PmagPy is a software package for analyzing paleomagnetic and rock magnetic data using Python. If you need to install Python and PmagPy locally, follow the install instructions. This notebook demonstrates the use of PmagPy functions from within a Jupyter notebook. For examples of how to use PmagPy scripts on the command line, see the static version of PmagPy_cli.ipynb.

You are currently looking at static html, but you may want to run this notebook interactively. You can check out a Binder demo of this notebook (please be patient, it loads very slowly).

You can also run this notebook locally. To do so, you'll need to first install Python and PmagPy (see the Cookbook for install instructions). You can then launch the notebook from your command line (see more details here).

Many PmagPy scripts are designed to work with data in the MagIC format. This notebook uses Data Model 3.0: https://www.earthref.org/MagIC/data-models/3.0 There are nine basic tables: contribution, locations, sites, samples, specimens, measurements, criteria, ages and images. These are tab delimited data tables with the first line consisting of a delimiter and the table name: (e.g., tab measurements). All of the examples here are tab delimited. The second line are the column names: (e.g., specimen experiment method_codes treat_temp.....). Each subsequent line is a single record.

MagIC tables have many columns only some of which are used in a particular instance. So combining files of the same type must be done carefully to ensure that the right data come under the right headings. The program combine_magic can be used to combine any number of MagIC files from a given type.
It reads in MagIC formatted files of a common type (e.g., sites.txt) and combines them into a single file, taking care that all the columns are preserved. For example, if there are both AF and thermal data from a study and we created a measurements.txt formatted file for each, we could use combine_magic.py on the command line to combine them together into a single measurements.txt file. In a notebook, we use ipmag.combine_magic().

Files downloaded from the MagIC search interface have ages that are in the original units, but what is often desired is for them to be in a single unit. For example, if we searched the MagIC database for all absolute paleointensity data (records with method codes of 'LP-PI-TRM') from the last five million years, the data sets have a variety of age units. We can use pmag.convert_ages() to convert them all to millions of years.

First we follow the instructions for unpacking downloaded files in download_magic.

After some minimal filtering using Pandas, we can convert a DataFrame to a list of dictionaries required by most PmagPy functions and use pmag.convert_ages() to convert all the ages. The converted list of dictionaries can then be turned back into a Pandas DataFrame and either plotted or filtered further as desired.

In this example, we filter for data older than the Brunhes (0.78 Ma) and younger than 5 Ma, then plot them against latitude. We can also use vdm_b to plot the intensities expected from the present dipole moment (~80 ZAm$^2$).

# read in the sites.txt file as a dataframesite_df=pd.read_csv('data_files/convert_ages/sites.txt',sep='\t',header=1)# get rid aof any records without intensity data or latitudesite_df=site_df.dropna(subset=['int_abs','lat'])# Pick out the sites with 'age' filled insite_df_age=site_df.dropna(subset=['age'])# pick out those with age_low and age_high filled insite_df_lowhigh=site_df.dropna(subset=['age_low','age_high'])# concatenate the twosite_all_ages=pd.concat([site_df_age,site_df_lowhigh])# get rid of duplicates (records with age, age_high AND age_low)site_all_ages.drop_duplicates(inplace=True)# Pandas reads in blanks as NaN, which pmag.convert_ages hates# this replaces all the NaNs with blankssite_all_ages.fillna('',inplace=True)# converts to a list of dictionariessites=site_all_ages.to_dict('records')# converts the ages to Maconverted_df=pmag.convert_ages(sites)# turn it back into a DataFramesite_ages=pd.DataFrame(converted_df)# filter awaysite_ages=site_ages[site_ages.age.astype(float)<=5]site_ages=site_ages[site_ages.age.astype(float)>=0.05]

It is frequently desirable to format tables for publications from the MagIC formatted files. This example is for the sites.txt formatted file. It will create a site information table with the location and age information, and directions and/or intenisty summary tables. The function to call is ipmag.sites_extract().

This program unpacks the .txt files downloaded from the MagIC database into individual text files. It has an option to also separate files for each location

As an example, go to the MagIC data base at http://earthref.org/MAGIC/doi/10.1029/2003gc000661 and dowload the contribution. Make a folder into which you should put the downloaded txt file called MagIC_download and move the file into it. Now use the program download_magic to unpack the .txt file (magic_contribution_16533.txt).

To do this within a notebook, use the function ipmag.download_magic().

We can just turn around and try to upload the file downloaded in download_magic. For this we use ipmag.upload_magic() in the same directory as for the download. You can try to upload the file you create to the MagIC data base as a private contribution here: https://www2.earthref.org/MagIC/upload

In [28]:

help(ipmag.upload_magic)

Help on function upload_magic in module pmagpy.ipmag:
upload_magic(concat=1, dir_path='.', dmodel=None, vocab='', contribution=None)
Finds all magic files in a given directory, and compiles them into an
upload.txt file which can be uploaded into the MagIC database.
Parameters
----------
concat : boolean where True means do concatenate to upload.txt file in dir_path,
False means write a new file (default is False)
dir_path : string for input/output directory (default ".")
dmodel : pmagpy data_model.DataModel object,
if not provided will be created (default None)
vocab : pmagpy controlled_vocabularies3.Vocabulary object,
if not provided will be created (default None)
contribution : pmagpy contribution_builder.Contribution object, if not provided will be created
in directory (default None)
Returns
----------
tuple of either: (False, error_message, errors, all_failing_items)
if there was a problem creating/validating the upload file
or: (filename, '', None, None) if the file creation was fully successful.

MagIC data model 3 took out redundant columns in the MagIC tables so the hierarchy of specimens (in the measurements and specimens tables) up to samples, sites and locations is lost. To put these back into the measurement table, we have the function cb.add_sites_to_meas_table(), which is super handy when data analysis requires it.

The MagIC data model has several different forms of magnetization with different normalizations (moment, volume, or mass). So to find the one used in a particular measurements table we can use this handy function.

We imported this module as convert. It provides many functions for creating MagIC format files from non-MagIC formats. The MagIC formatted files can then be used with PmagPy programs and uploaded to the MagIC database. Let's take a look at the options:

These are measurement data for a single specimen, so we can take a quickie look at the data in an equal area projection.

In [36]:

help(ipmag.plot_di)

Help on function plot_di in module pmagpy.ipmag:
plot_di(dec=None, inc=None, di_block=None, color='k', marker='o', markersize=20, legend='no', label='', title='', edge='')
Plot declination, inclination data on an equal area plot.
Before this function is called a plot needs to be initialized with code that looks
something like:
>fignum = 1
>plt.figure(num=fignum,figsize=(10,10),dpi=160)
>ipmag.plot_net(fignum)
Required Parameters
-----------
dec : declination being plotted
inc : inclination being plotted
or
di_block: a nested list of [dec,inc,1.0]
(di_block can be provided instead of dec, inc in which case it will be used)
Optional Parameters (defaults are used if not specified)
-----------
color : the default color is black. Other colors can be chosen (e.g. 'r')
marker : the default marker is a circle ('o')
markersize : default size is 20
label : the default label is blank ('')
legend : the default is no legend ('no'). Putting 'yes' will plot a legend.
edge : marker edge color - if blank, is color of marker

This program converts Micromag hysteresis files into MagIC formatted files. Because this program creates files for uploading to the MagIC database, specimens should also have sample/site/location information, which can be provided on the command line. If this information is not available, for example if this is a synthetic specimen, specify syn= True for synthetic name.

Someone named Lima Tango has measured a synthetic specimen named myspec for hysteresis and saved the data in a file named agm_magic_example.agm in the agm_magic/agm_directory folder. The backfield IRM curve for the same specimen was saved in same directory as agm_magic_example.irm. Use the function convert.agm() to convert the data into a measurements.txt output file. For the backfield IRM file, set the keyword "bak" to True. These were measured using cgs units, so be sure to set the units key word argument properly. Combine the two output files together using the instructions for combine_magic. The agm files can be plotted using hysteresis_magic but the back-field plots are broken.

# read in the measurements datameas_data=pd.read_csv('data_files/convert_2_magic/agm_magic/agm.magic',sep='\t',header=1)# pick out the hysteresis data using the method code for hysteresis lab protocolhyst_data=meas_data[meas_data.method_codes.str.contains('LP-HYS')]# make the dictionary for figures that pmagplotlib likes# make a list of specimensspecimens=hyst_data.specimen.unique()cnt=1forspecimeninspecimens:HDD={'hyst':cnt,'deltaM':cnt+1,'DdeltaM':cnt+2}spec_data=hyst_data[hyst_data.specimen==specimen]# make a list of the field dataB=spec_data.meas_field_dc.tolist()# make a list o the magnetizaiton dataM=spec_data.magn_moment.tolist()# call the plotting functionhpars=pmagplotlib.plot_hdd(HDD,B,M,specimen)hpars['specimen']=specimen# print out the hysteresis parametersprint(specimen,': \n',hpars)cnt+=3

Craig Jones’ PaleoMag software package (http://cires.colorado.edu/people/jones.craig/PMag3.html) imports various file formats, including the ’CIT’ format developed for the Caltech lab and now used in magnetometer control software that ships with 2G magnetometers that utilized a vertical sample changer system. The documentation for the CIT sample format is here: http://cires.colorado.edu/people/jones.craig/PMag_Formats.html#SAM_format. Demagnetization data for each specimen are in their own file in a directory with all the data for a site or study. These files are strictly formatted with fields determined by the character number in the line. There must be a file with the suffix ‘.sam’ in the same directory as the specimen data files which gives details about the specimens and a list of the specimen measurementfiles in the directory.

The first line in the .sam file is a comment (in this case the site name), the second is the latitude and longitude followed by a declination correction. In these data, the declination correction was applied to the specimen orientations so the value of the declination correction is set to be 0.

For detailed description of the .sam and sample file formats, check the PaleoMag Formats website linked to above.

Use the function convert.cit() to covert the CIT data files from Swanson-Hysell lab at Berkeley for the PI47 site in the data_files/convert_2_magic/cit_magic/PI47 directory. The site (PI47) was part of a data set published in Fairchild et al., (2016) (available in the MagIC database: (https://earthref.org/MagIC/11292/). The location name was “Slate Islands”, the naming convention was #2, the specimen name is specified with 1 character, we don’t wish to average replicate measurements and they were collected by drilling and with a magnetic compass (”FS-FD",and "SO-MAG”).

Use the function convert.cit() to covert the CIT data files from the USGS lab at Menlo Park. The data file is in the data_files/convert_2_magic/cit_magic/USGS/bl9-1 directory, the file name is bl9-1.sam, and the analyst was Hagstrum. The location name was “Boring volcanic field”, and this site name was set by Hagstrum to BL9001 because the site name cannot be determined from the sample name with the current available options. The samples were collected by drilling and with a magnetic compass and sun compass (”FS-FD",and "SO-MAG”), the measurement are in Oersted instead of the standard milliTesla, and we don’t wish to average replicate measurements.