Welcome to MTH5’s documentation!

_images/mth5_logo.png

https://img.shields.io/pypi/v/mth5.svg https://img.shields.io/conda/v/conda-forge/mth5.svg https://img.shields.io/badge/License-MIT-yellow.svg https://readthedocs.org/projects/mth5/badge/?version=latest https://codecov.io/gh/kujaku11/mth5/branch/master/graph/badge.svg?token=XU5QSRM1ZO https://zenodo.org/badge/283883448.svg https://mybinder.org/badge_logo.svg

MTH5

MTH5 is an HDF5 data container for magnetotelluric time series data, but could be extended to other data types. This package provides tools for reading/writing/manipulating MTH5 files.

MTH5 uses h5py to interact with the HDF5 file, xarray to interact with the data in a nice way, and all metadata use mt_metadata.

This project is in cooperation with the Incorporated Research Institutes of Seismology, the U.S. Geological Survey, and other collaborators. Facilities of the IRIS Consortium are supported by the National Science Foundation’s Seismological Facilities for the Advancement of Geoscience (SAGE) Award under Cooperative Support Agreement EAR-1851048. USGS is partially funded through the Community for Data Integration and IMAGe through the Minerals Resources Program.

  • Version: 0.3.1

  • Free software: MIT license

  • Documentation: https://mth5.readthedocs.io.

  • Examples: Click the Binder badge above and Jupyter Notebook examples are in docs/examples/notebooks

  • Suggested Citation: Peacock, J. R., Kappler, K., Ronan, T., Heagy, L., Kelbert, A., Frassetto, A. (2022) MTH5: An archive and exchangeable data format for magnetotelluric time series data, Computers & Geoscience, 162, doi:10.1016/j.cageo.2022.105102

Features

  • Read and write HDF5 files formated for magnetotelluric time series.

  • From MTH5 a user can create an MTH5 file, get/add/remove stations, runs, channels and filters and all associated metadata.

  • Data is contained as an xarray which can house the data and metadata together, and data is indexed by time.

  • Readers for some data types are included as plugins, namely

    Instrument

    File Types

    NIMS

    [.bin, .bnn]

    LEMI424

    [.txt]

    USGS ASCII

    [.asc, .gzip]

    Zonge ZEN

    [.z3d]

    Phoenix

    [.td_*, .bin]

    miniseed

    [.mseed]

Introduction

The goal of MTH5 is to provide a self describing heirarchical data format for working, sharing, and archiving. MTH5 was cooperatively developed with community input and follows logically how magnetotelluric data are collected. This module provides open-source tools to interact with an MTH5 file.

The metadata follows the standards proposed by the IRIS-PASSCAL MT Software working group and documented in MT Metadata Standards Note: If you would like to comment or contribute checkout Issues or Slack.

MTH5 Format

  • The basic format of MTH5 is illustrated below, where metadata is attached at each level.

MTH5 File Version 0.1.0

_images/example_mt_file_structure.svg

MTH5 file version 0.1.0 was the original file version where Survey was the highest level of the file. This has some limitations in that only one Survey could be saved in a single file, but if you have mulitple Surveys that you would like to store we need to add a higher level Experiment.

Important: Some MTH5 0.1.0 files have already been archived on ScienceBase and has been used as the working format for Aurora and is here for reference. Moving forward the new format will be 0.2.0 as described below.

MTH5 File Version 0.2.0

_images/example_mt_file_structure_v2.svg

MTH5 file version 0.2.0 has Experiment as the top level. This allows for multiple Surveys to be included in a single file and therefore allows for more flexibility. For example if you would like to remote reference stations in a local survey with stations from a different survey collected at the same time you can have all those surveys and stations in the same file and make it easier for processing.

Hint

MTH5 is comprehensively logged, therefore if any problems arise you can always check the mth5_debug.log (if you are in debug mode, change the mode in the mth5.__init__) and the mth5_error.log, which will be written to your current working directory.

Examples

Make a simple MTH5 with one station, 2 runs, and 2 channels (version 0.2.0)

from mth5.mth5 import MTH5

mth5_object = MTH5()
mth5_object.open_mth5(r"/home/mt/example_mth5.h5", "a")

# add a survey
survey_group = mth5_object.add_survey("example")

# add a station with metadata
station_group = m.add_station("mt001", survey="example")
station_group = survey_group.stations_group.add_station("mt002")
station_group.metadata.location.latitude = "40:05:01"
station_group.metadata.location.longitude = -122.3432
station_group.metadata.location.elevation = 403.1
station_group.metadata.acquired_by.author = "me"
station_group.metadata.orientation.reference_frame = "geomagnetic"

# IMPORTANT: Must always use the write_metadata method when metadata is updated.
station_group.write_metadata()

# add runs
run_01 = m.add_run("mt002", "001", survey="example")
run_02 = station_group.add_run("002")

# add channels
ex = m.add_channel("mt002", "001", "ex", "electric", None, survey="example")
hy = run_01.add_channel("hy", "magnetic", None)

print(mth5_object)

/:
====================
        |- Group: Experiment
        --------------------
                |- Group: Reports
                -----------------
                |- Group: Standards
                -------------------
                        --> Dataset: summary
                        ......................
                |- Group: Surveys
                -----------------
                        |- Group: example
                        -----------------
                                |- Group: Filters
                                -----------------
                                        |- Group: coefficient
                                        ---------------------
                                        |- Group: fap
                                        -------------
                                        |- Group: fir
                                        -------------
                                        |- Group: time_delay
                                        --------------------
                                        |- Group: zpk
                                        -------------
                                |- Group: Reports
                                -----------------
                                |- Group: Standards
                                -------------------
                                        --> Dataset: summary
                                        ......................
                                |- Group: Stations
                                ------------------
                                        |- Group: mt001
                                        ---------------
                                        |- Group: mt002
                                        ---------------
                                                |- Group: 001
                                                -------------
                                                        --> Dataset: ex
                                                        .................
                                                        --> Dataset: hy
                                                        .................
                                                |- Group: 002
                                                -------------

Credits

This project is in cooperation with the Incorporated Research Institutes of Seismology, the U.S. Geological Survey, and other collaborators. Facilities of the IRIS Consortium are supported by the National Science Foundation’s Seismological Facilities for the Advancement of Geoscience (SAGE) Award under Cooperative Support Agreement EAR-1851048. USGS is partially funded through the Community for Data Integration and IMAGe through the Minerals Resources Program.


Installation

Stable release

To install MTH5, run this command in your terminal:

$ pip install mth5

This is the preferred method to install MTH5, as it will always install the most recent stable release.

If you don’t have pip installed, this Python installation guide can guide you through the process.

Conda Forge

To install MTH5 from Conda run this command:

$ conda config --add channels conda-forge
$ conda config --set channel_priority strict
$ conda install mth5

This should be the same as installing with pip that pulls from PyPi and should work better if you are using an Anaconda environment.

From sources

The sources for MTH5 can be downloaded from the Github repo.

You can either clone the public repository:

$ git clone git://github.com/kujaku11/mth5

Or download the tarball:

$ curl -OJL https://github.com/kujaku11/mth5/tarball/master

Once you have a copy of the source, you can install it with:

$ python setup.py install

Or you can install it in editing mode and be able to adjust the code as needed:

$ python setup.py -e install

Contributing

Contributions are welcome, and they are greatly appreciated! Every little bit helps, and credit will always be given.

You can contribute in many ways:

Types of Contributions

Report Bugs

Report bugs at https://github.com/kujaku11/mth5/issues.

If you are reporting a bug, please include:

  • Your operating system name and version.

  • Any details about your local setup that might be helpful in troubleshooting.

  • Detailed steps to reproduce the bug.

Fix Bugs

Look through the GitHub issues for bugs. Anything tagged with “bug” and “help wanted” is open to whoever wants to implement it.

Implement Features

Look through the GitHub issues for features. Anything tagged with “enhancement” and “help wanted” is open to whoever wants to implement it.

Write Documentation

MTH5 could always use more documentation, whether as part of the official MTH5 docs, in docstrings, or even on the web in blog posts, articles, and such.

Submit Feedback

The best way to send feedback is to file an issue at https://github.com/kujaku11/mth5/issues.

If you are proposing a feature:

  • Explain in detail how it would work.

  • Keep the scope as narrow as possible, to make it easier to implement.

  • Remember that this is a volunteer-driven project, and that contributions are welcome :)

Get Started!

Ready to contribute? Here’s how to set up mth5 for local development.

  1. Fork the mth5 repo on GitHub.

  2. Clone your fork locally:

    $ git clone git@github.com:your_name_here/mth5.git
    
  3. Install your local copy into a virtualenv. Assuming you have virtualenvwrapper installed, this is how you set up your fork for local development:

    $ mkvirtualenv mth5
    $ cd mth5/
    $ python setup.py develop
    
  4. Create a branch for local development:

    $ git checkout -b name-of-your-bugfix-or-feature
    

    Now you can make your changes locally.

  5. When you’re done making changes, check that your changes pass flake8 and the tests, including testing other Python versions with tox:

    $ flake8 mth5 tests
    $ python setup.py test or pytest
    $ tox
    

    To get flake8 and tox, just pip install them into your virtualenv.

  6. Commit your changes and push your branch to GitHub:

    $ git add .
    $ git commit -m "Your detailed description of your changes."
    $ git push origin name-of-your-bugfix-or-feature
    
  7. Submit a pull request through the GitHub website.

Pull Request Guidelines

Before you submit a pull request, check that it meets these guidelines:

  1. The pull request should include tests.

  2. If the pull request adds functionality, the docs should be updated. Put your new functionality into a function with a docstring, and add the feature to the list in README.rst.

  3. The pull request should work for Python 3.5, 3.6, 3.7 and 3.8, and for PyPy. Check the Actions report to make sure that the tests pass for all supported Python versions.

Tips

To run a subset of tests:

$ pytest tests.test_mth5

Deploying

A reminder for the maintainers on how to deploy. Make sure all your changes are committed (including an entry in HISTORY.rst). Then run:

$ bump2version patch # possible: major / minor / patch
$ git push
$ git push --tags

GitActions will then deploy to PyPI if tests pass.

Credits

This project is in cooperation with the Incorporated Research Institutes of Seismology, the U.S. Geological Survey, and other collaborators. Facilities of the IRIS Consortium are supported by the National Science Foundation’s Seismological Facilities for the Advancement of Geoscience (SAGE) Award under Cooperative Support Agreement EAR-1851048. USGS is partially funded through the Community for Data Integration and IMAGe through the Minerals Resources Program.

Suggested Citation: Peacock, J. R., Kappler, K., Ronan, T., Heagy, L., Kelbert, A., Frassetto, A. (2022) MTH5: An archive and exchangeable data format for magnetotelluric time series data, Computers & Geoscience, 162, doi:10.1016/j.cageo.2022.105102

Development Lead

Contributors

  • Karl Kappler

  • Lindsey Heagy

  • Tim Ronan

  • Anna Kelbert

  • Laura Keyson

History

0.1.0 (2021-06-30)

  • First release on PyPI.

0.2.0 (2021-10-31)

  • Updated the structure of MTH5 to have Experiment as the top level

  • Updated tests

  • Backwards compatibility works

  • Updated Docs

0.2.5 (2022-04-07)

  • fixed bugs

  • Added TransferFunctions and TransferFunction groups at the station level that can now hold transfer functions

  • Added channel_summary and tf_summary tables that are updated upon close if the file is in ‘w’ mode

  • Updated tests

  • Updated documentation

  • Note: tests for make_mth5 from IRIS are currently not working as there has been some reorganization with data at the DMC

0.2.6 (2022-07-01)

  • Added calibration functions

  • minor bug fixes

  • updated tests

  • updated documentation

0.2.7 (2022-09-14)

  • Rebased IO module to make a module for each data logger type

  • Updated tests

  • Updated documentation

  • Factored make_mth5

0.3.0 (2022-09-25)

  • change default initialize_mth5 to use append mode, issue #92 by @kkappler in #94

  • Fix issue 105 by @kkappler in PR #106

  • adding in parallel mth5 tutorial by @nre900 in #110

  • adding in new tutorial and modifications to mth5_in_parallel.ipynb by @nre900 in issue #112

  • Add phoenix reader by @kujaku11 in PR #103

  • Remove response by @kujaku11 in PR #100

0.3.1 (2023-01-18)

  • Speed up station and survey validataion by

  • Tutorial updates by @nre900

  • remove kwarg specifying default value

  • update initialize_mth5

  • Have a single metadata object for ChannelTS and RunTS

  • Use h5 Paths to get groups and datasets

  • Bump wheel from 0.33.6 to 0.38.1

GOTCHAS

There are some gotchas or things you should understand when using HDF5 files as well as MTH5

Compression

Compression can slow down making a MTH5 file, so you should understand the compression parameters. See H5 Compression and DataSets for more information.

Compression is set in MTH5 when you instatiate an MTH5 object

>>> m = MTH5(shuffle=None, fletcher32=None, compression=None, compression_opts=None)

The compression parameters will be validated using mth5.helpers.validate_compression

Datasets can use chunks, which by default is set to True, which lets h5py pick the most efficient way to chunk the data.

Lossless compression filters
GZIP filter ("gzip")

Available with every installation of HDF5, so it’s best where portability is required. Good compression, moderate speed. compression_opts sets the compression level and may be an integer from 0 to 9, default is 4.

LZF filter ("lzf")

Available with every installation of h5py (C source code also available). Low to moderate compression, very fast. No options.

SZIP filter ("szip")

Patent-encumbered filter used in the NASA community. Not available with all installations of HDF5 due to legal reasons. Consult the HDF5 docs for filter options.

Logging

Logging is great, but can have dramatic effects on performance, mainly because I’m new to logging and probably haven’t written them most efficiently. By default the logging level is set to INFO. This seems to run as you might expect with slight overhead. If you change the logging level to DEBUG expect a slow down. You should only do this if you are a developer or are curious as to why something looks weird.

Conventions

Some conventions that have been implemented:

  • All metadata names are lower case and _ separated.

Survey Names

Survey names are not standardized yet, but will be in the future. Survey names are commonly a string of up to 10 characters, though there is no limit. Characters representing the project or geographic location are commonly used. For example a survey in Long Valley could be lv, a survey of the continental US could be conus, a NSF funded project on imaging magma under Mount St. Helens (iMUSH) could be imush.

Station Names

Stations names are not standardized yet, but will be in the future. Station names are often 4-6 characters. Convention is a 2 or 3 character survey representation followed by a 2-4 digit number representing the station number.

For example a survey in California could begin with ca and station 1 would be ca001.

Now that 3-D grids are more common people name lines or rows with a letter or number. For instance the EarthScope/MT Array across the US uses {state}{line}{number}. So a station could be can10 for station 10 on the N easting line in California.

Future suggestion is:

{survey_name}{line}{row}{number}

Run Names

Run names have not been standardized yet but will be in the future. There are 2 conventions, one to use alphabetic run names like run a and the other is numeric like run 001.

The disadvantage of alphabetic names is they can be limited by the alphabet if the number of runs gets large. Alphabetic names are common in long period experiments where you have a handful of runs when you need to change batteries or fix cables.

Numeric names are more flexible and will probably be the standard in the future. For broadband experiments where multiple sampling rates are used there can be lots of runs. For example if you have a continuous band sampling at 50 samples/second with 5 second bursts of 1000 samples/second every minute then you are going to have 60 short runs per hour. Keeping track of these can be tedious so a numeric counter would be the easiest.

Future suggestions would be a 4 digit string, maybe include the sample rate which could be a character similar to seismic FDSN standards.

{sample_rate}{run_number}

Channel Names

Channel names have not been standardized yet but will be in the future. Direction indicators are often given as:

  • x: northing direction or strike direction

  • y: easting direction or across strike direction

  • z: vertical direction

For MT we have 2 main types of channels electric and magnetic.

Electric channel names often start with an e and are followed by a directional indicator, like ex for an electric channel in the northing direction.

Magnetic channel names often start with an h or b followed by a directional indicator, like hz for a vertical magnetic channel.

Extending this to other data types we have Auxiliary channels which could be any other type of geophysical measurement, like temperature or battery voltage. Suggest using the full name at the moment, maybe in the future we will have measurment codes like seismic FDSN, but they can be cryptic.

In the future channel names will likely be standardized as:

{channel_type}{channel_direction}{channel_number}

to allow for flexibility to other methods like IP and DC surveys.

Usage v0.1.0

Important

This usage describes using MTH5 version 0.1.0 where Survey is the top level of the MTH5 file structure. The current version is 0.2.0. Some version 0.1.0 files have been archived and this is here for reference.

Basic Groups

Each MTH5 file has default groups. A ‘group’ is basically like a folder that can contain other groups or datasets. These are:

  • SurveyGroup: The master or root group of the HDF5 file

  • FiltersGroup: Holds all filters and filter information

  • ZPKGroup: Holds pole zero filters

  • FAPGroup: Holds frequency look up table filters

  • FIRGroup: Holds finite impulse response filters

  • CoefficientGroup: Holds coefficient filters

  • TimeDelayGroup: Holds time delay filters

  • Reports: Holds any reports relevant to the survey

  • StandardsGroup: A summary of metadata standards used

  • StationsGroup: Holds all the stations an subsequent data

    • StationGroup: Holds a single station

      • RunGroup: Holds a single run

        • ChannelDataset: Holds a single channel

Each group also has a summary table to make it easier to search and access different parts of the file. Each entry in the table will have an HDF5 reference that can be directly used to get the appropriate group or dataset without using the path.

See also

mth5.groups

Example

Example of Working with a Version 0.1.0 MTH5 File
[1]:
from mth5.mth5 import MTH5
2022-03-24T18:12:02 [line 157] numexpr.utils._init_num_threads - INFO: NumExpr defaulting to 8 threads.
2022-03-24 18:12:04,296 [line 135] mth5.setup_logger - INFO: Logging file can be found C:\Users\jpeacock\Documents\GitHub\mth5\logs\mth5_debug.log
Initialize an MTH5 object with file version 0.1.0
[2]:
m = MTH5(file_version="0.1.0")

Have a look at the attributes of the file

[3]:
m.file_attributes
[3]:
{'file.type': 'MTH5',
 'file.version': '0.1.0',
 'file.access.platform': 'Windows-10-10.0.19041-SP0',
 'file.access.time': '2022-03-25T01:12:04.433011+00:00',
 'mth5.software.version': '0.2.4',
 'mth5.software.name': 'mth5',
 'data_level': 1}

Here are the data set options

[4]:
m.dataset_options
[4]:
{'compression': 'gzip',
 'compression_opts': 9,
 'shuffle': True,
 'fletcher32': True}

The file is currently not open yet

[5]:
m
[5]:
HDF5 file is closed and cannot be accessed.
Open a new file

We will open the file in mode w here, which will overwrite the file if it already exists. If you don’t want to do that or are unsure if a file already exists the safest option is using mode a.

Context Manager

Its strongly encouraged that if you are making an MTH5 file, even if you want to open it up afterwards that you use with.

with MTH5(**kwargs) as m:
    m.open_mth5(filename)
    #pack MTH5

Using this style of pseudocode your MTH5 file will be made in a safe way if anything goes wrong in the packing. Using the with statement will automatically flush and close the MTH5 upon exiting the with statement, that includes any errors encountered.

Here we are just showing an example and how to interogate an MTH5 file.

[6]:
m.open_mth5(r"example.h5", "w")
2022-03-24 18:12:04,481 [line 604] mth5.mth5.MTH5.open_mth5 - WARNING: example.h5 will be overwritten in 'w' mode
2022-03-24 18:12:04,787 [line 687] mth5.mth5.MTH5._initialize_file - INFO: Initialized MTH5 0.1.0 file example.h5 in mode w

Now that we have initiated a file, let’s see what’s in an empty file.

[7]:
m
[7]:
/:
====================
    |- Group: Survey
    ----------------
        |- Group: Filters
        -----------------
            |- Group: coefficient
            ---------------------
            |- Group: fap
            -------------
            |- Group: fir
            -------------
            |- Group: time_delay
            --------------------
            |- Group: zpk
            -------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Stations
        ------------------
        --> Dataset: channel_summary
        ..............................
        --> Dataset: tf_summary
        .........................

We can see that there are default groups that are initiated by default. And here are the methods an MTH5 object contains. You can open/close an MTH5 file; add/remove station, run, channel; read from an mt_metadata.timeseries.Experiment object to fill the metadata and structure before adding data and create an mt_metadata.timeseries.Experiment object for archiving.

[8]:
print("\n".join(sorted([func for func in dir(m) if callable(getattr(m, func)) and not func.startswith("_")])))
2022-03-24 18:12:04,823 [line 470] mth5.mth5.MTH5.experiment_group - INFO: File version 0.1.0 does not have an Experiment Group
2022-03-24 18:12:04,833 [line 497] mth5.mth5.MTH5.surveys_group - INFO: File version 0.1.0 does not have a survey_group, try surveys_group
add_channel
add_run
add_station
add_survey
add_transfer_function
close_mth5
from_experiment
from_reference
get_channel
get_run
get_station
get_survey
get_transfer_function
h5_is_read
h5_is_write
has_group
open_mth5
remove_channel
remove_run
remove_station
remove_survey
remove_transfer_function
to_experiment
validate_file
Add a station

Here we will add a station called mt001. This will return a StationGroup object.

[9]:
station_group = m.add_station("mt001")

Add some metadata to this station like location, who acquired it, and the reference frame in which the data were collected.

[10]:
station_group.metadata.location.latitude = "40:05:01"
station_group.metadata.location.longitude = -122.3432
station_group.metadata.location.elevation = 403.1
station_group.metadata.acquired_by.author = "me"
station_group.metadata.orientation.reference_frame = "geomagnetic"

# IMPORTANT: Must always use the write_metadata method when metadata is updated.
station_group.write_metadata()
[11]:
station_group.metadata
[11]:
{
    "station": {
        "acquired_by.author": "me",
        "channels_recorded": [],
        "data_type": "BBMT",
        "geographic_name": null,
        "hdf5_reference": "<HDF5 object reference>",
        "id": "mt001",
        "location.declination.model": "WMM",
        "location.declination.value": 0.0,
        "location.elevation": 403.1,
        "location.latitude": 40.08361111111111,
        "location.longitude": -122.3432,
        "mth5_type": "Station",
        "orientation.method": null,
        "orientation.reference_frame": "geomagnetic",
        "provenance.creation_time": "1980-01-01T00:00:00+00:00",
        "provenance.software.author": "none",
        "provenance.software.name": null,
        "provenance.software.version": null,
        "provenance.submitter.email": null,
        "provenance.submitter.organization": null,
        "run_list": [],
        "time_period.end": "1980-01-01T00:00:00+00:00",
        "time_period.start": "1980-01-01T00:00:00+00:00"
    }
}
Add a Run

We can now add a run to the new station. We can do this in 2 ways, one directly from the m the MTH5 object, or from the newly created station_group

[12]:
run_01 = m.add_run("mt001", "001")
run_02 = station_group.add_run("002")
[13]:
station_group
[13]:
/Survey/Stations/mt001:
====================
    |- Group: 001
    -------------
    |- Group: 002
    -------------
    |- Group: Transfer_Functions
    ----------------------------
Add a Channel

Again we can do this in 2 ways: directly from the m the MTH5 object, or from the newly created run_01 or run_02 group. There are only 3 types of channels electric, magnetic, and auxiliary and this needs to be specified when a channel is initiated. We will initate the channel with data=None, which will create an empty data set.

[14]:
ex = m.add_channel("mt001", "001", "ex", "electric", None)
hy = run_01.add_channel("hy", "magnetic", None)
[15]:
run_01
[15]:
/Survey/Stations/mt001/001:
====================
    --> Dataset: ex
    .................
    --> Dataset: hy
    .................

Now, let’s see what the contents are of this file

[16]:
m
[16]:
/:
====================
    |- Group: Survey
    ----------------
        |- Group: Filters
        -----------------
            |- Group: coefficient
            ---------------------
            |- Group: fap
            -------------
            |- Group: fir
            -------------
            |- Group: time_delay
            --------------------
            |- Group: zpk
            -------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Stations
        ------------------
            |- Group: mt001
            ---------------
                |- Group: 001
                -------------
                    --> Dataset: ex
                    .................
                    --> Dataset: hy
                    .................
                |- Group: 002
                -------------
                |- Group: Transfer_Functions
                ----------------------------
        --> Dataset: channel_summary
        ..............................
        --> Dataset: tf_summary
        .........................
Channel Summary

This is a summary of all channels in the file, this can take a long time to build if the data file is large.

[17]:
%time

m.channel_summary.clear_table()
m.channel_summary.summarize()

ch_df = m.channel_summary.to_dataframe()
ch_df
Wall time: 0 ns
[17]:
survey station run latitude longitude elevation component start end n_samples sample_rate measurement_type azimuth tilt units hdf5_reference run_hdf5_reference station_hdf5_reference
0 none mt001 001 40.083611 -122.3432 403.1 ex 1980-01-01 00:00:00+00:00 1980-01-01 00:00:00+00:00 1 0.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
1 none mt001 001 40.083611 -122.3432 403.1 hy 1980-01-01 00:00:00+00:00 1980-01-01 00:00:00+00:00 1 0.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
Close MTH5 file

This part is important, be sure to close the file in order to save any changes. This function flushes metadata and data to the HDF5 file and then closes it. Note that once a file is closed all groups lose their link to the file and cannot retrieve any data.

[18]:
m.close_mth5()
station_group
2022-03-24 18:12:05,104 [line 747] mth5.mth5.MTH5.close_mth5 - INFO: Flushing and closing example.h5
2022-03-24 18:12:05,110 [line 113] mth5.groups.base.Station.__str__ - WARNING: MTH5 file is closed and cannot be accessed.
[18]:
MTH5 file is closed and cannot be accessed.

Usage v0.2.0

Important

This usage describes using MTH5 version 0.2.0 where Survey is the top level of the MTH5 file structure. The current version is 0.2.0. Some version 0.1.0 files have been archived and this is here for reference.

Basic Groups

Each MTH5 file has default groups. A ‘group’ is basically like a folder that can contain other groups or datasets. These are:

  • ExperimentGroup: The master or root group of the HDF5 file
    • ReportsGroup: Holds any reports relevant to the experiment

    • StandardsGroup: A summary of metadata standards used

    • SurveysGroup: Holds the surveys in an experiment

      • SurveyGroup: The master or root group of the HDF5 file

      • FiltersGroup: Holds all filters and filter information

      • ZPKGroup: Holds pole zero filters

      • FAPGroup: Holds frequency look up table filters

      • FIRGroup: Holds finite impulse response filters

      • CoefficientGroup: Holds coefficient filters

      • TimeDelayGroup: Holds time delay filters

      • Reports: Holds any reports relevant to the survey

      • StationsGroup: Holds all the stations an subsequent data

        • StationGroup: Holds a single station

          • RunGroup: Holds a single run

            • ChannelDataset: Holds a single channel

Each group also has a summary table to make it easier to search and access different parts of the file. Each entry in the table will have an HDF5 reference that can be directly used to get the appropriate group or dataset without using the path.

See also

mth5.groups

Example

Example of Working with a Version 0.2.0 MTH5 File
[1]:
from mth5.mth5 import MTH5
2022-04-12T21:21:46 [line 157] numexpr.utils._init_num_threads - INFO: NumExpr defaulting to 8 threads.
2022-04-12 21:21:47,999 [line 135] mth5.setup_logger - INFO: Logging file can be found C:\Users\jpeacock\Documents\GitHub\mth5\logs\mth5_debug.log
Initialize an MTH5 object with file version 0.2.0
[2]:
m = MTH5(file_version="0.2.0")

Have a look at the attributes of the file

[3]:
m.file_attributes
[3]:
{'file.type': 'MTH5',
 'file.version': '0.2.0',
 'file.access.platform': 'Windows-10-10.0.19041-SP0',
 'file.access.time': '2022-04-13T04:21:53.788393+00:00',
 'mth5.software.version': '0.2.5',
 'mth5.software.name': 'mth5',
 'data_level': 1}

Here are the data set options

[4]:
m.dataset_options
[4]:
{'compression': 'gzip',
 'compression_opts': 9,
 'shuffle': True,
 'fletcher32': True}

The file is currently not open yet

[5]:
m
[5]:
HDF5 file is closed and cannot be accessed.
Open a new file

We will open the file in mode w here, which will overwrite the file if it already exists. If you don’t want to do that or are unsure if a file already exists the safest option is using mode a.

Context Manager

Its strongly encouraged that if you are making an MTH5 file, even if you want to open it up afterwards that you use with.

with MTH5(**kwargs) as m:
    m.open_mth5(filename)
    #pack MTH5

Using this style of pseudocode your MTH5 file will be made in a safe way if anything goes wrong in the packing. Using the with statement will automatically flush and close the MTH5 upon exiting the with statement, that includes any errors encountered.

Here we are just showing an example and how to interogate an MTH5 file.

[6]:
m.open_mth5(r"example.h5", "w")
2022-04-12 21:22:02,707 [line 591] mth5.mth5.MTH5.open_mth5 - WARNING: example.h5 will be overwritten in 'w' mode
2022-04-12 21:22:03,000 [line 656] mth5.mth5.MTH5._initialize_file - INFO: Initialized MTH5 0.2.0 file example.h5 in mode w

Now that we have initiated a file, let’s see what’s in an empty file.

[20]:
m
[20]:
/:
====================
    |- Group: Experiment
    --------------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Surveys
        -----------------
            |- Group: example
            -----------------
                |- Group: Filters
                -----------------
                    |- Group: coefficient
                    ---------------------
                    |- Group: fap
                    -------------
                    |- Group: fir
                    -------------
                    |- Group: time_delay
                    --------------------
                    |- Group: zpk
                    -------------
                |- Group: Reports
                -----------------
                |- Group: Standards
                -------------------
                    --> Dataset: summary
                    ......................
                |- Group: Stations
                ------------------
                    |- Group: mt001
                    ---------------
                        |- Group: Transfer_Functions
                        ----------------------------
                    |- Group: mt002
                    ---------------
                        |- Group: 001
                        -------------
                            --> Dataset: ex
                            .................
                            --> Dataset: hy
                            .................
                        |- Group: 002
                        -------------
                        |- Group: Transfer_Functions
                        ----------------------------
        --> Dataset: channel_summary
        ..............................
        --> Dataset: tf_summary
        .........................

We can see that there are default groups that are initiated by default. And here are the methods an MTH5 object contains. You can open/close an MTH5 file; add/remove station, run, channel; read from an mt_metadata.timeseries.Experiment object to fill the metadata and structure before adding data and create an mt_metadata.timeseries.Experiment object for archiving.

[8]:
print("\n".join(sorted([func for func in dir(m) if callable(getattr(m, func)) and not func.startswith("_")])))
2022-04-12 21:22:11,930 [line 519] mth5.mth5.MTH5.filters_group - INFO: File version 0.2.0 does not have a FiltersGroup at the experiment level
2022-04-12 21:22:11,932 [line 541] mth5.mth5.MTH5.stations_group - INFO: File version 0.2.0 does not have a Stations. try surveys_group.
2022-04-12 21:22:11,934 [line 479] mth5.mth5.MTH5.survey_group - INFO: File version 0.2.0 does not have a survey_group, try surveys_group
add_channel
add_run
add_station
add_survey
add_transfer_function
close_mth5
from_experiment
from_reference
get_channel
get_run
get_station
get_survey
get_transfer_function
h5_is_read
h5_is_write
has_group
open_mth5
remove_channel
remove_run
remove_station
remove_survey
remove_transfer_function
to_experiment
validate_file
Add a Survey

The first step is to add a survey, here we will add the survey example. This will return a SurveyGroup object which will commonly be the main group we work with.

[9]:
survey_group = m.add_survey("example")
Add a station

Here we will add a station called mt001. This will return a StationGroup object. We can add a station 2 ways: one directy from the MTH5 object m or from the newly created survey_group. Note if we add it from m then we need to include the survey name example.

[10]:
station_group = m.add_station("mt001", survey="example")
station_group = survey_group.stations_group.add_station("mt002")

Add some metadata to this station like location, who acquired it, and the reference frame in which the data were collected.

[11]:
station_group.metadata.location.latitude = "40:05:01"
station_group.metadata.location.longitude = -122.3432
station_group.metadata.location.elevation = 403.1
station_group.metadata.acquired_by.author = "me"
station_group.metadata.orientation.reference_frame = "geomagnetic"

# IMPORTANT: Must always use the write_metadata method when metadata is updated.
station_group.write_metadata()
[12]:
station_group.metadata
[12]:
{
    "station": {
        "acquired_by.name": "me",
        "channels_recorded": [],
        "data_type": "BBMT",
        "geographic_name": null,
        "hdf5_reference": "<HDF5 object reference>",
        "id": "mt002",
        "location.declination.model": "WMM",
        "location.declination.value": 0.0,
        "location.elevation": 403.1,
        "location.latitude": 40.08361111111111,
        "location.longitude": -122.3432,
        "mth5_type": "Station",
        "orientation.method": null,
        "orientation.reference_frame": "geomagnetic",
        "provenance.creation_time": "1980-01-01T00:00:00+00:00",
        "provenance.software.author": "none",
        "provenance.software.name": null,
        "provenance.software.version": null,
        "provenance.submitter.email": null,
        "provenance.submitter.organization": null,
        "run_list": [],
        "time_period.end": "1980-01-01T00:00:00+00:00",
        "time_period.start": "1980-01-01T00:00:00+00:00"
    }
}
Add a Run

We can now add a run to the new station. We can do this in 2 ways, one directly from the m the MTH5 object, or from the newly created station_group

[13]:
run_01 = m.add_run("mt002", "001", survey="example")
run_02 = station_group.add_run("002")
[14]:
station_group
[14]:
/Experiment/Surveys/example/Stations/mt002:
====================
    |- Group: 001
    -------------
    |- Group: 002
    -------------
    |- Group: Transfer_Functions
    ----------------------------
Add a Channel

Again we can do this in 2 ways: directly from the m the MTH5 object, or from the newly created run_01 or run_02 group. There are only 3 types of channels electric, magnetic, and auxiliary and this needs to be specified when a channel is initiated. We will initate the channel with data=None, which will create an empty data set.

[15]:
ex = m.add_channel("mt002", "001", "ex", "electric", None, survey="example")
hy = run_01.add_channel("hy", "magnetic", None)
[16]:
hy
[16]:
Channel Magnetic:
-------------------
        component:        hy
        data type:        magnetic
        data format:      int32
        data shape:       (1,)
        start:            1980-01-01T00:00:00+00:00
        end:              1980-01-01T00:00:00+00:00
        sample rate:      0.0

Now, let’s see what the contents are of this file

[17]:
m
[17]:
/:
====================
    |- Group: Experiment
    --------------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Surveys
        -----------------
            |- Group: example
            -----------------
                |- Group: Filters
                -----------------
                    |- Group: coefficient
                    ---------------------
                    |- Group: fap
                    -------------
                    |- Group: fir
                    -------------
                    |- Group: time_delay
                    --------------------
                    |- Group: zpk
                    -------------
                |- Group: Reports
                -----------------
                |- Group: Standards
                -------------------
                    --> Dataset: summary
                    ......................
                |- Group: Stations
                ------------------
                    |- Group: mt001
                    ---------------
                        |- Group: Transfer_Functions
                        ----------------------------
                    |- Group: mt002
                    ---------------
                        |- Group: 001
                        -------------
                            --> Dataset: ex
                            .................
                            --> Dataset: hy
                            .................
                        |- Group: 002
                        -------------
                        |- Group: Transfer_Functions
                        ----------------------------
        --> Dataset: channel_summary
        ..............................
        --> Dataset: tf_summary
        .........................
Channel Summary

We can have a look at the what channels are in this file. This can take a long time if you have lots of data. This returns a pandas.DataFrame object and can therefore be queried with the standard Pandas methods.

Note: the number of samples is 1 even though we did not add any data. This is because we initialize the dataset to be extendable and it needs at least 1 dimension to be initialized. We set the max shape to be (1, None) which means it can be extended to an arbitrary shape.

[18]:
%time

m.channel_summary.clear_table()
m.channel_summary.summarize()

ch_df = m.channel_summary.to_dataframe()
ch_df
Wall time: 0 ns
[18]:
survey station run latitude longitude elevation component start end n_samples sample_rate measurement_type azimuth tilt units hdf5_reference run_hdf5_reference station_hdf5_reference
0 example mt002 001 40.083611 -122.3432 403.1 ex 1980-01-01 00:00:00+00:00 1980-01-01 00:00:00+00:00 1 0.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
1 example mt002 001 40.083611 -122.3432 403.1 hy 1980-01-01 00:00:00+00:00 1980-01-01 00:00:00+00:00 1 0.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
Access channel through HDF5 Reference

The channel summary table contains a column labeled hdf5_reference, this is an interal HDF5 reference that can be used directly to access that specific group or dataset. A method is provided in MTH5 to use this reference and return the proper group object. Here we will request to get the first channel in the table

[19]:
h5_reference = ch_df.iloc[0].hdf5_reference
ex = m.from_reference(h5_reference)
ex
[19]:
Channel Electric:
-------------------
        component:        ex
        data type:        electric
        data format:      int32
        data shape:       (1,)
        start:            1980-01-01T00:00:00+00:00
        end:              1980-01-01T00:00:00+00:00
        sample rate:      0.0
Close MTH5 file

This part is important, be sure to close the file in order to save any changes. This function flushes metadata and data to the HDF5 file and then closes it. Note that once a file is closed all groups lose their link to the file and cannot retrieve any data.

[ ]:
m.close_mth5()
station_group

MTH5 Groups

Each group within an MTH5 file has a corresponding group object that provides convenience functions to interact with the HDF5 file and add, get, and remove groups or datasets to that group. For example a station has a corresponding mth5.groups.StationGroup object that provides functions to add, get, and remove a run. These are meant to make it easier for the user to interact with the HDF5 file without too much trouble.

v0.1.0 Groups

  • SurveyGroup: The master or root group of the HDF5 file

  • FiltersGroup: Holds all filters and filter information

    • ZPKGroup: Holds pole zero filters

    • FAPGroup: Holds frequency look up table filters

    • FIRGroup: Holds finite impulse response filters

    • CoefficientGroup: Holds coefficient filters

    • TimeDelayGroup: Holds time delay filters

  • Reports: Holds any reports relevant to the survey

  • StandardsGroup: A summary of metadata standards used

  • StationsGroup: Holds all the stations an subsequent data

    • StationGroup: Holds a single station

      • RunGroup: Holds a single run

        • ChannelDataset: Holds a single channel

v0.2.0 Groups

  • ExperimentGroup: The master or root group of the HDF5 file

  • ReportsGroup: Holds any reports relevant to the experiment

  • StandardsGroup: A summary of metadata standards used

  • SurveysGroup: Holds the surveys in an experiment

    • SurveyGroup: The master or root group of the HDF5 file

    • FiltersGroup: Holds all filters and filter information

      • ZPKGroup: Holds pole zero filters

      • FAPGroup: Holds frequency look up table filters

      • FIRGroup: Holds finite impulse response filters

      • CoefficientGroup: Holds coefficient filters

      • TimeDelayGroup: Holds time delay filters

    • Reports: Holds any reports relevant to the survey

    • StationsGroup: Holds all the stations an subsequent data

      • StationGroup: Holds a single station

        • RunGroup: Holds a single run

          • ChannelDataset: Holds a single channel

Note

Each group contains a weak reference to the HDF5 group, this way when a file is closed there are no lingering references to a closed HDF5 file.

Base Groups

Each group inherits mth5.groups.base.BaseGroup which provides basic methods for the group. This includes setting up the proper metadata and group structure. The __str__ and __repr__ methods are overwritten to show the structure of the group.

A groups_list property is provided that lists all the subgroups in the group.

Group Descriptions

There are 2 survey containers mth5.groups.MasterSurveysGroup and mth5.groups.SurveyGroup.

Master Surveys Group

mth5.groups.MasterSurveysGroup is an umbrella container that holds a collection of mth5.groups.SurveyGroup objects and contains a summary table that summarizes all surveys within the survey. Use mth5.groups.MasterSurveysGroup to add/get/remove surveys.

No metadata currently accompanies mth5.groups.MasterSurveysGroup.

Add/get will return a mth5.groups.SurveyGroup. If adding a survey that has the same name as an existing survey the mth5.groups.SurveyGroup returned will be of the existing survey and no survey will be added. Change the name or update the existing staiton. If getting a survey that does not exist a mth5.utils.exceptions.MTH5Error will be raised.

Add/Get/Remove Survey

Add/get/remove surveys can be done from the mth5.MTH5.surveys_group which is a mth5.groups.MasterSurveysGroup object.

>>> # v0.1.0 does not have a surveys_group
>>> # v0.2.0
>>> surveys = mth5_obj.surveys_group
>>> type(surveys)
mth5.groups.MasterSurveysGroup

From the surveys_group you can add/remove/get a survey.

To add a survey:

>>> new_survey = surveys.add_survey('survey_01')
>>> print(type(new_survey))
mth5.groups.SurveyGroup

To get an existing survey:

>>> existing_survey = surveys.get_survey('MT001')

To remove an existing survey:

>>> surveys.remove_survey('MT002')
>>> surveys.group_list
['MT001']
Summary Table

This table includes all channels within the Survey. This can be quite large and can take a long time to build.

Column

Description

survey

Survey ID for channel

station

Station ID for channel

run

Run ID for channel

latitude

Survey latitude (decimal degrees)

longitude

Survey longitude (decimal degrees)

elevation

Survey elevation (meters)

component

Channel component

start

Start time for the channel (ISO format)

end

End time for the channel (ISO format)

n_samples

Number of samples in the channel

sample_rate

Channel sample rate

measurement_type

Channel measurement type

azimuth

Channel azimuth (degrees from N)

tilt

Channel tilt (degrees from horizontal)

units

Channel units

hdf5_reference

HDF5 internal reference

Survey Group

A single survey is contained within a mth5.groups.SurveyGroup object, which has the appropriate metadata for a single survey. mth5.groups.SurveyGroup contains all the runs for that survey.

Survey Metadata

Metadata is accessed through the metadata property, which is a mt_metadata.timeseries.Survey object.

>>> type(new_survey.metadata)
mt_metadata.timeseries.Survey
>>> new_survey.metadata
{
        "survey": {
                "citation_dataset.doi": null,
                "citation_journal.doi": null,
                "country": null,
                "datum": null,
                "geographic_name": null,
                "id": null,
                "name": null,
                "northwest_corner.latitude": 0.0,
                "northwest_corner.longitude": 0.0,
                "project": null,
                "project_lead.email": null,
                "project_lead.organization": null,
                "release_license": "CC-0",
                "southeast_corner.latitude": 0.0,
                "southeast_corner.longitude": 0.0,
                "summary": null,
                "time_period.end_date": "1980-01-01",
                "time_period.start_date": "1980-01-01"
        }
}

See also

mth5.groups.SurveyGroup and mt_metadata.timeseries.Survey

There are 2 station containers mth5.groups.MasterStationsGroup and mth5.groups.StationGroup.

Master Stations Group

mth5.groups.MasterStationsGroup is an umbrella container that holds a collection of mth5.groups.StationGroup objects and contains a summary table that summarizes all stations within the survey. Use mth5.groups.MasterStationsGroup to add/get/remove stations.

No metadata currently accompanies mth5.groups.MasterStationsGroup.

Add/get will return a mth5.groups.StationGroup. If adding a station that has the same name as an existing station the mth5.groups.StationGroup returned will be of the existing station and no station will be added. Change the name or update the existing staiton. If getting a station that does not exist a mth5.utils.exceptions.MTH5Error will be raised.

Add/Get/Remove Station

Add/get/remove stations can be done from the mth5.MTH5.stations_group which is a mth5.groups.MasterStationsGroup object.

>>> # v0.1.0
>>> stations = mth5_obj.stations_group
>>> # v0.2.0
>>> stations = mth5_obj.get_survey("example").stations_group
>>> type(stations)
mth5.groups.MasterStationsGroup
>>> stations
/Survey/Stations:
====================
    |- Group: MT001
        ---------------
                |- Group: MT001a
                ----------------
                    --> Dataset: Ex
                    .................
                    --> Dataset: Ey
                    .................
                    --> Dataset: Hx
                    .................
                    --> Dataset: Hy
                    .................
                    --> Dataset: Hz
                    .................

From the stations_group you can add/remove/get a station.

To add a station:

>>> new_station = stations.add_station('MT002')
>>> print(type(new_station))
mth5.groups.StationGroup
>>> new_station
/Survey/Stations/MT002:
====================

To get an existing station:

>>> existing_station = stations.get_station('MT001')

To remove an existing station:

>>> stations.remove_station('MT002')
>>> stations.group_list
['MT001']
Summary Table

Column

Description

archive_id

Station archive name

start

Start time of the station (ISO format)

end

End time of the station (ISO format)

components

All components measured by the station

measurement_type

All measurement types collected by the station

location.latitude

Station latitude (decimal degrees)

location.longitude

Station longitude (decimal degrees)

location.elevation

Station elevation (meters)

hdf5_reference

Internal HDF5 reference

Station Group

A single station is contained within a mth5.groups.StationGroup object, which has the appropriate metadata for a single station. mth5.groups.StationGroup contains all the runs for that station.

Summary Table

The summary table in mth5.groups.StationGroup summarizes all runs for that station.

Column

Description

id

Run ID

start

Start time of the run (ISO format)

end

End time of the run (ISO format)

components

All components measured for that run

measurement_type

Type of measurement for that run

sample_rate

Sample rate of the run (samples/second)

hdf5_reference

Internal HDF5 reference

Station Metadata

Metadata is accessed through the metadata property, which is a mt_metadata.timeseries.Station object.

>>> type(new_station.metadata)
mt_metadata.timeseries.Station
>>> new_station.metadata
{
        "station": {
                "acquired_by.author": null,
                "acquired_by.comments": null,
                "archive_id": "FL001",
                "channel_layout": "X",
                "channels_recorded": [
                        "Hx",
                        "Hy",
                        "Hz",
                        "Ex",
                        "Ey"
                ],
                "comments": null,
                "data_type": "BB, LP",
                "geographic_name": "Beachy Keen, FL, USA",
                "hdf5_reference": "<HDF5 object reference>",
                "id": "FL001",
                "location.declination.comments": "Declination obtained from the instrument GNSS NMEA sequence",
                "location.declination.model": "Unknown",
                "location.declination.value": -4.1,
                "location.elevation": 0.0,
                "location.latitude": 29.7203555,
                "location.longitude": -83.4854715,
                "mth5_type": "Station",
                "orientation.method": "compass",
                "orientation.reference_frame": "geographic",
                "provenance.comments": null,
                "provenance.creation_time": "2020-05-29T21:08:40+00:00",
                "provenance.log": null,
                "provenance.software.author": "Anna Kelbert, USGS",
                "provenance.software.name": "mth5_metadata.m",
                "provenance.software.version": "2020-05-29",
                "provenance.submitter.author": "Anna Kelbert, USGS",
                "provenance.submitter.email": "akelbert@usgs.gov",
                "provenance.submitter.organization": "USGS Geomagnetism Program",
                "time_period.end": "2015-01-29T16:18:14+00:00",
                "time_period.start": "2015-01-08T19:49:15+00:00"
        }
}

See also

mth5.groups.StationGroup and mt_metadata.timeseries.Station

Runs

A run is a collection of channels that recorded at similar start and end times at the same sample rate for a given station. A run is contained within a mth5.groups.RunGroup object. A run is the next level down from a station.

The main way to add/remove/get a run object is through a mth5.groups.StationGroup object

Accessing through StationGroup

You can get a mth5.groups.StationGroup using

>>> # v0.1.0
>>> new_station = mth5_obj.add_station('MT003')
>>> # v0.2.0
>>> new_station = mth5_obj.add_station("MT003", survey="example")
Add Run
>>> # if you don't already have a run name one can be assigned based on existing runs
>>> new_run_name = new_station.make_run_name()
>>> new_run = new_station.add_run(new_run_name)
>>> new_run.metadata.id
'001'
Get Run

Similar methods for get/remove a run

>>> existing_run = new_station.get_run('MT003a')
Remove Run
>>> new_station.remove_run('MT003a')
Summary Table

The summary table summarizes all channels for that run.

Column

Description

component

Component name

start

Start time of the channel (ISO format)

end

End time of the channel (ISO format0

n_samples

Number of samples for the channel

measurement_type

Measuremnt type of the channel

units

Units of the channel data

hdf5_reference

HDF5 internal reference

Metadata

Metadata is accessed through the metadata property, which is a mth5.metadata.Run object.

>>> type(new_run)
mth5.metadata.Run
>>> new_run.metadata
{
        "run": {
                "acquired_by.author": "BB",
                "acquired_by.comments": "it's cold in florida",
                "channels_recorded_auxiliary": null,
                "channels_recorded_electric": null,
                "channels_recorded_magnetic": null,
                "comments": null,
                "data_logger.firmware.author": "Barry Narod",
                "data_logger.firmware.name": null,
                "data_logger.firmware.version": null,
                "data_logger.id": "1305-1",
                "data_logger.manufacturer": "Barry Narod",
                "data_logger.model": "NIMS",
                "data_logger.power_source.comments": "voltage measurements not recorded",
                "data_logger.power_source.id": null,
                "data_logger.power_source.type": "battery",
                "data_logger.power_source.voltage.end": null,
                "data_logger.power_source.voltage.start": null,
                "data_logger.timing_system.comments": null,
                "data_logger.timing_system.drift": 0.0,
                "data_logger.timing_system.type": "GPS",
                "data_logger.timing_system.uncertainty": 1.0,
                "data_logger.type": null,
                "data_type": "BB, LP",
                "hdf5_reference": "<HDF5 object reference>",
                "id": "MT003a",
                "metadata_by.author": "Anna Kelbert; Paul Bedrosian",
                "metadata_by.comments": "Paul Bedrosian: Ey, electrode dug up",
                "mth5_type": "Run",
                "provenance.comments": null,
                "provenance.log": null,
                "sample_rate": 8.0,
                "time_period.end": "2015-01-19T14:54:54+00:00",
                "time_period.start": "2015-01-08T19:49:15+00:00"
        }
}

See also

mth5.groups.RunGroup and mt_metadata.timeseries.Run for more information.

Time Series Objects

All datasets at the Channel level are represented by mth5.timeseries.ChannelTS objects. The mth5.timeseries.ChannelTS are based on xarray.DataArray objects. This way memory usage is minimal because xarray is lazy and only uses what is called for. Another benefit is that metadata can directly accompany the data. Currently the model is that all metadata are input into a mth5.metadata.Base object to be validated first and then the xarray.DataArray can be updated. This is not automated at this point so the user just needs to use the function update_xarray_metadata when metadata values are changed. Another advantage of using xarray is that the time series data are indexed by time making it easier to align, trim, extract, sort, etc.

All run datasets are represented by mth5.timeseries.RunTS objects, which are based on xarray.DataSet which is a collection of xarray.DataArray objects. The benefits of using xarray are that many of the methods such as aligning, indexing, sorting are already developed and are robust. Therefore the useability is easier without more coding.

Another reason why xarray was picked as the basis for representing the data is that it works seamlessly with other programs like Dask for parallel computing, and plotting tools like hvplot.

Examples

ChannelTS

A ChannelTS object is a container for a single channel. The data are stored in an xarray.DataArray and indexed by time according to the metadata provided. Here we will make a simple electric channel and look at how to interogate it.

[1]:
%matplotlib inline
import numpy as np
from mth5.timeseries import ChannelTS
from mt_metadata.timeseries import Electric, Run, Station

Here create some metadata, the keys are the time_period.start and the sample_rate.

[2]:
ex_metadata = Electric()
ex_metadata.time_period.start = "2020-01-01T00:00:00"
ex_metadata.sample_rate = 8.0
ex_metadata.component = "ex"
ex_metadata.dipole_length = 100.
ex_metadata.units = "millivolts"

Create Station and Run metadata

[3]:
station_metadata = Station(id="mt001")
run_metadata = Run(id="001")

Create “realistic” data

[4]:
n_samples = 4096
t = np.arange(n_samples)
data = np.sum([np.cos(2*np.pi*w*t + phi) for w, phi in zip(np.logspace(-3, 3, 20), np.random.rand(20))], axis=0)
[5]:
ex = ChannelTS(channel_type="electric",
              data=data,
              channel_metadata=ex_metadata,
              run_metadata=run_metadata,
              station_metadata=station_metadata)
[6]:
ex
[6]:
Channel Summary:
        Survey:       0
        Station:      mt001
        Run:          001
        Channel Type: Electric
        Component:    ex
        Sample Rate:  8.0
        Start:        2020-01-01T00:00:00+00:00
        End:          2020-01-01T00:08:31.875000+00:00
        N Samples:    4096
Get a slice of the data

Here we will provide a start time of the slice and the number of samples that we want the slice to be

[7]:
ex_slice = ex.get_slice("2020-01-01T00:00:00", n_samples=256)
[8]:
ex_slice
[8]:
Channel Summary:
        Survey:       0
        Station:      mt001
        Run:          001
        Channel Type: Electric
        Component:    ex
        Sample Rate:  8.0
        Start:        2020-01-01T00:00:00+00:00
        End:          2020-01-01T00:00:31.875000+00:00
        N Samples:    256
Plot the data

This is a work in progress, but this can be done through the xarray container.

[9]:
ex_slice.plot()
[9]:
[<matplotlib.lines.Line2D at 0x293bb639ca0>]
_images/examples_notebooks_channel_ts_example_14_1.png
Convert to an xarray

We can convert the ChannelTS object to an xarray.DataArray which could be easier to use.

[10]:
ex_xarray = ex.to_xarray()
[11]:
ex_xarray
[11]:
<xarray.DataArray 'ex' (time: 4096)>
array([16.43704101,  2.7440483 ,  4.62460915, ..., -3.61015321,
       -0.73752295,  1.85945637])
Coordinates:
  * time     (time) datetime64[ns] 2020-01-01 ... 2020-01-01T00:08:31.875000
Attributes:
    channel_number:             0
    component:                  ex
    data_quality.rating.value:  0
    dipole_length:              100.0
    filter.applied:             [False]
    filter.name:                []
    measurement_azimuth:        0.0
    measurement_tilt:           0.0
    negative.elevation:         0.0
    negative.id:                None
    negative.latitude:          0.0
    negative.longitude:         0.0
    negative.manufacturer:      None
    negative.type:              None
    positive.elevation:         0.0
    positive.id:                None
    positive.latitude:          0.0
    positive.longitude:         0.0
    positive.manufacturer:      None
    positive.type:              None
    sample_rate:                8.0
    time_period.end:            2020-01-01T00:08:31.875000+00:00
    time_period.start:          2020-01-01T00:00:00+00:00
    type:                       electric
    units:                      millivolts
    station.id:                 mt001
    run.id:                     001
[12]:
ex.plot()
ex_xarray.plot()
[12]:
[<matplotlib.lines.Line2D at 0x293bbe13190>]
_images/examples_notebooks_channel_ts_example_18_1.png
Convert to an Obspy.Trace object

The ChannelTS object can be converted to an obspy.Trace object. This can be useful when dealing with data received from a mainly seismic archive like IRIS. This can also be useful for using some tools provided by Obspy.

Note there is a loss of information when doing this because an obspy.Trace is based on miniSEED data formats which has minimal metadata.

[13]:
ex.station_metadata.fdsn.id = "mt001"
ex_trace = ex.to_obspy_trace()
[14]:
ex_trace
[14]:
.mt001..MQN | 2020-01-01T00:00:00.000000Z - 2020-01-01T00:08:31.875000Z | 8.0 Hz, 4096 samples
Convert from an Obspy.Trace object

We can reverse that and convert an obspy.Trace into a ChannelTS. Again useful when dealing with seismic dominated archives.

[15]:
ex_from_trace = ChannelTS()
ex_from_trace.from_obspy_trace(ex_trace)
[16]:
ex_from_trace
[16]:
Channel Summary:
        Survey:       0
        Station:      mt001
        Run:          sr8_001
        Channel Type: Electric
        Component:    ex
        Sample Rate:  8.0
        Start:        2020-01-01T00:00:00+00:00
        End:          2020-01-01T00:08:31.875000+00:00
        N Samples:    4096
[17]:
ex
[17]:
Channel Summary:
        Survey:       0
        Station:      mt001
        Run:          001
        Channel Type: Electric
        Component:    ex
        Sample Rate:  8.0
        Start:        2020-01-01T00:00:00+00:00
        End:          2020-01-01T00:08:31.875000+00:00
        N Samples:    4096

On comparison you can see the loss of metadata information.

Calibrate
Removing the instrument response to calibrate the data is an important step in processing the data. A convenience function ChannelTS.remove_instrument_response is supplied just for this.
Currently, it will calibrate the whole time series at once and therefore may be slow for large data sets.

See Also: Make Data From IRIS examples for working examples.

[18]:
help(ex.remove_instrument_response)
Help on method remove_instrument_response in module mth5.timeseries.channel_ts:

remove_instrument_response(**kwargs) method of mth5.timeseries.channel_ts.ChannelTS instance
    Remove instrument response from the given channel response filter

    The order of operations is important (if applied):

        1) detrend
        2) zero mean
        3) zero pad
        4) time window
        5) frequency window
        6) remove response
        7) undo time window
        8) bandpass

    **kwargs**

    :param plot: to plot the calibration process [ False | True ]
    :type plot: boolean, default True
    :param detrend: Remove linar trend of the time series
    :type detrend: boolean, default True
    :param zero_mean: Remove the mean of the time series
    :type zero_mean: boolean, default True
    :param zero_pad: pad the time series to the next power of 2 for efficiency
    :type zero_pad: boolean, default True
    :param t_window: Time domain windown name see `scipy.signal.windows` for options
    :type t_window: string, default None
    :param t_window_params: Time domain window parameters, parameters can be
    found in `scipy.signal.windows`
    :type t_window_params: dictionary
    :param f_window: Frequency domain windown name see `scipy.signal.windows` for options
    :type f_window: string, defualt None
    :param f_window_params: Frequency window parameters, parameters can be
    found in `scipy.signal.windows`
    :type f_window_params: dictionary
    :param bandpass: bandpass freequency and order {"low":, "high":, "order":,}
    :type bandpass: dictionary

XArray Accessor for Decimate and Filtering

A common practice when working with time series would be decimating or downsampling the data. xarray has some builtins for resampling, however these do not apply a filter prior to downsampling and has alias issues. We have added some utilities for decimation and filtering following the package xr-scipy. When MTH5 is initiated a sps_filters accessor to xarray.DataArray and xarray.Dataset which includes some filtering methods as well as decimation and resampling. Therefore for access to these methods use DataArray.sps_filters.decimate or Dataset.sps_filters.decimate.

Methods include

Name

Function

lowpass

low pass filter the data

highpass

high pass filter the data

bandpass

band pass filter the data

bandstop

filter out frequencies using a band stop (notch filter)

decimate

simulates scipy.signal.decimate method by filtering data first then decimating. Can be inaccurate around the edges of the time series because it assumes periodic signal.

resample_poly

uses scipy.signal.resample_poly method for down sampling, more accurate and usually faster for real signals (default for resampling)

detrend

uses scipy.signal.detrend with keyword type to remove trends in the data

Note: In future versions of MTH5 filters will be added to ChannelTS.

Compare Downsampling methods
Decimate

Here we will decimate to a new sample rate of 1 sample per second.

[19]:
decimated_ex = ex.decimate(1)
[20]:
ex.plot()
decimated_ex.plot()
[20]:
[<matplotlib.lines.Line2D at 0x293bd0293a0>]
_images/examples_notebooks_channel_ts_example_31_1.png
resample_poly

resample_poly is more accurate for real signals because there is no innate assumption of periodicity. As you can see in the decimate case there are edge effects. Have a look at the scipy documentation for more information. In the plot below, you’ll notice no edge effects and similar numbers as decimate.

Note: Strongly suggest using ChannelTS.resample_poly for downsampling data.

[28]:
resample_poly_ex = ex.resample_poly(1)
[29]:
ex.plot()
decimated_ex.plot()
resample_poly_ex.plot()
[29]:
[<matplotlib.lines.Line2D at 0x293bf9291f0>]
_images/examples_notebooks_channel_ts_example_34_1.png
Merge Channels

A common step in working with time series would be to combine different segments of collected for the same channel. If the sample rates are the same you can use channel_01 + channel_02. This should also work if the sample rates are not the same, though you should use ChannelTS.merge if the sample rates are not the same. The channels combined must have the same component. There are two builtin methods to combine channels those are + and merge().

  1. added_channel = cnahhel_01 + channel_02

  2. merged_channel = channel_01.merge(channel_02)

Both methods use xarray.combine_by_coords([ch1, rch2], combine_attrs='override'. The combine_by_coords method simply concatenates along similar dimensions and cares nothing of a monotonix dimension variable. Therefore, xarray.DataArray.reindex is used to create a monotonically increasing time series. Any gaps are filled interpolated using a 1-D interpolation. The default method is slinear which is probably the most useful for processing time series. If you want more control over the interpolation method use ChannelTS.merge([ch1, ch2, ...], gap_method=interpolation_method. For more information on interpolation methods see Scipy.interpolate.interp1d. Similarly if you want more control on how the datasets are merged use xarray tools.

Add Channels

Adding channels together does 2 at a time so if you are adding multiple channels together channel_01 + channel_02 + channel_03 ... its better to use merge. Also if you want more control on how the channels are merged and how time gaps are interpolated use merge.

[21]:
ex2 = ex.copy()
ex2.start = "2020-01-01T00:08:45"
[22]:
added_channel = ex + ex2
[23]:
added_channel
[23]:
Channel Summary:
        Survey:       0
        Station:      mt001
        Run:          001
        Channel Type: Electric
        Component:    ex
        Sample Rate:  8.0
        Start:        2020-01-01T00:00:00+00:00
        End:          2020-01-01T00:17:16.875000+00:00
        N Samples:    8296
[24]:
added_channel.plot()
[24]:
[<matplotlib.lines.Line2D at 0x293bd05e340>]
_images/examples_notebooks_channel_ts_example_39_1.png
Merge Channels And Resample

If channels have different sample rates or you want to combine channels and resample to a lower sample rate, use the keyword argument new_sample_rate.

[25]:
merged_channel = ex.merge(ex2, new_sample_rate=1)
[26]:
merged_channel
[26]:
Channel Summary:
        Survey:       0
        Station:      mt001
        Run:          001
        Channel Type: Electric
        Component:    ex
        Sample Rate:  1.0
        Start:        2020-01-01T00:00:00+00:00
        End:          2020-01-01T00:17:16+00:00
        N Samples:    1037
[27]:
added_channel.plot()
merged_channel.plot()
[27]:
[<matplotlib.lines.Line2D at 0x293bd512340>]
_images/examples_notebooks_channel_ts_example_43_1.png
RunTS

mth5.timeseries.RunTS is a container to hold multiple synchronous channels of the same sampling rate. The data is contained in an xarray.DataSet which is a collection of ChannelTS.to_xarray() objects.

[1]:
%matplotlib inline
import numpy as np
from mth5.timeseries import ChannelTS, RunTS
from mt_metadata.timeseries import Electric, Magnetic, Auxiliary, Run, Station
Create a Run

We will create a common run that has all 5 channels of an MT measurement (Hx, Hy, Hz, Ex, Ey) plus an auxiliary channel. We will make individual channels first and then add them into a RunTS object.

[2]:
channel_list = []
common_start = "2020-01-01T00:00:00"
sample_rate = 8.0
n_samples = 4096
t = np.arange(n_samples)
data = np.sum([np.cos(2*np.pi*w*t + phi) for w, phi in zip(np.logspace(-3, 3, 20), np.random.rand(20))], axis=0)

station_metadata = Station(id="mt001")
run_metadata = Run(id="001")
Create magnetic channels
[3]:
for component in ["hx", "hy", "hz"]:
    h_metadata = Magnetic(component=component)
    h_metadata.time_period.start = common_start
    h_metadata.sample_rate = sample_rate
    if component in ["hy"]:
        h_metadata.measurement_azimuth = 90
    h_channel = ChannelTS(
        channel_type="magnetic",
        data=data,
        channel_metadata=h_metadata,
        run_metadata=run_metadata,
        station_metadata=station_metadata)
    channel_list.append(h_channel)
Create electric channels
[4]:
for component in ["ex", "ey"]:
    e_metadata = Electric(component=component)
    e_metadata.time_period.start = common_start
    e_metadata.sample_rate = sample_rate
    if component in ["ey"]:
        e_metadata.measurement_azimuth = 90
    e_channel = ChannelTS(
        channel_type="electric",
        data=data,
        channel_metadata=e_metadata,
        run_metadata=run_metadata,
        station_metadata=station_metadata)
    channel_list.append(e_channel)
Create auxiliary channel
[5]:
aux_metadata = Auxiliary(component="temperature")
aux_metadata.time_period.start = common_start
aux_metadata.sample_rate = sample_rate
aux_channel = ChannelTS(
        channel_type="auxiliary",
        data=np.random.rand(n_samples) * 30,
        channel_metadata=aux_metadata,
        run_metadata=run_metadata,
        station_metadata=station_metadata)
aux_channel.channel_metadata.type = "temperature"
channel_list.append(aux_channel)
Create RunTS object

Now that we have made individual channels we can make a RunTS object by inputing a list of ChannelTS objects.

Note: This can also be a list of xarray.DataArray objects formated like a channel.

[6]:
run = RunTS(channel_list)
[7]:
run
[7]:
RunTS Summary:
        Survey:      0
        Station:     mt001
        Run:         001
        Start:       2020-01-01T00:00:00+00:00
        End:         2020-01-01T00:08:31.875000+00:00
        Sample Rate: 8.0
        Components:  ['hx', 'hy', 'hz', 'ex', 'ey', 'temperature']
Plot Run

Again this is a hack at the moment, we are working on a better visualization, but this works for now.

[8]:
run.plot()
[8]:
_images/examples_notebooks_run_ts_example_14_0.png
_images/examples_notebooks_run_ts_example_14_1.png
Decimate and Filters

A common practice when working with time series would be decimating or downsampling the data. xarray has some builtins for resampling, however these do not apply a filter prior to downsampling and has alias issues. We have added some utilities for decimation and filtering following the package xr-scipy. When MTH5 is initiated a sps_filters accessor to xarray.DataArray and xarray.Dataset which includes some filtering methods as well as decimation and resampling. Therefore for access to these methods use DataArray.sps_filters.decimate or Dataset.sps_filters.decimate.

Methods include

Name

Function

lowpass

low pass filter the data

highpass

high pass filter the data

bandpass

band pass filter the data

bandstop

filter out frequencies using a band stop (notch filter)

decimate

simulates scipy.signal.decimate method by filtering data first then decimating. Can be inaccurate around the edges of the time series because it assumes periodic signal.

resample_poly

uses scipy.signal.resample_poly method for down sampling, more accurate and usually faster for real signals (default for resampling)

detrend

uses scipy.signal.detrend with keyword type to remove trends in the data

Note: In future versions of MTH5 filters will be added to RunTS.

Compare Downsampling methods
Decimate

Here we will decimate to a new sample rate of 1 sample per second.

[9]:
decimated_run = run.decimate(1)
2023-09-27T16:30:24.906668-0700 | WARNING | mth5.timeseries.run_ts | validate_metadata | end time of dataset 2020-01-01T00:08:31+00:00 does not match metadata end 2020-01-01T00:08:31.875000+00:00 updating metatdata value to 2020-01-01T00:08:31+00:00
resample_poly

resample_poly is more accurate for real signals because there is no innate assumption of periodicity. As you can see in the decimate case there are edge effects. Have a look at the scipy documentation for more information. In the plot below, you’ll notice no edge effects and similar numbers as decimate.

Note: Strongly suggest using RunTS.resample_poly for downsampling data.

[10]:
resample_poly_run = run.resample_poly(1)
2023-09-27T16:30:25.076013-0700 | WARNING | mth5.timeseries.run_ts | validate_metadata | end time of dataset 2020-01-01T00:08:31+00:00 does not match metadata end 2020-01-01T00:08:31.875000+00:00 updating metatdata value to 2020-01-01T00:08:31+00:00
[12]:
decimated_run.plot()
resample_poly_run.plot()
[12]:
_images/examples_notebooks_run_ts_example_19_0.png
_images/examples_notebooks_run_ts_example_19_1.png
_images/examples_notebooks_run_ts_example_19_2.png
Merge Runs

It can be benificial to add runs together to create a longer time series. To combine runs each run must have the same sample rate. There are 2 builtin options for combining runs

  1. added_run = run_01 + run_02

  2. merged_run = run_01.merge(run_02)

Both methods use xarray.combine_by_coords([run1, run2], combine_attrs='override'. The combine_by_coords method simply concatenates along similar dimensions and cares nothing of a monotonix dimension variable. Therefore, xarray.DataSet.reindex is used to create a monotonically increasing time series. Any gaps are filled interpolated using a 1-D interpolation. The default method is slinear which is probably the most useful for processing time series. If you want more control over the interpolation method use RunTS.merge([run1, run2, ...], gap_method=interpolation_method. For more information on interpolation methods see Scipy.interpolate.interp1d. Similarly if you want more control on how the datasets are merged use xarray tools.

Using +

Using run_01 + run_02 will combine and interpolate onto a new monotonic time index. Any gaps will be interpolated linearly interpolated. If channels are different Nan’s will be placed where channels do not overlap. If you have hz in run_01 but not run_02 then Nan’s will be place in hz for the time period of run_02.

[14]:
for ch in channel_list:
    ch.start = "2020-01-01T00:08:45"
run_02 = RunTS(channel_list)
[15]:
added_run = run + run_02
2023-09-27T16:32:27.726365-0700 | WARNING | mth5.timeseries.run_ts | validate_metadata | end time of dataset 2020-01-01T00:17:16.875000+00:00 does not match metadata end 2020-01-01T00:08:31.875000+00:00 updating metatdata value to 2020-01-01T00:17:16.875000+00:00
2023-09-27T16:32:27.726365-0700 | WARNING | mth5.timeseries.run_ts | validate_metadata | sample rate of dataset 8.0 does not match metadata sample rate 1.0 updating metatdata value to 8.0
[16]:
added_run
[16]:
RunTS Summary:
        Survey:      0
        Station:     mt001
        Run:         001
        Start:       2020-01-01T00:00:00+00:00
        End:         2020-01-01T00:17:16.875000+00:00
        Sample Rate: 8.0
        Components:  ['hx', 'hy', 'hz', 'ex', 'ey', 'temperature']
Plot combined run

Plotting you can see the linear interpolation in the gap of time between the runs. Here it makes a spike because the first point in the time series is a spike.

[17]:
added_run.plot()
[17]:
_images/examples_notebooks_run_ts_example_26_0.png
_images/examples_notebooks_run_ts_example_26_1.png
Using Merge and Decimating

Using merge is a little more flexible and many runs can be merged together at the same time. Note if you use run_01 + run_02 + run_03 it only combines 2 at a time, which can be inefficient if multiple runs are combined. RunTS.merge provides the option of combining multiple runs and also decimating to lower sample rate with the keyword new_sample_rate.

Runs with different sample rates

When using merge not all runs need to have the same sample rate, but you need to set new_sample_rate to the common sample rate for the combined runs. This will decimate the data by first applying an iir filter to remove aliasing and then downsample to the desired sample rate. This mimics scipy.signal.decimate.

Change interpolation method

To change how gaps are interpolated change the gap_method parameter. See (scipy.interpolate.interp1d)[https://docs.scipy.org/doc/scipy/reference/generated/scipy.interpolate.interp1d.html]

Options are

  • ‘linear’

  • ‘nearest’

  • ‘nearest-up’

  • ‘zero’

  • ‘slinear’ (default)

  • ‘quadratic’

  • ‘cubic’

  • ‘previous’

  • ‘next’

[18]:
merged_run = run.merge(run_02, new_sample_rate=1, gap_method="slinear")
2023-09-27T16:32:29.163041-0700 | WARNING | mth5.timeseries.run_ts | validate_metadata | end time of dataset 2020-01-01T00:08:31+00:00 does not match metadata end 2020-01-01T00:08:31.875000+00:00 updating metatdata value to 2020-01-01T00:08:31+00:00
2023-09-27T16:32:29.316779-0700 | WARNING | mth5.timeseries.run_ts | validate_metadata | end time of dataset 2020-01-01T00:17:16+00:00 does not match metadata end 2020-01-01T00:17:16.875000+00:00 updating metatdata value to 2020-01-01T00:17:16+00:00
2023-09-27T16:32:29.499618-0700 | WARNING | mth5.timeseries.run_ts | validate_metadata | end time of dataset 2020-01-01T00:17:16+00:00 does not match metadata end 2020-01-01T00:08:31.875000+00:00 updating metatdata value to 2020-01-01T00:17:16+00:00
[19]:
merged_run
[19]:
RunTS Summary:
        Survey:      0
        Station:     mt001
        Run:         001
        Start:       2020-01-01T00:00:00+00:00
        End:         2020-01-01T00:17:16+00:00
        Sample Rate: 1.0
        Components:  ['hx', 'hy', 'hz', 'ex', 'ey', 'temperature']
Plot

Plot the merged runs downsampled to 1 second, and the linear interpolation is more obvious.

[20]:
merged_run.plot()
[20]:
_images/examples_notebooks_run_ts_example_31_0.png
_images/examples_notebooks_run_ts_example_31_1.png
To/From Obspy.Stream

When data is downloaded from an FDSN client or data is collected as miniseed Obspy is used to contain the data as a Stream object. Transformation between RunTS and Stream is supported.

To Obspy.Stream
[21]:
stream = run.to_obspy_stream()
[22]:
stream
[22]:
6 Trace(s) in Stream:
.mt001..MFN | 2020-01-01T00:00:00.000000Z - 2020-01-01T00:08:31.875000Z | 8.0 Hz, 4096 samples
.mt001..MFE | 2020-01-01T00:00:00.000000Z - 2020-01-01T00:08:31.875000Z | 8.0 Hz, 4096 samples
.mt001..MFZ | 2020-01-01T00:00:00.000000Z - 2020-01-01T00:08:31.875000Z | 8.0 Hz, 4096 samples
.mt001..MQN | 2020-01-01T00:00:00.000000Z - 2020-01-01T00:08:31.875000Z | 8.0 Hz, 4096 samples
.mt001..MQE | 2020-01-01T00:00:00.000000Z - 2020-01-01T00:08:31.875000Z | 8.0 Hz, 4096 samples
.mt001..MKN | 2020-01-01T00:00:00.000000Z - 2020-01-01T00:08:31.875000Z | 8.0 Hz, 4096 samples
From Obspy.Stream
[23]:
new_run = RunTS()
new_run.from_obspy_stream(stream)
[24]:
new_run
[24]:
RunTS Summary:
        Survey:      0
        Station:     mt001
        Run:         sr8_001
        Start:       2020-01-01T00:00:00+00:00
        End:         2020-01-01T00:08:31.875000+00:00
        Sample Rate: 8.0
        Components:  ['hx', 'hy', 'hz', 'ex', 'ey', 'temperaturex']
[ ]:

Filters

This section includes information about filters, which can be a very painful process and somewhat obtuse.

Filters are basically anything that manipulates the time series data into something else. This includes scaling, removing instrument response, removing trends, notch filtering, passband filtering, etc. We want to work with data that is in physical units, but the data are often collected in digital counts. To transform between the two we need to apply or unapply certain filters. When going from physical units to digital counts we are applying filters, channel_metadata.filter.applied = [True], whereas going from counts to physical units we are unapplying the filters channel_metadata.filter.applied = [False]. This may seem backwards, but this is the way most archived data is thought of. Then any filter applied to physical units for the purposes of cleaning the data are channel_metadata.filter.applied = [True].

Note: currently there are not tools to convert digital counts to physical units, this is a work in progress. Nevertheless all the information is there for you to do the transformation.

Supported filters in mt_metadata.timeseries.filters are:

Filter

Description

CoefficientFilter

A coefficient filter scales the data by a given factor, a real value.

FIRFilter

A finite impulse response filter is commonly an anti-alias filter and is represented as a 1-D array of real valued coefficients.

PoleZeroFilter

A pole-zero filter is often an type of bandpass filter. It is represented as symmetric complex poles and zeros with a scale factor

TimeDelayFilter

A time delay filter delays the data by a real valued time delay. A negative value is a delay and a positive value is a prediction.

FrequencyResponseTableFilter

A frequency, amplitude, phase look-up table, commonly in physical units and degrees. This is commonly how manufacturers provide instrument responses.

ChannelResponseFilter

A comprehensive representation of all filters. Contains a list of all filters, should be the most commonly used filter object as it can compute the total response, and estimate the passband and normalization frequency.

Filter Base

All filters inherit from a FilterBase class, which has attributes and methods common to all filters.

Units

The two common to all filters are units_in and units_out. These attributes are key as they describes how the filters are transforming the data. The units should be SI units and given as all lowercase full names. For example Volts would be volts and V/m would be volts per meter. All units are represented in short form.

Complex Response

A method is provided to compute the complex response, this is slightly different for each filter and the method overwritten by the different filters.

Pass Band

A method is also provided to compute the “pass band” of a given filter. This is the band where the response is flat, the estimation is only approximate and should be used with caution. It works well for simple filters, but more complex filters, it tries to pick the longest segment with a flat response. CoefficientFilter and TimeDelayFilter objects have an infinite pass band. FrequencyResponseTableFilter estimate the pass band within the given frequencies, unless told otherwise. If the frequency range is out of the range of the calibration frequencies extrapolation is applied. This should also be done with caution.

[1]:
%matplotlib inline
import numpy as np

# make a general frequency array to calculate complex response
# note that these are linear frequencies and the complex response uses angular frequencies
frequencies = np.logspace(-5, 5, 500)

Coefficient Filter

A coefficient filter is relatively simple. It includes a scale factor represented as gain.

[2]:
from mt_metadata.timeseries.filters import CoefficientFilter
[3]:
cf = CoefficientFilter(units_in="volts", units_out="V", gain=100.0, name="example_coefficient_filter")
cf
[3]:
{
    "coefficient_filter": {
        "calibration_date": "1980-01-01",
        "gain": 100.0,
        "name": "example_coefficient_filter",
        "type": "coefficient",
        "units_in": "V",
        "units_out": "V"
    }
}
[4]:
cf.plot_response(frequencies, x_units="frequency")
_images/examples_notebooks_filters_example_6_0.png

FIR Filter

An finite impulse response filter is commonly used as an anti-alias filter, and is represented as a 1-D array of real valued coefficients.

[5]:
from mt_metadata.timeseries.filters import FIRFilter
[6]:
fir = FIRFilter(
    units_in="volts",
    units_out="volts",
    name="example_fir",
    decimation_input_sample_rate=32000,
    gain=0.999904,
    symmetry="EVEN",
    decimation_factor=16,
)

fir.coefficients = [
    1.0828314e-06, 1.7808272e-06, 3.2410387e-06, 5.4627321e-06, 8.682945e-06, 1.3240843e-05, 1.9565294e-05,
    2.8185128e-05, 3.9656901e-05, 5.4688699e-05, 7.4153548e-05, 9.8989171e-05, 0.00013036761, 0.00016954952,
    0.00021798223, 0.00027731725, 0.00034936491, 0.00043613836, 0.00053984317, 0.0006628664, 0.00080777059,
    0.00097733398, 0.0011744311, 0.0014021378, 0.0016635987, 0.0019620692, 0.0023008469, 0.0026832493,
    0.0031125348, 0.0035918986, 0.0041243695, 0.0047127693, 0.0053596641, 0.0060672448, 0.0068373145,
    0.0076711699, 0.0085695535, 0.0095325625, 0.010559602, 0.01164928, 0.012799387, 0.014006814, 0.015267504,
    0.016576445, 0.017927598, 0.019313928, 0.020727372, 0.022158878, 0.023598416, 0.025035053, 0.026456987,
    0.027851671, 0.029205887, 0.030505868, 0.031737458, 0.032886244, 0.03393773, 0.034877509, 0.035691477,
    0.036365997, 0.036888145, 0.037245877, 0.03742826, 0.037425674, 0.03723, 0.036834806, 0.036235519,
    0.035429578, 0.034416564, 0.033198304, 0.031778947, 0.030165028, 0.028365461, 0.026391543, 0.024256891,
    0.021977346, 0.019570865, 0.017057346, 0.014458441, 0.011797323, 0.009098433, 0.0063871844, 0.0036896705,
    0.0010323179, -0.0015584482, -0.0040565561, -0.0064366534, -0.0086744577, -0.010747112, -0.012633516,
    -0.014314661, -0.0157739, -0.01699724, -0.017973563, -0.018694809, -0.019156145, -0.01935605, -0.019296378,
    -0.018982368, -0.018422581, -0.017628808, -0.016615927, -0.015401681, -0.014006456, -0.012452973,
    -0.010765962, -0.0089718029, -0.0070981304, -0.0051734182, -0.0032265538, -0.0012863991, 0.0006186511,
    0.0024610918, 0.0042147399, 0.0058551184, 0.0073598339, 0.0087089101, 0.0098850802, 0.01087404, 0.011664648,
    0.012249067, 0.012622863, 0.012785035, 0.012737988, 0.012487462, 0.01204238, 0.011414674, 0.010619033,
    0.0096726287, 0.0085947802, 0.007406604, 0.006130626, 0.0047903718, 0.0034099557, 0.002013647, 0.00062545808,
    -0.0007312728, -0.0020342721, -0.0032627005, -0.0043974896, -0.0054216445, -0.0063205026, -0.0070819431,
    -0.0076965573, -0.0081577515, -0.0084618134, -0.0086079109, -0.0085980454, -0.0084369555, -0.0081319623,
    -0.0076927822, -0.0071312929, -0.0064612622, -0.005698049, -0.0048582871, -0.0039595389, -0.0030199524,
    -0.0020579058, -0.0010916605, -0.00013902181, 0.00078298012, 0.0016583927, 0.0024726097, 0.0032126096,
    0.0038671608, 0.0044269804, 0.0048848582, 0.005235733, 0.0054767267, 0.0056071337, 0.0056283739,
    0.0055438946, 0.0053590466, 0.0050809216, 0.0047181565, 0.0042807218, 0.0037796798, 0.0032269375,
    0.0026349833, 0.0020166242, 0.001384723, 0.00075194426, 0.00013050952, -0.00046802824, -0.0010329896,
    -0.0015547526, -0.0020249044, -0.0024363673, -0.0027834927, -0.0030621202, -0.0032696081, -0.0034048304,
    -0.003468141, -0.0034613123, -0.003387446, -0.0032508562, -0.003056939, -0.0028120177, -0.0025231787,
    -0.0021980959, -0.0018448512, -0.001471751, -0.0010871479, -0.00069926586, -0.00031603608, 5.5052958e-05,
    0.00040709358, 0.00073387625, 0.0010299878, 0.0012908906, 0.0015129783, 0.0016936094, 0.0018311206,
    0.0019248178, 0.0019749457, 0.0019826426, 0.0019498726, 0.0018793481, 0.001774437, 0.0016390601,
    0.0014775817, 0.0012946966, 0.0010953132, 0.00088443852, 0.00066706788, 0.00044807568, 0.00023212004,
    2.3551687e-05, -0.00017366471, -0.00035601447, -0.00052048819, -0.00066461961, -0.0007865114, -0.00088484585,
    -0.00095888576, -0.0010084547, -0.0010339168, -0.0010361352, -0.0010164281, -0.00097651512, -0.0009184563,
    -0.00084458862, -0.00075745879, -0.00065975531, -0.0005542388, -0.00044368068, -0.00033079527, -0.00021818698,
    -0.0001082966, -3.3548122e-06, 9.4652831e-05, 0.00018401834, 0.00026333242, 0.00033149892, 0.00038773834,
    0.00043159194, 0.00046290434, 0.0004818163, 0.0004887383, 0.00048432534, 0.00046944799, 0.00044515711,
    0.0004126484, 0.0003732258, 0.00032826542, 0.0002791743, 0.00022736372, 0.0001742071, 0.00012101705,
    6.9016627e-05, 1.9314915e-05, -2.7108661e-05, -6.9420195e-05, -0.00010694123, -0.00013915499, -0.00016570302,
    -0.00018639189, -0.00020117435, -0.00021015058, -0.00021354953, -0.00021171506, -0.00020509114, -0.00019420317,
    -0.00017963824, -0.00016202785, -0.00014203126, -0.00012030972, -9.7522447e-05, -7.4297881e-05, -5.1229126e-05,
    -2.8859566e-05, -7.6714587e-06, 1.191924e-05, 2.9567975e-05, 4.5006156e-05, 5.8043028e-05, 6.8557049e-05,
    7.6507284e-05, 8.1911501e-05, 8.4854546e-05, 8.5473977e-05, 8.3953237e-05, 8.0514517e-05, 7.5410928e-05,
    6.8914269e-05, 6.1308667e-05, 5.2886291e-05, 4.3926615e-05, 3.4708195e-05, 2.5484322e-05, 1.6489239e-05,
    7.9309229e-06, -1.335887e-08, -7.1985955e-06, -1.3510014e-05, -1.8867066e-05, -2.32245e-05, -2.6558888e-05,
    -2.8887769e-05, -3.0242758e-05, -3.0684769e-05, -3.0291432e-05, -2.9154804e-05, -2.7376906e-05, -2.5070442e-05,
    -2.2347936e-05, -1.9321938e-05, -1.6109379e-05, -1.2808429e-05, -9.5234091e-06, -6.3382245e-06,
    -3.3291028e-06, -5.5987789e-07, 1.9187619e-06, 4.072373e-06, 5.8743913e-06, 7.312286e-06, 8.3905516e-06,
    9.10975e-06, 9.4979405e-06, 9.5751602e-06, 9.3762619e-06, 8.9382884e-06, 8.301693e-06, 7.5045982e-06,
    6.592431e-06, 5.6033018e-06, 4.5713236e-06, 3.5404175e-06, 2.5302468e-06, 1.5771827e-06, 6.9930724e-07,
    -8.6047464e-08, -7.6676685e-07, -1.3332562e-06, -1.7875625e-06, -2.1266892e-06, -2.35267e-06, -2.4824365e-06,
    -2.5098916e-06, -2.4598471e-06, -2.335345e-06, -2.1523615e-06, -1.9251499e-06, -1.6707684e-06, -1.3952398e-06,
    -1.1173763e-06, -8.4543007e-07, -5.7948262e-07, -3.444687e-07, -1.2505329e-07, 6.1743521e-08, 2.1873758e-07,
    3.4424812e-07, 4.3748074e-07, 4.931357e-07, 5.2551894e-07, 5.344753e-07, 5.136161e-07, 4.9029785e-07,
    4.3492003e-07, 3.8198567e-07, 3.2236682e-07, 2.6023093e-07, 1.9363162e-07, 1.3382508e-07, 6.8672463e-08,
    2.1443693e-08, -1.9671351e-09, -4.7522178e-08, -6.2719053e-08, -1.0190665e-07, -1.2015286e-07, -1.103657e-07,
    -1.0294882e-07, -1.1965994e-07, -1.3612285e-07, -1.463918e-07, -1.4752351e-07, 3.9802276e-07]
[7]:
fir.plot_response(frequencies, x_units="frequency", pb_tol=.5)
print(f"Pass Band frequency range estimation: {fir.pass_band(frequencies, tol=.5)}")
_images/examples_notebooks_filters_example_10_0.png
Pass Band frequency range estimation: [1.00000000e-05 1.18071285e+01]

Pole Zero Filter

A pole-zero filter is a mathematical way to represent a filter using complex valued poles and zeros and a scaling factor. The advantage of the pole-zero representation is that arbitrary frequency ranges can be computed without having to extrapolate.

[8]:
from mt_metadata.timeseries.filters import PoleZeroFilter
[9]:
pz = PoleZeroFilter(units_in="volts", units_out="nanotesla", name="example_zpk_response")
pz.poles = [(-6.283185+10.882477j), (-6.283185-10.882477j), (-12.566371+0j)]
pz.zeros = []
pz.normalization_factor = 2002.269
pz
[9]:
{
    "pole_zero_filter": {
        "calibration_date": "1980-01-01",
        "gain": 1.0,
        "name": "example_zpk_response",
        "normalization_factor": 2002.269,
        "poles": {
            "real": [
                -6.283185,
                -6.283185,
                -12.566371
            ],
            "imag": [
                10.882477,
                -10.882477,
                0.0
            ]
        },
        "type": "zpk",
        "units_in": "V",
        "units_out": "nT",
        "zeros": {
            "real": [],
            "imag": []
        }
    }
}
[10]:
pz.plot_response(frequencies, x_units="frequency", pb_tol=1e-2)
print(f"Pass Band frequency range estimation: {pz.pass_band(frequencies, tol=1e-2)}")
_images/examples_notebooks_filters_example_14_0.png
Pass Band frequency range estimation: [1.00000000e-05 5.36363132e-01]

Time Delay Filter

A time delay filter are often incorporated in the AD converter that controls mulitple channels and creates a small time delay as it shifts between the channels. A positive value predicts time and a negative value delays time. For causality the value should always be negative. The delay is givne in seconds.

[11]:
from mt_metadata.timeseries.filters import TimeDelayFilter
[12]:
td = TimeDelayFilter(units_in="volts", units_out="volts", name="example_time_delay", delay=-.25)
td.pass_band(frequencies)
[12]:
array([1.e-05, 1.e+05])
[13]:
td.plot_response(frequencies, x_units="frequency")
_images/examples_notebooks_filters_example_18_0.png

FrequencyResponseTableFilter

Commonly to calibrate the frequency response of an instrument a manufacturer will run a calibration taking measurements at discrete frequencies and then supplying a table of frequency, amplitude, and phase for the users. This is more accurate than pole-zero representation but has limitations in that extrapolation needs to be applied to frequencies outside of the calibrated frequencies and interpolation between calibrated frequencies.

Note: phase is assumed to be in radians.

[14]:
from mt_metadata.timeseries.filters import FrequencyResponseTableFilter
[15]:
fap = FrequencyResponseTableFilter(units_in="volts", units_out="nanotesla", name="example_fap")

fap.frequencies = [
    1.95312000e-03,   2.76214000e-03,   3.90625000e-03,
     5.52427000e-03,   7.81250000e-03,   1.10485000e-02,
     1.56250000e-02,   2.20971000e-02,   3.12500000e-02,
     4.41942000e-02,   6.25000000e-02,   8.83883000e-02,
     1.25000000e-01,   1.76780000e-01,   2.50000000e-01,
     3.53550000e-01,   5.00000000e-01,   7.07110000e-01,
     1.00000000e+00,   1.41420000e+00,   2.00000000e+00,
     2.82840000e+00,   4.00000000e+00,   5.65690000e+00,
     8.00000000e+00,   1.13140000e+01,   1.60000000e+01,
     2.26270000e+01,   3.20000000e+01,   4.52550000e+01,
     6.40000000e+01,   9.05100000e+01,   1.28000000e+02,
     1.81020000e+02,   2.56000000e+02,   3.62040000e+02,
     5.12000000e+02,   7.24080000e+02,   1.02400000e+03,
     1.44820000e+03,   2.04800000e+03,   2.89630000e+03,
     4.09600000e+03,   5.79260000e+03,   8.19200000e+03,
     1.15850000e+04]

fap.amplitudes = [
    1.59009000e-03,   3.07497000e-03,   5.52793000e-03,
    9.47448000e-03,   1.54565000e-02,   2.49498000e-02,
    3.96462000e-02,   7.87192000e-02,   1.57134000e-01,
    3.09639000e-01,   5.94224000e-01,   1.12698000e+00,
    2.01092000e+00,   3.33953000e+00,   5.00280000e+00,
    6.62396000e+00,   7.97545000e+00,   8.82872000e+00,
    9.36883000e+00,   9.64102000e+00,   9.79664000e+00,
    9.87183000e+00,   9.90666000e+00,   9.92845000e+00,
    9.93559000e+00,   9.93982000e+00,   9.94300000e+00,
    9.93546000e+00,   9.93002000e+00,   9.90873000e+00,
    9.86383000e+00,   9.78129000e+00,   9.61814000e+00,
    9.26461000e+00,   8.60175000e+00,   7.18337000e+00,
    4.46123000e+00,  -8.72600000e-01,  -5.15684000e+00,
    -2.95111000e+00,  -9.28512000e-01,  -2.49850000e-01,
    -5.75682000e-02,  -1.34293000e-02,  -1.02708000e-03,
    1.09577000e-03]

fap.phases = [
    7.60824000e-02,   1.09174000e-01,   1.56106000e-01,
    2.22371000e-01,   3.12020000e-01,   4.41080000e-01,
    6.23548000e-01,   8.77188000e-01,   1.23360000e+00,
    1.71519000e+00,   2.35172000e+00,   3.13360000e+00,
    3.98940000e+00,   4.67269000e+00,   4.96593000e+00,
    4.65875000e+00,   3.95441000e+00,   3.11098000e+00,
    2.30960000e+00,   1.68210000e+00,   1.17928000e+00,
    8.20015000e-01,   5.36474000e-01,   3.26955000e-01,
    1.48051000e-01,  -8.24275000e-03,  -1.66064000e-01,
    -3.48852000e-01,  -5.66625000e-01,  -8.62435000e-01,
    -1.25347000e+00,  -1.81065000e+00,  -2.55245000e+00,
    -3.61512000e+00,  -5.00185000e+00,  -6.86158000e+00,
    -8.78698000e+00,  -9.08920000e+00,  -4.22925000e+00,
     2.15533000e-01,   6.00661000e-01,   3.12368000e-01,
     1.31660000e-01,   5.01553000e-02,   1.87239000e-02,
     6.68243000e-03]
[16]:
from matplotlib import pyplot as plt
[17]:
fap.plot_response(frequencies, x_units="frequency", unwrap=True, pb_tol=1E-2, interpolation_method="slinear")
print(f"Pass Band frequency range estimation: {fap.pass_band(fap.frequencies, tol=1e-2,  interpolation_method='slinear')}")
2023-09-27T16:22:20.641040-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
2023-09-27T16:22:20.641040-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
_images/examples_notebooks_filters_example_23_1.png
Pass Band frequency range estimation: [  2.   181.02]

IMPORTANT: As you can see above extrapolation is unstable and will have change the data in unknown ways. If you need to extrapolate past the calibrated frequencies try to extrapolate using a different method than the one provided. Try using a different interpolation method, or a different curve fitting algorithm.

ChannelResponseFilter

These individual filters are useful, but the more practical use is combining all filters into a single filter and calculating the total response. The mt_metadata.timeseries.filters.ChannelResponseFilter is provide for this purpose.

[18]:
from mt_metadata.timeseries.filters import ChannelResponseFilter
[19]:
channel_response = ChannelResponseFilter()
channel_response.filters_list = [fir, td, fap]
channel_response.frequencies = frequencies
[20]:
channel_response.plot_response(x_units="frequency", pb_tol=1e-1, include_delay=True)
2023-09-27T16:22:21.020759-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
2023-09-27T16:22:21.042704-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
2023-09-27T16:22:21.055112-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
2023-09-27T16:22:21.076144-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
2023-09-27T16:22:21.092895-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
_images/examples_notebooks_filters_example_28_1.png

Estimate Total Sensitivity

If you want to do a quick calibration you can divide the time series by the total sensitivity and it will be accurate within the pass band.

[21]:
print(f"Normalization Frequency: {channel_response.normalization_frequency} Hz")
print(f"Pass Band: {channel_response.pass_band} Hz")
print(f"Total Sensitivity: {channel_response.compute_instrument_sensitivity(sig_figs=3)}")
2023-09-27T16:22:21.571180-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
2023-09-27T16:22:21.593486-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
Normalization Frequency: 1.097 Hz
2023-09-27T16:22:21.609817-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
Pass Band: [ 0.1018629  11.80712847] Hz
2023-09-27T16:22:21.629781-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
2023-09-27T16:22:21.648280-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
2023-09-27T16:22:21.661262-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
2023-09-27T16:22:21.681687-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
2023-09-27T16:22:21.705189-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
2023-09-27T16:22:21.725088-0700 | WARNING | mt_metadata.timeseries.filters.frequency_response_table_filter | complex_response | Extrapolating, use values outside calibration frequencies with caution
Total Sensitivity: 9.405

Check if units are proper between filters

The output units of the first filter must be the same units as the input units for the second filter. This is done when the filter list is set. Here you can see that the unit order is not correct.

[22]:
channel_response = ChannelResponseFilter()
channel_response.filters_list = [cf, fap, pz]
2023-09-27T16:22:21.747735-0700 | ERROR | mt_metadata.timeseries.filters.channel_response_filter | _check_consistency_of_units | Unit consistency is incorrect. The input units for example_zpk_response should be nT not V
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
~\AppData\Local\Temp\1\ipykernel_19688\4206849502.py in <cell line: 2>()
      1 channel_response = ChannelResponseFilter()
----> 2 channel_response.filters_list = [cf, fap, pz]

~\OneDrive - DOI\Documents\GitHub\mt_metadata\mt_metadata\base\metadata.py in __setattr__(self, name, value)
    360
    361         if name in skip_list:
--> 362             super().__setattr__(name, value)
    363             return
    364         if not name.startswith("_"):

~\OneDrive - DOI\Documents\GitHub\mt_metadata\mt_metadata\timeseries\filters\channel_response_filter.py in filters_list(self, filters_list)
     70         """set the filters list and validate the list"""
     71         self._filters_list = self._validate_filters_list(filters_list)
---> 72         self._check_consistency_of_units()
     73
     74     @property

~\OneDrive - DOI\Documents\GitHub\mt_metadata\mt_metadata\timeseries\filters\channel_response_filter.py in _check_consistency_of_units(self)
    338                     )
    339                     self.logger.error(msg)
--> 340                     raise ValueError(msg)
    341                 previous_units = mt_filter.units_out
    342

ValueError: Unit consistency is incorrect. The input units for example_zpk_response should be nT not V

Remove Instrument Response

This is an example notebook for removing the instrument response to calibrate the data into physical units.

Note: The workflow in this notebook is all incorporated into a class mth5.timeseris.ts_filters.RemoveInstrumentResponse and has been incorporated into mth5.timeseries.ChannelTS as mth5.timeseries.ChannelTS.remove_instrument_response and mth5.timeseries.RunTS as mth5.timeseries.RunTS.calibrate. An example is shown at the bottom of the notebook.

All the filters are represented as mt_metadata.timeseries.filters objects. More information about these can be found in mt_metadata filters.

This example will read in an existing MTH5 that has one station CAS04, one run a, and 2 channels ex and hx.

[9]:
from mth5.mth5 import MTH5
%matplotlib widget

Open the MTH5 file

This has 2 stations in it CAS04 and NVR08 both MTArray/Earthscope stations. Each only has one run. You can make this MTH5 file using make_mth5_driver_v0.2.0.ipnb

[3]:
m = MTH5()
m.open_mth5(r"8P_CAS04_NVR08.h5")

Have a look at the channel summary. This is a table of all the channels available in the MTH5 file.

[4]:
m.channel_summary.summarize()
ch_df = m.channel_summary.to_dataframe()
ch_df
[4]:
survey station run latitude longitude elevation component start end n_samples sample_rate measurement_type azimuth tilt units hdf5_reference run_hdf5_reference station_hdf5_reference
0 CONUS South CAS04 CAS04a 37.633351 -121.468382 329.387 ex 1980-01-01 00:00:00+00:00 1980-01-01 00:00:00+00:00 1 0.0 electric 13.2 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
1 CONUS South CAS04 CAS04a 37.633351 -121.468382 329.387 ey 1980-01-01 00:00:00+00:00 1980-01-01 00:00:00+00:00 1 0.0 electric 103.2 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
2 CONUS South CAS04 CAS04a 37.633351 -121.468382 329.387 hx 1980-01-01 00:00:00+00:00 1980-01-01 00:00:00+00:00 1 0.0 magnetic 13.2 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
3 CONUS South CAS04 CAS04a 37.633351 -121.468382 329.387 hy 1980-01-01 00:00:00+00:00 1980-01-01 00:00:00+00:00 1 0.0 magnetic 103.2 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
4 CONUS South CAS04 CAS04a 37.633351 -121.468382 329.387 hz 1980-01-01 00:00:00+00:00 1980-01-01 00:00:00+00:00 1 0.0 magnetic 13.2 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
65 CONUS South NVR08 c 38.326630 -118.082382 1375.425 ex 2020-06-14 18:00:44+00:00 2020-06-24 15:55:47+00:00 856503 1.0 electric 12.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
66 CONUS South NVR08 c 38.326630 -118.082382 1375.425 ey 2020-06-14 18:00:44+00:00 2020-06-24 15:55:47+00:00 856503 1.0 electric 102.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
67 CONUS South NVR08 c 38.326630 -118.082382 1375.425 hx 2020-06-14 18:00:44+00:00 2020-06-24 15:55:47+00:00 856503 1.0 magnetic 12.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
68 CONUS South NVR08 c 38.326630 -118.082382 1375.425 hy 2020-06-14 18:00:44+00:00 2020-06-24 15:55:47+00:00 856503 1.0 magnetic 102.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
69 CONUS South NVR08 c 38.326630 -118.082382 1375.425 hz 2020-06-14 18:00:44+00:00 2020-06-24 15:55:47+00:00 856503 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>

70 rows × 18 columns

Get observatory data

https://geomag.usgs.gov/ws/data/?id=FRN&type=adjusted&elements=H&sampling_period=1&format=json&starttime=2020-06-02T19:00:00Z&endtime=2020-06-02T22:07:46Z

[16]:
import json
frn_json = json.loads('{"type":"Timeseries","metadata":{"intermagnet":{"imo":{"iaga_code":"FRN","name":"Fresno","coordinates":[240.282,37.091,331.0]},"reported_orientation":"H","sensor_orientation":"HDZF","data_type":"adjusted","sampling_period":1,"digital_sampling_rate":0.01},"status":200,"generated":"2022-04-11T23:46:00Z","url":null},"times":["2020-06-02T19:00:00.000Z","2020-06-02T19:00:01.000Z","2020-06-02T19:00:02.000Z","2020-06-02T19:00:03.000Z","2020-06-02T19:00:04.000Z","2020-06-02T19:00:05.000Z","2020-06-02T19:00:06.000Z","2020-06-02T19:00:07.000Z","2020-06-02T19:00:08.000Z","2020-06-02T19:00:09.000Z","2020-06-02T19:00:10.000Z","2020-06-02T19:00:11.000Z","2020-06-02T19:00:12.000Z","2020-06-02T19:00:13.000Z","2020-06-02T19:00:14.000Z","2020-06-02T19:00:15.000Z","2020-06-02T19:00:16.000Z","2020-06-02T19:00:17.000Z","2020-06-02T19:00:18.000Z","2020-06-02T19:00:19.000Z","2020-06-02T19:00:20.000Z","2020-06-02T19:00:21.000Z","2020-06-02T19:00:22.000Z","2020-06-02T19:00:23.000Z","2020-06-02T19:00:24.000Z","2020-06-02T19:00:25.000Z","2020-06-02T19:00:26.000Z","2020-06-02T19:00:27.000Z","2020-06-02T19:00:28.000Z","2020-06-02T19:00:29.000Z","2020-06-02T19:00:30.000Z","2020-06-02T19:00:31.000Z","2020-06-02T19:00:32.000Z","2020-06-02T19:00:33.000Z","2020-06-02T19:00:34.000Z","2020-06-02T19:00:35.000Z","2020-06-02T19:00:36.000Z","2020-06-02T19:00:37.000Z","2020-06-02T19:00:38.000Z","2020-06-02T19:00:39.000Z","2020-06-02T19:00:40.000Z","2020-06-02T19:00:41.000Z","2020-06-02T19:00:42.000Z","2020-06-02T19:00:43.000Z","2020-06-02T19:00:44.000Z","2020-06-02T19:00:45.000Z","2020-06-02T19:00:46.000Z","2020-06-02T19:00:47.000Z","2020-06-02T19:00:48.000Z","2020-06-02T19:00:49.000Z","2020-06-02T19:00:50.000Z","2020-06-02T19:00:51.000Z","2020-06-02T19:00:52.000Z","2020-06-02T19:00:53.000Z","2020-06-02T19:00:54.000Z","2020-06-02T19:00:55.000Z","2020-06-02T19:00:56.000Z","2020-06-02T19:00:57.000Z","2020-06-02T19:00:58.000Z","2020-06-02T19:00:59.000Z","2020-06-02T19:01:00.000Z","2020-06-02T19:01:01.000Z","2020-06-02T19:01:02.000Z","2020-06-02T19:01:03.000Z","2020-06-02T19:01:04.000Z","2020-06-02T19:01:05.000Z","2020-06-02T19:01:06.000Z","2020-06-02T19:01:07.000Z","2020-06-02T19:01:08.000Z","2020-06-02T19:01:09.000Z","2020-06-02T19:01:10.000Z","2020-06-02T19:01:11.000Z","2020-06-02T19:01:12.000Z","2020-06-02T19:01:13.000Z","2020-06-02T19:01:14.000Z","2020-06-02T19:01:15.000Z","2020-06-02T19:01:16.000Z","2020-06-02T19:01:17.000Z","2020-06-02T19:01:18.000Z","2020-06-02T19:01:19.000Z","2020-06-02T19:01:20.000Z","2020-06-02T19:01:21.000Z","2020-06-02T19:01:22.000Z","2020-06-02T19:01:23.000Z","2020-06-02T19:01:24.000Z","2020-06-02T19:01:25.000Z","2020-06-02T19:01:26.000Z","2020-06-02T19:01:27.000Z","2020-06-02T19:01:28.000Z","2020-06-02T19:01:29.000Z","2020-06-02T19:01:30.000Z","2020-06-02T19:01:31.000Z","2020-06-02T19:01:32.000Z","2020-06-02T19:01:33.000Z","2020-06-02T19:01:34.000Z","2020-06-02T19:01:35.000Z","2020-06-02T19:01:36.000Z","2020-06-02T19:01:37.000Z","2020-06-02T19:01:38.000Z","2020-06-02T19:01:39.000Z","2020-06-02T19:01:40.000Z","2020-06-02T19:01:41.000Z","2020-06-02T19:01:42.000Z","2020-06-02T19:01:43.000Z","2020-06-02T19:01:44.000Z","2020-06-02T19:01:45.000Z","2020-06-02T19:01:46.000Z","2020-06-02T19:01:47.000Z","2020-06-02T19:01:48.000Z","2020-06-02T19:01:49.000Z","2020-06-02T19:01:50.000Z","2020-06-02T19:01:51.000Z","2020-06-02T19:01:52.000Z","2020-06-02T19:01:53.000Z","2020-06-02T19:01:54.000Z","2020-06-02T19:01:55.000Z","2020-06-02T19:01:56.000Z","2020-06-02T19:01:57.000Z","2020-06-02T19:01:58.000Z","2020-06-02T19:01:59.000Z","2020-06-02T19:02:00.000Z","2020-06-02T19:02:01.000Z","2020-06-02T19:02:02.000Z","2020-06-02T19:02:03.000Z","2020-06-02T19:02:04.000Z","2020-06-02T19:02:05.000Z","2020-06-02T19:02:06.000Z","2020-06-02T19:02:07.000Z","2020-06-02T19:02:08.000Z","2020-06-02T19:02:09.000Z","2020-06-02T19:02:10.000Z","2020-06-02T19:02:11.000Z","2020-06-02T19:02:12.000Z","2020-06-02T19:02:13.000Z","2020-06-02T19:02:14.000Z","2020-06-02T19:02:15.000Z","2020-06-02T19:02:16.000Z","2020-06-02T19:02:17.000Z","2020-06-02T19:02:18.000Z","2020-06-02T19:02:19.000Z","2020-06-02T19:02:20.000Z","2020-06-02T19:02:21.000Z","2020-06-02T19:02:22.000Z","2020-06-02T19:02:23.000Z","2020-06-02T19:02:24.000Z","2020-06-02T19:02:25.000Z","2020-06-02T19:02:26.000Z","2020-06-02T19:02:27.000Z","2020-06-02T19:02:28.000Z","2020-06-02T19:02:29.000Z","2020-06-02T19:02:30.000Z","2020-06-02T19:02:31.000Z","2020-06-02T19:02:32.000Z","2020-06-02T19:02:33.000Z","2020-06-02T19:02:34.000Z","2020-06-02T19:02:35.000Z","2020-06-02T19:02:36.000Z","2020-06-02T19:02:37.000Z","2020-06-02T19:02:38.000Z","2020-06-02T19:02:39.000Z","2020-06-02T19:02:40.000Z","2020-06-02T19:02:41.000Z","2020-06-02T19:02:42.000Z","2020-06-02T19:02:43.000Z","2020-06-02T19:02:44.000Z","2020-06-02T19:02:45.000Z","2020-06-02T19:02:46.000Z","2020-06-02T19:02:47.000Z","2020-06-02T19:02:48.000Z","2020-06-02T19:02:49.000Z","2020-06-02T19:02:50.000Z","2020-06-02T19:02:51.000Z","2020-06-02T19:02:52.000Z","2020-06-02T19:02:53.000Z","2020-06-02T19:02:54.000Z","2020-06-02T19:02:55.000Z","2020-06-02T19:02:56.000Z","2020-06-02T19:02:57.000Z","2020-06-02T19:02:58.000Z","2020-06-02T19:02:59.000Z","2020-06-02T19:03:00.000Z","2020-06-02T19:03:01.000Z","2020-06-02T19:03:02.000Z","2020-06-02T19:03:03.000Z","2020-06-02T19:03:04.000Z","2020-06-02T19:03:05.000Z","2020-06-02T19:03:06.000Z","2020-06-02T19:03:07.000Z","2020-06-02T19:03:08.000Z","2020-06-02T19:03:09.000Z","2020-06-02T19:03:10.000Z","2020-06-02T19:03:11.000Z","2020-06-02T19:03:12.000Z","2020-06-02T19:03:13.000Z","2020-06-02T19:03:14.000Z","2020-06-02T19:03:15.000Z","2020-06-02T19:03:16.000Z","2020-06-02T19:03:17.000Z","2020-06-02T19:03:18.000Z","2020-06-02T19:03:19.000Z","2020-06-02T19:03:20.000Z","2020-06-02T19:03:21.000Z","2020-06-02T19:03:22.000Z","2020-06-02T19:03:23.000Z","2020-06-02T19:03:24.000Z","2020-06-02T19:03:25.000Z","2020-06-02T19:03:26.000Z","2020-06-02T19:03:27.000Z","2020-06-02T19:03:28.000Z","2020-06-02T19:03:29.000Z","2020-06-02T19:03:30.000Z","2020-06-02T19:03:31.000Z","2020-06-02T19:03:32.000Z","2020-06-02T19:03:33.000Z","2020-06-02T19:03:34.000Z","2020-06-02T19:03:35.000Z","2020-06-02T19:03:36.000Z","2020-06-02T19:03:37.000Z","2020-06-02T19:03:38.000Z","2020-06-02T19:03:39.000Z","2020-06-02T19:03:40.000Z","2020-06-02T19:03:41.000Z","2020-06-02T19:03:42.000Z","2020-06-02T19:03:43.000Z","2020-06-02T19:03:44.000Z","2020-06-02T19:03:45.000Z","2020-06-02T19:03:46.000Z","2020-06-02T19:03:47.000Z","2020-06-02T19:03:48.000Z","2020-06-02T19:03:49.000Z","2020-06-02T19:03:50.000Z","2020-06-02T19:03:51.000Z","2020-06-02T19:03:52.000Z","2020-06-02T19:03:53.000Z","2020-06-02T19:03:54.000Z","2020-06-02T19:03:55.000Z","2020-06-02T19:03:56.000Z","2020-06-02T19:03:57.000Z","2020-06-02T19:03:58.000Z","2020-06-02T19:03:59.000Z","2020-06-02T19:04:00.000Z","2020-06-02T19:04:01.000Z","2020-06-02T19:04:02.000Z","2020-06-02T19:04:03.000Z","2020-06-02T19:04:04.000Z","2020-06-02T19:04:05.000Z","2020-06-02T19:04:06.000Z","2020-06-02T19:04:07.000Z","2020-06-02T19:04:08.000Z","2020-06-02T19:04:09.000Z","2020-06-02T19:04:10.000Z","2020-06-02T19:04:11.000Z","2020-06-02T19:04:12.000Z","2020-06-02T19:04:13.000Z","2020-06-02T19:04:14.000Z","2020-06-02T19:04:15.000Z","2020-06-02T19:04:16.000Z","2020-06-02T19:04:17.000Z","2020-06-02T19:04:18.000Z","2020-06-02T19:04:19.000Z","2020-06-02T19:04:20.000Z","2020-06-02T19:04:21.000Z","2020-06-02T19:04:22.000Z","2020-06-02T19:04:23.000Z","2020-06-02T19:04:24.000Z","2020-06-02T19:04:25.000Z","2020-06-02T19:04:26.000Z","2020-06-02T19:04:27.000Z","2020-06-02T19:04:28.000Z","2020-06-02T19:04:29.000Z","2020-06-02T19:04:30.000Z","2020-06-02T19:04:31.000Z","2020-06-02T19:04:32.000Z","2020-06-02T19:04:33.000Z","2020-06-02T19:04:34.000Z","2020-06-02T19:04:35.000Z","2020-06-02T19:04:36.000Z","2020-06-02T19:04:37.000Z","2020-06-02T19:04:38.000Z","2020-06-02T19:04:39.000Z","2020-06-02T19:04:40.000Z","2020-06-02T19:04:41.000Z","2020-06-02T19:04:42.000Z","2020-06-02T19:04:43.000Z","2020-06-02T19:04:44.000Z","2020-06-02T19:04:45.000Z","2020-06-02T19:04:46.000Z","2020-06-02T19:04:47.000Z","2020-06-02T19:04:48.000Z","2020-06-02T19:04:49.000Z","2020-06-02T19:04:50.000Z","2020-06-02T19:04:51.000Z","2020-06-02T19:04:52.000Z","2020-06-02T19:04:53.000Z","2020-06-02T19:04:54.000Z","2020-06-02T19:04:55.000Z","2020-06-02T19:04:56.000Z","2020-06-02T19:04:57.000Z","2020-06-02T19:04:58.000Z","2020-06-02T19:04:59.000Z","2020-06-02T19:05:00.000Z","2020-06-02T19:05:01.000Z","2020-06-02T19:05:02.000Z","2020-06-02T19:05:03.000Z","2020-06-02T19:05:04.000Z","2020-06-02T19:05:05.000Z","2020-06-02T19:05:06.000Z","2020-06-02T19:05:07.000Z","2020-06-02T19:05:08.000Z","2020-06-02T19:05:09.000Z","2020-06-02T19:05:10.000Z","2020-06-02T19:05:11.000Z","2020-06-02T19:05:12.000Z","2020-06-02T19:05:13.000Z","2020-06-02T19:05:14.000Z","2020-06-02T19:05:15.000Z","2020-06-02T19:05:16.000Z","2020-06-02T19:05:17.000Z","2020-06-02T19:05:18.000Z","2020-06-02T19:05:19.000Z","2020-06-02T19:05:20.000Z","2020-06-02T19:05:21.000Z","2020-06-02T19:05:22.000Z","2020-06-02T19:05:23.000Z","2020-06-02T19:05:24.000Z","2020-06-02T19:05:25.000Z","2020-06-02T19:05:26.000Z","2020-06-02T19:05:27.000Z","2020-06-02T19:05:28.000Z","2020-06-02T19:05:29.000Z","2020-06-02T19:05:30.000Z","2020-06-02T19:05:31.000Z","2020-06-02T19:05:32.000Z","2020-06-02T19:05:33.000Z","2020-06-02T19:05:34.000Z","2020-06-02T19:05:35.000Z","2020-06-02T19:05:36.000Z","2020-06-02T19:05:37.000Z","2020-06-02T19:05:38.000Z","2020-06-02T19:05:39.000Z","2020-06-02T19:05:40.000Z","2020-06-02T19:05:41.000Z","2020-06-02T19:05:42.000Z","2020-06-02T19:05:43.000Z","2020-06-02T19:05:44.000Z","2020-06-02T19:05:45.000Z","2020-06-02T19:05:46.000Z","2020-06-02T19:05:47.000Z","2020-06-02T19:05:48.000Z","2020-06-02T19:05:49.000Z","2020-06-02T19:05:50.000Z","2020-06-02T19:05:51.000Z","2020-06-02T19:05:52.000Z","2020-06-02T19:05:53.000Z","2020-06-02T19:05:54.000Z","2020-06-02T19:05:55.000Z","2020-06-02T19:05:56.000Z","2020-06-02T19:05:57.000Z","2020-06-02T19:05:58.000Z","2020-06-02T19:05:59.000Z","2020-06-02T19:06:00.000Z","2020-06-02T19:06:01.000Z","2020-06-02T19:06:02.000Z","2020-06-02T19:06:03.000Z","2020-06-02T19:06:04.000Z","2020-06-02T19:06:05.000Z","2020-06-02T19:06:06.000Z","2020-06-02T19:06:07.000Z","2020-06-02T19:06:08.000Z","2020-06-02T19:06:09.000Z","2020-06-02T19:06:10.000Z","2020-06-02T19:06:11.000Z","2020-06-02T19:06:12.000Z","2020-06-02T19:06:13.000Z","2020-06-02T19:06:14.000Z","2020-06-02T19:06:15.000Z","2020-06-02T19:06:16.000Z","2020-06-02T19:06:17.000Z","2020-06-02T19:06:18.000Z","2020-06-02T19:06:19.000Z","2020-06-02T19:06:20.000Z","2020-06-02T19:06:21.000Z","2020-06-02T19:06:22.000Z","2020-06-02T19:06:23.000Z","2020-06-02T19:06:24.000Z","2020-06-02T19:06:25.000Z","2020-06-02T19:06:26.000Z","2020-06-02T19:06:27.000Z","2020-06-02T19:06:28.000Z","2020-06-02T19:06:29.000Z","2020-06-02T19:06:30.000Z","2020-06-02T19:06:31.000Z","2020-06-02T19:06:32.000Z","2020-06-02T19:06:33.000Z","2020-06-02T19:06:34.000Z","2020-06-02T19:06:35.000Z","2020-06-02T19:06:36.000Z","2020-06-02T19:06:37.000Z","2020-06-02T19:06:38.000Z","2020-06-02T19:06:39.000Z","2020-06-02T19:06:40.000Z","2020-06-02T19:06:41.000Z","2020-06-02T19:06:42.000Z","2020-06-02T19:06:43.000Z","2020-06-02T19:06:44.000Z","2020-06-02T19:06:45.000Z","2020-06-02T19:06:46.000Z","2020-06-02T19:06:47.000Z","2020-06-02T19:06:48.000Z","2020-06-02T19:06:49.000Z","2020-06-02T19:06:50.000Z","2020-06-02T19:06:51.000Z","2020-06-02T19:06:52.000Z","2020-06-02T19:06:53.000Z","2020-06-02T19:06:54.000Z","2020-06-02T19:06:55.000Z","2020-06-02T19:06:56.000Z","2020-06-02T19:06:57.000Z","2020-06-02T19:06:58.000Z","2020-06-02T19:06:59.000Z","2020-06-02T19:07:00.000Z","2020-06-02T19:07:01.000Z","2020-06-02T19:07:02.000Z","2020-06-02T19:07:03.000Z","2020-06-02T19:07:04.000Z","2020-06-02T19:07:05.000Z","2020-06-02T19:07:06.000Z","2020-06-02T19:07:07.000Z","2020-06-02T19:07:08.000Z","2020-06-02T19:07:09.000Z","2020-06-02T19:07:10.000Z","2020-06-02T19:07:11.000Z","2020-06-02T19:07:12.000Z","2020-06-02T19:07:13.000Z","2020-06-02T19:07:14.000Z","2020-06-02T19:07:15.000Z","2020-06-02T19:07:16.000Z","2020-06-02T19:07:17.000Z","2020-06-02T19:07:18.000Z","2020-06-02T19:07:19.000Z","2020-06-02T19:07:20.000Z","2020-06-02T19:07:21.000Z","2020-06-02T19:07:22.000Z","2020-06-02T19:07:23.000Z","2020-06-02T19:07:24.000Z","2020-06-02T19:07:25.000Z","2020-06-02T19:07:26.000Z","2020-06-02T19:07:27.000Z","2020-06-02T19:07:28.000Z","2020-06-02T19:07:29.000Z","2020-06-02T19:07:30.000Z","2020-06-02T19:07:31.000Z","2020-06-02T19:07:32.000Z","2020-06-02T19:07:33.000Z","2020-06-02T19:07:34.000Z","2020-06-02T19:07:35.000Z","2020-06-02T19:07:36.000Z","2020-06-02T19:07:37.000Z","2020-06-02T19:07:38.000Z","2020-06-02T19:07:39.000Z","2020-06-02T19:07:40.000Z","2020-06-02T19:07:41.000Z","2020-06-02T19:07:42.000Z","2020-06-02T19:07:43.000Z","2020-06-02T19:07:44.000Z","2020-06-02T19:07:45.000Z","2020-06-02T19:07:46.000Z","2020-06-02T19:07:47.000Z","2020-06-02T19:07:48.000Z","2020-06-02T19:07:49.000Z","2020-06-02T19:07:50.000Z","2020-06-02T19:07:51.000Z","2020-06-02T19:07:52.000Z","2020-06-02T19:07:53.000Z","2020-06-02T19:07:54.000Z","2020-06-02T19:07:55.000Z","2020-06-02T19:07:56.000Z","2020-06-02T19:07:57.000Z","2020-06-02T19:07:58.000Z","2020-06-02T19:07:59.000Z","2020-06-02T19:08:00.000Z","2020-06-02T19:08:01.000Z","2020-06-02T19:08:02.000Z","2020-06-02T19:08:03.000Z","2020-06-02T19:08:04.000Z","2020-06-02T19:08:05.000Z","2020-06-02T19:08:06.000Z","2020-06-02T19:08:07.000Z","2020-06-02T19:08:08.000Z","2020-06-02T19:08:09.000Z","2020-06-02T19:08:10.000Z","2020-06-02T19:08:11.000Z","2020-06-02T19:08:12.000Z","2020-06-02T19:08:13.000Z","2020-06-02T19:08:14.000Z","2020-06-02T19:08:15.000Z","2020-06-02T19:08:16.000Z","2020-06-02T19:08:17.000Z","2020-06-02T19:08:18.000Z","2020-06-02T19:08:19.000Z","2020-06-02T19:08:20.000Z","2020-06-02T19:08:21.000Z","2020-06-02T19:08:22.000Z","2020-06-02T19:08:23.000Z","2020-06-02T19:08:24.000Z","2020-06-02T19:08:25.000Z","2020-06-02T19:08:26.000Z","2020-06-02T19:08:27.000Z","2020-06-02T19:08:28.000Z","2020-06-02T19:08:29.000Z","2020-06-02T19:08:30.000Z","2020-06-02T19:08:31.000Z","2020-06-02T19:08:32.000Z","2020-06-02T19:08:33.000Z","2020-06-02T19:08:34.000Z","2020-06-02T19:08:35.000Z","2020-06-02T19:08:36.000Z","2020-06-02T19:08:37.000Z","2020-06-02T19:08:38.000Z","2020-06-02T19:08:39.000Z","2020-06-02T19:08:40.000Z","2020-06-02T19:08:41.000Z","2020-06-02T19:08:42.000Z","2020-06-02T19:08:43.000Z","2020-06-02T19:08:44.000Z","2020-06-02T19:08:45.000Z","2020-06-02T19:08:46.000Z","2020-06-02T19:08:47.000Z","2020-06-02T19:08:48.000Z","2020-06-02T19:08:49.000Z","2020-06-02T19:08:50.000Z","2020-06-02T19:08:51.000Z","2020-06-02T19:08:52.000Z","2020-06-02T19:08:53.000Z","2020-06-02T19:08:54.000Z","2020-06-02T19:08:55.000Z","2020-06-02T19:08:56.000Z","2020-06-02T19:08:57.000Z","2020-06-02T19:08:58.000Z","2020-06-02T19:08:59.000Z","2020-06-02T19:09:00.000Z","2020-06-02T19:09:01.000Z","2020-06-02T19:09:02.000Z","2020-06-02T19:09:03.000Z","2020-06-02T19:09:04.000Z","2020-06-02T19:09:05.000Z","2020-06-02T19:09:06.000Z","2020-06-02T19:09:07.000Z","2020-06-02T19:09:08.000Z","2020-06-02T19:09:09.000Z","2020-06-02T19:09:10.000Z","2020-06-02T19:09:11.000Z","2020-06-02T19:09:12.000Z","2020-06-02T19:09:13.000Z","2020-06-02T19:09:14.000Z","2020-06-02T19:09:15.000Z","2020-06-02T19:09:16.000Z","2020-06-02T19:09:17.000Z","2020-06-02T19:09:18.000Z","2020-06-02T19:09:19.000Z","2020-06-02T19:09:20.000Z","2020-06-02T19:09:21.000Z","2020-06-02T19:09:22.000Z","2020-06-02T19:09:23.000Z","2020-06-02T19:09:24.000Z","2020-06-02T19:09:25.000Z","2020-06-02T19:09:26.000Z","2020-06-02T19:09:27.000Z","2020-06-02T19:09:28.000Z","2020-06-02T19:09:29.000Z","2020-06-02T19:09:30.000Z","2020-06-02T19:09:31.000Z","2020-06-02T19:09:32.000Z","2020-06-02T19:09:33.000Z","2020-06-02T19:09:34.000Z","2020-06-02T19:09:35.000Z","2020-06-02T19:09:36.000Z","2020-06-02T19:09:37.000Z","2020-06-02T19:09:38.000Z","2020-06-02T19:09:39.000Z","2020-06-02T19:09:40.000Z","2020-06-02T19:09:41.000Z","2020-06-02T19:09:42.000Z","2020-06-02T19:09:43.000Z","2020-06-02T19:09:44.000Z","2020-06-02T19:09:45.000Z","2020-06-02T19:09:46.000Z","2020-06-02T19:09:47.000Z","2020-06-02T19:09:48.000Z","2020-06-02T19:09:49.000Z","2020-06-02T19:09:50.000Z","2020-06-02T19:09:51.000Z","2020-06-02T19:09:52.000Z","2020-06-02T19:09:53.000Z","2020-06-02T19:09:54.000Z","2020-06-02T19:09:55.000Z","2020-06-02T19:09:56.000Z","2020-06-02T19:09:57.000Z","2020-06-02T19:09:58.000Z","2020-06-02T19:09:59.000Z","2020-06-02T19:10:00.000Z","2020-06-02T19:10:01.000Z","2020-06-02T19:10:02.000Z","2020-06-02T19:10:03.000Z","2020-06-02T19:10:04.000Z","2020-06-02T19:10:05.000Z","2020-06-02T19:10:06.000Z","2020-06-02T19:10:07.000Z","2020-06-02T19:10:08.000Z","2020-06-02T19:10:09.000Z","2020-06-02T19:10:10.000Z","2020-06-02T19:10:11.000Z","2020-06-02T19:10:12.000Z","2020-06-02T19:10:13.000Z","2020-06-02T19:10:14.000Z","2020-06-02T19:10:15.000Z","2020-06-02T19:10:16.000Z","2020-06-02T19:10:17.000Z","2020-06-02T19:10:18.000Z","2020-06-02T19:10:19.000Z","2020-06-02T19:10:20.000Z","2020-06-02T19:10:21.000Z","2020-06-02T19:10:22.000Z","2020-06-02T19:10:23.000Z","2020-06-02T19:10:24.000Z","2020-06-02T19:10:25.000Z","2020-06-02T19:10:26.000Z","2020-06-02T19:10:27.000Z","2020-06-02T19:10:28.000Z","2020-06-02T19:10:29.000Z","2020-06-02T19:10:30.000Z","2020-06-02T19:10:31.000Z","2020-06-02T19:10:32.000Z","2020-06-02T19:10:33.000Z","2020-06-02T19:10:34.000Z","2020-06-02T19:10:35.000Z","2020-06-02T19:10:36.000Z","2020-06-02T19:10:37.000Z","2020-06-02T19:10:38.000Z","2020-06-02T19:10:39.000Z","2020-06-02T19:10:40.000Z","2020-06-02T19:10:41.000Z","2020-06-02T19:10:42.000Z","2020-06-02T19:10:43.000Z","2020-06-02T19:10:44.000Z","2020-06-02T19:10:45.000Z","2020-06-02T19:10:46.000Z","2020-06-02T19:10:47.000Z","2020-06-02T19:10:48.000Z","2020-06-02T19:10:49.000Z","2020-06-02T19:10:50.000Z","2020-06-02T19:10:51.000Z","2020-06-02T19:10:52.000Z","2020-06-02T19:10:53.000Z","2020-06-02T19:10:54.000Z","2020-06-02T19:10:55.000Z","2020-06-02T19:10:56.000Z","2020-06-02T19:10:57.000Z","2020-06-02T19:10:58.000Z","2020-06-02T19:10:59.000Z","2020-06-02T19:11:00.000Z","2020-06-02T19:11:01.000Z","2020-06-02T19:11:02.000Z","2020-06-02T19:11:03.000Z","2020-06-02T19:11:04.000Z","2020-06-02T19:11:05.000Z","2020-06-02T19:11:06.000Z","2020-06-02T19:11:07.000Z","2020-06-02T19:11:08.000Z","2020-06-02T19:11:09.000Z","2020-06-02T19:11:10.000Z","2020-06-02T19:11:11.000Z","2020-06-02T19:11:12.000Z","2020-06-02T19:11:13.000Z","2020-06-02T19:11:14.000Z","2020-06-02T19:11:15.000Z","2020-06-02T19:11:16.000Z","2020-06-02T19:11:17.000Z","2020-06-02T19:11:18.000Z","2020-06-02T19:11:19.000Z","2020-06-02T19:11:20.000Z","2020-06-02T19:11:21.000Z","2020-06-02T19:11:22.000Z","2020-06-02T19:11:23.000Z","2020-06-02T19:11:24.000Z","2020-06-02T19:11:25.000Z","2020-06-02T19:11:26.000Z","2020-06-02T19:11:27.000Z","2020-06-02T19:11:28.000Z","2020-06-02T19:11:29.000Z","2020-06-02T19:11:30.000Z","2020-06-02T19:11:31.000Z","2020-06-02T19:11:32.000Z","2020-06-02T19:11:33.000Z","2020-06-02T19:11:34.000Z","2020-06-02T19:11:35.000Z","2020-06-02T19:11:36.000Z","2020-06-02T19:11:37.000Z","2020-06-02T19:11:38.000Z","2020-06-02T19:11:39.000Z","2020-06-02T19:11:40.000Z","2020-06-02T19:11:41.000Z","2020-06-02T19:11:42.000Z","2020-06-02T19:11:43.000Z","2020-06-02T19:11:44.000Z","2020-06-02T19:11:45.000Z","2020-06-02T19:11:46.000Z","2020-06-02T19:11:47.000Z","2020-06-02T19:11:48.000Z","2020-06-02T19:11:49.000Z","2020-06-02T19:11:50.000Z","2020-06-02T19:11:51.000Z","2020-06-02T19:11:52.000Z","2020-06-02T19:11:53.000Z","2020-06-02T19:11:54.000Z","2020-06-02T19:11:55.000Z","2020-06-02T19:11:56.000Z","2020-06-02T19:11:57.000Z","2020-06-02T19:11:58.000Z","2020-06-02T19:11:59.000Z","2020-06-02T19:12:00.000Z","2020-06-02T19:12:01.000Z","2020-06-02T19:12:02.000Z","2020-06-02T19:12:03.000Z","2020-06-02T19:12:04.000Z","2020-06-02T19:12:05.000Z","2020-06-02T19:12:06.000Z","2020-06-02T19:12:07.000Z","2020-06-02T19:12:08.000Z","2020-06-02T19:12:09.000Z","2020-06-02T19:12:10.000Z","2020-06-02T19:12:11.000Z","2020-06-02T19:12:12.000Z","2020-06-02T19:12:13.000Z","2020-06-02T19:12:14.000Z","2020-06-02T19:12:15.000Z","2020-06-02T19:12:16.000Z","2020-06-02T19:12:17.000Z","2020-06-02T19:12:18.000Z","2020-06-02T19:12:19.000Z","2020-06-02T19:12:20.000Z","2020-06-02T19:12:21.000Z","2020-06-02T19:12:22.000Z","2020-06-02T19:12:23.000Z","2020-06-02T19:12:24.000Z","2020-06-02T19:12:25.000Z","2020-06-02T19:12:26.000Z","2020-06-02T19:12:27.000Z","2020-06-02T19:12:28.000Z","2020-06-02T19:12:29.000Z","2020-06-02T19:12:30.000Z","2020-06-02T19:12:31.000Z","2020-06-02T19:12:32.000Z","2020-06-02T19:12:33.000Z","2020-06-02T19:12:34.000Z","2020-06-02T19:12:35.000Z","2020-06-02T19:12:36.000Z","2020-06-02T19:12:37.000Z","2020-06-02T19:12:38.000Z","2020-06-02T19:12:39.000Z","2020-06-02T19:12:40.000Z","2020-06-02T19:12:41.000Z","2020-06-02T19:12:42.000Z","2020-06-02T19:12:43.000Z","2020-06-02T19:12:44.000Z","2020-06-02T19:12:45.000Z","2020-06-02T19:12:46.000Z","2020-06-02T19:12:47.000Z","2020-06-02T19:12:48.000Z","2020-06-02T19:12:49.000Z","2020-06-02T19:12:50.000Z","2020-06-02T19:12:51.000Z","2020-06-02T19:12:52.000Z","2020-06-02T19:12:53.000Z","2020-06-02T19:12:54.000Z","2020-06-02T19:12:55.000Z","2020-06-02T19:12:56.000Z","2020-06-02T19:12:57.000Z","2020-06-02T19:12:58.000Z","2020-06-02T19:12:59.000Z","2020-06-02T19:13:00.000Z","2020-06-02T19:13:01.000Z","2020-06-02T19:13:02.000Z","2020-06-02T19:13:03.000Z","2020-06-02T19:13:04.000Z","2020-06-02T19:13:05.000Z","2020-06-02T19:13:06.000Z","2020-06-02T19:13:07.000Z","2020-06-02T19:13:08.000Z","2020-06-02T19:13:09.000Z","2020-06-02T19:13:10.000Z","2020-06-02T19:13:11.000Z","2020-06-02T19:13:12.000Z","2020-06-02T19:13:13.000Z","2020-06-02T19:13:14.000Z","2020-06-02T19:13:15.000Z","2020-06-02T19:13:16.000Z","2020-06-02T19:13:17.000Z","2020-06-02T19:13:18.000Z","2020-06-02T19:13:19.000Z","2020-06-02T19:13:20.000Z","2020-06-02T19:13:21.000Z","2020-06-02T19:13:22.000Z","2020-06-02T19:13:23.000Z","2020-06-02T19:13:24.000Z","2020-06-02T19:13:25.000Z","2020-06-02T19:13:26.000Z","2020-06-02T19:13:27.000Z","2020-06-02T19:13:28.000Z","2020-06-02T19:13:29.000Z","2020-06-02T19:13:30.000Z","2020-06-02T19:13:31.000Z","2020-06-02T19:13:32.000Z","2020-06-02T19:13:33.000Z","2020-06-02T19:13:34.000Z","2020-06-02T19:13:35.000Z","2020-06-02T19:13:36.000Z","2020-06-02T19:13:37.000Z","2020-06-02T19:13:38.000Z","2020-06-02T19:13:39.000Z","2020-06-02T19:13:40.000Z","2020-06-02T19:13:41.000Z","2020-06-02T19:13:42.000Z","2020-06-02T19:13:43.000Z","2020-06-02T19:13:44.000Z","2020-06-02T19:13:45.000Z","2020-06-02T19:13:46.000Z","2020-06-02T19:13:47.000Z","2020-06-02T19:13:48.000Z","2020-06-02T19:13:49.000Z","2020-06-02T19:13:50.000Z","2020-06-02T19:13:51.000Z","2020-06-02T19:13:52.000Z","2020-06-02T19:13:53.000Z","2020-06-02T19:13:54.000Z","2020-06-02T19:13:55.000Z","2020-06-02T19:13:56.000Z","2020-06-02T19:13:57.000Z","2020-06-02T19:13:58.000Z","2020-06-02T19:13:59.000Z","2020-06-02T19:14:00.000Z","2020-06-02T19:14:01.000Z","2020-06-02T19:14:02.000Z","2020-06-02T19:14:03.000Z","2020-06-02T19:14:04.000Z","2020-06-02T19:14:05.000Z","2020-06-02T19:14:06.000Z","2020-06-02T19:14:07.000Z","2020-06-02T19:14:08.000Z","2020-06-02T19:14:09.000Z","2020-06-02T19:14:10.000Z","2020-06-02T19:14:11.000Z","2020-06-02T19:14:12.000Z","2020-06-02T19:14:13.000Z","2020-06-02T19:14:14.000Z","2020-06-02T19:14:15.000Z","2020-06-02T19:14:16.000Z","2020-06-02T19:14:17.000Z","2020-06-02T19:14:18.000Z","2020-06-02T19:14:19.000Z","2020-06-02T19:14:20.000Z","2020-06-02T19:14:21.000Z","2020-06-02T19:14:22.000Z","2020-06-02T19:14:23.000Z","2020-06-02T19:14:24.000Z","2020-06-02T19:14:25.000Z","2020-06-02T19:14:26.000Z","2020-06-02T19:14:27.000Z","2020-06-02T19:14:28.000Z","2020-06-02T19:14:29.000Z","2020-06-02T19:14:30.000Z","2020-06-02T19:14:31.000Z","2020-06-02T19:14:32.000Z","2020-06-02T19:14:33.000Z","2020-06-02T19:14:34.000Z","2020-06-02T19:14:35.000Z","2020-06-02T19:14:36.000Z","2020-06-02T19:14:37.000Z","2020-06-02T19:14:38.000Z","2020-06-02T19:14:39.000Z","2020-06-02T19:14:40.000Z","2020-06-02T19:14:41.000Z","2020-06-02T19:14:42.000Z","2020-06-02T19:14:43.000Z","2020-06-02T19:14:44.000Z","2020-06-02T19:14:45.000Z","2020-06-02T19:14:46.000Z","2020-06-02T19:14:47.000Z","2020-06-02T19:14:48.000Z","2020-06-02T19:14:49.000Z","2020-06-02T19:14:50.000Z","2020-06-02T19:14:51.000Z","2020-06-02T19:14:52.000Z","2020-06-02T19:14:53.000Z","2020-06-02T19:14:54.000Z","2020-06-02T19:14:55.000Z","2020-06-02T19:14:56.000Z","2020-06-02T19:14:57.000Z","2020-06-02T19:14:58.000Z","2020-06-02T19:14:59.000Z","2020-06-02T19:15:00.000Z","2020-06-02T19:15:01.000Z","2020-06-02T19:15:02.000Z","2020-06-02T19:15:03.000Z","2020-06-02T19:15:04.000Z","2020-06-02T19:15:05.000Z","2020-06-02T19:15:06.000Z","2020-06-02T19:15:07.000Z","2020-06-02T19:15:08.000Z","2020-06-02T19:15:09.000Z","2020-06-02T19:15:10.000Z","2020-06-02T19:15:11.000Z","2020-06-02T19:15:12.000Z","2020-06-02T19:15:13.000Z","2020-06-02T19:15:14.000Z","2020-06-02T19:15:15.000Z","2020-06-02T19:15:16.000Z","2020-06-02T19:15:17.000Z","2020-06-02T19:15:18.000Z","2020-06-02T19:15:19.000Z","2020-06-02T19:15:20.000Z","2020-06-02T19:15:21.000Z","2020-06-02T19:15:22.000Z","2020-06-02T19:15:23.000Z","2020-06-02T19:15:24.000Z","2020-06-02T19:15:25.000Z","2020-06-02T19:15:26.000Z","2020-06-02T19:15:27.000Z","2020-06-02T19:15:28.000Z","2020-06-02T19:15:29.000Z","2020-06-02T19:15:30.000Z","2020-06-02T19:15:31.000Z","2020-06-02T19:15:32.000Z","2020-06-02T19:15:33.000Z","2020-06-02T19:15:34.000Z","2020-06-02T19:15:35.000Z","2020-06-02T19:15:36.000Z","2020-06-02T19:15:37.000Z","2020-06-02T19:15:38.000Z","2020-06-02T19:15:39.000Z","2020-06-02T19:15:40.000Z","2020-06-02T19:15:41.000Z","2020-06-02T19:15:42.000Z","2020-06-02T19:15:43.000Z","2020-06-02T19:15:44.000Z","2020-06-02T19:15:45.000Z","2020-06-02T19:15:46.000Z","2020-06-02T19:15:47.000Z","2020-06-02T19:15:48.000Z","2020-06-02T19:15:49.000Z","2020-06-02T19:15:50.000Z","2020-06-02T19:15:51.000Z","2020-06-02T19:15:52.000Z","2020-06-02T19:15:53.000Z","2020-06-02T19:15:54.000Z","2020-06-02T19:15:55.000Z","2020-06-02T19:15:56.000Z","2020-06-02T19:15:57.000Z","2020-06-02T19:15:58.000Z","2020-06-02T19:15:59.000Z","2020-06-02T19:16:00.000Z","2020-06-02T19:16:01.000Z","2020-06-02T19:16:02.000Z","2020-06-02T19:16:03.000Z","2020-06-02T19:16:04.000Z","2020-06-02T19:16:05.000Z","2020-06-02T19:16:06.000Z","2020-06-02T19:16:07.000Z","2020-06-02T19:16:08.000Z","2020-06-02T19:16:09.000Z","2020-06-02T19:16:10.000Z","2020-06-02T19:16:11.000Z","2020-06-02T19:16:12.000Z","2020-06-02T19:16:13.000Z","2020-06-02T19:16:14.000Z","2020-06-02T19:16:15.000Z","2020-06-02T19:16:16.000Z","2020-06-02T19:16:17.000Z","2020-06-02T19:16:18.000Z","2020-06-02T19:16:19.000Z","2020-06-02T19:16:20.000Z","2020-06-02T19:16:21.000Z","2020-06-02T19:16:22.000Z","2020-06-02T19:16:23.000Z","2020-06-02T19:16:24.000Z","2020-06-02T19:16:25.000Z","2020-06-02T19:16:26.000Z","2020-06-02T19:16:27.000Z","2020-06-02T19:16:28.000Z","2020-06-02T19:16:29.000Z","2020-06-02T19:16:30.000Z","2020-06-02T19:16:31.000Z","2020-06-02T19:16:32.000Z","2020-06-02T19:16:33.000Z","2020-06-02T19:16:34.000Z","2020-06-02T19:16:35.000Z","2020-06-02T19:16:36.000Z","2020-06-02T19:16:37.000Z","2020-06-02T19:16:38.000Z","2020-06-02T19:16:39.000Z","2020-06-02T19:16:40.000Z","2020-06-02T19:16:41.000Z","2020-06-02T19:16:42.000Z","2020-06-02T19:16:43.000Z","2020-06-02T19:16:44.000Z","2020-06-02T19:16:45.000Z","2020-06-02T19:16:46.000Z","2020-06-02T19:16:47.000Z","2020-06-02T19:16:48.000Z","2020-06-02T19:16:49.000Z","2020-06-02T19:16:50.000Z","2020-06-02T19:16:51.000Z","2020-06-02T19:16:52.000Z","2020-06-02T19:16:53.000Z","2020-06-02T19:16:54.000Z","2020-06-02T19:16:55.000Z","2020-06-02T19:16:56.000Z","2020-06-02T19:16:57.000Z","2020-06-02T19:16:58.000Z","2020-06-02T19:16:59.000Z","2020-06-02T19:17:00.000Z","2020-06-02T19:17:01.000Z","2020-06-02T19:17:02.000Z","2020-06-02T19:17:03.000Z","2020-06-02T19:17:04.000Z","2020-06-02T19:17:05.000Z","2020-06-02T19:17:06.000Z","2020-06-02T19:17:07.000Z","2020-06-02T19:17:08.000Z","2020-06-02T19:17:09.000Z","2020-06-02T19:17:10.000Z","2020-06-02T19:17:11.000Z","2020-06-02T19:17:12.000Z","2020-06-02T19:17:13.000Z","2020-06-02T19:17:14.000Z","2020-06-02T19:17:15.000Z","2020-06-02T19:17:16.000Z","2020-06-02T19:17:17.000Z","2020-06-02T19:17:18.000Z","2020-06-02T19:17:19.000Z","2020-06-02T19:17:20.000Z","2020-06-02T19:17:21.000Z","2020-06-02T19:17:22.000Z","2020-06-02T19:17:23.000Z","2020-06-02T19:17:24.000Z","2020-06-02T19:17:25.000Z","2020-06-02T19:17:26.000Z","2020-06-02T19:17:27.000Z","2020-06-02T19:17:28.000Z","2020-06-02T19:17:29.000Z","2020-06-02T19:17:30.000Z","2020-06-02T19:17:31.000Z","2020-06-02T19:17:32.000Z","2020-06-02T19:17:33.000Z","2020-06-02T19:17:34.000Z","2020-06-02T19:17:35.000Z","2020-06-02T19:17:36.000Z","2020-06-02T19:17:37.000Z","2020-06-02T19:17:38.000Z","2020-06-02T19:17:39.000Z","2020-06-02T19:17:40.000Z","2020-06-02T19:17:41.000Z","2020-06-02T19:17:42.000Z","2020-06-02T19:17:43.000Z","2020-06-02T19:17:44.000Z","2020-06-02T19:17:45.000Z","2020-06-02T19:17:46.000Z","2020-06-02T19:17:47.000Z","2020-06-02T19:17:48.000Z","2020-06-02T19:17:49.000Z","2020-06-02T19:17:50.000Z","2020-06-02T19:17:51.000Z","2020-06-02T19:17:52.000Z","2020-06-02T19:17:53.000Z","2020-06-02T19:17:54.000Z","2020-06-02T19:17:55.000Z","2020-06-02T19:17:56.000Z","2020-06-02T19:17:57.000Z","2020-06-02T19:17:58.000Z","2020-06-02T19:17:59.000Z","2020-06-02T19:18:00.000Z","2020-06-02T19:18:01.000Z","2020-06-02T19:18:02.000Z","2020-06-02T19:18:03.000Z","2020-06-02T19:18:04.000Z","2020-06-02T19:18:05.000Z","2020-06-02T19:18:06.000Z","2020-06-02T19:18:07.000Z","2020-06-02T19:18:08.000Z","2020-06-02T19:18:09.000Z","2020-06-02T19:18:10.000Z","2020-06-02T19:18:11.000Z","2020-06-02T19:18:12.000Z","2020-06-02T19:18:13.000Z","2020-06-02T19:18:14.000Z","2020-06-02T19:18:15.000Z","2020-06-02T19:18:16.000Z","2020-06-02T19:18:17.000Z","2020-06-02T19:18:18.000Z","2020-06-02T19:18:19.000Z","2020-06-02T19:18:20.000Z","2020-06-02T19:18:21.000Z","2020-06-02T19:18:22.000Z","2020-06-02T19:18:23.000Z","2020-06-02T19:18:24.000Z","2020-06-02T19:18:25.000Z","2020-06-02T19:18:26.000Z","2020-06-02T19:18:27.000Z","2020-06-02T19:18:28.000Z","2020-06-02T19:18:29.000Z","2020-06-02T19:18:30.000Z","2020-06-02T19:18:31.000Z","2020-06-02T19:18:32.000Z","2020-06-02T19:18:33.000Z","2020-06-02T19:18:34.000Z","2020-06-02T19:18:35.000Z","2020-06-02T19:18:36.000Z","2020-06-02T19:18:37.000Z","2020-06-02T19:18:38.000Z","2020-06-02T19:18:39.000Z","2020-06-02T19:18:40.000Z","2020-06-02T19:18:41.000Z","2020-06-02T19:18:42.000Z","2020-06-02T19:18:43.000Z","2020-06-02T19:18:44.000Z","2020-06-02T19:18:45.000Z","2020-06-02T19:18:46.000Z","2020-06-02T19:18:47.000Z","2020-06-02T19:18:48.000Z","2020-06-02T19:18:49.000Z","2020-06-02T19:18:50.000Z","2020-06-02T19:18:51.000Z","2020-06-02T19:18:52.000Z","2020-06-02T19:18:53.000Z","2020-06-02T19:18:54.000Z","2020-06-02T19:18:55.000Z","2020-06-02T19:18:56.000Z","2020-06-02T19:18:57.000Z","2020-06-02T19:18:58.000Z","2020-06-02T19:18:59.000Z","2020-06-02T19:19:00.000Z","2020-06-02T19:19:01.000Z","2020-06-02T19:19:02.000Z","2020-06-02T19:19:03.000Z","2020-06-02T19:19:04.000Z","2020-06-02T19:19:05.000Z","2020-06-02T19:19:06.000Z","2020-06-02T19:19:07.000Z","2020-06-02T19:19:08.000Z","2020-06-02T19:19:09.000Z","2020-06-02T19:19:10.000Z","2020-06-02T19:19:11.000Z","2020-06-02T19:19:12.000Z","2020-06-02T19:19:13.000Z","2020-06-02T19:19:14.000Z","2020-06-02T19:19:15.000Z","2020-06-02T19:19:16.000Z","2020-06-02T19:19:17.000Z","2020-06-02T19:19:18.000Z","2020-06-02T19:19:19.000Z","2020-06-02T19:19:20.000Z","2020-06-02T19:19:21.000Z","2020-06-02T19:19:22.000Z","2020-06-02T19:19:23.000Z","2020-06-02T19:19:24.000Z","2020-06-02T19:19:25.000Z","2020-06-02T19:19:26.000Z","2020-06-02T19:19:27.000Z","2020-06-02T19:19:28.000Z","2020-06-02T19:19:29.000Z","2020-06-02T19:19:30.000Z","2020-06-02T19:19:31.000Z","2020-06-02T19:19:32.000Z","2020-06-02T19:19:33.000Z","2020-06-02T19:19:34.000Z","2020-06-02T19:19:35.000Z","2020-06-02T19:19:36.000Z","2020-06-02T19:19:37.000Z","2020-06-02T19:19:38.000Z","2020-06-02T19:19:39.000Z","2020-06-02T19:19:40.000Z","2020-06-02T19:19:41.000Z","2020-06-02T19:19:42.000Z","2020-06-02T19:19:43.000Z","2020-06-02T19:19:44.000Z","2020-06-02T19:19:45.000Z","2020-06-02T19:19:46.000Z","2020-06-02T19:19:47.000Z","2020-06-02T19:19:48.000Z","2020-06-02T19:19:49.000Z","2020-06-02T19:19:50.000Z","2020-06-02T19:19:51.000Z","2020-06-02T19:19:52.000Z","2020-06-02T19:19:53.000Z","2020-06-02T19:19:54.000Z","2020-06-02T19:19:55.000Z","2020-06-02T19:19:56.000Z","2020-06-02T19:19:57.000Z","2020-06-02T19:19:58.000Z","2020-06-02T19:19:59.000Z","2020-06-02T19:20:00.000Z","2020-06-02T19:20:01.000Z","2020-06-02T19:20:02.000Z","2020-06-02T19:20:03.000Z","2020-06-02T19:20:04.000Z","2020-06-02T19:20:05.000Z","2020-06-02T19:20:06.000Z","2020-06-02T19:20:07.000Z","2020-06-02T19:20:08.000Z","2020-06-02T19:20:09.000Z","2020-06-02T19:20:10.000Z","2020-06-02T19:20:11.000Z","2020-06-02T19:20:12.000Z","2020-06-02T19:20:13.000Z","2020-06-02T19:20:14.000Z","2020-06-02T19:20:15.000Z","2020-06-02T19:20:16.000Z","2020-06-02T19:20:17.000Z","2020-06-02T19:20:18.000Z","2020-06-02T19:20:19.000Z","2020-06-02T19:20:20.000Z","2020-06-02T19:20:21.000Z","2020-06-02T19:20:22.000Z","2020-06-02T19:20:23.000Z","2020-06-02T19:20:24.000Z","2020-06-02T19:20:25.000Z","2020-06-02T19:20:26.000Z","2020-06-02T19:20:27.000Z","2020-06-02T19:20:28.000Z","2020-06-02T19:20:29.000Z","2020-06-02T19:20:30.000Z","2020-06-02T19:20:31.000Z","2020-06-02T19:20:32.000Z","2020-06-02T19:20:33.000Z","2020-06-02T19:20:34.000Z","2020-06-02T19:20:35.000Z","2020-06-02T19:20:36.000Z","2020-06-02T19:20:37.000Z","2020-06-02T19:20:38.000Z","2020-06-02T19:20:39.000Z","2020-06-02T19:20:40.000Z","2020-06-02T19:20:41.000Z","2020-06-02T19:20:42.000Z","2020-06-02T19:20:43.000Z","2020-06-02T19:20:44.000Z","2020-06-02T19:20:45.000Z","2020-06-02T19:20:46.000Z","2020-06-02T19:20:47.000Z","2020-06-02T19:20:48.000Z","2020-06-02T19:20:49.000Z","2020-06-02T19:20:50.000Z","2020-06-02T19:20:51.000Z","2020-06-02T19:20:52.000Z","2020-06-02T19:20:53.000Z","2020-06-02T19:20:54.000Z","2020-06-02T19:20:55.000Z","2020-06-02T19:20:56.000Z","2020-06-02T19:20:57.000Z","2020-06-02T19:20:58.000Z","2020-06-02T19:20:59.000Z","2020-06-02T19:21:00.000Z","2020-06-02T19:21:01.000Z","2020-06-02T19:21:02.000Z","2020-06-02T19:21:03.000Z","2020-06-02T19:21:04.000Z","2020-06-02T19:21:05.000Z","2020-06-02T19:21:06.000Z","2020-06-02T19:21:07.000Z","2020-06-02T19:21:08.000Z","2020-06-02T19:21:09.000Z","2020-06-02T19:21:10.000Z","2020-06-02T19:21:11.000Z","2020-06-02T19:21:12.000Z","2020-06-02T19:21:13.000Z","2020-06-02T19:21:14.000Z","2020-06-02T19:21:15.000Z","2020-06-02T19:21:16.000Z","2020-06-02T19:21:17.000Z","2020-06-02T19:21:18.000Z","2020-06-02T19:21:19.000Z","2020-06-02T19:21:20.000Z","2020-06-02T19:21:21.000Z","2020-06-02T19:21:22.000Z","2020-06-02T19:21:23.000Z","2020-06-02T19:21:24.000Z","2020-06-02T19:21:25.000Z","2020-06-02T19:21:26.000Z","2020-06-02T19:21:27.000Z","2020-06-02T19:21:28.000Z","2020-06-02T19:21:29.000Z","2020-06-02T19:21:30.000Z","2020-06-02T19:21:31.000Z","2020-06-02T19:21:32.000Z","2020-06-02T19:21:33.000Z","2020-06-02T19:21:34.000Z","2020-06-02T19:21:35.000Z","2020-06-02T19:21:36.000Z","2020-06-02T19:21:37.000Z","2020-06-02T19:21:38.000Z","2020-06-02T19:21:39.000Z","2020-06-02T19:21:40.000Z","2020-06-02T19:21:41.000Z","2020-06-02T19:21:42.000Z","2020-06-02T19:21:43.000Z","2020-06-02T19:21:44.000Z","2020-06-02T19:21:45.000Z","2020-06-02T19:21:46.000Z","2020-06-02T19:21:47.000Z","2020-06-02T19:21:48.000Z","2020-06-02T19:21:49.000Z","2020-06-02T19:21:50.000Z","2020-06-02T19:21:51.000Z","2020-06-02T19:21:52.000Z","2020-06-02T19:21:53.000Z","2020-06-02T19:21:54.000Z","2020-06-02T19:21:55.000Z","2020-06-02T19:21:56.000Z","2020-06-02T19:21:57.000Z","2020-06-02T19:21:58.000Z","2020-06-02T19:21:59.000Z","2020-06-02T19:22:00.000Z","2020-06-02T19:22:01.000Z","2020-06-02T19:22:02.000Z","2020-06-02T19:22:03.000Z","2020-06-02T19:22:04.000Z","2020-06-02T19:22:05.000Z","2020-06-02T19:22:06.000Z","2020-06-02T19:22:07.000Z","2020-06-02T19:22:08.000Z","2020-06-02T19:22:09.000Z","2020-06-02T19:22:10.000Z","2020-06-02T19:22:11.000Z","2020-06-02T19:22:12.000Z","2020-06-02T19:22:13.000Z","2020-06-02T19:22:14.000Z","2020-06-02T19:22:15.000Z","2020-06-02T19:22:16.000Z","2020-06-02T19:22:17.000Z","2020-06-02T19:22:18.000Z","2020-06-02T19:22:19.000Z","2020-06-02T19:22:20.000Z","2020-06-02T19:22:21.000Z","2020-06-02T19:22:22.000Z","2020-06-02T19:22:23.000Z","2020-06-02T19:22:24.000Z","2020-06-02T19:22:25.000Z","2020-06-02T19:22:26.000Z","2020-06-02T19:22:27.000Z","2020-06-02T19:22:28.000Z","2020-06-02T19:22:29.000Z","2020-06-02T19:22:30.000Z","2020-06-02T19:22:31.000Z","2020-06-02T19:22:32.000Z","2020-06-02T19:22:33.000Z","2020-06-02T19:22:34.000Z","2020-06-02T19:22:35.000Z","2020-06-02T19:22:36.000Z","2020-06-02T19:22:37.000Z","2020-06-02T19:22:38.000Z","2020-06-02T19:22:39.000Z","2020-06-02T19:22:40.000Z","2020-06-02T19:22:41.000Z","2020-06-02T19:22:42.000Z","2020-06-02T19:22:43.000Z","2020-06-02T19:22:44.000Z","2020-06-02T19:22:45.000Z","2020-06-02T19:22:46.000Z","2020-06-02T19:22:47.000Z","2020-06-02T19:22:48.000Z","2020-06-02T19:22:49.000Z","2020-06-02T19:22:50.000Z","2020-06-02T19:22:51.000Z","2020-06-02T19:22:52.000Z","2020-06-02T19:22:53.000Z","2020-06-02T19:22:54.000Z","2020-06-02T19:22:55.000Z","2020-06-02T19:22:56.000Z","2020-06-02T19:22:57.000Z","2020-06-02T19:22:58.000Z","2020-06-02T19:22:59.000Z","2020-06-02T19:23:00.000Z","2020-06-02T19:23:01.000Z","2020-06-02T19:23:02.000Z","2020-06-02T19:23:03.000Z","2020-06-02T19:23:04.000Z","2020-06-02T19:23:05.000Z","2020-06-02T19:23:06.000Z","2020-06-02T19:23:07.000Z","2020-06-02T19:23:08.000Z","2020-06-02T19:23:09.000Z","2020-06-02T19:23:10.000Z","2020-06-02T19:23:11.000Z","2020-06-02T19:23:12.000Z","2020-06-02T19:23:13.000Z","2020-06-02T19:23:14.000Z","2020-06-02T19:23:15.000Z","2020-06-02T19:23:16.000Z","2020-06-02T19:23:17.000Z","2020-06-02T19:23:18.000Z","2020-06-02T19:23:19.000Z","2020-06-02T19:23:20.000Z","2020-06-02T19:23:21.000Z","2020-06-02T19:23:22.000Z","2020-06-02T19:23:23.000Z","2020-06-02T19:23:24.000Z","2020-06-02T19:23:25.000Z","2020-06-02T19:23:26.000Z","2020-06-02T19:23:27.000Z","2020-06-02T19:23:28.000Z","2020-06-02T19:23:29.000Z","2020-06-02T19:23:30.000Z","2020-06-02T19:23:31.000Z","2020-06-02T19:23:32.000Z","2020-06-02T19:23:33.000Z","2020-06-02T19:23:34.000Z","2020-06-02T19:23:35.000Z","2020-06-02T19:23:36.000Z","2020-06-02T19:23:37.000Z","2020-06-02T19:23:38.000Z","2020-06-02T19:23:39.000Z","2020-06-02T19:23:40.000Z","2020-06-02T19:23:41.000Z","2020-06-02T19:23:42.000Z","2020-06-02T19:23:43.000Z","2020-06-02T19:23:44.000Z","2020-06-02T19:23:45.000Z","2020-06-02T19:23:46.000Z","2020-06-02T19:23:47.000Z","2020-06-02T19:23:48.000Z","2020-06-02T19:23:49.000Z","2020-06-02T19:23:50.000Z","2020-06-02T19:23:51.000Z","2020-06-02T19:23:52.000Z","2020-06-02T19:23:53.000Z","2020-06-02T19:23:54.000Z","2020-06-02T19:23:55.000Z","2020-06-02T19:23:56.000Z","2020-06-02T19:23:57.000Z","2020-06-02T19:23:58.000Z","2020-06-02T19:23:59.000Z","2020-06-02T19:24:00.000Z","2020-06-02T19:24:01.000Z","2020-06-02T19:24:02.000Z","2020-06-02T19:24:03.000Z","2020-06-02T19:24:04.000Z","2020-06-02T19:24:05.000Z","2020-06-02T19:24:06.000Z","2020-06-02T19:24:07.000Z","2020-06-02T19:24:08.000Z","2020-06-02T19:24:09.000Z","2020-06-02T19:24:10.000Z","2020-06-02T19:24:11.000Z","2020-06-02T19:24:12.000Z","2020-06-02T19:24:13.000Z","2020-06-02T19:24:14.000Z","2020-06-02T19:24:15.000Z","2020-06-02T19:24:16.000Z","2020-06-02T19:24:17.000Z","2020-06-02T19:24:18.000Z","2020-06-02T19:24:19.000Z","2020-06-02T19:24:20.000Z","2020-06-02T19:24:21.000Z","2020-06-02T19:24:22.000Z","2020-06-02T19:24:23.000Z","2020-06-02T19:24:24.000Z","2020-06-02T19:24:25.000Z","2020-06-02T19:24:26.000Z","2020-06-02T19:24:27.000Z","2020-06-02T19:24:28.000Z","2020-06-02T19:24:29.000Z","2020-06-02T19:24:30.000Z","2020-06-02T19:24:31.000Z","2020-06-02T19:24:32.000Z","2020-06-02T19:24:33.000Z","2020-06-02T19:24:34.000Z","2020-06-02T19:24:35.000Z","2020-06-02T19:24:36.000Z","2020-06-02T19:24:37.000Z","2020-06-02T19:24:38.000Z","2020-06-02T19:24:39.000Z","2020-06-02T19:24:40.000Z","2020-06-02T19:24:41.000Z","2020-06-02T19:24:42.000Z","2020-06-02T19:24:43.000Z","2020-06-02T19:24:44.000Z","2020-06-02T19:24:45.000Z","2020-06-02T19:24:46.000Z","2020-06-02T19:24:47.000Z","2020-06-02T19:24:48.000Z","2020-06-02T19:24:49.000Z","2020-06-02T19:24:50.000Z","2020-06-02T19:24:51.000Z","2020-06-02T19:24:52.000Z","2020-06-02T19:24:53.000Z","2020-06-02T19:24:54.000Z","2020-06-02T19:24:55.000Z","2020-06-02T19:24:56.000Z","2020-06-02T19:24:57.000Z","2020-06-02T19:24:58.000Z","2020-06-02T19:24:59.000Z","2020-06-02T19:25:00.000Z","2020-06-02T19:25:01.000Z","2020-06-02T19:25:02.000Z","2020-06-02T19:25:03.000Z","2020-06-02T19:25:04.000Z","2020-06-02T19:25:05.000Z","2020-06-02T19:25:06.000Z","2020-06-02T19:25:07.000Z","2020-06-02T19:25:08.000Z","2020-06-02T19:25:09.000Z","2020-06-02T19:25:10.000Z","2020-06-02T19:25:11.000Z","2020-06-02T19:25:12.000Z","2020-06-02T19:25:13.000Z","2020-06-02T19:25:14.000Z","2020-06-02T19:25:15.000Z","2020-06-02T19:25:16.000Z","2020-06-02T19:25:17.000Z","2020-06-02T19:25:18.000Z","2020-06-02T19:25:19.000Z","2020-06-02T19:25:20.000Z","2020-06-02T19:25:21.000Z","2020-06-02T19:25:22.000Z","2020-06-02T19:25:23.000Z","2020-06-02T19:25:24.000Z","2020-06-02T19:25:25.000Z","2020-06-02T19:25:26.000Z","2020-06-02T19:25:27.000Z","2020-06-02T19:25:28.000Z","2020-06-02T19:25:29.000Z","2020-06-02T19:25:30.000Z","2020-06-02T19:25:31.000Z","2020-06-02T19:25:32.000Z","2020-06-02T19:25:33.000Z","2020-06-02T19:25:34.000Z","2020-06-02T19:25:35.000Z","2020-06-02T19:25:36.000Z","2020-06-02T19:25:37.000Z","2020-06-02T19:25:38.000Z","2020-06-02T19:25:39.000Z","2020-06-02T19:25:40.000Z","2020-06-02T19:25:41.000Z","2020-06-02T19:25:42.000Z","2020-06-02T19:25:43.000Z","2020-06-02T19:25:44.000Z","2020-06-02T19:25:45.000Z","2020-06-02T19:25:46.000Z","2020-06-02T19:25:47.000Z","2020-06-02T19:25:48.000Z","2020-06-02T19:25:49.000Z","2020-06-02T19:25:50.000Z","2020-06-02T19:25:51.000Z","2020-06-02T19:25:52.000Z","2020-06-02T19:25:53.000Z","2020-06-02T19:25:54.000Z","2020-06-02T19:25:55.000Z","2020-06-02T19:25:56.000Z","2020-06-02T19:25:57.000Z","2020-06-02T19:25:58.000Z","2020-06-02T19:25:59.000Z","2020-06-02T19:26:00.000Z","2020-06-02T19:26:01.000Z","2020-06-02T19:26:02.000Z","2020-06-02T19:26:03.000Z","2020-06-02T19:26:04.000Z","2020-06-02T19:26:05.000Z","2020-06-02T19:26:06.000Z","2020-06-02T19:26:07.000Z","2020-06-02T19:26:08.000Z","2020-06-02T19:26:09.000Z","2020-06-02T19:26:10.000Z","2020-06-02T19:26:11.000Z","2020-06-02T19:26:12.000Z","2020-06-02T19:26:13.000Z","2020-06-02T19:26:14.000Z","2020-06-02T19:26:15.000Z","2020-06-02T19:26:16.000Z","2020-06-02T19:26:17.000Z","2020-06-02T19:26:18.000Z","2020-06-02T19:26:19.000Z","2020-06-02T19:26:20.000Z","2020-06-02T19:26:21.000Z","2020-06-02T19:26:22.000Z","2020-06-02T19:26:23.000Z","2020-06-02T19:26:24.000Z","2020-06-02T19:26:25.000Z","2020-06-02T19:26:26.000Z","2020-06-02T19:26:27.000Z","2020-06-02T19:26:28.000Z","2020-06-02T19:26:29.000Z","2020-06-02T19:26:30.000Z","2020-06-02T19:26:31.000Z","2020-06-02T19:26:32.000Z","2020-06-02T19:26:33.000Z","2020-06-02T19:26:34.000Z","2020-06-02T19:26:35.000Z","2020-06-02T19:26:36.000Z","2020-06-02T19:26:37.000Z","2020-06-02T19:26:38.000Z","2020-06-02T19:26:39.000Z","2020-06-02T19:26:40.000Z","2020-06-02T19:26:41.000Z","2020-06-02T19:26:42.000Z","2020-06-02T19:26:43.000Z","2020-06-02T19:26:44.000Z","2020-06-02T19:26:45.000Z","2020-06-02T19:26:46.000Z","2020-06-02T19:26:47.000Z","2020-06-02T19:26:48.000Z","2020-06-02T19:26:49.000Z","2020-06-02T19:26:50.000Z","2020-06-02T19:26:51.000Z","2020-06-02T19:26:52.000Z","2020-06-02T19:26:53.000Z","2020-06-02T19:26:54.000Z","2020-06-02T19:26:55.000Z","2020-06-02T19:26:56.000Z","2020-06-02T19:26:57.000Z","2020-06-02T19:26:58.000Z","2020-06-02T19:26:59.000Z","2020-06-02T19:27:00.000Z","2020-06-02T19:27:01.000Z","2020-06-02T19:27:02.000Z","2020-06-02T19:27:03.000Z","2020-06-02T19:27:04.000Z","2020-06-02T19:27:05.000Z","2020-06-02T19:27:06.000Z","2020-06-02T19:27:07.000Z","2020-06-02T19:27:08.000Z","2020-06-02T19:27:09.000Z","2020-06-02T19:27:10.000Z","2020-06-02T19:27:11.000Z","2020-06-02T19:27:12.000Z","2020-06-02T19:27:13.000Z","2020-06-02T19:27:14.000Z","2020-06-02T19:27:15.000Z","2020-06-02T19:27:16.000Z","2020-06-02T19:27:17.000Z","2020-06-02T19:27:18.000Z","2020-06-02T19:27:19.000Z","2020-06-02T19:27:20.000Z","2020-06-02T19:27:21.000Z","2020-06-02T19:27:22.000Z","2020-06-02T19:27:23.000Z","2020-06-02T19:27:24.000Z","2020-06-02T19:27:25.000Z","2020-06-02T19:27:26.000Z","2020-06-02T19:27:27.000Z","2020-06-02T19:27:28.000Z","2020-06-02T19:27:29.000Z","2020-06-02T19:27:30.000Z","2020-06-02T19:27:31.000Z","2020-06-02T19:27:32.000Z","2020-06-02T19:27:33.000Z","2020-06-02T19:27:34.000Z","2020-06-02T19:27:35.000Z","2020-06-02T19:27:36.000Z","2020-06-02T19:27:37.000Z","2020-06-02T19:27:38.000Z","2020-06-02T19:27:39.000Z","2020-06-02T19:27:40.000Z","2020-06-02T19:27:41.000Z","2020-06-02T19:27:42.000Z","2020-06-02T19:27:43.000Z","2020-06-02T19:27:44.000Z","2020-06-02T19:27:45.000Z","2020-06-02T19:27:46.000Z","2020-06-02T19:27:47.000Z","2020-06-02T19:27:48.000Z","2020-06-02T19:27:49.000Z","2020-06-02T19:27:50.000Z","2020-06-02T19:27:51.000Z","2020-06-02T19:27:52.000Z","2020-06-02T19:27:53.000Z","2020-06-02T19:27:54.000Z","2020-06-02T19:27:55.000Z","2020-06-02T19:27:56.000Z","2020-06-02T19:27:57.000Z","2020-06-02T19:27:58.000Z","2020-06-02T19:27:59.000Z","2020-06-02T19:28:00.000Z","2020-06-02T19:28:01.000Z","2020-06-02T19:28:02.000Z","2020-06-02T19:28:03.000Z","2020-06-02T19:28:04.000Z","2020-06-02T19:28:05.000Z","2020-06-02T19:28:06.000Z","2020-06-02T19:28:07.000Z","2020-06-02T19:28:08.000Z","2020-06-02T19:28:09.000Z","2020-06-02T19:28:10.000Z","2020-06-02T19:28:11.000Z","2020-06-02T19:28:12.000Z","2020-06-02T19:28:13.000Z","2020-06-02T19:28:14.000Z","2020-06-02T19:28:15.000Z","2020-06-02T19:28:16.000Z","2020-06-02T19:28:17.000Z","2020-06-02T19:28:18.000Z","2020-06-02T19:28:19.000Z","2020-06-02T19:28:20.000Z","2020-06-02T19:28:21.000Z","2020-06-02T19:28:22.000Z","2020-06-02T19:28:23.000Z","2020-06-02T19:28:24.000Z","2020-06-02T19:28:25.000Z","2020-06-02T19:28:26.000Z","2020-06-02T19:28:27.000Z","2020-06-02T19:28:28.000Z","2020-06-02T19:28:29.000Z","2020-06-02T19:28:30.000Z","2020-06-02T19:28:31.000Z","2020-06-02T19:28:32.000Z","2020-06-02T19:28:33.000Z","2020-06-02T19:28:34.000Z","2020-06-02T19:28:35.000Z","2020-06-02T19:28:36.000Z","2020-06-02T19:28:37.000Z","2020-06-02T19:28:38.000Z","2020-06-02T19:28:39.000Z","2020-06-02T19:28:40.000Z","2020-06-02T19:28:41.000Z","2020-06-02T19:28:42.000Z","2020-06-02T19:28:43.000Z","2020-06-02T19:28:44.000Z","2020-06-02T19:28:45.000Z","2020-06-02T19:28:46.000Z","2020-06-02T19:28:47.000Z","2020-06-02T19:28:48.000Z","2020-06-02T19:28:49.000Z","2020-06-02T19:28:50.000Z","2020-06-02T19:28:51.000Z","2020-06-02T19:28:52.000Z","2020-06-02T19:28:53.000Z","2020-06-02T19:28:54.000Z","2020-06-02T19:28:55.000Z","2020-06-02T19:28:56.000Z","2020-06-02T19:28:57.000Z","2020-06-02T19:28:58.000Z","2020-06-02T19:28:59.000Z","2020-06-02T19:29:00.000Z","2020-06-02T19:29:01.000Z","2020-06-02T19:29:02.000Z","2020-06-02T19:29:03.000Z","2020-06-02T19:29:04.000Z","2020-06-02T19:29:05.000Z","2020-06-02T19:29:06.000Z","2020-06-02T19:29:07.000Z","2020-06-02T19:29:08.000Z","2020-06-02T19:29:09.000Z","2020-06-02T19:29:10.000Z","2020-06-02T19:29:11.000Z","2020-06-02T19:29:12.000Z","2020-06-02T19:29:13.000Z","2020-06-02T19:29:14.000Z","2020-06-02T19:29:15.000Z","2020-06-02T19:29:16.000Z","2020-06-02T19:29:17.000Z","2020-06-02T19:29:18.000Z","2020-06-02T19:29:19.000Z","2020-06-02T19:29:20.000Z","2020-06-02T19:29:21.000Z","2020-06-02T19:29:22.000Z","2020-06-02T19:29:23.000Z","2020-06-02T19:29:24.000Z","2020-06-02T19:29:25.000Z","2020-06-02T19:29:26.000Z","2020-06-02T19:29:27.000Z","2020-06-02T19:29:28.000Z","2020-06-02T19:29:29.000Z","2020-06-02T19:29:30.000Z","2020-06-02T19:29:31.000Z","2020-06-02T19:29:32.000Z","2020-06-02T19:29:33.000Z","2020-06-02T19:29:34.000Z","2020-06-02T19:29:35.000Z","2020-06-02T19:29:36.000Z","2020-06-02T19:29:37.000Z","2020-06-02T19:29:38.000Z","2020-06-02T19:29:39.000Z","2020-06-02T19:29:40.000Z","2020-06-02T19:29:41.000Z","2020-06-02T19:29:42.000Z","2020-06-02T19:29:43.000Z","2020-06-02T19:29:44.000Z","2020-06-02T19:29:45.000Z","2020-06-02T19:29:46.000Z","2020-06-02T19:29:47.000Z","2020-06-02T19:29:48.000Z","2020-06-02T19:29:49.000Z","2020-06-02T19:29:50.000Z","2020-06-02T19:29:51.000Z","2020-06-02T19:29:52.000Z","2020-06-02T19:29:53.000Z","2020-06-02T19:29:54.000Z","2020-06-02T19:29:55.000Z","2020-06-02T19:29:56.000Z","2020-06-02T19:29:57.000Z","2020-06-02T19:29:58.000Z","2020-06-02T19:29:59.000Z","2020-06-02T19:30:00.000Z","2020-06-02T19:30:01.000Z","2020-06-02T19:30:02.000Z","2020-06-02T19:30:03.000Z","2020-06-02T19:30:04.000Z","2020-06-02T19:30:05.000Z","2020-06-02T19:30:06.000Z","2020-06-02T19:30:07.000Z","2020-06-02T19:30:08.000Z","2020-06-02T19:30:09.000Z","2020-06-02T19:30:10.000Z","2020-06-02T19:30:11.000Z","2020-06-02T19:30:12.000Z","2020-06-02T19:30:13.000Z","2020-06-02T19:30:14.000Z","2020-06-02T19:30:15.000Z","2020-06-02T19:30:16.000Z","2020-06-02T19:30:17.000Z","2020-06-02T19:30:18.000Z","2020-06-02T19:30:19.000Z","2020-06-02T19:30:20.000Z","2020-06-02T19:30:21.000Z","2020-06-02T19:30:22.000Z","2020-06-02T19:30:23.000Z","2020-06-02T19:30:24.000Z","2020-06-02T19:30:25.000Z","2020-06-02T19:30:26.000Z","2020-06-02T19:30:27.000Z","2020-06-02T19:30:28.000Z","2020-06-02T19:30:29.000Z","2020-06-02T19:30:30.000Z","2020-06-02T19:30:31.000Z","2020-06-02T19:30:32.000Z","2020-06-02T19:30:33.000Z","2020-06-02T19:30:34.000Z","2020-06-02T19:30:35.000Z","2020-06-02T19:30:36.000Z","2020-06-02T19:30:37.000Z","2020-06-02T19:30:38.000Z","2020-06-02T19:30:39.000Z","2020-06-02T19:30:40.000Z","2020-06-02T19:30:41.000Z","2020-06-02T19:30:42.000Z","2020-06-02T19:30:43.000Z","2020-06-02T19:30:44.000Z","2020-06-02T19:30:45.000Z","2020-06-02T19:30:46.000Z","2020-06-02T19:30:47.000Z","2020-06-02T19:30:48.000Z","2020-06-02T19:30:49.000Z","2020-06-02T19:30:50.000Z","2020-06-02T19:30:51.000Z","2020-06-02T19:30:52.000Z","2020-06-02T19:30:53.000Z","2020-06-02T19:30:54.000Z","2020-06-02T19:30:55.000Z","2020-06-02T19:30:56.000Z","2020-06-02T19:30:57.000Z","2020-06-02T19:30:58.000Z","2020-06-02T19:30:59.000Z","2020-06-02T19:31:00.000Z","2020-06-02T19:31:01.000Z","2020-06-02T19:31:02.000Z","2020-06-02T19:31:03.000Z","2020-06-02T19:31:04.000Z","2020-06-02T19:31:05.000Z","2020-06-02T19:31:06.000Z","2020-06-02T19:31:07.000Z","2020-06-02T19:31:08.000Z","2020-06-02T19:31:09.000Z","2020-06-02T19:31:10.000Z","2020-06-02T19:31:11.000Z","2020-06-02T19:31:12.000Z","2020-06-02T19:31:13.000Z","2020-06-02T19:31:14.000Z","2020-06-02T19:31:15.000Z","2020-06-02T19:31:16.000Z","2020-06-02T19:31:17.000Z","2020-06-02T19:31:18.000Z","2020-06-02T19:31:19.000Z","2020-06-02T19:31:20.000Z","2020-06-02T19:31:21.000Z","2020-06-02T19:31:22.000Z","2020-06-02T19:31:23.000Z","2020-06-02T19:31:24.000Z","2020-06-02T19:31:25.000Z","2020-06-02T19:31:26.000Z","2020-06-02T19:31:27.000Z","2020-06-02T19:31:28.000Z","2020-06-02T19:31:29.000Z","2020-06-02T19:31:30.000Z","2020-06-02T19:31:31.000Z","2020-06-02T19:31:32.000Z","2020-06-02T19:31:33.000Z","2020-06-02T19:31:34.000Z","2020-06-02T19:31:35.000Z","2020-06-02T19:31:36.000Z","2020-06-02T19:31:37.000Z","2020-06-02T19:31:38.000Z","2020-06-02T19:31:39.000Z","2020-06-02T19:31:40.000Z","2020-06-02T19:31:41.000Z","2020-06-02T19:31:42.000Z","2020-06-02T19:31:43.000Z","2020-06-02T19:31:44.000Z","2020-06-02T19:31:45.000Z","2020-06-02T19:31:46.000Z","2020-06-02T19:31:47.000Z","2020-06-02T19:31:48.000Z","2020-06-02T19:31:49.000Z","2020-06-02T19:31:50.000Z","2020-06-02T19:31:51.000Z","2020-06-02T19:31:52.000Z","2020-06-02T19:31:53.000Z","2020-06-02T19:31:54.000Z","2020-06-02T19:31:55.000Z","2020-06-02T19:31:56.000Z","2020-06-02T19:31:57.000Z","2020-06-02T19:31:58.000Z","2020-06-02T19:31:59.000Z","2020-06-02T19:32:00.000Z","2020-06-02T19:32:01.000Z","2020-06-02T19:32:02.000Z","2020-06-02T19:32:03.000Z","2020-06-02T19:32:04.000Z","2020-06-02T19:32:05.000Z","2020-06-02T19:32:06.000Z","2020-06-02T19:32:07.000Z","2020-06-02T19:32:08.000Z","2020-06-02T19:32:09.000Z","2020-06-02T19:32:10.000Z","2020-06-02T19:32:11.000Z","2020-06-02T19:32:12.000Z","2020-06-02T19:32:13.000Z","2020-06-02T19:32:14.000Z","2020-06-02T19:32:15.000Z","2020-06-02T19:32:16.000Z","2020-06-02T19:32:17.000Z","2020-06-02T19:32:18.000Z","2020-06-02T19:32:19.000Z","2020-06-02T19:32:20.000Z","2020-06-02T19:32:21.000Z","2020-06-02T19:32:22.000Z","2020-06-02T19:32:23.000Z","2020-06-02T19:32:24.000Z","2020-06-02T19:32:25.000Z","2020-06-02T19:32:26.000Z","2020-06-02T19:32:27.000Z","2020-06-02T19:32:28.000Z","2020-06-02T19:32:29.000Z","2020-06-02T19:32:30.000Z","2020-06-02T19:32:31.000Z","2020-06-02T19:32:32.000Z","2020-06-02T19:32:33.000Z","2020-06-02T19:32:34.000Z","2020-06-02T19:32:35.000Z","2020-06-02T19:32:36.000Z","2020-06-02T19:32:37.000Z","2020-06-02T19:32:38.000Z","2020-06-02T19:32:39.000Z","2020-06-02T19:32:40.000Z","2020-06-02T19:32:41.000Z","2020-06-02T19:32:42.000Z","2020-06-02T19:32:43.000Z","2020-06-02T19:32:44.000Z","2020-06-02T19:32:45.000Z","2020-06-02T19:32:46.000Z","2020-06-02T19:32:47.000Z","2020-06-02T19:32:48.000Z","2020-06-02T19:32:49.000Z","2020-06-02T19:32:50.000Z","2020-06-02T19:32:51.000Z","2020-06-02T19:32:52.000Z","2020-06-02T19:32:53.000Z","2020-06-02T19:32:54.000Z","2020-06-02T19:32:55.000Z","2020-06-02T19:32:56.000Z","2020-06-02T19:32:57.000Z","2020-06-02T19:32:58.000Z","2020-06-02T19:32:59.000Z","2020-06-02T19:33:00.000Z","2020-06-02T19:33:01.000Z","2020-06-02T19:33:02.000Z","2020-06-02T19:33:03.000Z","2020-06-02T19:33:04.000Z","2020-06-02T19:33:05.000Z","2020-06-02T19:33:06.000Z","2020-06-02T19:33:07.000Z","2020-06-02T19:33:08.000Z","2020-06-02T19:33:09.000Z","2020-06-02T19:33:10.000Z","2020-06-02T19:33:11.000Z","2020-06-02T19:33:12.000Z","2020-06-02T19:33:13.000Z","2020-06-02T19:33:14.000Z","2020-06-02T19:33:15.000Z","2020-06-02T19:33:16.000Z","2020-06-02T19:33:17.000Z","2020-06-02T19:33:18.000Z","2020-06-02T19:33:19.000Z","2020-06-02T19:33:20.000Z","2020-06-02T19:33:21.000Z","2020-06-02T19:33:22.000Z","2020-06-02T19:33:23.000Z","2020-06-02T19:33:24.000Z","2020-06-02T19:33:25.000Z","2020-06-02T19:33:26.000Z","2020-06-02T19:33:27.000Z","2020-06-02T19:33:28.000Z","2020-06-02T19:33:29.000Z","2020-06-02T19:33:30.000Z","2020-06-02T19:33:31.000Z","2020-06-02T19:33:32.000Z","2020-06-02T19:33:33.000Z","2020-06-02T19:33:34.000Z","2020-06-02T19:33:35.000Z","2020-06-02T19:33:36.000Z","2020-06-02T19:33:37.000Z","2020-06-02T19:33:38.000Z","2020-06-02T19:33:39.000Z","2020-06-02T19:33:40.000Z","2020-06-02T19:33:41.000Z","2020-06-02T19:33:42.000Z","2020-06-02T19:33:43.000Z","2020-06-02T19:33:44.000Z","2020-06-02T19:33:45.000Z","2020-06-02T19:33:46.000Z","2020-06-02T19:33:47.000Z","2020-06-02T19:33:48.000Z","2020-06-02T19:33:49.000Z","2020-06-02T19:33:50.000Z","2020-06-02T19:33:51.000Z","2020-06-02T19:33:52.000Z","2020-06-02T19:33:53.000Z","2020-06-02T19:33:54.000Z","2020-06-02T19:33:55.000Z","2020-06-02T19:33:56.000Z","2020-06-02T19:33:57.000Z","2020-06-02T19:33:58.000Z","2020-06-02T19:33:59.000Z","2020-06-02T19:34:00.000Z","2020-06-02T19:34:01.000Z","2020-06-02T19:34:02.000Z","2020-06-02T19:34:03.000Z","2020-06-02T19:34:04.000Z","2020-06-02T19:34:05.000Z","2020-06-02T19:34:06.000Z","2020-06-02T19:34:07.000Z","2020-06-02T19:34:08.000Z","2020-06-02T19:34:09.000Z","2020-06-02T19:34:10.000Z","2020-06-02T19:34:11.000Z","2020-06-02T19:34:12.000Z","2020-06-02T19:34:13.000Z","2020-06-02T19:34:14.000Z","2020-06-02T19:34:15.000Z","2020-06-02T19:34:16.000Z","2020-06-02T19:34:17.000Z","2020-06-02T19:34:18.000Z","2020-06-02T19:34:19.000Z","2020-06-02T19:34:20.000Z","2020-06-02T19:34:21.000Z","2020-06-02T19:34:22.000Z","2020-06-02T19:34:23.000Z","2020-06-02T19:34:24.000Z","2020-06-02T19:34:25.000Z","2020-06-02T19:34:26.000Z","2020-06-02T19:34:27.000Z","2020-06-02T19:34:28.000Z","2020-06-02T19:34:29.000Z","2020-06-02T19:34:30.000Z","2020-06-02T19:34:31.000Z","2020-06-02T19:34:32.000Z","2020-06-02T19:34:33.000Z","2020-06-02T19:34:34.000Z","2020-06-02T19:34:35.000Z","2020-06-02T19:34:36.000Z","2020-06-02T19:34:37.000Z","2020-06-02T19:34:38.000Z","2020-06-02T19:34:39.000Z","2020-06-02T19:34:40.000Z","2020-06-02T19:34:41.000Z","2020-06-02T19:34:42.000Z","2020-06-02T19:34:43.000Z","2020-06-02T19:34:44.000Z","2020-06-02T19:34:45.000Z","2020-06-02T19:34:46.000Z","2020-06-02T19:34:47.000Z","2020-06-02T19:34:48.000Z","2020-06-02T19:34:49.000Z","2020-06-02T19:34:50.000Z","2020-06-02T19:34:51.000Z","2020-06-02T19:34:52.000Z","2020-06-02T19:34:53.000Z","2020-06-02T19:34:54.000Z","2020-06-02T19:34:55.000Z","2020-06-02T19:34:56.000Z","2020-06-02T19:34:57.000Z","2020-06-02T19:34:58.000Z","2020-06-02T19:34:59.000Z","2020-06-02T19:35:00.000Z","2020-06-02T19:35:01.000Z","2020-06-02T19:35:02.000Z","2020-06-02T19:35:03.000Z","2020-06-02T19:35:04.000Z","2020-06-02T19:35:05.000Z","2020-06-02T19:35:06.000Z","2020-06-02T19:35:07.000Z","2020-06-02T19:35:08.000Z","2020-06-02T19:35:09.000Z","2020-06-02T19:35:10.000Z","2020-06-02T19:35:11.000Z","2020-06-02T19:35:12.000Z","2020-06-02T19:35:13.000Z","2020-06-02T19:35:14.000Z","2020-06-02T19:35:15.000Z","2020-06-02T19:35:16.000Z","2020-06-02T19:35:17.000Z","2020-06-02T19:35:18.000Z","2020-06-02T19:35:19.000Z","2020-06-02T19:35:20.000Z","2020-06-02T19:35:21.000Z","2020-06-02T19:35:22.000Z","2020-06-02T19:35:23.000Z","2020-06-02T19:35:24.000Z","2020-06-02T19:35:25.000Z","2020-06-02T19:35:26.000Z","2020-06-02T19:35:27.000Z","2020-06-02T19:35:28.000Z","2020-06-02T19:35:29.000Z","2020-06-02T19:35:30.000Z","2020-06-02T19:35:31.000Z","2020-06-02T19:35:32.000Z","2020-06-02T19:35:33.000Z","2020-06-02T19:35:34.000Z","2020-06-02T19:35:35.000Z","2020-06-02T19:35:36.000Z","2020-06-02T19:35:37.000Z","2020-06-02T19:35:38.000Z","2020-06-02T19:35:39.000Z","2020-06-02T19:35:40.000Z","2020-06-02T19:35:41.000Z","2020-06-02T19:35:42.000Z","2020-06-02T19:35:43.000Z","2020-06-02T19:35:44.000Z","2020-06-02T19:35:45.000Z","2020-06-02T19:35:46.000Z","2020-06-02T19:35:47.000Z","2020-06-02T19:35:48.000Z","2020-06-02T19:35:49.000Z","2020-06-02T19:35:50.000Z","2020-06-02T19:35:51.000Z","2020-06-02T19:35:52.000Z","2020-06-02T19:35:53.000Z","2020-06-02T19:35:54.000Z","2020-06-02T19:35:55.000Z","2020-06-02T19:35:56.000Z","2020-06-02T19:35:57.000Z","2020-06-02T19:35:58.000Z","2020-06-02T19:35:59.000Z","2020-06-02T19:36:00.000Z","2020-06-02T19:36:01.000Z","2020-06-02T19:36:02.000Z","2020-06-02T19:36:03.000Z","2020-06-02T19:36:04.000Z","2020-06-02T19:36:05.000Z","2020-06-02T19:36:06.000Z","2020-06-02T19:36:07.000Z","2020-06-02T19:36:08.000Z","2020-06-02T19:36:09.000Z","2020-06-02T19:36:10.000Z","2020-06-02T19:36:11.000Z","2020-06-02T19:36:12.000Z","2020-06-02T19:36:13.000Z","2020-06-02T19:36:14.000Z","2020-06-02T19:36:15.000Z","2020-06-02T19:36:16.000Z","2020-06-02T19:36:17.000Z","2020-06-02T19:36:18.000Z","2020-06-02T19:36:19.000Z","2020-06-02T19:36:20.000Z","2020-06-02T19:36:21.000Z","2020-06-02T19:36:22.000Z","2020-06-02T19:36:23.000Z","2020-06-02T19:36:24.000Z","2020-06-02T19:36:25.000Z","2020-06-02T19:36:26.000Z","2020-06-02T19:36:27.000Z","2020-06-02T19:36:28.000Z","2020-06-02T19:36:29.000Z","2020-06-02T19:36:30.000Z","2020-06-02T19:36:31.000Z","2020-06-02T19:36:32.000Z","2020-06-02T19:36:33.000Z","2020-06-02T19:36:34.000Z","2020-06-02T19:36:35.000Z","2020-06-02T19:36:36.000Z","2020-06-02T19:36:37.000Z","2020-06-02T19:36:38.000Z","2020-06-02T19:36:39.000Z","2020-06-02T19:36:40.000Z","2020-06-02T19:36:41.000Z","2020-06-02T19:36:42.000Z","2020-06-02T19:36:43.000Z","2020-06-02T19:36:44.000Z","2020-06-02T19:36:45.000Z","2020-06-02T19:36:46.000Z","2020-06-02T19:36:47.000Z","2020-06-02T19:36:48.000Z","2020-06-02T19:36:49.000Z","2020-06-02T19:36:50.000Z","2020-06-02T19:36:51.000Z","2020-06-02T19:36:52.000Z","2020-06-02T19:36:53.000Z","2020-06-02T19:36:54.000Z","2020-06-02T19:36:55.000Z","2020-06-02T19:36:56.000Z","2020-06-02T19:36:57.000Z","2020-06-02T19:36:58.000Z","2020-06-02T19:36:59.000Z","2020-06-02T19:37:00.000Z","2020-06-02T19:37:01.000Z","2020-06-02T19:37:02.000Z","2020-06-02T19:37:03.000Z","2020-06-02T19:37:04.000Z","2020-06-02T19:37:05.000Z","2020-06-02T19:37:06.000Z","2020-06-02T19:37:07.000Z","2020-06-02T19:37:08.000Z","2020-06-02T19:37:09.000Z","2020-06-02T19:37:10.000Z","2020-06-02T19:37:11.000Z","2020-06-02T19:37:12.000Z","2020-06-02T19:37:13.000Z","2020-06-02T19:37:14.000Z","2020-06-02T19:37:15.000Z","2020-06-02T19:37:16.000Z","2020-06-02T19:37:17.000Z","2020-06-02T19:37:18.000Z","2020-06-02T19:37:19.000Z","2020-06-02T19:37:20.000Z","2020-06-02T19:37:21.000Z","2020-06-02T19:37:22.000Z","2020-06-02T19:37:23.000Z","2020-06-02T19:37:24.000Z","2020-06-02T19:37:25.000Z","2020-06-02T19:37:26.000Z","2020-06-02T19:37:27.000Z","2020-06-02T19:37:28.000Z","2020-06-02T19:37:29.000Z","2020-06-02T19:37:30.000Z","2020-06-02T19:37:31.000Z","2020-06-02T19:37:32.000Z","2020-06-02T19:37:33.000Z","2020-06-02T19:37:34.000Z","2020-06-02T19:37:35.000Z","2020-06-02T19:37:36.000Z","2020-06-02T19:37:37.000Z","2020-06-02T19:37:38.000Z","2020-06-02T19:37:39.000Z","2020-06-02T19:37:40.000Z","2020-06-02T19:37:41.000Z","2020-06-02T19:37:42.000Z","2020-06-02T19:37:43.000Z","2020-06-02T19:37:44.000Z","2020-06-02T19:37:45.000Z","2020-06-02T19:37:46.000Z","2020-06-02T19:37:47.000Z","2020-06-02T19:37:48.000Z","2020-06-02T19:37:49.000Z","2020-06-02T19:37:50.000Z","2020-06-02T19:37:51.000Z","2020-06-02T19:37:52.000Z","2020-06-02T19:37:53.000Z","2020-06-02T19:37:54.000Z","2020-06-02T19:37:55.000Z","2020-06-02T19:37:56.000Z","2020-06-02T19:37:57.000Z","2020-06-02T19:37:58.000Z","2020-06-02T19:37:59.000Z","2020-06-02T19:38:00.000Z","2020-06-02T19:38:01.000Z","2020-06-02T19:38:02.000Z","2020-06-02T19:38:03.000Z","2020-06-02T19:38:04.000Z","2020-06-02T19:38:05.000Z","2020-06-02T19:38:06.000Z","2020-06-02T19:38:07.000Z","2020-06-02T19:38:08.000Z","2020-06-02T19:38:09.000Z","2020-06-02T19:38:10.000Z","2020-06-02T19:38:11.000Z","2020-06-02T19:38:12.000Z","2020-06-02T19:38:13.000Z","2020-06-02T19:38:14.000Z","2020-06-02T19:38:15.000Z","2020-06-02T19:38:16.000Z","2020-06-02T19:38:17.000Z","2020-06-02T19:38:18.000Z","2020-06-02T19:38:19.000Z","2020-06-02T19:38:20.000Z","2020-06-02T19:38:21.000Z","2020-06-02T19:38:22.000Z","2020-06-02T19:38:23.000Z","2020-06-02T19:38:24.000Z","2020-06-02T19:38:25.000Z","2020-06-02T19:38:26.000Z","2020-06-02T19:38:27.000Z","2020-06-02T19:38:28.000Z","2020-06-02T19:38:29.000Z","2020-06-02T19:38:30.000Z","2020-06-02T19:38:31.000Z","2020-06-02T19:38:32.000Z","2020-06-02T19:38:33.000Z","2020-06-02T19:38:34.000Z","2020-06-02T19:38:35.000Z","2020-06-02T19:38:36.000Z","2020-06-02T19:38:37.000Z","2020-06-02T19:38:38.000Z","2020-06-02T19:38:39.000Z","2020-06-02T19:38:40.000Z","2020-06-02T19:38:41.000Z","2020-06-02T19:38:42.000Z","2020-06-02T19:38:43.000Z","2020-06-02T19:38:44.000Z","2020-06-02T19:38:45.000Z","2020-06-02T19:38:46.000Z","2020-06-02T19:38:47.000Z","2020-06-02T19:38:48.000Z","2020-06-02T19:38:49.000Z","2020-06-02T19:38:50.000Z","2020-06-02T19:38:51.000Z","2020-06-02T19:38:52.000Z","2020-06-02T19:38:53.000Z","2020-06-02T19:38:54.000Z","2020-06-02T19:38:55.000Z","2020-06-02T19:38:56.000Z","2020-06-02T19:38:57.000Z","2020-06-02T19:38:58.000Z","2020-06-02T19:38:59.000Z","2020-06-02T19:39:00.000Z","2020-06-02T19:39:01.000Z","2020-06-02T19:39:02.000Z","2020-06-02T19:39:03.000Z","2020-06-02T19:39:04.000Z","2020-06-02T19:39:05.000Z","2020-06-02T19:39:06.000Z","2020-06-02T19:39:07.000Z","2020-06-02T19:39:08.000Z","2020-06-02T19:39:09.000Z","2020-06-02T19:39:10.000Z","2020-06-02T19:39:11.000Z","2020-06-02T19:39:12.000Z","2020-06-02T19:39:13.000Z","2020-06-02T19:39:14.000Z","2020-06-02T19:39:15.000Z","2020-06-02T19:39:16.000Z","2020-06-02T19:39:17.000Z","2020-06-02T19:39:18.000Z","2020-06-02T19:39:19.000Z","2020-06-02T19:39:20.000Z","2020-06-02T19:39:21.000Z","2020-06-02T19:39:22.000Z","2020-06-02T19:39:23.000Z","2020-06-02T19:39:24.000Z","2020-06-02T19:39:25.000Z","2020-06-02T19:39:26.000Z","2020-06-02T19:39:27.000Z","2020-06-02T19:39:28.000Z","2020-06-02T19:39:29.000Z","2020-06-02T19:39:30.000Z","2020-06-02T19:39:31.000Z","2020-06-02T19:39:32.000Z","2020-06-02T19:39:33.000Z","2020-06-02T19:39:34.000Z","2020-06-02T19:39:35.000Z","2020-06-02T19:39:36.000Z","2020-06-02T19:39:37.000Z","2020-06-02T19:39:38.000Z","2020-06-02T19:39:39.000Z","2020-06-02T19:39:40.000Z","2020-06-02T19:39:41.000Z","2020-06-02T19:39:42.000Z","2020-06-02T19:39:43.000Z","2020-06-02T19:39:44.000Z","2020-06-02T19:39:45.000Z","2020-06-02T19:39:46.000Z","2020-06-02T19:39:47.000Z","2020-06-02T19:39:48.000Z","2020-06-02T19:39:49.000Z","2020-06-02T19:39:50.000Z","2020-06-02T19:39:51.000Z","2020-06-02T19:39:52.000Z","2020-06-02T19:39:53.000Z","2020-06-02T19:39:54.000Z","2020-06-02T19:39:55.000Z","2020-06-02T19:39:56.000Z","2020-06-02T19:39:57.000Z","2020-06-02T19:39:58.000Z","2020-06-02T19:39:59.000Z","2020-06-02T19:40:00.000Z","2020-06-02T19:40:01.000Z","2020-06-02T19:40:02.000Z","2020-06-02T19:40:03.000Z","2020-06-02T19:40:04.000Z","2020-06-02T19:40:05.000Z","2020-06-02T19:40:06.000Z","2020-06-02T19:40:07.000Z","2020-06-02T19:40:08.000Z","2020-06-02T19:40:09.000Z","2020-06-02T19:40:10.000Z","2020-06-02T19:40:11.000Z","2020-06-02T19:40:12.000Z","2020-06-02T19:40:13.000Z","2020-06-02T19:40:14.000Z","2020-06-02T19:40:15.000Z","2020-06-02T19:40:16.000Z","2020-06-02T19:40:17.000Z","2020-06-02T19:40:18.000Z","2020-06-02T19:40:19.000Z","2020-06-02T19:40:20.000Z","2020-06-02T19:40:21.000Z","2020-06-02T19:40:22.000Z","2020-06-02T19:40:23.000Z","2020-06-02T19:40:24.000Z","2020-06-02T19:40:25.000Z","2020-06-02T19:40:26.000Z","2020-06-02T19:40:27.000Z","2020-06-02T19:40:28.000Z","2020-06-02T19:40:29.000Z","2020-06-02T19:40:30.000Z","2020-06-02T19:40:31.000Z","2020-06-02T19:40:32.000Z","2020-06-02T19:40:33.000Z","2020-06-02T19:40:34.000Z","2020-06-02T19:40:35.000Z","2020-06-02T19:40:36.000Z","2020-06-02T19:40:37.000Z","2020-06-02T19:40:38.000Z","2020-06-02T19:40:39.000Z","2020-06-02T19:40:40.000Z","2020-06-02T19:40:41.000Z","2020-06-02T19:40:42.000Z","2020-06-02T19:40:43.000Z","2020-06-02T19:40:44.000Z","2020-06-02T19:40:45.000Z","2020-06-02T19:40:46.000Z","2020-06-02T19:40:47.000Z","2020-06-02T19:40:48.000Z","2020-06-02T19:40:49.000Z","2020-06-02T19:40:50.000Z","2020-06-02T19:40:51.000Z","2020-06-02T19:40:52.000Z","2020-06-02T19:40:53.000Z","2020-06-02T19:40:54.000Z","2020-06-02T19:40:55.000Z","2020-06-02T19:40:56.000Z","2020-06-02T19:40:57.000Z","2020-06-02T19:40:58.000Z","2020-06-02T19:40:59.000Z","2020-06-02T19:41:00.000Z","2020-06-02T19:41:01.000Z","2020-06-02T19:41:02.000Z","2020-06-02T19:41:03.000Z","2020-06-02T19:41:04.000Z","2020-06-02T19:41:05.000Z","2020-06-02T19:41:06.000Z","2020-06-02T19:41:07.000Z","2020-06-02T19:41:08.000Z","2020-06-02T19:41:09.000Z","2020-06-02T19:41:10.000Z","2020-06-02T19:41:11.000Z","2020-06-02T19:41:12.000Z","2020-06-02T19:41:13.000Z","2020-06-02T19:41:14.000Z","2020-06-02T19:41:15.000Z","2020-06-02T19:41:16.000Z","2020-06-02T19:41:17.000Z","2020-06-02T19:41:18.000Z","2020-06-02T19:41:19.000Z","2020-06-02T19:41:20.000Z","2020-06-02T19:41:21.000Z","2020-06-02T19:41:22.000Z","2020-06-02T19:41:23.000Z","2020-06-02T19:41:24.000Z","2020-06-02T19:41:25.000Z","2020-06-02T19:41:26.000Z","2020-06-02T19:41:27.000Z","2020-06-02T19:41:28.000Z","2020-06-02T19:41:29.000Z","2020-06-02T19:41:30.000Z","2020-06-02T19:41:31.000Z","2020-06-02T19:41:32.000Z","2020-06-02T19:41:33.000Z","2020-06-02T19:41:34.000Z","2020-06-02T19:41:35.000Z","2020-06-02T19:41:36.000Z","2020-06-02T19:41:37.000Z","2020-06-02T19:41:38.000Z","2020-06-02T19:41:39.000Z","2020-06-02T19:41:40.000Z","2020-06-02T19:41:41.000Z","2020-06-02T19:41:42.000Z","2020-06-02T19:41:43.000Z","2020-06-02T19:41:44.000Z","2020-06-02T19:41:45.000Z","2020-06-02T19:41:46.000Z","2020-06-02T19:41:47.000Z","2020-06-02T19:41:48.000Z","2020-06-02T19:41:49.000Z","2020-06-02T19:41:50.000Z","2020-06-02T19:41:51.000Z","2020-06-02T19:41:52.000Z","2020-06-02T19:41:53.000Z","2020-06-02T19:41:54.000Z","2020-06-02T19:41:55.000Z","2020-06-02T19:41:56.000Z","2020-06-02T19:41:57.000Z","2020-06-02T19:41:58.000Z","2020-06-02T19:41:59.000Z","2020-06-02T19:42:00.000Z","2020-06-02T19:42:01.000Z","2020-06-02T19:42:02.000Z","2020-06-02T19:42:03.000Z","2020-06-02T19:42:04.000Z","2020-06-02T19:42:05.000Z","2020-06-02T19:42:06.000Z","2020-06-02T19:42:07.000Z","2020-06-02T19:42:08.000Z","2020-06-02T19:42:09.000Z","2020-06-02T19:42:10.000Z","2020-06-02T19:42:11.000Z","2020-06-02T19:42:12.000Z","2020-06-02T19:42:13.000Z","2020-06-02T19:42:14.000Z","2020-06-02T19:42:15.000Z","2020-06-02T19:42:16.000Z","2020-06-02T19:42:17.000Z","2020-06-02T19:42:18.000Z","2020-06-02T19:42:19.000Z","2020-06-02T19:42:20.000Z","2020-06-02T19:42:21.000Z","2020-06-02T19:42:22.000Z","2020-06-02T19:42:23.000Z","2020-06-02T19:42:24.000Z","2020-06-02T19:42:25.000Z","2020-06-02T19:42:26.000Z","2020-06-02T19:42:27.000Z","2020-06-02T19:42:28.000Z","2020-06-02T19:42:29.000Z","2020-06-02T19:42:30.000Z","2020-06-02T19:42:31.000Z","2020-06-02T19:42:32.000Z","2020-06-02T19:42:33.000Z","2020-06-02T19:42:34.000Z","2020-06-02T19:42:35.000Z","2020-06-02T19:42:36.000Z","2020-06-02T19:42:37.000Z","2020-06-02T19:42:38.000Z","2020-06-02T19:42:39.000Z","2020-06-02T19:42:40.000Z","2020-06-02T19:42:41.000Z","2020-06-02T19:42:42.000Z","2020-06-02T19:42:43.000Z","2020-06-02T19:42:44.000Z","2020-06-02T19:42:45.000Z","2020-06-02T19:42:46.000Z","2020-06-02T19:42:47.000Z","2020-06-02T19:42:48.000Z","2020-06-02T19:42:49.000Z","2020-06-02T19:42:50.000Z","2020-06-02T19:42:51.000Z","2020-06-02T19:42:52.000Z","2020-06-02T19:42:53.000Z","2020-06-02T19:42:54.000Z","2020-06-02T19:42:55.000Z","2020-06-02T19:42:56.000Z","2020-06-02T19:42:57.000Z","2020-06-02T19:42:58.000Z","2020-06-02T19:42:59.000Z","2020-06-02T19:43:00.000Z","2020-06-02T19:43:01.000Z","2020-06-02T19:43:02.000Z","2020-06-02T19:43:03.000Z","2020-06-02T19:43:04.000Z","2020-06-02T19:43:05.000Z","2020-06-02T19:43:06.000Z","2020-06-02T19:43:07.000Z","2020-06-02T19:43:08.000Z","2020-06-02T19:43:09.000Z","2020-06-02T19:43:10.000Z","2020-06-02T19:43:11.000Z","2020-06-02T19:43:12.000Z","2020-06-02T19:43:13.000Z","2020-06-02T19:43:14.000Z","2020-06-02T19:43:15.000Z","2020-06-02T19:43:16.000Z","2020-06-02T19:43:17.000Z","2020-06-02T19:43:18.000Z","2020-06-02T19:43:19.000Z","2020-06-02T19:43:20.000Z","2020-06-02T19:43:21.000Z","2020-06-02T19:43:22.000Z","2020-06-02T19:43:23.000Z","2020-06-02T19:43:24.000Z","2020-06-02T19:43:25.000Z","2020-06-02T19:43:26.000Z","2020-06-02T19:43:27.000Z","2020-06-02T19:43:28.000Z","2020-06-02T19:43:29.000Z","2020-06-02T19:43:30.000Z","2020-06-02T19:43:31.000Z","2020-06-02T19:43:32.000Z","2020-06-02T19:43:33.000Z","2020-06-02T19:43:34.000Z","2020-06-02T19:43:35.000Z","2020-06-02T19:43:36.000Z","2020-06-02T19:43:37.000Z","2020-06-02T19:43:38.000Z","2020-06-02T19:43:39.000Z","2020-06-02T19:43:40.000Z","2020-06-02T19:43:41.000Z","2020-06-02T19:43:42.000Z","2020-06-02T19:43:43.000Z","2020-06-02T19:43:44.000Z","2020-06-02T19:43:45.000Z","2020-06-02T19:43:46.000Z","2020-06-02T19:43:47.000Z","2020-06-02T19:43:48.000Z","2020-06-02T19:43:49.000Z","2020-06-02T19:43:50.000Z","2020-06-02T19:43:51.000Z","2020-06-02T19:43:52.000Z","2020-06-02T19:43:53.000Z","2020-06-02T19:43:54.000Z","2020-06-02T19:43:55.000Z","2020-06-02T19:43:56.000Z","2020-06-02T19:43:57.000Z","2020-06-02T19:43:58.000Z","2020-06-02T19:43:59.000Z","2020-06-02T19:44:00.000Z","2020-06-02T19:44:01.000Z","2020-06-02T19:44:02.000Z","2020-06-02T19:44:03.000Z","2020-06-02T19:44:04.000Z","2020-06-02T19:44:05.000Z","2020-06-02T19:44:06.000Z","2020-06-02T19:44:07.000Z","2020-06-02T19:44:08.000Z","2020-06-02T19:44:09.000Z","2020-06-02T19:44:10.000Z","2020-06-02T19:44:11.000Z","2020-06-02T19:44:12.000Z","2020-06-02T19:44:13.000Z","2020-06-02T19:44:14.000Z","2020-06-02T19:44:15.000Z","2020-06-02T19:44:16.000Z","2020-06-02T19:44:17.000Z","2020-06-02T19:44:18.000Z","2020-06-02T19:44:19.000Z","2020-06-02T19:44:20.000Z","2020-06-02T19:44:21.000Z","2020-06-02T19:44:22.000Z","2020-06-02T19:44:23.000Z","2020-06-02T19:44:24.000Z","2020-06-02T19:44:25.000Z","2020-06-02T19:44:26.000Z","2020-06-02T19:44:27.000Z","2020-06-02T19:44:28.000Z","2020-06-02T19:44:29.000Z","2020-06-02T19:44:30.000Z","2020-06-02T19:44:31.000Z","2020-06-02T19:44:32.000Z","2020-06-02T19:44:33.000Z","2020-06-02T19:44:34.000Z","2020-06-02T19:44:35.000Z","2020-06-02T19:44:36.000Z","2020-06-02T19:44:37.000Z","2020-06-02T19:44:38.000Z","2020-06-02T19:44:39.000Z","2020-06-02T19:44:40.000Z","2020-06-02T19:44:41.000Z","2020-06-02T19:44:42.000Z","2020-06-02T19:44:43.000Z","2020-06-02T19:44:44.000Z","2020-06-02T19:44:45.000Z","2020-06-02T19:44:46.000Z","2020-06-02T19:44:47.000Z","2020-06-02T19:44:48.000Z","2020-06-02T19:44:49.000Z","2020-06-02T19:44:50.000Z","2020-06-02T19:44:51.000Z","2020-06-02T19:44:52.000Z","2020-06-02T19:44:53.000Z","2020-06-02T19:44:54.000Z","2020-06-02T19:44:55.000Z","2020-06-02T19:44:56.000Z","2020-06-02T19:44:57.000Z","2020-06-02T19:44:58.000Z","2020-06-02T19:44:59.000Z","2020-06-02T19:45:00.000Z","2020-06-02T19:45:01.000Z","2020-06-02T19:45:02.000Z","2020-06-02T19:45:03.000Z","2020-06-02T19:45:04.000Z","2020-06-02T19:45:05.000Z","2020-06-02T19:45:06.000Z","2020-06-02T19:45:07.000Z","2020-06-02T19:45:08.000Z","2020-06-02T19:45:09.000Z","2020-06-02T19:45:10.000Z","2020-06-02T19:45:11.000Z","2020-06-02T19:45:12.000Z","2020-06-02T19:45:13.000Z","2020-06-02T19:45:14.000Z","2020-06-02T19:45:15.000Z","2020-06-02T19:45:16.000Z","2020-06-02T19:45:17.000Z","2020-06-02T19:45:18.000Z","2020-06-02T19:45:19.000Z","2020-06-02T19:45:20.000Z","2020-06-02T19:45:21.000Z","2020-06-02T19:45:22.000Z","2020-06-02T19:45:23.000Z","2020-06-02T19:45:24.000Z","2020-06-02T19:45:25.000Z","2020-06-02T19:45:26.000Z","2020-06-02T19:45:27.000Z","2020-06-02T19:45:28.000Z","2020-06-02T19:45:29.000Z","2020-06-02T19:45:30.000Z","2020-06-02T19:45:31.000Z","2020-06-02T19:45:32.000Z","2020-06-02T19:45:33.000Z","2020-06-02T19:45:34.000Z","2020-06-02T19:45:35.000Z","2020-06-02T19:45:36.000Z","2020-06-02T19:45:37.000Z","2020-06-02T19:45:38.000Z","2020-06-02T19:45:39.000Z","2020-06-02T19:45:40.000Z","2020-06-02T19:45:41.000Z","2020-06-02T19:45:42.000Z","2020-06-02T19:45:43.000Z","2020-06-02T19:45:44.000Z","2020-06-02T19:45:45.000Z","2020-06-02T19:45:46.000Z","2020-06-02T19:45:47.000Z","2020-06-02T19:45:48.000Z","2020-06-02T19:45:49.000Z","2020-06-02T19:45:50.000Z","2020-06-02T19:45:51.000Z","2020-06-02T19:45:52.000Z","2020-06-02T19:45:53.000Z","2020-06-02T19:45:54.000Z","2020-06-02T19:45:55.000Z","2020-06-02T19:45:56.000Z","2020-06-02T19:45:57.000Z","2020-06-02T19:45:58.000Z","2020-06-02T19:45:59.000Z","2020-06-02T19:46:00.000Z","2020-06-02T19:46:01.000Z","2020-06-02T19:46:02.000Z","2020-06-02T19:46:03.000Z","2020-06-02T19:46:04.000Z","2020-06-02T19:46:05.000Z","2020-06-02T19:46:06.000Z","2020-06-02T19:46:07.000Z","2020-06-02T19:46:08.000Z","2020-06-02T19:46:09.000Z","2020-06-02T19:46:10.000Z","2020-06-02T19:46:11.000Z","2020-06-02T19:46:12.000Z","2020-06-02T19:46:13.000Z","2020-06-02T19:46:14.000Z","2020-06-02T19:46:15.000Z","2020-06-02T19:46:16.000Z","2020-06-02T19:46:17.000Z","2020-06-02T19:46:18.000Z","2020-06-02T19:46:19.000Z","2020-06-02T19:46:20.000Z","2020-06-02T19:46:21.000Z","2020-06-02T19:46:22.000Z","2020-06-02T19:46:23.000Z","2020-06-02T19:46:24.000Z","2020-06-02T19:46:25.000Z","2020-06-02T19:46:26.000Z","2020-06-02T19:46:27.000Z","2020-06-02T19:46:28.000Z","2020-06-02T19:46:29.000Z","2020-06-02T19:46:30.000Z","2020-06-02T19:46:31.000Z","2020-06-02T19:46:32.000Z","2020-06-02T19:46:33.000Z","2020-06-02T19:46:34.000Z","2020-06-02T19:46:35.000Z","2020-06-02T19:46:36.000Z","2020-06-02T19:46:37.000Z","2020-06-02T19:46:38.000Z","2020-06-02T19:46:39.000Z","2020-06-02T19:46:40.000Z","2020-06-02T19:46:41.000Z","2020-06-02T19:46:42.000Z","2020-06-02T19:46:43.000Z","2020-06-02T19:46:44.000Z","2020-06-02T19:46:45.000Z","2020-06-02T19:46:46.000Z","2020-06-02T19:46:47.000Z","2020-06-02T19:46:48.000Z","2020-06-02T19:46:49.000Z","2020-06-02T19:46:50.000Z","2020-06-02T19:46:51.000Z","2020-06-02T19:46:52.000Z","2020-06-02T19:46:53.000Z","2020-06-02T19:46:54.000Z","2020-06-02T19:46:55.000Z","2020-06-02T19:46:56.000Z","2020-06-02T19:46:57.000Z","2020-06-02T19:46:58.000Z","2020-06-02T19:46:59.000Z","2020-06-02T19:47:00.000Z","2020-06-02T19:47:01.000Z","2020-06-02T19:47:02.000Z","2020-06-02T19:47:03.000Z","2020-06-02T19:47:04.000Z","2020-06-02T19:47:05.000Z","2020-06-02T19:47:06.000Z","2020-06-02T19:47:07.000Z","2020-06-02T19:47:08.000Z","2020-06-02T19:47:09.000Z","2020-06-02T19:47:10.000Z","2020-06-02T19:47:11.000Z","2020-06-02T19:47:12.000Z","2020-06-02T19:47:13.000Z","2020-06-02T19:47:14.000Z","2020-06-02T19:47:15.000Z","2020-06-02T19:47:16.000Z","2020-06-02T19:47:17.000Z","2020-06-02T19:47:18.000Z","2020-06-02T19:47:19.000Z","2020-06-02T19:47:20.000Z","2020-06-02T19:47:21.000Z","2020-06-02T19:47:22.000Z","2020-06-02T19:47:23.000Z","2020-06-02T19:47:24.000Z","2020-06-02T19:47:25.000Z","2020-06-02T19:47:26.000Z","2020-06-02T19:47:27.000Z","2020-06-02T19:47:28.000Z","2020-06-02T19:47:29.000Z","2020-06-02T19:47:30.000Z","2020-06-02T19:47:31.000Z","2020-06-02T19:47:32.000Z","2020-06-02T19:47:33.000Z","2020-06-02T19:47:34.000Z","2020-06-02T19:47:35.000Z","2020-06-02T19:47:36.000Z","2020-06-02T19:47:37.000Z","2020-06-02T19:47:38.000Z","2020-06-02T19:47:39.000Z","2020-06-02T19:47:40.000Z","2020-06-02T19:47:41.000Z","2020-06-02T19:47:42.000Z","2020-06-02T19:47:43.000Z","2020-06-02T19:47:44.000Z","2020-06-02T19:47:45.000Z","2020-06-02T19:47:46.000Z","2020-06-02T19:47:47.000Z","2020-06-02T19:47:48.000Z","2020-06-02T19:47:49.000Z","2020-06-02T19:47:50.000Z","2020-06-02T19:47:51.000Z","2020-06-02T19:47:52.000Z","2020-06-02T19:47:53.000Z","2020-06-02T19:47:54.000Z","2020-06-02T19:47:55.000Z","2020-06-02T19:47:56.000Z","2020-06-02T19:47:57.000Z","2020-06-02T19:47:58.000Z","2020-06-02T19:47:59.000Z","2020-06-02T19:48:00.000Z","2020-06-02T19:48:01.000Z","2020-06-02T19:48:02.000Z","2020-06-02T19:48:03.000Z","2020-06-02T19:48:04.000Z","2020-06-02T19:48:05.000Z","2020-06-02T19:48:06.000Z","2020-06-02T19:48:07.000Z","2020-06-02T19:48:08.000Z","2020-06-02T19:48:09.000Z","2020-06-02T19:48:10.000Z","2020-06-02T19:48:11.000Z","2020-06-02T19:48:12.000Z","2020-06-02T19:48:13.000Z","2020-06-02T19:48:14.000Z","2020-06-02T19:48:15.000Z","2020-06-02T19:48:16.000Z","2020-06-02T19:48:17.000Z","2020-06-02T19:48:18.000Z","2020-06-02T19:48:19.000Z","2020-06-02T19:48:20.000Z","2020-06-02T19:48:21.000Z","2020-06-02T19:48:22.000Z","2020-06-02T19:48:23.000Z","2020-06-02T19:48:24.000Z","2020-06-02T19:48:25.000Z","2020-06-02T19:48:26.000Z","2020-06-02T19:48:27.000Z","2020-06-02T19:48:28.000Z","2020-06-02T19:48:29.000Z","2020-06-02T19:48:30.000Z","2020-06-02T19:48:31.000Z","2020-06-02T19:48:32.000Z","2020-06-02T19:48:33.000Z","2020-06-02T19:48:34.000Z","2020-06-02T19:48:35.000Z","2020-06-02T19:48:36.000Z","2020-06-02T19:48:37.000Z","2020-06-02T19:48:38.000Z","2020-06-02T19:48:39.000Z","2020-06-02T19:48:40.000Z","2020-06-02T19:48:41.000Z","2020-06-02T19:48:42.000Z","2020-06-02T19:48:43.000Z","2020-06-02T19:48:44.000Z","2020-06-02T19:48:45.000Z","2020-06-02T19:48:46.000Z","2020-06-02T19:48:47.000Z","2020-06-02T19:48:48.000Z","2020-06-02T19:48:49.000Z","2020-06-02T19:48:50.000Z","2020-06-02T19:48:51.000Z","2020-06-02T19:48:52.000Z","2020-06-02T19:48:53.000Z","2020-06-02T19:48:54.000Z","2020-06-02T19:48:55.000Z","2020-06-02T19:48:56.000Z","2020-06-02T19:48:57.000Z","2020-06-02T19:48:58.000Z","2020-06-02T19:48:59.000Z","2020-06-02T19:49:00.000Z","2020-06-02T19:49:01.000Z","2020-06-02T19:49:02.000Z","2020-06-02T19:49:03.000Z","2020-06-02T19:49:04.000Z","2020-06-02T19:49:05.000Z","2020-06-02T19:49:06.000Z","2020-06-02T19:49:07.000Z","2020-06-02T19:49:08.000Z","2020-06-02T19:49:09.000Z","2020-06-02T19:49:10.000Z","2020-06-02T19:49:11.000Z","2020-06-02T19:49:12.000Z","2020-06-02T19:49:13.000Z","2020-06-02T19:49:14.000Z","2020-06-02T19:49:15.000Z","2020-06-02T19:49:16.000Z","2020-06-02T19:49:17.000Z","2020-06-02T19:49:18.000Z","2020-06-02T19:49:19.000Z","2020-06-02T19:49:20.000Z","2020-06-02T19:49:21.000Z","2020-06-02T19:49:22.000Z","2020-06-02T19:49:23.000Z","2020-06-02T19:49:24.000Z","2020-06-02T19:49:25.000Z","2020-06-02T19:49:26.000Z","2020-06-02T19:49:27.000Z","2020-06-02T19:49:28.000Z","2020-06-02T19:49:29.000Z","2020-06-02T19:49:30.000Z","2020-06-02T19:49:31.000Z","2020-06-02T19:49:32.000Z","2020-06-02T19:49:33.000Z","2020-06-02T19:49:34.000Z","2020-06-02T19:49:35.000Z","2020-06-02T19:49:36.000Z","2020-06-02T19:49:37.000Z","2020-06-02T19:49:38.000Z","2020-06-02T19:49:39.000Z","2020-06-02T19:49:40.000Z","2020-06-02T19:49:41.000Z","2020-06-02T19:49:42.000Z","2020-06-02T19:49:43.000Z","2020-06-02T19:49:44.000Z","2020-06-02T19:49:45.000Z","2020-06-02T19:49:46.000Z","2020-06-02T19:49:47.000Z","2020-06-02T19:49:48.000Z","2020-06-02T19:49:49.000Z","2020-06-02T19:49:50.000Z","2020-06-02T19:49:51.000Z","2020-06-02T19:49:52.000Z","2020-06-02T19:49:53.000Z","2020-06-02T19:49:54.000Z","2020-06-02T19:49:55.000Z","2020-06-02T19:49:56.000Z","2020-06-02T19:49:57.000Z","2020-06-02T19:49:58.000Z","2020-06-02T19:49:59.000Z","2020-06-02T19:50:00.000Z","2020-06-02T19:50:01.000Z","2020-06-02T19:50:02.000Z","2020-06-02T19:50:03.000Z","2020-06-02T19:50:04.000Z","2020-06-02T19:50:05.000Z","2020-06-02T19:50:06.000Z","2020-06-02T19:50:07.000Z","2020-06-02T19:50:08.000Z","2020-06-02T19:50:09.000Z","2020-06-02T19:50:10.000Z","2020-06-02T19:50:11.000Z","2020-06-02T19:50:12.000Z","2020-06-02T19:50:13.000Z","2020-06-02T19:50:14.000Z","2020-06-02T19:50:15.000Z","2020-06-02T19:50:16.000Z","2020-06-02T19:50:17.000Z","2020-06-02T19:50:18.000Z","2020-06-02T19:50:19.000Z","2020-06-02T19:50:20.000Z","2020-06-02T19:50:21.000Z","2020-06-02T19:50:22.000Z","2020-06-02T19:50:23.000Z","2020-06-02T19:50:24.000Z","2020-06-02T19:50:25.000Z","2020-06-02T19:50:26.000Z","2020-06-02T19:50:27.000Z","2020-06-02T19:50:28.000Z","2020-06-02T19:50:29.000Z","2020-06-02T19:50:30.000Z","2020-06-02T19:50:31.000Z","2020-06-02T19:50:32.000Z","2020-06-02T19:50:33.000Z","2020-06-02T19:50:34.000Z","2020-06-02T19:50:35.000Z","2020-06-02T19:50:36.000Z","2020-06-02T19:50:37.000Z","2020-06-02T19:50:38.000Z","2020-06-02T19:50:39.000Z","2020-06-02T19:50:40.000Z","2020-06-02T19:50:41.000Z","2020-06-02T19:50:42.000Z","2020-06-02T19:50:43.000Z","2020-06-02T19:50:44.000Z","2020-06-02T19:50:45.000Z","2020-06-02T19:50:46.000Z","2020-06-02T19:50:47.000Z","2020-06-02T19:50:48.000Z","2020-06-02T19:50:49.000Z","2020-06-02T19:50:50.000Z","2020-06-02T19:50:51.000Z","2020-06-02T19:50:52.000Z","2020-06-02T19:50:53.000Z","2020-06-02T19:50:54.000Z","2020-06-02T19:50:55.000Z","2020-06-02T19:50:56.000Z","2020-06-02T19:50:57.000Z","2020-06-02T19:50:58.000Z","2020-06-02T19:50:59.000Z","2020-06-02T19:51:00.000Z","2020-06-02T19:51:01.000Z","2020-06-02T19:51:02.000Z","2020-06-02T19:51:03.000Z","2020-06-02T19:51:04.000Z","2020-06-02T19:51:05.000Z","2020-06-02T19:51:06.000Z","2020-06-02T19:51:07.000Z","2020-06-02T19:51:08.000Z","2020-06-02T19:51:09.000Z","2020-06-02T19:51:10.000Z","2020-06-02T19:51:11.000Z","2020-06-02T19:51:12.000Z","2020-06-02T19:51:13.000Z","2020-06-02T19:51:14.000Z","2020-06-02T19:51:15.000Z","2020-06-02T19:51:16.000Z","2020-06-02T19:51:17.000Z","2020-06-02T19:51:18.000Z","2020-06-02T19:51:19.000Z","2020-06-02T19:51:20.000Z","2020-06-02T19:51:21.000Z","2020-06-02T19:51:22.000Z","2020-06-02T19:51:23.000Z","2020-06-02T19:51:24.000Z","2020-06-02T19:51:25.000Z","2020-06-02T19:51:26.000Z","2020-06-02T19:51:27.000Z","2020-06-02T19:51:28.000Z","2020-06-02T19:51:29.000Z","2020-06-02T19:51:30.000Z","2020-06-02T19:51:31.000Z","2020-06-02T19:51:32.000Z","2020-06-02T19:51:33.000Z","2020-06-02T19:51:34.000Z","2020-06-02T19:51:35.000Z","2020-06-02T19:51:36.000Z","2020-06-02T19:51:37.000Z","2020-06-02T19:51:38.000Z","2020-06-02T19:51:39.000Z","2020-06-02T19:51:40.000Z","2020-06-02T19:51:41.000Z","2020-06-02T19:51:42.000Z","2020-06-02T19:51:43.000Z","2020-06-02T19:51:44.000Z","2020-06-02T19:51:45.000Z","2020-06-02T19:51:46.000Z","2020-06-02T19:51:47.000Z","2020-06-02T19:51:48.000Z","2020-06-02T19:51:49.000Z","2020-06-02T19:51:50.000Z","2020-06-02T19:51:51.000Z","2020-06-02T19:51:52.000Z","2020-06-02T19:51:53.000Z","2020-06-02T19:51:54.000Z","2020-06-02T19:51:55.000Z","2020-06-02T19:51:56.000Z","2020-06-02T19:51:57.000Z","2020-06-02T19:51:58.000Z","2020-06-02T19:51:59.000Z","2020-06-02T19:52:00.000Z","2020-06-02T19:52:01.000Z","2020-06-02T19:52:02.000Z","2020-06-02T19:52:03.000Z","2020-06-02T19:52:04.000Z","2020-06-02T19:52:05.000Z","2020-06-02T19:52:06.000Z","2020-06-02T19:52:07.000Z","2020-06-02T19:52:08.000Z","2020-06-02T19:52:09.000Z","2020-06-02T19:52:10.000Z","2020-06-02T19:52:11.000Z","2020-06-02T19:52:12.000Z","2020-06-02T19:52:13.000Z","2020-06-02T19:52:14.000Z","2020-06-02T19:52:15.000Z","2020-06-02T19:52:16.000Z","2020-06-02T19:52:17.000Z","2020-06-02T19:52:18.000Z","2020-06-02T19:52:19.000Z","2020-06-02T19:52:20.000Z","2020-06-02T19:52:21.000Z","2020-06-02T19:52:22.000Z","2020-06-02T19:52:23.000Z","2020-06-02T19:52:24.000Z","2020-06-02T19:52:25.000Z","2020-06-02T19:52:26.000Z","2020-06-02T19:52:27.000Z","2020-06-02T19:52:28.000Z","2020-06-02T19:52:29.000Z","2020-06-02T19:52:30.000Z","2020-06-02T19:52:31.000Z","2020-06-02T19:52:32.000Z","2020-06-02T19:52:33.000Z","2020-06-02T19:52:34.000Z","2020-06-02T19:52:35.000Z","2020-06-02T19:52:36.000Z","2020-06-02T19:52:37.000Z","2020-06-02T19:52:38.000Z","2020-06-02T19:52:39.000Z","2020-06-02T19:52:40.000Z","2020-06-02T19:52:41.000Z","2020-06-02T19:52:42.000Z","2020-06-02T19:52:43.000Z","2020-06-02T19:52:44.000Z","2020-06-02T19:52:45.000Z","2020-06-02T19:52:46.000Z","2020-06-02T19:52:47.000Z","2020-06-02T19:52:48.000Z","2020-06-02T19:52:49.000Z","2020-06-02T19:52:50.000Z","2020-06-02T19:52:51.000Z","2020-06-02T19:52:52.000Z","2020-06-02T19:52:53.000Z","2020-06-02T19:52:54.000Z","2020-06-02T19:52:55.000Z","2020-06-02T19:52:56.000Z","2020-06-02T19:52:57.000Z","2020-06-02T19:52:58.000Z","2020-06-02T19:52:59.000Z","2020-06-02T19:53:00.000Z","2020-06-02T19:53:01.000Z","2020-06-02T19:53:02.000Z","2020-06-02T19:53:03.000Z","2020-06-02T19:53:04.000Z","2020-06-02T19:53:05.000Z","2020-06-02T19:53:06.000Z","2020-06-02T19:53:07.000Z","2020-06-02T19:53:08.000Z","2020-06-02T19:53:09.000Z","2020-06-02T19:53:10.000Z","2020-06-02T19:53:11.000Z","2020-06-02T19:53:12.000Z","2020-06-02T19:53:13.000Z","2020-06-02T19:53:14.000Z","2020-06-02T19:53:15.000Z","2020-06-02T19:53:16.000Z","2020-06-02T19:53:17.000Z","2020-06-02T19:53:18.000Z","2020-06-02T19:53:19.000Z","2020-06-02T19:53:20.000Z","2020-06-02T19:53:21.000Z","2020-06-02T19:53:22.000Z","2020-06-02T19:53:23.000Z","2020-06-02T19:53:24.000Z","2020-06-02T19:53:25.000Z","2020-06-02T19:53:26.000Z","2020-06-02T19:53:27.000Z","2020-06-02T19:53:28.000Z","2020-06-02T19:53:29.000Z","2020-06-02T19:53:30.000Z","2020-06-02T19:53:31.000Z","2020-06-02T19:53:32.000Z","2020-06-02T19:53:33.000Z","2020-06-02T19:53:34.000Z","2020-06-02T19:53:35.000Z","2020-06-02T19:53:36.000Z","2020-06-02T19:53:37.000Z","2020-06-02T19:53:38.000Z","2020-06-02T19:53:39.000Z","2020-06-02T19:53:40.000Z","2020-06-02T19:53:41.000Z","2020-06-02T19:53:42.000Z","2020-06-02T19:53:43.000Z","2020-06-02T19:53:44.000Z","2020-06-02T19:53:45.000Z","2020-06-02T19:53:46.000Z","2020-06-02T19:53:47.000Z","2020-06-02T19:53:48.000Z","2020-06-02T19:53:49.000Z","2020-06-02T19:53:50.000Z","2020-06-02T19:53:51.000Z","2020-06-02T19:53:52.000Z","2020-06-02T19:53:53.000Z","2020-06-02T19:53:54.000Z","2020-06-02T19:53:55.000Z","2020-06-02T19:53:56.000Z","2020-06-02T19:53:57.000Z","2020-06-02T19:53:58.000Z","2020-06-02T19:53:59.000Z","2020-06-02T19:54:00.000Z","2020-06-02T19:54:01.000Z","2020-06-02T19:54:02.000Z","2020-06-02T19:54:03.000Z","2020-06-02T19:54:04.000Z","2020-06-02T19:54:05.000Z","2020-06-02T19:54:06.000Z","2020-06-02T19:54:07.000Z","2020-06-02T19:54:08.000Z","2020-06-02T19:54:09.000Z","2020-06-02T19:54:10.000Z","2020-06-02T19:54:11.000Z","2020-06-02T19:54:12.000Z","2020-06-02T19:54:13.000Z","2020-06-02T19:54:14.000Z","2020-06-02T19:54:15.000Z","2020-06-02T19:54:16.000Z","2020-06-02T19:54:17.000Z","2020-06-02T19:54:18.000Z","2020-06-02T19:54:19.000Z","2020-06-02T19:54:20.000Z","2020-06-02T19:54:21.000Z","2020-06-02T19:54:22.000Z","2020-06-02T19:54:23.000Z","2020-06-02T19:54:24.000Z","2020-06-02T19:54:25.000Z","2020-06-02T19:54:26.000Z","2020-06-02T19:54:27.000Z","2020-06-02T19:54:28.000Z","2020-06-02T19:54:29.000Z","2020-06-02T19:54:30.000Z","2020-06-02T19:54:31.000Z","2020-06-02T19:54:32.000Z","2020-06-02T19:54:33.000Z","2020-06-02T19:54:34.000Z","2020-06-02T19:54:35.000Z","2020-06-02T19:54:36.000Z","2020-06-02T19:54:37.000Z","2020-06-02T19:54:38.000Z","2020-06-02T19:54:39.000Z","2020-06-02T19:54:40.000Z","2020-06-02T19:54:41.000Z","2020-06-02T19:54:42.000Z","2020-06-02T19:54:43.000Z","2020-06-02T19:54:44.000Z","2020-06-02T19:54:45.000Z","2020-06-02T19:54:46.000Z","2020-06-02T19:54:47.000Z","2020-06-02T19:54:48.000Z","2020-06-02T19:54:49.000Z","2020-06-02T19:54:50.000Z","2020-06-02T19:54:51.000Z","2020-06-02T19:54:52.000Z","2020-06-02T19:54:53.000Z","2020-06-02T19:54:54.000Z","2020-06-02T19:54:55.000Z","2020-06-02T19:54:56.000Z","2020-06-02T19:54:57.000Z","2020-06-02T19:54:58.000Z","2020-06-02T19:54:59.000Z","2020-06-02T19:55:00.000Z","2020-06-02T19:55:01.000Z","2020-06-02T19:55:02.000Z","2020-06-02T19:55:03.000Z","2020-06-02T19:55:04.000Z","2020-06-02T19:55:05.000Z","2020-06-02T19:55:06.000Z","2020-06-02T19:55:07.000Z","2020-06-02T19:55:08.000Z","2020-06-02T19:55:09.000Z","2020-06-02T19:55:10.000Z","2020-06-02T19:55:11.000Z","2020-06-02T19:55:12.000Z","2020-06-02T19:55:13.000Z","2020-06-02T19:55:14.000Z","2020-06-02T19:55:15.000Z","2020-06-02T19:55:16.000Z","2020-06-02T19:55:17.000Z","2020-06-02T19:55:18.000Z","2020-06-02T19:55:19.000Z","2020-06-02T19:55:20.000Z","2020-06-02T19:55:21.000Z","2020-06-02T19:55:22.000Z","2020-06-02T19:55:23.000Z","2020-06-02T19:55:24.000Z","2020-06-02T19:55:25.000Z","2020-06-02T19:55:26.000Z","2020-06-02T19:55:27.000Z","2020-06-02T19:55:28.000Z","2020-06-02T19:55:29.000Z","2020-06-02T19:55:30.000Z","2020-06-02T19:55:31.000Z","2020-06-02T19:55:32.000Z","2020-06-02T19:55:33.000Z","2020-06-02T19:55:34.000Z","2020-06-02T19:55:35.000Z","2020-06-02T19:55:36.000Z","2020-06-02T19:55:37.000Z","2020-06-02T19:55:38.000Z","2020-06-02T19:55:39.000Z","2020-06-02T19:55:40.000Z","2020-06-02T19:55:41.000Z","2020-06-02T19:55:42.000Z","2020-06-02T19:55:43.000Z","2020-06-02T19:55:44.000Z","2020-06-02T19:55:45.000Z","2020-06-02T19:55:46.000Z","2020-06-02T19:55:47.000Z","2020-06-02T19:55:48.000Z","2020-06-02T19:55:49.000Z","2020-06-02T19:55:50.000Z","2020-06-02T19:55:51.000Z","2020-06-02T19:55:52.000Z","2020-06-02T19:55:53.000Z","2020-06-02T19:55:54.000Z","2020-06-02T19:55:55.000Z","2020-06-02T19:55:56.000Z","2020-06-02T19:55:57.000Z","2020-06-02T19:55:58.000Z","2020-06-02T19:55:59.000Z","2020-06-02T19:56:00.000Z","2020-06-02T19:56:01.000Z","2020-06-02T19:56:02.000Z","2020-06-02T19:56:03.000Z","2020-06-02T19:56:04.000Z","2020-06-02T19:56:05.000Z","2020-06-02T19:56:06.000Z","2020-06-02T19:56:07.000Z","2020-06-02T19:56:08.000Z","2020-06-02T19:56:09.000Z","2020-06-02T19:56:10.000Z","2020-06-02T19:56:11.000Z","2020-06-02T19:56:12.000Z","2020-06-02T19:56:13.000Z","2020-06-02T19:56:14.000Z","2020-06-02T19:56:15.000Z","2020-06-02T19:56:16.000Z","2020-06-02T19:56:17.000Z","2020-06-02T19:56:18.000Z","2020-06-02T19:56:19.000Z","2020-06-02T19:56:20.000Z","2020-06-02T19:56:21.000Z","2020-06-02T19:56:22.000Z","2020-06-02T19:56:23.000Z","2020-06-02T19:56:24.000Z","2020-06-02T19:56:25.000Z","2020-06-02T19:56:26.000Z","2020-06-02T19:56:27.000Z","2020-06-02T19:56:28.000Z","2020-06-02T19:56:29.000Z","2020-06-02T19:56:30.000Z","2020-06-02T19:56:31.000Z","2020-06-02T19:56:32.000Z","2020-06-02T19:56:33.000Z","2020-06-02T19:56:34.000Z","2020-06-02T19:56:35.000Z","2020-06-02T19:56:36.000Z","2020-06-02T19:56:37.000Z","2020-06-02T19:56:38.000Z","2020-06-02T19:56:39.000Z","2020-06-02T19:56:40.000Z","2020-06-02T19:56:41.000Z","2020-06-02T19:56:42.000Z","2020-06-02T19:56:43.000Z","2020-06-02T19:56:44.000Z","2020-06-02T19:56:45.000Z","2020-06-02T19:56:46.000Z","2020-06-02T19:56:47.000Z","2020-06-02T19:56:48.000Z","2020-06-02T19:56:49.000Z","2020-06-02T19:56:50.000Z","2020-06-02T19:56:51.000Z","2020-06-02T19:56:52.000Z","2020-06-02T19:56:53.000Z","2020-06-02T19:56:54.000Z","2020-06-02T19:56:55.000Z","2020-06-02T19:56:56.000Z","2020-06-02T19:56:57.000Z","2020-06-02T19:56:58.000Z","2020-06-02T19:56:59.000Z","2020-06-02T19:57:00.000Z","2020-06-02T19:57:01.000Z","2020-06-02T19:57:02.000Z","2020-06-02T19:57:03.000Z","2020-06-02T19:57:04.000Z","2020-06-02T19:57:05.000Z","2020-06-02T19:57:06.000Z","2020-06-02T19:57:07.000Z","2020-06-02T19:57:08.000Z","2020-06-02T19:57:09.000Z","2020-06-02T19:57:10.000Z","2020-06-02T19:57:11.000Z","2020-06-02T19:57:12.000Z","2020-06-02T19:57:13.000Z","2020-06-02T19:57:14.000Z","2020-06-02T19:57:15.000Z","2020-06-02T19:57:16.000Z","2020-06-02T19:57:17.000Z","2020-06-02T19:57:18.000Z","2020-06-02T19:57:19.000Z","2020-06-02T19:57:20.000Z","2020-06-02T19:57:21.000Z","2020-06-02T19:57:22.000Z","2020-06-02T19:57:23.000Z","2020-06-02T19:57:24.000Z","2020-06-02T19:57:25.000Z","2020-06-02T19:57:26.000Z","2020-06-02T19:57:27.000Z","2020-06-02T19:57:28.000Z","2020-06-02T19:57:29.000Z","2020-06-02T19:57:30.000Z","2020-06-02T19:57:31.000Z","2020-06-02T19:57:32.000Z","2020-06-02T19:57:33.000Z","2020-06-02T19:57:34.000Z","2020-06-02T19:57:35.000Z","2020-06-02T19:57:36.000Z","2020-06-02T19:57:37.000Z","2020-06-02T19:57:38.000Z","2020-06-02T19:57:39.000Z","2020-06-02T19:57:40.000Z","2020-06-02T19:57:41.000Z","2020-06-02T19:57:42.000Z","2020-06-02T19:57:43.000Z","2020-06-02T19:57:44.000Z","2020-06-02T19:57:45.000Z","2020-06-02T19:57:46.000Z","2020-06-02T19:57:47.000Z","2020-06-02T19:57:48.000Z","2020-06-02T19:57:49.000Z","2020-06-02T19:57:50.000Z","2020-06-02T19:57:51.000Z","2020-06-02T19:57:52.000Z","2020-06-02T19:57:53.000Z","2020-06-02T19:57:54.000Z","2020-06-02T19:57:55.000Z","2020-06-02T19:57:56.000Z","2020-06-02T19:57:57.000Z","2020-06-02T19:57:58.000Z","2020-06-02T19:57:59.000Z","2020-06-02T19:58:00.000Z","2020-06-02T19:58:01.000Z","2020-06-02T19:58:02.000Z","2020-06-02T19:58:03.000Z","2020-06-02T19:58:04.000Z","2020-06-02T19:58:05.000Z","2020-06-02T19:58:06.000Z","2020-06-02T19:58:07.000Z","2020-06-02T19:58:08.000Z","2020-06-02T19:58:09.000Z","2020-06-02T19:58:10.000Z","2020-06-02T19:58:11.000Z","2020-06-02T19:58:12.000Z","2020-06-02T19:58:13.000Z","2020-06-02T19:58:14.000Z","2020-06-02T19:58:15.000Z","2020-06-02T19:58:16.000Z","2020-06-02T19:58:17.000Z","2020-06-02T19:58:18.000Z","2020-06-02T19:58:19.000Z","2020-06-02T19:58:20.000Z","2020-06-02T19:58:21.000Z","2020-06-02T19:58:22.000Z","2020-06-02T19:58:23.000Z","2020-06-02T19:58:24.000Z","2020-06-02T19:58:25.000Z","2020-06-02T19:58:26.000Z","2020-06-02T19:58:27.000Z","2020-06-02T19:58:28.000Z","2020-06-02T19:58:29.000Z","2020-06-02T19:58:30.000Z","2020-06-02T19:58:31.000Z","2020-06-02T19:58:32.000Z","2020-06-02T19:58:33.000Z","2020-06-02T19:58:34.000Z","2020-06-02T19:58:35.000Z","2020-06-02T19:58:36.000Z","2020-06-02T19:58:37.000Z","2020-06-02T19:58:38.000Z","2020-06-02T19:58:39.000Z","2020-06-02T19:58:40.000Z","2020-06-02T19:58:41.000Z","2020-06-02T19:58:42.000Z","2020-06-02T19:58:43.000Z","2020-06-02T19:58:44.000Z","2020-06-02T19:58:45.000Z","2020-06-02T19:58:46.000Z","2020-06-02T19:58:47.000Z","2020-06-02T19:58:48.000Z","2020-06-02T19:58:49.000Z","2020-06-02T19:58:50.000Z","2020-06-02T19:58:51.000Z","2020-06-02T19:58:52.000Z","2020-06-02T19:58:53.000Z","2020-06-02T19:58:54.000Z","2020-06-02T19:58:55.000Z","2020-06-02T19:58:56.000Z","2020-06-02T19:58:57.000Z","2020-06-02T19:58:58.000Z","2020-06-02T19:58:59.000Z","2020-06-02T19:59:00.000Z","2020-06-02T19:59:01.000Z","2020-06-02T19:59:02.000Z","2020-06-02T19:59:03.000Z","2020-06-02T19:59:04.000Z","2020-06-02T19:59:05.000Z","2020-06-02T19:59:06.000Z","2020-06-02T19:59:07.000Z","2020-06-02T19:59:08.000Z","2020-06-02T19:59:09.000Z","2020-06-02T19:59:10.000Z","2020-06-02T19:59:11.000Z","2020-06-02T19:59:12.000Z","2020-06-02T19:59:13.000Z","2020-06-02T19:59:14.000Z","2020-06-02T19:59:15.000Z","2020-06-02T19:59:16.000Z","2020-06-02T19:59:17.000Z","2020-06-02T19:59:18.000Z","2020-06-02T19:59:19.000Z","2020-06-02T19:59:20.000Z","2020-06-02T19:59:21.000Z","2020-06-02T19:59:22.000Z","2020-06-02T19:59:23.000Z","2020-06-02T19:59:24.000Z","2020-06-02T19:59:25.000Z","2020-06-02T19:59:26.000Z","2020-06-02T19:59:27.000Z","2020-06-02T19:59:28.000Z","2020-06-02T19:59:29.000Z","2020-06-02T19:59:30.000Z","2020-06-02T19:59:31.000Z","2020-06-02T19:59:32.000Z","2020-06-02T19:59:33.000Z","2020-06-02T19:59:34.000Z","2020-06-02T19:59:35.000Z","2020-06-02T19:59:36.000Z","2020-06-02T19:59:37.000Z","2020-06-02T19:59:38.000Z","2020-06-02T19:59:39.000Z","2020-06-02T19:59:40.000Z","2020-06-02T19:59:41.000Z","2020-06-02T19:59:42.000Z","2020-06-02T19:59:43.000Z","2020-06-02T19:59:44.000Z","2020-06-02T19:59:45.000Z","2020-06-02T19:59:46.000Z","2020-06-02T19:59:47.000Z","2020-06-02T19:59:48.000Z","2020-06-02T19:59:49.000Z","2020-06-02T19:59:50.000Z","2020-06-02T19:59:51.000Z","2020-06-02T19:59:52.000Z","2020-06-02T19:59:53.000Z","2020-06-02T19:59:54.000Z","2020-06-02T19:59:55.000Z","2020-06-02T19:59:56.000Z","2020-06-02T19:59:57.000Z","2020-06-02T19:59:58.000Z","2020-06-02T19:59:59.000Z","2020-06-02T20:00:00.000Z","2020-06-02T20:00:01.000Z","2020-06-02T20:00:02.000Z","2020-06-02T20:00:03.000Z","2020-06-02T20:00:04.000Z","2020-06-02T20:00:05.000Z","2020-06-02T20:00:06.000Z","2020-06-02T20:00:07.000Z","2020-06-02T20:00:08.000Z","2020-06-02T20:00:09.000Z","2020-06-02T20:00:10.000Z","2020-06-02T20:00:11.000Z","2020-06-02T20:00:12.000Z","2020-06-02T20:00:13.000Z","2020-06-02T20:00:14.000Z","2020-06-02T20:00:15.000Z","2020-06-02T20:00:16.000Z","2020-06-02T20:00:17.000Z","2020-06-02T20:00:18.000Z","2020-06-02T20:00:19.000Z","2020-06-02T20:00:20.000Z","2020-06-02T20:00:21.000Z","2020-06-02T20:00:22.000Z","2020-06-02T20:00:23.000Z","2020-06-02T20:00:24.000Z","2020-06-02T20:00:25.000Z","2020-06-02T20:00:26.000Z","2020-06-02T20:00:27.000Z","2020-06-02T20:00:28.000Z","2020-06-02T20:00:29.000Z","2020-06-02T20:00:30.000Z","2020-06-02T20:00:31.000Z","2020-06-02T20:00:32.000Z","2020-06-02T20:00:33.000Z","2020-06-02T20:00:34.000Z","2020-06-02T20:00:35.000Z","2020-06-02T20:00:36.000Z","2020-06-02T20:00:37.000Z","2020-06-02T20:00:38.000Z","2020-06-02T20:00:39.000Z","2020-06-02T20:00:40.000Z","2020-06-02T20:00:41.000Z","2020-06-02T20:00:42.000Z","2020-06-02T20:00:43.000Z","2020-06-02T20:00:44.000Z","2020-06-02T20:00:45.000Z","2020-06-02T20:00:46.000Z","2020-06-02T20:00:47.000Z","2020-06-02T20:00:48.000Z","2020-06-02T20:00:49.000Z","2020-06-02T20:00:50.000Z","2020-06-02T20:00:51.000Z","2020-06-02T20:00:52.000Z","2020-06-02T20:00:53.000Z","2020-06-02T20:00:54.000Z","2020-06-02T20:00:55.000Z","2020-06-02T20:00:56.000Z","2020-06-02T20:00:57.000Z","2020-06-02T20:00:58.000Z","2020-06-02T20:00:59.000Z","2020-06-02T20:01:00.000Z","2020-06-02T20:01:01.000Z","2020-06-02T20:01:02.000Z","2020-06-02T20:01:03.000Z","2020-06-02T20:01:04.000Z","2020-06-02T20:01:05.000Z","2020-06-02T20:01:06.000Z","2020-06-02T20:01:07.000Z","2020-06-02T20:01:08.000Z","2020-06-02T20:01:09.000Z","2020-06-02T20:01:10.000Z","2020-06-02T20:01:11.000Z","2020-06-02T20:01:12.000Z","2020-06-02T20:01:13.000Z","2020-06-02T20:01:14.000Z","2020-06-02T20:01:15.000Z","2020-06-02T20:01:16.000Z","2020-06-02T20:01:17.000Z","2020-06-02T20:01:18.000Z","2020-06-02T20:01:19.000Z","2020-06-02T20:01:20.000Z","2020-06-02T20:01:21.000Z","2020-06-02T20:01:22.000Z","2020-06-02T20:01:23.000Z","2020-06-02T20:01:24.000Z","2020-06-02T20:01:25.000Z","2020-06-02T20:01:26.000Z","2020-06-02T20:01:27.000Z","2020-06-02T20:01:28.000Z","2020-06-02T20:01:29.000Z","2020-06-02T20:01:30.000Z","2020-06-02T20:01:31.000Z","2020-06-02T20:01:32.000Z","2020-06-02T20:01:33.000Z","2020-06-02T20:01:34.000Z","2020-06-02T20:01:35.000Z","2020-06-02T20:01:36.000Z","2020-06-02T20:01:37.000Z","2020-06-02T20:01:38.000Z","2020-06-02T20:01:39.000Z","2020-06-02T20:01:40.000Z","2020-06-02T20:01:41.000Z","2020-06-02T20:01:42.000Z","2020-06-02T20:01:43.000Z","2020-06-02T20:01:44.000Z","2020-06-02T20:01:45.000Z","2020-06-02T20:01:46.000Z","2020-06-02T20:01:47.000Z","2020-06-02T20:01:48.000Z","2020-06-02T20:01:49.000Z","2020-06-02T20:01:50.000Z","2020-06-02T20:01:51.000Z","2020-06-02T20:01:52.000Z","2020-06-02T20:01:53.000Z","2020-06-02T20:01:54.000Z","2020-06-02T20:01:55.000Z","2020-06-02T20:01:56.000Z","2020-06-02T20:01:57.000Z","2020-06-02T20:01:58.000Z","2020-06-02T20:01:59.000Z","2020-06-02T20:02:00.000Z","2020-06-02T20:02:01.000Z","2020-06-02T20:02:02.000Z","2020-06-02T20:02:03.000Z","2020-06-02T20:02:04.000Z","2020-06-02T20:02:05.000Z","2020-06-02T20:02:06.000Z","2020-06-02T20:02:07.000Z","2020-06-02T20:02:08.000Z","2020-06-02T20:02:09.000Z","2020-06-02T20:02:10.000Z","2020-06-02T20:02:11.000Z","2020-06-02T20:02:12.000Z","2020-06-02T20:02:13.000Z","2020-06-02T20:02:14.000Z","2020-06-02T20:02:15.000Z","2020-06-02T20:02:16.000Z","2020-06-02T20:02:17.000Z","2020-06-02T20:02:18.000Z","2020-06-02T20:02:19.000Z","2020-06-02T20:02:20.000Z","2020-06-02T20:02:21.000Z","2020-06-02T20:02:22.000Z","2020-06-02T20:02:23.000Z","2020-06-02T20:02:24.000Z","2020-06-02T20:02:25.000Z","2020-06-02T20:02:26.000Z","2020-06-02T20:02:27.000Z","2020-06-02T20:02:28.000Z","2020-06-02T20:02:29.000Z","2020-06-02T20:02:30.000Z","2020-06-02T20:02:31.000Z","2020-06-02T20:02:32.000Z","2020-06-02T20:02:33.000Z","2020-06-02T20:02:34.000Z","2020-06-02T20:02:35.000Z","2020-06-02T20:02:36.000Z","2020-06-02T20:02:37.000Z","2020-06-02T20:02:38.000Z","2020-06-02T20:02:39.000Z","2020-06-02T20:02:40.000Z","2020-06-02T20:02:41.000Z","2020-06-02T20:02:42.000Z","2020-06-02T20:02:43.000Z","2020-06-02T20:02:44.000Z","2020-06-02T20:02:45.000Z","2020-06-02T20:02:46.000Z","2020-06-02T20:02:47.000Z","2020-06-02T20:02:48.000Z","2020-06-02T20:02:49.000Z","2020-06-02T20:02:50.000Z","2020-06-02T20:02:51.000Z","2020-06-02T20:02:52.000Z","2020-06-02T20:02:53.000Z","2020-06-02T20:02:54.000Z","2020-06-02T20:02:55.000Z","2020-06-02T20:02:56.000Z","2020-06-02T20:02:57.000Z","2020-06-02T20:02:58.000Z","2020-06-02T20:02:59.000Z","2020-06-02T20:03:00.000Z","2020-06-02T20:03:01.000Z","2020-06-02T20:03:02.000Z","2020-06-02T20:03:03.000Z","2020-06-02T20:03:04.000Z","2020-06-02T20:03:05.000Z","2020-06-02T20:03:06.000Z","2020-06-02T20:03:07.000Z","2020-06-02T20:03:08.000Z","2020-06-02T20:03:09.000Z","2020-06-02T20:03:10.000Z","2020-06-02T20:03:11.000Z","2020-06-02T20:03:12.000Z","2020-06-02T20:03:13.000Z","2020-06-02T20:03:14.000Z","2020-06-02T20:03:15.000Z","2020-06-02T20:03:16.000Z","2020-06-02T20:03:17.000Z","2020-06-02T20:03:18.000Z","2020-06-02T20:03:19.000Z","2020-06-02T20:03:20.000Z","2020-06-02T20:03:21.000Z","2020-06-02T20:03:22.000Z","2020-06-02T20:03:23.000Z","2020-06-02T20:03:24.000Z","2020-06-02T20:03:25.000Z","2020-06-02T20:03:26.000Z","2020-06-02T20:03:27.000Z","2020-06-02T20:03:28.000Z","2020-06-02T20:03:29.000Z","2020-06-02T20:03:30.000Z","2020-06-02T20:03:31.000Z","2020-06-02T20:03:32.000Z","2020-06-02T20:03:33.000Z","2020-06-02T20:03:34.000Z","2020-06-02T20:03:35.000Z","2020-06-02T20:03:36.000Z","2020-06-02T20:03:37.000Z","2020-06-02T20:03:38.000Z","2020-06-02T20:03:39.000Z","2020-06-02T20:03:40.000Z","2020-06-02T20:03:41.000Z","2020-06-02T20:03:42.000Z","2020-06-02T20:03:43.000Z","2020-06-02T20:03:44.000Z","2020-06-02T20:03:45.000Z","2020-06-02T20:03:46.000Z","2020-06-02T20:03:47.000Z","2020-06-02T20:03:48.000Z","2020-06-02T20:03:49.000Z","2020-06-02T20:03:50.000Z","2020-06-02T20:03:51.000Z","2020-06-02T20:03:52.000Z","2020-06-02T20:03:53.000Z","2020-06-02T20:03:54.000Z","2020-06-02T20:03:55.000Z","2020-06-02T20:03:56.000Z","2020-06-02T20:03:57.000Z","2020-06-02T20:03:58.000Z","2020-06-02T20:03:59.000Z","2020-06-02T20:04:00.000Z","2020-06-02T20:04:01.000Z","2020-06-02T20:04:02.000Z","2020-06-02T20:04:03.000Z","2020-06-02T20:04:04.000Z","2020-06-02T20:04:05.000Z","2020-06-02T20:04:06.000Z","2020-06-02T20:04:07.000Z","2020-06-02T20:04:08.000Z","2020-06-02T20:04:09.000Z","2020-06-02T20:04:10.000Z","2020-06-02T20:04:11.000Z","2020-06-02T20:04:12.000Z","2020-06-02T20:04:13.000Z","2020-06-02T20:04:14.000Z","2020-06-02T20:04:15.000Z","2020-06-02T20:04:16.000Z","2020-06-02T20:04:17.000Z","2020-06-02T20:04:18.000Z","2020-06-02T20:04:19.000Z","2020-06-02T20:04:20.000Z","2020-06-02T20:04:21.000Z","2020-06-02T20:04:22.000Z","2020-06-02T20:04:23.000Z","2020-06-02T20:04:24.000Z","2020-06-02T20:04:25.000Z","2020-06-02T20:04:26.000Z","2020-06-02T20:04:27.000Z","2020-06-02T20:04:28.000Z","2020-06-02T20:04:29.000Z","2020-06-02T20:04:30.000Z","2020-06-02T20:04:31.000Z","2020-06-02T20:04:32.000Z","2020-06-02T20:04:33.000Z","2020-06-02T20:04:34.000Z","2020-06-02T20:04:35.000Z","2020-06-02T20:04:36.000Z","2020-06-02T20:04:37.000Z","2020-06-02T20:04:38.000Z","2020-06-02T20:04:39.000Z","2020-06-02T20:04:40.000Z","2020-06-02T20:04:41.000Z","2020-06-02T20:04:42.000Z","2020-06-02T20:04:43.000Z","2020-06-02T20:04:44.000Z","2020-06-02T20:04:45.000Z","2020-06-02T20:04:46.000Z","2020-06-02T20:04:47.000Z","2020-06-02T20:04:48.000Z","2020-06-02T20:04:49.000Z","2020-06-02T20:04:50.000Z","2020-06-02T20:04:51.000Z","2020-06-02T20:04:52.000Z","2020-06-02T20:04:53.000Z","2020-06-02T20:04:54.000Z","2020-06-02T20:04:55.000Z","2020-06-02T20:04:56.000Z","2020-06-02T20:04:57.000Z","2020-06-02T20:04:58.000Z","2020-06-02T20:04:59.000Z","2020-06-02T20:05:00.000Z","2020-06-02T20:05:01.000Z","2020-06-02T20:05:02.000Z","2020-06-02T20:05:03.000Z","2020-06-02T20:05:04.000Z","2020-06-02T20:05:05.000Z","2020-06-02T20:05:06.000Z","2020-06-02T20:05:07.000Z","2020-06-02T20:05:08.000Z","2020-06-02T20:05:09.000Z","2020-06-02T20:05:10.000Z","2020-06-02T20:05:11.000Z","2020-06-02T20:05:12.000Z","2020-06-02T20:05:13.000Z","2020-06-02T20:05:14.000Z","2020-06-02T20:05:15.000Z","2020-06-02T20:05:16.000Z","2020-06-02T20:05:17.000Z","2020-06-02T20:05:18.000Z","2020-06-02T20:05:19.000Z","2020-06-02T20:05:20.000Z","2020-06-02T20:05:21.000Z","2020-06-02T20:05:22.000Z","2020-06-02T20:05:23.000Z","2020-06-02T20:05:24.000Z","2020-06-02T20:05:25.000Z","2020-06-02T20:05:26.000Z","2020-06-02T20:05:27.000Z","2020-06-02T20:05:28.000Z","2020-06-02T20:05:29.000Z","2020-06-02T20:05:30.000Z","2020-06-02T20:05:31.000Z","2020-06-02T20:05:32.000Z","2020-06-02T20:05:33.000Z","2020-06-02T20:05:34.000Z","2020-06-02T20:05:35.000Z","2020-06-02T20:05:36.000Z","2020-06-02T20:05:37.000Z","2020-06-02T20:05:38.000Z","2020-06-02T20:05:39.000Z","2020-06-02T20:05:40.000Z","2020-06-02T20:05:41.000Z","2020-06-02T20:05:42.000Z","2020-06-02T20:05:43.000Z","2020-06-02T20:05:44.000Z","2020-06-02T20:05:45.000Z","2020-06-02T20:05:46.000Z","2020-06-02T20:05:47.000Z","2020-06-02T20:05:48.000Z","2020-06-02T20:05:49.000Z","2020-06-02T20:05:50.000Z","2020-06-02T20:05:51.000Z","2020-06-02T20:05:52.000Z","2020-06-02T20:05:53.000Z","2020-06-02T20:05:54.000Z","2020-06-02T20:05:55.000Z","2020-06-02T20:05:56.000Z","2020-06-02T20:05:57.000Z","2020-06-02T20:05:58.000Z","2020-06-02T20:05:59.000Z","2020-06-02T20:06:00.000Z","2020-06-02T20:06:01.000Z","2020-06-02T20:06:02.000Z","2020-06-02T20:06:03.000Z","2020-06-02T20:06:04.000Z","2020-06-02T20:06:05.000Z","2020-06-02T20:06:06.000Z","2020-06-02T20:06:07.000Z","2020-06-02T20:06:08.000Z","2020-06-02T20:06:09.000Z","2020-06-02T20:06:10.000Z","2020-06-02T20:06:11.000Z","2020-06-02T20:06:12.000Z","2020-06-02T20:06:13.000Z","2020-06-02T20:06:14.000Z","2020-06-02T20:06:15.000Z","2020-06-02T20:06:16.000Z","2020-06-02T20:06:17.000Z","2020-06-02T20:06:18.000Z","2020-06-02T20:06:19.000Z","2020-06-02T20:06:20.000Z","2020-06-02T20:06:21.000Z","2020-06-02T20:06:22.000Z","2020-06-02T20:06:23.000Z","2020-06-02T20:06:24.000Z","2020-06-02T20:06:25.000Z","2020-06-02T20:06:26.000Z","2020-06-02T20:06:27.000Z","2020-06-02T20:06:28.000Z","2020-06-02T20:06:29.000Z","2020-06-02T20:06:30.000Z","2020-06-02T20:06:31.000Z","2020-06-02T20:06:32.000Z","2020-06-02T20:06:33.000Z","2020-06-02T20:06:34.000Z","2020-06-02T20:06:35.000Z","2020-06-02T20:06:36.000Z","2020-06-02T20:06:37.000Z","2020-06-02T20:06:38.000Z","2020-06-02T20:06:39.000Z","2020-06-02T20:06:40.000Z","2020-06-02T20:06:41.000Z","2020-06-02T20:06:42.000Z","2020-06-02T20:06:43.000Z","2020-06-02T20:06:44.000Z","2020-06-02T20:06:45.000Z","2020-06-02T20:06:46.000Z","2020-06-02T20:06:47.000Z","2020-06-02T20:06:48.000Z","2020-06-02T20:06:49.000Z","2020-06-02T20:06:50.000Z","2020-06-02T20:06:51.000Z","2020-06-02T20:06:52.000Z","2020-06-02T20:06:53.000Z","2020-06-02T20:06:54.000Z","2020-06-02T20:06:55.000Z","2020-06-02T20:06:56.000Z","2020-06-02T20:06:57.000Z","2020-06-02T20:06:58.000Z","2020-06-02T20:06:59.000Z","2020-06-02T20:07:00.000Z","2020-06-02T20:07:01.000Z","2020-06-02T20:07:02.000Z","2020-06-02T20:07:03.000Z","2020-06-02T20:07:04.000Z","2020-06-02T20:07:05.000Z","2020-06-02T20:07:06.000Z","2020-06-02T20:07:07.000Z","2020-06-02T20:07:08.000Z","2020-06-02T20:07:09.000Z","2020-06-02T20:07:10.000Z","2020-06-02T20:07:11.000Z","2020-06-02T20:07:12.000Z","2020-06-02T20:07:13.000Z","2020-06-02T20:07:14.000Z","2020-06-02T20:07:15.000Z","2020-06-02T20:07:16.000Z","2020-06-02T20:07:17.000Z","2020-06-02T20:07:18.000Z","2020-06-02T20:07:19.000Z","2020-06-02T20:07:20.000Z","2020-06-02T20:07:21.000Z","2020-06-02T20:07:22.000Z","2020-06-02T20:07:23.000Z","2020-06-02T20:07:24.000Z","2020-06-02T20:07:25.000Z","2020-06-02T20:07:26.000Z","2020-06-02T20:07:27.000Z","2020-06-02T20:07:28.000Z","2020-06-02T20:07:29.000Z","2020-06-02T20:07:30.000Z","2020-06-02T20:07:31.000Z","2020-06-02T20:07:32.000Z","2020-06-02T20:07:33.000Z","2020-06-02T20:07:34.000Z","2020-06-02T20:07:35.000Z","2020-06-02T20:07:36.000Z","2020-06-02T20:07:37.000Z","2020-06-02T20:07:38.000Z","2020-06-02T20:07:39.000Z","2020-06-02T20:07:40.000Z","2020-06-02T20:07:41.000Z","2020-06-02T20:07:42.000Z","2020-06-02T20:07:43.000Z","2020-06-02T20:07:44.000Z","2020-06-02T20:07:45.000Z","2020-06-02T20:07:46.000Z","2020-06-02T20:07:47.000Z","2020-06-02T20:07:48.000Z","2020-06-02T20:07:49.000Z","2020-06-02T20:07:50.000Z","2020-06-02T20:07:51.000Z","2020-06-02T20:07:52.000Z","2020-06-02T20:07:53.000Z","2020-06-02T20:07:54.000Z","2020-06-02T20:07:55.000Z","2020-06-02T20:07:56.000Z","2020-06-02T20:07:57.000Z","2020-06-02T20:07:58.000Z","2020-06-02T20:07:59.000Z","2020-06-02T20:08:00.000Z","2020-06-02T20:08:01.000Z","2020-06-02T20:08:02.000Z","2020-06-02T20:08:03.000Z","2020-06-02T20:08:04.000Z","2020-06-02T20:08:05.000Z","2020-06-02T20:08:06.000Z","2020-06-02T20:08:07.000Z","2020-06-02T20:08:08.000Z","2020-06-02T20:08:09.000Z","2020-06-02T20:08:10.000Z","2020-06-02T20:08:11.000Z","2020-06-02T20:08:12.000Z","2020-06-02T20:08:13.000Z","2020-06-02T20:08:14.000Z","2020-06-02T20:08:15.000Z","2020-06-02T20:08:16.000Z","2020-06-02T20:08:17.000Z","2020-06-02T20:08:18.000Z","2020-06-02T20:08:19.000Z","2020-06-02T20:08:20.000Z","2020-06-02T20:08:21.000Z","2020-06-02T20:08:22.000Z","2020-06-02T20:08:23.000Z","2020-06-02T20:08:24.000Z","2020-06-02T20:08:25.000Z","2020-06-02T20:08:26.000Z","2020-06-02T20:08:27.000Z","2020-06-02T20:08:28.000Z","2020-06-02T20:08:29.000Z","2020-06-02T20:08:30.000Z","2020-06-02T20:08:31.000Z","2020-06-02T20:08:32.000Z","2020-06-02T20:08:33.000Z","2020-06-02T20:08:34.000Z","2020-06-02T20:08:35.000Z","2020-06-02T20:08:36.000Z","2020-06-02T20:08:37.000Z","2020-06-02T20:08:38.000Z","2020-06-02T20:08:39.000Z","2020-06-02T20:08:40.000Z","2020-06-02T20:08:41.000Z","2020-06-02T20:08:42.000Z","2020-06-02T20:08:43.000Z","2020-06-02T20:08:44.000Z","2020-06-02T20:08:45.000Z","2020-06-02T20:08:46.000Z","2020-06-02T20:08:47.000Z","2020-06-02T20:08:48.000Z","2020-06-02T20:08:49.000Z","2020-06-02T20:08:50.000Z","2020-06-02T20:08:51.000Z","2020-06-02T20:08:52.000Z","2020-06-02T20:08:53.000Z","2020-06-02T20:08:54.000Z","2020-06-02T20:08:55.000Z","2020-06-02T20:08:56.000Z","2020-06-02T20:08:57.000Z","2020-06-02T20:08:58.000Z","2020-06-02T20:08:59.000Z","2020-06-02T20:09:00.000Z","2020-06-02T20:09:01.000Z","2020-06-02T20:09:02.000Z","2020-06-02T20:09:03.000Z","2020-06-02T20:09:04.000Z","2020-06-02T20:09:05.000Z","2020-06-02T20:09:06.000Z","2020-06-02T20:09:07.000Z","2020-06-02T20:09:08.000Z","2020-06-02T20:09:09.000Z","2020-06-02T20:09:10.000Z","2020-06-02T20:09:11.000Z","2020-06-02T20:09:12.000Z","2020-06-02T20:09:13.000Z","2020-06-02T20:09:14.000Z","2020-06-02T20:09:15.000Z","2020-06-02T20:09:16.000Z","2020-06-02T20:09:17.000Z","2020-06-02T20:09:18.000Z","2020-06-02T20:09:19.000Z","2020-06-02T20:09:20.000Z","2020-06-02T20:09:21.000Z","2020-06-02T20:09:22.000Z","2020-06-02T20:09:23.000Z","2020-06-02T20:09:24.000Z","2020-06-02T20:09:25.000Z","2020-06-02T20:09:26.000Z","2020-06-02T20:09:27.000Z","2020-06-02T20:09:28.000Z","2020-06-02T20:09:29.000Z","2020-06-02T20:09:30.000Z","2020-06-02T20:09:31.000Z","2020-06-02T20:09:32.000Z","2020-06-02T20:09:33.000Z","2020-06-02T20:09:34.000Z","2020-06-02T20:09:35.000Z","2020-06-02T20:09:36.000Z","2020-06-02T20:09:37.000Z","2020-06-02T20:09:38.000Z","2020-06-02T20:09:39.000Z","2020-06-02T20:09:40.000Z","2020-06-02T20:09:41.000Z","2020-06-02T20:09:42.000Z","2020-06-02T20:09:43.000Z","2020-06-02T20:09:44.000Z","2020-06-02T20:09:45.000Z","2020-06-02T20:09:46.000Z","2020-06-02T20:09:47.000Z","2020-06-02T20:09:48.000Z","2020-06-02T20:09:49.000Z","2020-06-02T20:09:50.000Z","2020-06-02T20:09:51.000Z","2020-06-02T20:09:52.000Z","2020-06-02T20:09:53.000Z","2020-06-02T20:09:54.000Z","2020-06-02T20:09:55.000Z","2020-06-02T20:09:56.000Z","2020-06-02T20:09:57.000Z","2020-06-02T20:09:58.000Z","2020-06-02T20:09:59.000Z","2020-06-02T20:10:00.000Z","2020-06-02T20:10:01.000Z","2020-06-02T20:10:02.000Z","2020-06-02T20:10:03.000Z","2020-06-02T20:10:04.000Z","2020-06-02T20:10:05.000Z","2020-06-02T20:10:06.000Z","2020-06-02T20:10:07.000Z","2020-06-02T20:10:08.000Z","2020-06-02T20:10:09.000Z","2020-06-02T20:10:10.000Z","2020-06-02T20:10:11.000Z","2020-06-02T20:10:12.000Z","2020-06-02T20:10:13.000Z","2020-06-02T20:10:14.000Z","2020-06-02T20:10:15.000Z","2020-06-02T20:10:16.000Z","2020-06-02T20:10:17.000Z","2020-06-02T20:10:18.000Z","2020-06-02T20:10:19.000Z","2020-06-02T20:10:20.000Z","2020-06-02T20:10:21.000Z","2020-06-02T20:10:22.000Z","2020-06-02T20:10:23.000Z","2020-06-02T20:10:24.000Z","2020-06-02T20:10:25.000Z","2020-06-02T20:10:26.000Z","2020-06-02T20:10:27.000Z","2020-06-02T20:10:28.000Z","2020-06-02T20:10:29.000Z","2020-06-02T20:10:30.000Z","2020-06-02T20:10:31.000Z","2020-06-02T20:10:32.000Z","2020-06-02T20:10:33.000Z","2020-06-02T20:10:34.000Z","2020-06-02T20:10:35.000Z","2020-06-02T20:10:36.000Z","2020-06-02T20:10:37.000Z","2020-06-02T20:10:38.000Z","2020-06-02T20:10:39.000Z","2020-06-02T20:10:40.000Z","2020-06-02T20:10:41.000Z","2020-06-02T20:10:42.000Z","2020-06-02T20:10:43.000Z","2020-06-02T20:10:44.000Z","2020-06-02T20:10:45.000Z","2020-06-02T20:10:46.000Z","2020-06-02T20:10:47.000Z","2020-06-02T20:10:48.000Z","2020-06-02T20:10:49.000Z","2020-06-02T20:10:50.000Z","2020-06-02T20:10:51.000Z","2020-06-02T20:10:52.000Z","2020-06-02T20:10:53.000Z","2020-06-02T20:10:54.000Z","2020-06-02T20:10:55.000Z","2020-06-02T20:10:56.000Z","2020-06-02T20:10:57.000Z","2020-06-02T20:10:58.000Z","2020-06-02T20:10:59.000Z","2020-06-02T20:11:00.000Z","2020-06-02T20:11:01.000Z","2020-06-02T20:11:02.000Z","2020-06-02T20:11:03.000Z","2020-06-02T20:11:04.000Z","2020-06-02T20:11:05.000Z","2020-06-02T20:11:06.000Z","2020-06-02T20:11:07.000Z","2020-06-02T20:11:08.000Z","2020-06-02T20:11:09.000Z","2020-06-02T20:11:10.000Z","2020-06-02T20:11:11.000Z","2020-06-02T20:11:12.000Z","2020-06-02T20:11:13.000Z","2020-06-02T20:11:14.000Z","2020-06-02T20:11:15.000Z","2020-06-02T20:11:16.000Z","2020-06-02T20:11:17.000Z","2020-06-02T20:11:18.000Z","2020-06-02T20:11:19.000Z","2020-06-02T20:11:20.000Z","2020-06-02T20:11:21.000Z","2020-06-02T20:11:22.000Z","2020-06-02T20:11:23.000Z","2020-06-02T20:11:24.000Z","2020-06-02T20:11:25.000Z","2020-06-02T20:11:26.000Z","2020-06-02T20:11:27.000Z","2020-06-02T20:11:28.000Z","2020-06-02T20:11:29.000Z","2020-06-02T20:11:30.000Z","2020-06-02T20:11:31.000Z","2020-06-02T20:11:32.000Z","2020-06-02T20:11:33.000Z","2020-06-02T20:11:34.000Z","2020-06-02T20:11:35.000Z","2020-06-02T20:11:36.000Z","2020-06-02T20:11:37.000Z","2020-06-02T20:11:38.000Z","2020-06-02T20:11:39.000Z","2020-06-02T20:11:40.000Z","2020-06-02T20:11:41.000Z","2020-06-02T20:11:42.000Z","2020-06-02T20:11:43.000Z","2020-06-02T20:11:44.000Z","2020-06-02T20:11:45.000Z","2020-06-02T20:11:46.000Z","2020-06-02T20:11:47.000Z","2020-06-02T20:11:48.000Z","2020-06-02T20:11:49.000Z","2020-06-02T20:11:50.000Z","2020-06-02T20:11:51.000Z","2020-06-02T20:11:52.000Z","2020-06-02T20:11:53.000Z","2020-06-02T20:11:54.000Z","2020-06-02T20:11:55.000Z","2020-06-02T20:11:56.000Z","2020-06-02T20:11:57.000Z","2020-06-02T20:11:58.000Z","2020-06-02T20:11:59.000Z","2020-06-02T20:12:00.000Z","2020-06-02T20:12:01.000Z","2020-06-02T20:12:02.000Z","2020-06-02T20:12:03.000Z","2020-06-02T20:12:04.000Z","2020-06-02T20:12:05.000Z","2020-06-02T20:12:06.000Z","2020-06-02T20:12:07.000Z","2020-06-02T20:12:08.000Z","2020-06-02T20:12:09.000Z","2020-06-02T20:12:10.000Z","2020-06-02T20:12:11.000Z","2020-06-02T20:12:12.000Z","2020-06-02T20:12:13.000Z","2020-06-02T20:12:14.000Z","2020-06-02T20:12:15.000Z","2020-06-02T20:12:16.000Z","2020-06-02T20:12:17.000Z","2020-06-02T20:12:18.000Z","2020-06-02T20:12:19.000Z","2020-06-02T20:12:20.000Z","2020-06-02T20:12:21.000Z","2020-06-02T20:12:22.000Z","2020-06-02T20:12:23.000Z","2020-06-02T20:12:24.000Z","2020-06-02T20:12:25.000Z","2020-06-02T20:12:26.000Z","2020-06-02T20:12:27.000Z","2020-06-02T20:12:28.000Z","2020-06-02T20:12:29.000Z","2020-06-02T20:12:30.000Z","2020-06-02T20:12:31.000Z","2020-06-02T20:12:32.000Z","2020-06-02T20:12:33.000Z","2020-06-02T20:12:34.000Z","2020-06-02T20:12:35.000Z","2020-06-02T20:12:36.000Z","2020-06-02T20:12:37.000Z","2020-06-02T20:12:38.000Z","2020-06-02T20:12:39.000Z","2020-06-02T20:12:40.000Z","2020-06-02T20:12:41.000Z","2020-06-02T20:12:42.000Z","2020-06-02T20:12:43.000Z","2020-06-02T20:12:44.000Z","2020-06-02T20:12:45.000Z","2020-06-02T20:12:46.000Z","2020-06-02T20:12:47.000Z","2020-06-02T20:12:48.000Z","2020-06-02T20:12:49.000Z","2020-06-02T20:12:50.000Z","2020-06-02T20:12:51.000Z","2020-06-02T20:12:52.000Z","2020-06-02T20:12:53.000Z","2020-06-02T20:12:54.000Z","2020-06-02T20:12:55.000Z","2020-06-02T20:12:56.000Z","2020-06-02T20:12:57.000Z","2020-06-02T20:12:58.000Z","2020-06-02T20:12:59.000Z","2020-06-02T20:13:00.000Z","2020-06-02T20:13:01.000Z","2020-06-02T20:13:02.000Z","2020-06-02T20:13:03.000Z","2020-06-02T20:13:04.000Z","2020-06-02T20:13:05.000Z","2020-06-02T20:13:06.000Z","2020-06-02T20:13:07.000Z","2020-06-02T20:13:08.000Z","2020-06-02T20:13:09.000Z","2020-06-02T20:13:10.000Z","2020-06-02T20:13:11.000Z","2020-06-02T20:13:12.000Z","2020-06-02T20:13:13.000Z","2020-06-02T20:13:14.000Z","2020-06-02T20:13:15.000Z","2020-06-02T20:13:16.000Z","2020-06-02T20:13:17.000Z","2020-06-02T20:13:18.000Z","2020-06-02T20:13:19.000Z","2020-06-02T20:13:20.000Z","2020-06-02T20:13:21.000Z","2020-06-02T20:13:22.000Z","2020-06-02T20:13:23.000Z","2020-06-02T20:13:24.000Z","2020-06-02T20:13:25.000Z","2020-06-02T20:13:26.000Z","2020-06-02T20:13:27.000Z","2020-06-02T20:13:28.000Z","2020-06-02T20:13:29.000Z","2020-06-02T20:13:30.000Z","2020-06-02T20:13:31.000Z","2020-06-02T20:13:32.000Z","2020-06-02T20:13:33.000Z","2020-06-02T20:13:34.000Z","2020-06-02T20:13:35.000Z","2020-06-02T20:13:36.000Z","2020-06-02T20:13:37.000Z","2020-06-02T20:13:38.000Z","2020-06-02T20:13:39.000Z","2020-06-02T20:13:40.000Z","2020-06-02T20:13:41.000Z","2020-06-02T20:13:42.000Z","2020-06-02T20:13:43.000Z","2020-06-02T20:13:44.000Z","2020-06-02T20:13:45.000Z","2020-06-02T20:13:46.000Z","2020-06-02T20:13:47.000Z","2020-06-02T20:13:48.000Z","2020-06-02T20:13:49.000Z","2020-06-02T20:13:50.000Z","2020-06-02T20:13:51.000Z","2020-06-02T20:13:52.000Z","2020-06-02T20:13:53.000Z","2020-06-02T20:13:54.000Z","2020-06-02T20:13:55.000Z","2020-06-02T20:13:56.000Z","2020-06-02T20:13:57.000Z","2020-06-02T20:13:58.000Z","2020-06-02T20:13:59.000Z","2020-06-02T20:14:00.000Z","2020-06-02T20:14:01.000Z","2020-06-02T20:14:02.000Z","2020-06-02T20:14:03.000Z","2020-06-02T20:14:04.000Z","2020-06-02T20:14:05.000Z","2020-06-02T20:14:06.000Z","2020-06-02T20:14:07.000Z","2020-06-02T20:14:08.000Z","2020-06-02T20:14:09.000Z","2020-06-02T20:14:10.000Z","2020-06-02T20:14:11.000Z","2020-06-02T20:14:12.000Z","2020-06-02T20:14:13.000Z","2020-06-02T20:14:14.000Z","2020-06-02T20:14:15.000Z","2020-06-02T20:14:16.000Z","2020-06-02T20:14:17.000Z","2020-06-02T20:14:18.000Z","2020-06-02T20:14:19.000Z","2020-06-02T20:14:20.000Z","2020-06-02T20:14:21.000Z","2020-06-02T20:14:22.000Z","2020-06-02T20:14:23.000Z","2020-06-02T20:14:24.000Z","2020-06-02T20:14:25.000Z","2020-06-02T20:14:26.000Z","2020-06-02T20:14:27.000Z","2020-06-02T20:14:28.000Z","2020-06-02T20:14:29.000Z","2020-06-02T20:14:30.000Z","2020-06-02T20:14:31.000Z","2020-06-02T20:14:32.000Z","2020-06-02T20:14:33.000Z","2020-06-02T20:14:34.000Z","2020-06-02T20:14:35.000Z","2020-06-02T20:14:36.000Z","2020-06-02T20:14:37.000Z","2020-06-02T20:14:38.000Z","2020-06-02T20:14:39.000Z","2020-06-02T20:14:40.000Z","2020-06-02T20:14:41.000Z","2020-06-02T20:14:42.000Z","2020-06-02T20:14:43.000Z","2020-06-02T20:14:44.000Z","2020-06-02T20:14:45.000Z","2020-06-02T20:14:46.000Z","2020-06-02T20:14:47.000Z","2020-06-02T20:14:48.000Z","2020-06-02T20:14:49.000Z","2020-06-02T20:14:50.000Z","2020-06-02T20:14:51.000Z","2020-06-02T20:14:52.000Z","2020-06-02T20:14:53.000Z","2020-06-02T20:14:54.000Z","2020-06-02T20:14:55.000Z","2020-06-02T20:14:56.000Z","2020-06-02T20:14:57.000Z","2020-06-02T20:14:58.000Z","2020-06-02T20:14:59.000Z","2020-06-02T20:15:00.000Z","2020-06-02T20:15:01.000Z","2020-06-02T20:15:02.000Z","2020-06-02T20:15:03.000Z","2020-06-02T20:15:04.000Z","2020-06-02T20:15:05.000Z","2020-06-02T20:15:06.000Z","2020-06-02T20:15:07.000Z","2020-06-02T20:15:08.000Z","2020-06-02T20:15:09.000Z","2020-06-02T20:15:10.000Z","2020-06-02T20:15:11.000Z","2020-06-02T20:15:12.000Z","2020-06-02T20:15:13.000Z","2020-06-02T20:15:14.000Z","2020-06-02T20:15:15.000Z","2020-06-02T20:15:16.000Z","2020-06-02T20:15:17.000Z","2020-06-02T20:15:18.000Z","2020-06-02T20:15:19.000Z","2020-06-02T20:15:20.000Z","2020-06-02T20:15:21.000Z","2020-06-02T20:15:22.000Z","2020-06-02T20:15:23.000Z","2020-06-02T20:15:24.000Z","2020-06-02T20:15:25.000Z","2020-06-02T20:15:26.000Z","2020-06-02T20:15:27.000Z","2020-06-02T20:15:28.000Z","2020-06-02T20:15:29.000Z","2020-06-02T20:15:30.000Z","2020-06-02T20:15:31.000Z","2020-06-02T20:15:32.000Z","2020-06-02T20:15:33.000Z","2020-06-02T20:15:34.000Z","2020-06-02T20:15:35.000Z","2020-06-02T20:15:36.000Z","2020-06-02T20:15:37.000Z","2020-06-02T20:15:38.000Z","2020-06-02T20:15:39.000Z","2020-06-02T20:15:40.000Z","2020-06-02T20:15:41.000Z","2020-06-02T20:15:42.000Z","2020-06-02T20:15:43.000Z","2020-06-02T20:15:44.000Z","2020-06-02T20:15:45.000Z","2020-06-02T20:15:46.000Z","2020-06-02T20:15:47.000Z","2020-06-02T20:15:48.000Z","2020-06-02T20:15:49.000Z","2020-06-02T20:15:50.000Z","2020-06-02T20:15:51.000Z","2020-06-02T20:15:52.000Z","2020-06-02T20:15:53.000Z","2020-06-02T20:15:54.000Z","2020-06-02T20:15:55.000Z","2020-06-02T20:15:56.000Z","2020-06-02T20:15:57.000Z","2020-06-02T20:15:58.000Z","2020-06-02T20:15:59.000Z","2020-06-02T20:16:00.000Z","2020-06-02T20:16:01.000Z","2020-06-02T20:16:02.000Z","2020-06-02T20:16:03.000Z","2020-06-02T20:16:04.000Z","2020-06-02T20:16:05.000Z","2020-06-02T20:16:06.000Z","2020-06-02T20:16:07.000Z","2020-06-02T20:16:08.000Z","2020-06-02T20:16:09.000Z","2020-06-02T20:16:10.000Z","2020-06-02T20:16:11.000Z","2020-06-02T20:16:12.000Z","2020-06-02T20:16:13.000Z","2020-06-02T20:16:14.000Z","2020-06-02T20:16:15.000Z","2020-06-02T20:16:16.000Z","2020-06-02T20:16:17.000Z","2020-06-02T20:16:18.000Z","2020-06-02T20:16:19.000Z","2020-06-02T20:16:20.000Z","2020-06-02T20:16:21.000Z","2020-06-02T20:16:22.000Z","2020-06-02T20:16:23.000Z","2020-06-02T20:16:24.000Z","2020-06-02T20:16:25.000Z","2020-06-02T20:16:26.000Z","2020-06-02T20:16:27.000Z","2020-06-02T20:16:28.000Z","2020-06-02T20:16:29.000Z","2020-06-02T20:16:30.000Z","2020-06-02T20:16:31.000Z","2020-06-02T20:16:32.000Z","2020-06-02T20:16:33.000Z","2020-06-02T20:16:34.000Z","2020-06-02T20:16:35.000Z","2020-06-02T20:16:36.000Z","2020-06-02T20:16:37.000Z","2020-06-02T20:16:38.000Z","2020-06-02T20:16:39.000Z","2020-06-02T20:16:40.000Z","2020-06-02T20:16:41.000Z","2020-06-02T20:16:42.000Z","2020-06-02T20:16:43.000Z","2020-06-02T20:16:44.000Z","2020-06-02T20:16:45.000Z","2020-06-02T20:16:46.000Z","2020-06-02T20:16:47.000Z","2020-06-02T20:16:48.000Z","2020-06-02T20:16:49.000Z","2020-06-02T20:16:50.000Z","2020-06-02T20:16:51.000Z","2020-06-02T20:16:52.000Z","2020-06-02T20:16:53.000Z","2020-06-02T20:16:54.000Z","2020-06-02T20:16:55.000Z","2020-06-02T20:16:56.000Z","2020-06-02T20:16:57.000Z","2020-06-02T20:16:58.000Z","2020-06-02T20:16:59.000Z","2020-06-02T20:17:00.000Z","2020-06-02T20:17:01.000Z","2020-06-02T20:17:02.000Z","2020-06-02T20:17:03.000Z","2020-06-02T20:17:04.000Z","2020-06-02T20:17:05.000Z","2020-06-02T20:17:06.000Z","2020-06-02T20:17:07.000Z","2020-06-02T20:17:08.000Z","2020-06-02T20:17:09.000Z","2020-06-02T20:17:10.000Z","2020-06-02T20:17:11.000Z","2020-06-02T20:17:12.000Z","2020-06-02T20:17:13.000Z","2020-06-02T20:17:14.000Z","2020-06-02T20:17:15.000Z","2020-06-02T20:17:16.000Z","2020-06-02T20:17:17.000Z","2020-06-02T20:17:18.000Z","2020-06-02T20:17:19.000Z","2020-06-02T20:17:20.000Z","2020-06-02T20:17:21.000Z","2020-06-02T20:17:22.000Z","2020-06-02T20:17:23.000Z","2020-06-02T20:17:24.000Z","2020-06-02T20:17:25.000Z","2020-06-02T20:17:26.000Z","2020-06-02T20:17:27.000Z","2020-06-02T20:17:28.000Z","2020-06-02T20:17:29.000Z","2020-06-02T20:17:30.000Z","2020-06-02T20:17:31.000Z","2020-06-02T20:17:32.000Z","2020-06-02T20:17:33.000Z","2020-06-02T20:17:34.000Z","2020-06-02T20:17:35.000Z","2020-06-02T20:17:36.000Z","2020-06-02T20:17:37.000Z","2020-06-02T20:17:38.000Z","2020-06-02T20:17:39.000Z","2020-06-02T20:17:40.000Z","2020-06-02T20:17:41.000Z","2020-06-02T20:17:42.000Z","2020-06-02T20:17:43.000Z","2020-06-02T20:17:44.000Z","2020-06-02T20:17:45.000Z","2020-06-02T20:17:46.000Z","2020-06-02T20:17:47.000Z","2020-06-02T20:17:48.000Z","2020-06-02T20:17:49.000Z","2020-06-02T20:17:50.000Z","2020-06-02T20:17:51.000Z","2020-06-02T20:17:52.000Z","2020-06-02T20:17:53.000Z","2020-06-02T20:17:54.000Z","2020-06-02T20:17:55.000Z","2020-06-02T20:17:56.000Z","2020-06-02T20:17:57.000Z","2020-06-02T20:17:58.000Z","2020-06-02T20:17:59.000Z","2020-06-02T20:18:00.000Z","2020-06-02T20:18:01.000Z","2020-06-02T20:18:02.000Z","2020-06-02T20:18:03.000Z","2020-06-02T20:18:04.000Z","2020-06-02T20:18:05.000Z","2020-06-02T20:18:06.000Z","2020-06-02T20:18:07.000Z","2020-06-02T20:18:08.000Z","2020-06-02T20:18:09.000Z","2020-06-02T20:18:10.000Z","2020-06-02T20:18:11.000Z","2020-06-02T20:18:12.000Z","2020-06-02T20:18:13.000Z","2020-06-02T20:18:14.000Z","2020-06-02T20:18:15.000Z","2020-06-02T20:18:16.000Z","2020-06-02T20:18:17.000Z","2020-06-02T20:18:18.000Z","2020-06-02T20:18:19.000Z","2020-06-02T20:18:20.000Z","2020-06-02T20:18:21.000Z","2020-06-02T20:18:22.000Z","2020-06-02T20:18:23.000Z","2020-06-02T20:18:24.000Z","2020-06-02T20:18:25.000Z","2020-06-02T20:18:26.000Z","2020-06-02T20:18:27.000Z","2020-06-02T20:18:28.000Z","2020-06-02T20:18:29.000Z","2020-06-02T20:18:30.000Z","2020-06-02T20:18:31.000Z","2020-06-02T20:18:32.000Z","2020-06-02T20:18:33.000Z","2020-06-02T20:18:34.000Z","2020-06-02T20:18:35.000Z","2020-06-02T20:18:36.000Z","2020-06-02T20:18:37.000Z","2020-06-02T20:18:38.000Z","2020-06-02T20:18:39.000Z","2020-06-02T20:18:40.000Z","2020-06-02T20:18:41.000Z","2020-06-02T20:18:42.000Z","2020-06-02T20:18:43.000Z","2020-06-02T20:18:44.000Z","2020-06-02T20:18:45.000Z","2020-06-02T20:18:46.000Z","2020-06-02T20:18:47.000Z","2020-06-02T20:18:48.000Z","2020-06-02T20:18:49.000Z","2020-06-02T20:18:50.000Z","2020-06-02T20:18:51.000Z","2020-06-02T20:18:52.000Z","2020-06-02T20:18:53.000Z","2020-06-02T20:18:54.000Z","2020-06-02T20:18:55.000Z","2020-06-02T20:18:56.000Z","2020-06-02T20:18:57.000Z","2020-06-02T20:18:58.000Z","2020-06-02T20:18:59.000Z","2020-06-02T20:19:00.000Z","2020-06-02T20:19:01.000Z","2020-06-02T20:19:02.000Z","2020-06-02T20:19:03.000Z","2020-06-02T20:19:04.000Z","2020-06-02T20:19:05.000Z","2020-06-02T20:19:06.000Z","2020-06-02T20:19:07.000Z","2020-06-02T20:19:08.000Z","2020-06-02T20:19:09.000Z","2020-06-02T20:19:10.000Z","2020-06-02T20:19:11.000Z","2020-06-02T20:19:12.000Z","2020-06-02T20:19:13.000Z","2020-06-02T20:19:14.000Z","2020-06-02T20:19:15.000Z","2020-06-02T20:19:16.000Z","2020-06-02T20:19:17.000Z","2020-06-02T20:19:18.000Z","2020-06-02T20:19:19.000Z","2020-06-02T20:19:20.000Z","2020-06-02T20:19:21.000Z","2020-06-02T20:19:22.000Z","2020-06-02T20:19:23.000Z","2020-06-02T20:19:24.000Z","2020-06-02T20:19:25.000Z","2020-06-02T20:19:26.000Z","2020-06-02T20:19:27.000Z","2020-06-02T20:19:28.000Z","2020-06-02T20:19:29.000Z","2020-06-02T20:19:30.000Z","2020-06-02T20:19:31.000Z","2020-06-02T20:19:32.000Z","2020-06-02T20:19:33.000Z","2020-06-02T20:19:34.000Z","2020-06-02T20:19:35.000Z","2020-06-02T20:19:36.000Z","2020-06-02T20:19:37.000Z","2020-06-02T20:19:38.000Z","2020-06-02T20:19:39.000Z","2020-06-02T20:19:40.000Z","2020-06-02T20:19:41.000Z","2020-06-02T20:19:42.000Z","2020-06-02T20:19:43.000Z","2020-06-02T20:19:44.000Z","2020-06-02T20:19:45.000Z","2020-06-02T20:19:46.000Z","2020-06-02T20:19:47.000Z","2020-06-02T20:19:48.000Z","2020-06-02T20:19:49.000Z","2020-06-02T20:19:50.000Z","2020-06-02T20:19:51.000Z","2020-06-02T20:19:52.000Z","2020-06-02T20:19:53.000Z","2020-06-02T20:19:54.000Z","2020-06-02T20:19:55.000Z","2020-06-02T20:19:56.000Z","2020-06-02T20:19:57.000Z","2020-06-02T20:19:58.000Z","2020-06-02T20:19:59.000Z","2020-06-02T20:20:00.000Z","2020-06-02T20:20:01.000Z","2020-06-02T20:20:02.000Z","2020-06-02T20:20:03.000Z","2020-06-02T20:20:04.000Z","2020-06-02T20:20:05.000Z","2020-06-02T20:20:06.000Z","2020-06-02T20:20:07.000Z","2020-06-02T20:20:08.000Z","2020-06-02T20:20:09.000Z","2020-06-02T20:20:10.000Z","2020-06-02T20:20:11.000Z","2020-06-02T20:20:12.000Z","2020-06-02T20:20:13.000Z","2020-06-02T20:20:14.000Z","2020-06-02T20:20:15.000Z","2020-06-02T20:20:16.000Z","2020-06-02T20:20:17.000Z","2020-06-02T20:20:18.000Z","2020-06-02T20:20:19.000Z","2020-06-02T20:20:20.000Z","2020-06-02T20:20:21.000Z","2020-06-02T20:20:22.000Z","2020-06-02T20:20:23.000Z","2020-06-02T20:20:24.000Z","2020-06-02T20:20:25.000Z","2020-06-02T20:20:26.000Z","2020-06-02T20:20:27.000Z","2020-06-02T20:20:28.000Z","2020-06-02T20:20:29.000Z","2020-06-02T20:20:30.000Z","2020-06-02T20:20:31.000Z","2020-06-02T20:20:32.000Z","2020-06-02T20:20:33.000Z","2020-06-02T20:20:34.000Z","2020-06-02T20:20:35.000Z","2020-06-02T20:20:36.000Z","2020-06-02T20:20:37.000Z","2020-06-02T20:20:38.000Z","2020-06-02T20:20:39.000Z","2020-06-02T20:20:40.000Z","2020-06-02T20:20:41.000Z","2020-06-02T20:20:42.000Z","2020-06-02T20:20:43.000Z","2020-06-02T20:20:44.000Z","2020-06-02T20:20:45.000Z","2020-06-02T20:20:46.000Z","2020-06-02T20:20:47.000Z","2020-06-02T20:20:48.000Z","2020-06-02T20:20:49.000Z","2020-06-02T20:20:50.000Z","2020-06-02T20:20:51.000Z","2020-06-02T20:20:52.000Z","2020-06-02T20:20:53.000Z","2020-06-02T20:20:54.000Z","2020-06-02T20:20:55.000Z","2020-06-02T20:20:56.000Z","2020-06-02T20:20:57.000Z","2020-06-02T20:20:58.000Z","2020-06-02T20:20:59.000Z","2020-06-02T20:21:00.000Z","2020-06-02T20:21:01.000Z","2020-06-02T20:21:02.000Z","2020-06-02T20:21:03.000Z","2020-06-02T20:21:04.000Z","2020-06-02T20:21:05.000Z","2020-06-02T20:21:06.000Z","2020-06-02T20:21:07.000Z","2020-06-02T20:21:08.000Z","2020-06-02T20:21:09.000Z","2020-06-02T20:21:10.000Z","2020-06-02T20:21:11.000Z","2020-06-02T20:21:12.000Z","2020-06-02T20:21:13.000Z","2020-06-02T20:21:14.000Z","2020-06-02T20:21:15.000Z","2020-06-02T20:21:16.000Z","2020-06-02T20:21:17.000Z","2020-06-02T20:21:18.000Z","2020-06-02T20:21:19.000Z","2020-06-02T20:21:20.000Z","2020-06-02T20:21:21.000Z","2020-06-02T20:21:22.000Z","2020-06-02T20:21:23.000Z","2020-06-02T20:21:24.000Z","2020-06-02T20:21:25.000Z","2020-06-02T20:21:26.000Z","2020-06-02T20:21:27.000Z","2020-06-02T20:21:28.000Z","2020-06-02T20:21:29.000Z","2020-06-02T20:21:30.000Z","2020-06-02T20:21:31.000Z","2020-06-02T20:21:32.000Z","2020-06-02T20:21:33.000Z","2020-06-02T20:21:34.000Z","2020-06-02T20:21:35.000Z","2020-06-02T20:21:36.000Z","2020-06-02T20:21:37.000Z","2020-06-02T20:21:38.000Z","2020-06-02T20:21:39.000Z","2020-06-02T20:21:40.000Z","2020-06-02T20:21:41.000Z","2020-06-02T20:21:42.000Z","2020-06-02T20:21:43.000Z","2020-06-02T20:21:44.000Z","2020-06-02T20:21:45.000Z","2020-06-02T20:21:46.000Z","2020-06-02T20:21:47.000Z","2020-06-02T20:21:48.000Z","2020-06-02T20:21:49.000Z","2020-06-02T20:21:50.000Z","2020-06-02T20:21:51.000Z","2020-06-02T20:21:52.000Z","2020-06-02T20:21:53.000Z","2020-06-02T20:21:54.000Z","2020-06-02T20:21:55.000Z","2020-06-02T20:21:56.000Z","2020-06-02T20:21:57.000Z","2020-06-02T20:21:58.000Z","2020-06-02T20:21:59.000Z","2020-06-02T20:22:00.000Z","2020-06-02T20:22:01.000Z","2020-06-02T20:22:02.000Z","2020-06-02T20:22:03.000Z","2020-06-02T20:22:04.000Z","2020-06-02T20:22:05.000Z","2020-06-02T20:22:06.000Z","2020-06-02T20:22:07.000Z","2020-06-02T20:22:08.000Z","2020-06-02T20:22:09.000Z","2020-06-02T20:22:10.000Z","2020-06-02T20:22:11.000Z","2020-06-02T20:22:12.000Z","2020-06-02T20:22:13.000Z","2020-06-02T20:22:14.000Z","2020-06-02T20:22:15.000Z","2020-06-02T20:22:16.000Z","2020-06-02T20:22:17.000Z","2020-06-02T20:22:18.000Z","2020-06-02T20:22:19.000Z","2020-06-02T20:22:20.000Z","2020-06-02T20:22:21.000Z","2020-06-02T20:22:22.000Z","2020-06-02T20:22:23.000Z","2020-06-02T20:22:24.000Z","2020-06-02T20:22:25.000Z","2020-06-02T20:22:26.000Z","2020-06-02T20:22:27.000Z","2020-06-02T20:22:28.000Z","2020-06-02T20:22:29.000Z","2020-06-02T20:22:30.000Z","2020-06-02T20:22:31.000Z","2020-06-02T20:22:32.000Z","2020-06-02T20:22:33.000Z","2020-06-02T20:22:34.000Z","2020-06-02T20:22:35.000Z","2020-06-02T20:22:36.000Z","2020-06-02T20:22:37.000Z","2020-06-02T20:22:38.000Z","2020-06-02T20:22:39.000Z","2020-06-02T20:22:40.000Z","2020-06-02T20:22:41.000Z","2020-06-02T20:22:42.000Z","2020-06-02T20:22:43.000Z","2020-06-02T20:22:44.000Z","2020-06-02T20:22:45.000Z","2020-06-02T20:22:46.000Z","2020-06-02T20:22:47.000Z","2020-06-02T20:22:48.000Z","2020-06-02T20:22:49.000Z","2020-06-02T20:22:50.000Z","2020-06-02T20:22:51.000Z","2020-06-02T20:22:52.000Z","2020-06-02T20:22:53.000Z","2020-06-02T20:22:54.000Z","2020-06-02T20:22:55.000Z","2020-06-02T20:22:56.000Z","2020-06-02T20:22:57.000Z","2020-06-02T20:22:58.000Z","2020-06-02T20:22:59.000Z","2020-06-02T20:23:00.000Z","2020-06-02T20:23:01.000Z","2020-06-02T20:23:02.000Z","2020-06-02T20:23:03.000Z","2020-06-02T20:23:04.000Z","2020-06-02T20:23:05.000Z","2020-06-02T20:23:06.000Z","2020-06-02T20:23:07.000Z","2020-06-02T20:23:08.000Z","2020-06-02T20:23:09.000Z","2020-06-02T20:23:10.000Z","2020-06-02T20:23:11.000Z","2020-06-02T20:23:12.000Z","2020-06-02T20:23:13.000Z","2020-06-02T20:23:14.000Z","2020-06-02T20:23:15.000Z","2020-06-02T20:23:16.000Z","2020-06-02T20:23:17.000Z","2020-06-02T20:23:18.000Z","2020-06-02T20:23:19.000Z","2020-06-02T20:23:20.000Z","2020-06-02T20:23:21.000Z","2020-06-02T20:23:22.000Z","2020-06-02T20:23:23.000Z","2020-06-02T20:23:24.000Z","2020-06-02T20:23:25.000Z","2020-06-02T20:23:26.000Z","2020-06-02T20:23:27.000Z","2020-06-02T20:23:28.000Z","2020-06-02T20:23:29.000Z","2020-06-02T20:23:30.000Z","2020-06-02T20:23:31.000Z","2020-06-02T20:23:32.000Z","2020-06-02T20:23:33.000Z","2020-06-02T20:23:34.000Z","2020-06-02T20:23:35.000Z","2020-06-02T20:23:36.000Z","2020-06-02T20:23:37.000Z","2020-06-02T20:23:38.000Z","2020-06-02T20:23:39.000Z","2020-06-02T20:23:40.000Z","2020-06-02T20:23:41.000Z","2020-06-02T20:23:42.000Z","2020-06-02T20:23:43.000Z","2020-06-02T20:23:44.000Z","2020-06-02T20:23:45.000Z","2020-06-02T20:23:46.000Z","2020-06-02T20:23:47.000Z","2020-06-02T20:23:48.000Z","2020-06-02T20:23:49.000Z","2020-06-02T20:23:50.000Z","2020-06-02T20:23:51.000Z","2020-06-02T20:23:52.000Z","2020-06-02T20:23:53.000Z","2020-06-02T20:23:54.000Z","2020-06-02T20:23:55.000Z","2020-06-02T20:23:56.000Z","2020-06-02T20:23:57.000Z","2020-06-02T20:23:58.000Z","2020-06-02T20:23:59.000Z","2020-06-02T20:24:00.000Z","2020-06-02T20:24:01.000Z","2020-06-02T20:24:02.000Z","2020-06-02T20:24:03.000Z","2020-06-02T20:24:04.000Z","2020-06-02T20:24:05.000Z","2020-06-02T20:24:06.000Z","2020-06-02T20:24:07.000Z","2020-06-02T20:24:08.000Z","2020-06-02T20:24:09.000Z","2020-06-02T20:24:10.000Z","2020-06-02T20:24:11.000Z","2020-06-02T20:24:12.000Z","2020-06-02T20:24:13.000Z","2020-06-02T20:24:14.000Z","2020-06-02T20:24:15.000Z","2020-06-02T20:24:16.000Z","2020-06-02T20:24:17.000Z","2020-06-02T20:24:18.000Z","2020-06-02T20:24:19.000Z","2020-06-02T20:24:20.000Z","2020-06-02T20:24:21.000Z","2020-06-02T20:24:22.000Z","2020-06-02T20:24:23.000Z","2020-06-02T20:24:24.000Z","2020-06-02T20:24:25.000Z","2020-06-02T20:24:26.000Z","2020-06-02T20:24:27.000Z","2020-06-02T20:24:28.000Z","2020-06-02T20:24:29.000Z","2020-06-02T20:24:30.000Z","2020-06-02T20:24:31.000Z","2020-06-02T20:24:32.000Z","2020-06-02T20:24:33.000Z","2020-06-02T20:24:34.000Z","2020-06-02T20:24:35.000Z","2020-06-02T20:24:36.000Z","2020-06-02T20:24:37.000Z","2020-06-02T20:24:38.000Z","2020-06-02T20:24:39.000Z","2020-06-02T20:24:40.000Z","2020-06-02T20:24:41.000Z","2020-06-02T20:24:42.000Z","2020-06-02T20:24:43.000Z","2020-06-02T20:24:44.000Z","2020-06-02T20:24:45.000Z","2020-06-02T20:24:46.000Z","2020-06-02T20:24:47.000Z","2020-06-02T20:24:48.000Z","2020-06-02T20:24:49.000Z","2020-06-02T20:24:50.000Z","2020-06-02T20:24:51.000Z","2020-06-02T20:24:52.000Z","2020-06-02T20:24:53.000Z","2020-06-02T20:24:54.000Z","2020-06-02T20:24:55.000Z","2020-06-02T20:24:56.000Z","2020-06-02T20:24:57.000Z","2020-06-02T20:24:58.000Z","2020-06-02T20:24:59.000Z","2020-06-02T20:25:00.000Z","2020-06-02T20:25:01.000Z","2020-06-02T20:25:02.000Z","2020-06-02T20:25:03.000Z","2020-06-02T20:25:04.000Z","2020-06-02T20:25:05.000Z","2020-06-02T20:25:06.000Z","2020-06-02T20:25:07.000Z","2020-06-02T20:25:08.000Z","2020-06-02T20:25:09.000Z","2020-06-02T20:25:10.000Z","2020-06-02T20:25:11.000Z","2020-06-02T20:25:12.000Z","2020-06-02T20:25:13.000Z","2020-06-02T20:25:14.000Z","2020-06-02T20:25:15.000Z","2020-06-02T20:25:16.000Z","2020-06-02T20:25:17.000Z","2020-06-02T20:25:18.000Z","2020-06-02T20:25:19.000Z","2020-06-02T20:25:20.000Z","2020-06-02T20:25:21.000Z","2020-06-02T20:25:22.000Z","2020-06-02T20:25:23.000Z","2020-06-02T20:25:24.000Z","2020-06-02T20:25:25.000Z","2020-06-02T20:25:26.000Z","2020-06-02T20:25:27.000Z","2020-06-02T20:25:28.000Z","2020-06-02T20:25:29.000Z","2020-06-02T20:25:30.000Z","2020-06-02T20:25:31.000Z","2020-06-02T20:25:32.000Z","2020-06-02T20:25:33.000Z","2020-06-02T20:25:34.000Z","2020-06-02T20:25:35.000Z","2020-06-02T20:25:36.000Z","2020-06-02T20:25:37.000Z","2020-06-02T20:25:38.000Z","2020-06-02T20:25:39.000Z","2020-06-02T20:25:40.000Z","2020-06-02T20:25:41.000Z","2020-06-02T20:25:42.000Z","2020-06-02T20:25:43.000Z","2020-06-02T20:25:44.000Z","2020-06-02T20:25:45.000Z","2020-06-02T20:25:46.000Z","2020-06-02T20:25:47.000Z","2020-06-02T20:25:48.000Z","2020-06-02T20:25:49.000Z","2020-06-02T20:25:50.000Z","2020-06-02T20:25:51.000Z","2020-06-02T20:25:52.000Z","2020-06-02T20:25:53.000Z","2020-06-02T20:25:54.000Z","2020-06-02T20:25:55.000Z","2020-06-02T20:25:56.000Z","2020-06-02T20:25:57.000Z","2020-06-02T20:25:58.000Z","2020-06-02T20:25:59.000Z","2020-06-02T20:26:00.000Z","2020-06-02T20:26:01.000Z","2020-06-02T20:26:02.000Z","2020-06-02T20:26:03.000Z","2020-06-02T20:26:04.000Z","2020-06-02T20:26:05.000Z","2020-06-02T20:26:06.000Z","2020-06-02T20:26:07.000Z","2020-06-02T20:26:08.000Z","2020-06-02T20:26:09.000Z","2020-06-02T20:26:10.000Z","2020-06-02T20:26:11.000Z","2020-06-02T20:26:12.000Z","2020-06-02T20:26:13.000Z","2020-06-02T20:26:14.000Z","2020-06-02T20:26:15.000Z","2020-06-02T20:26:16.000Z","2020-06-02T20:26:17.000Z","2020-06-02T20:26:18.000Z","2020-06-02T20:26:19.000Z","2020-06-02T20:26:20.000Z","2020-06-02T20:26:21.000Z","2020-06-02T20:26:22.000Z","2020-06-02T20:26:23.000Z","2020-06-02T20:26:24.000Z","2020-06-02T20:26:25.000Z","2020-06-02T20:26:26.000Z","2020-06-02T20:26:27.000Z","2020-06-02T20:26:28.000Z","2020-06-02T20:26:29.000Z","2020-06-02T20:26:30.000Z","2020-06-02T20:26:31.000Z","2020-06-02T20:26:32.000Z","2020-06-02T20:26:33.000Z","2020-06-02T20:26:34.000Z","2020-06-02T20:26:35.000Z","2020-06-02T20:26:36.000Z","2020-06-02T20:26:37.000Z","2020-06-02T20:26:38.000Z","2020-06-02T20:26:39.000Z","2020-06-02T20:26:40.000Z","2020-06-02T20:26:41.000Z","2020-06-02T20:26:42.000Z","2020-06-02T20:26:43.000Z","2020-06-02T20:26:44.000Z","2020-06-02T20:26:45.000Z","2020-06-02T20:26:46.000Z","2020-06-02T20:26:47.000Z","2020-06-02T20:26:48.000Z","2020-06-02T20:26:49.000Z","2020-06-02T20:26:50.000Z","2020-06-02T20:26:51.000Z","2020-06-02T20:26:52.000Z","2020-06-02T20:26:53.000Z","2020-06-02T20:26:54.000Z","2020-06-02T20:26:55.000Z","2020-06-02T20:26:56.000Z","2020-06-02T20:26:57.000Z","2020-06-02T20:26:58.000Z","2020-06-02T20:26:59.000Z","2020-06-02T20:27:00.000Z","2020-06-02T20:27:01.000Z","2020-06-02T20:27:02.000Z","2020-06-02T20:27:03.000Z","2020-06-02T20:27:04.000Z","2020-06-02T20:27:05.000Z","2020-06-02T20:27:06.000Z","2020-06-02T20:27:07.000Z","2020-06-02T20:27:08.000Z","2020-06-02T20:27:09.000Z","2020-06-02T20:27:10.000Z","2020-06-02T20:27:11.000Z","2020-06-02T20:27:12.000Z","2020-06-02T20:27:13.000Z","2020-06-02T20:27:14.000Z","2020-06-02T20:27:15.000Z","2020-06-02T20:27:16.000Z","2020-06-02T20:27:17.000Z","2020-06-02T20:27:18.000Z","2020-06-02T20:27:19.000Z","2020-06-02T20:27:20.000Z","2020-06-02T20:27:21.000Z","2020-06-02T20:27:22.000Z","2020-06-02T20:27:23.000Z","2020-06-02T20:27:24.000Z","2020-06-02T20:27:25.000Z","2020-06-02T20:27:26.000Z","2020-06-02T20:27:27.000Z","2020-06-02T20:27:28.000Z","2020-06-02T20:27:29.000Z","2020-06-02T20:27:30.000Z","2020-06-02T20:27:31.000Z","2020-06-02T20:27:32.000Z","2020-06-02T20:27:33.000Z","2020-06-02T20:27:34.000Z","2020-06-02T20:27:35.000Z","2020-06-02T20:27:36.000Z","2020-06-02T20:27:37.000Z","2020-06-02T20:27:38.000Z","2020-06-02T20:27:39.000Z","2020-06-02T20:27:40.000Z","2020-06-02T20:27:41.000Z","2020-06-02T20:27:42.000Z","2020-06-02T20:27:43.000Z","2020-06-02T20:27:44.000Z","2020-06-02T20:27:45.000Z","2020-06-02T20:27:46.000Z","2020-06-02T20:27:47.000Z","2020-06-02T20:27:48.000Z","2020-06-02T20:27:49.000Z","2020-06-02T20:27:50.000Z","2020-06-02T20:27:51.000Z","2020-06-02T20:27:52.000Z","2020-06-02T20:27:53.000Z","2020-06-02T20:27:54.000Z","2020-06-02T20:27:55.000Z","2020-06-02T20:27:56.000Z","2020-06-02T20:27:57.000Z","2020-06-02T20:27:58.000Z","2020-06-02T20:27:59.000Z","2020-06-02T20:28:00.000Z","2020-06-02T20:28:01.000Z","2020-06-02T20:28:02.000Z","2020-06-02T20:28:03.000Z","2020-06-02T20:28:04.000Z","2020-06-02T20:28:05.000Z","2020-06-02T20:28:06.000Z","2020-06-02T20:28:07.000Z","2020-06-02T20:28:08.000Z","2020-06-02T20:28:09.000Z","2020-06-02T20:28:10.000Z","2020-06-02T20:28:11.000Z","2020-06-02T20:28:12.000Z","2020-06-02T20:28:13.000Z","2020-06-02T20:28:14.000Z","2020-06-02T20:28:15.000Z","2020-06-02T20:28:16.000Z","2020-06-02T20:28:17.000Z","2020-06-02T20:28:18.000Z","2020-06-02T20:28:19.000Z","2020-06-02T20:28:20.000Z","2020-06-02T20:28:21.000Z","2020-06-02T20:28:22.000Z","2020-06-02T20:28:23.000Z","2020-06-02T20:28:24.000Z","2020-06-02T20:28:25.000Z","2020-06-02T20:28:26.000Z","2020-06-02T20:28:27.000Z","2020-06-02T20:28:28.000Z","2020-06-02T20:28:29.000Z","2020-06-02T20:28:30.000Z","2020-06-02T20:28:31.000Z","2020-06-02T20:28:32.000Z","2020-06-02T20:28:33.000Z","2020-06-02T20:28:34.000Z","2020-06-02T20:28:35.000Z","2020-06-02T20:28:36.000Z","2020-06-02T20:28:37.000Z","2020-06-02T20:28:38.000Z","2020-06-02T20:28:39.000Z","2020-06-02T20:28:40.000Z","2020-06-02T20:28:41.000Z","2020-06-02T20:28:42.000Z","2020-06-02T20:28:43.000Z","2020-06-02T20:28:44.000Z","2020-06-02T20:28:45.000Z","2020-06-02T20:28:46.000Z","2020-06-02T20:28:47.000Z","2020-06-02T20:28:48.000Z","2020-06-02T20:28:49.000Z","2020-06-02T20:28:50.000Z","2020-06-02T20:28:51.000Z","2020-06-02T20:28:52.000Z","2020-06-02T20:28:53.000Z","2020-06-02T20:28:54.000Z","2020-06-02T20:28:55.000Z","2020-06-02T20:28:56.000Z","2020-06-02T20:28:57.000Z","2020-06-02T20:28:58.000Z","2020-06-02T20:28:59.000Z","2020-06-02T20:29:00.000Z","2020-06-02T20:29:01.000Z","2020-06-02T20:29:02.000Z","2020-06-02T20:29:03.000Z","2020-06-02T20:29:04.000Z","2020-06-02T20:29:05.000Z","2020-06-02T20:29:06.000Z","2020-06-02T20:29:07.000Z","2020-06-02T20:29:08.000Z","2020-06-02T20:29:09.000Z","2020-06-02T20:29:10.000Z","2020-06-02T20:29:11.000Z","2020-06-02T20:29:12.000Z","2020-06-02T20:29:13.000Z","2020-06-02T20:29:14.000Z","2020-06-02T20:29:15.000Z","2020-06-02T20:29:16.000Z","2020-06-02T20:29:17.000Z","2020-06-02T20:29:18.000Z","2020-06-02T20:29:19.000Z","2020-06-02T20:29:20.000Z","2020-06-02T20:29:21.000Z","2020-06-02T20:29:22.000Z","2020-06-02T20:29:23.000Z","2020-06-02T20:29:24.000Z","2020-06-02T20:29:25.000Z","2020-06-02T20:29:26.000Z","2020-06-02T20:29:27.000Z","2020-06-02T20:29:28.000Z","2020-06-02T20:29:29.000Z","2020-06-02T20:29:30.000Z","2020-06-02T20:29:31.000Z","2020-06-02T20:29:32.000Z","2020-06-02T20:29:33.000Z","2020-06-02T20:29:34.000Z","2020-06-02T20:29:35.000Z","2020-06-02T20:29:36.000Z","2020-06-02T20:29:37.000Z","2020-06-02T20:29:38.000Z","2020-06-02T20:29:39.000Z","2020-06-02T20:29:40.000Z","2020-06-02T20:29:41.000Z","2020-06-02T20:29:42.000Z","2020-06-02T20:29:43.000Z","2020-06-02T20:29:44.000Z","2020-06-02T20:29:45.000Z","2020-06-02T20:29:46.000Z","2020-06-02T20:29:47.000Z","2020-06-02T20:29:48.000Z","2020-06-02T20:29:49.000Z","2020-06-02T20:29:50.000Z","2020-06-02T20:29:51.000Z","2020-06-02T20:29:52.000Z","2020-06-02T20:29:53.000Z","2020-06-02T20:29:54.000Z","2020-06-02T20:29:55.000Z","2020-06-02T20:29:56.000Z","2020-06-02T20:29:57.000Z","2020-06-02T20:29:58.000Z","2020-06-02T20:29:59.000Z","2020-06-02T20:30:00.000Z","2020-06-02T20:30:01.000Z","2020-06-02T20:30:02.000Z","2020-06-02T20:30:03.000Z","2020-06-02T20:30:04.000Z","2020-06-02T20:30:05.000Z","2020-06-02T20:30:06.000Z","2020-06-02T20:30:07.000Z","2020-06-02T20:30:08.000Z","2020-06-02T20:30:09.000Z","2020-06-02T20:30:10.000Z","2020-06-02T20:30:11.000Z","2020-06-02T20:30:12.000Z","2020-06-02T20:30:13.000Z","2020-06-02T20:30:14.000Z","2020-06-02T20:30:15.000Z","2020-06-02T20:30:16.000Z","2020-06-02T20:30:17.000Z","2020-06-02T20:30:18.000Z","2020-06-02T20:30:19.000Z","2020-06-02T20:30:20.000Z","2020-06-02T20:30:21.000Z","2020-06-02T20:30:22.000Z","2020-06-02T20:30:23.000Z","2020-06-02T20:30:24.000Z","2020-06-02T20:30:25.000Z","2020-06-02T20:30:26.000Z","2020-06-02T20:30:27.000Z","2020-06-02T20:30:28.000Z","2020-06-02T20:30:29.000Z","2020-06-02T20:30:30.000Z","2020-06-02T20:30:31.000Z","2020-06-02T20:30:32.000Z","2020-06-02T20:30:33.000Z","2020-06-02T20:30:34.000Z","2020-06-02T20:30:35.000Z","2020-06-02T20:30:36.000Z","2020-06-02T20:30:37.000Z","2020-06-02T20:30:38.000Z","2020-06-02T20:30:39.000Z","2020-06-02T20:30:40.000Z","2020-06-02T20:30:41.000Z","2020-06-02T20:30:42.000Z","2020-06-02T20:30:43.000Z","2020-06-02T20:30:44.000Z","2020-06-02T20:30:45.000Z","2020-06-02T20:30:46.000Z","2020-06-02T20:30:47.000Z","2020-06-02T20:30:48.000Z","2020-06-02T20:30:49.000Z","2020-06-02T20:30:50.000Z","2020-06-02T20:30:51.000Z","2020-06-02T20:30:52.000Z","2020-06-02T20:30:53.000Z","2020-06-02T20:30:54.000Z","2020-06-02T20:30:55.000Z","2020-06-02T20:30:56.000Z","2020-06-02T20:30:57.000Z","2020-06-02T20:30:58.000Z","2020-06-02T20:30:59.000Z","2020-06-02T20:31:00.000Z","2020-06-02T20:31:01.000Z","2020-06-02T20:31:02.000Z","2020-06-02T20:31:03.000Z","2020-06-02T20:31:04.000Z","2020-06-02T20:31:05.000Z","2020-06-02T20:31:06.000Z","2020-06-02T20:31:07.000Z","2020-06-02T20:31:08.000Z","2020-06-02T20:31:09.000Z","2020-06-02T20:31:10.000Z","2020-06-02T20:31:11.000Z","2020-06-02T20:31:12.000Z","2020-06-02T20:31:13.000Z","2020-06-02T20:31:14.000Z","2020-06-02T20:31:15.000Z","2020-06-02T20:31:16.000Z","2020-06-02T20:31:17.000Z","2020-06-02T20:31:18.000Z","2020-06-02T20:31:19.000Z","2020-06-02T20:31:20.000Z","2020-06-02T20:31:21.000Z","2020-06-02T20:31:22.000Z","2020-06-02T20:31:23.000Z","2020-06-02T20:31:24.000Z","2020-06-02T20:31:25.000Z","2020-06-02T20:31:26.000Z","2020-06-02T20:31:27.000Z","2020-06-02T20:31:28.000Z","2020-06-02T20:31:29.000Z","2020-06-02T20:31:30.000Z","2020-06-02T20:31:31.000Z","2020-06-02T20:31:32.000Z","2020-06-02T20:31:33.000Z","2020-06-02T20:31:34.000Z","2020-06-02T20:31:35.000Z","2020-06-02T20:31:36.000Z","2020-06-02T20:31:37.000Z","2020-06-02T20:31:38.000Z","2020-06-02T20:31:39.000Z","2020-06-02T20:31:40.000Z","2020-06-02T20:31:41.000Z","2020-06-02T20:31:42.000Z","2020-06-02T20:31:43.000Z","2020-06-02T20:31:44.000Z","2020-06-02T20:31:45.000Z","2020-06-02T20:31:46.000Z","2020-06-02T20:31:47.000Z","2020-06-02T20:31:48.000Z","2020-06-02T20:31:49.000Z","2020-06-02T20:31:50.000Z","2020-06-02T20:31:51.000Z","2020-06-02T20:31:52.000Z","2020-06-02T20:31:53.000Z","2020-06-02T20:31:54.000Z","2020-06-02T20:31:55.000Z","2020-06-02T20:31:56.000Z","2020-06-02T20:31:57.000Z","2020-06-02T20:31:58.000Z","2020-06-02T20:31:59.000Z","2020-06-02T20:32:00.000Z","2020-06-02T20:32:01.000Z","2020-06-02T20:32:02.000Z","2020-06-02T20:32:03.000Z","2020-06-02T20:32:04.000Z","2020-06-02T20:32:05.000Z","2020-06-02T20:32:06.000Z","2020-06-02T20:32:07.000Z","2020-06-02T20:32:08.000Z","2020-06-02T20:32:09.000Z","2020-06-02T20:32:10.000Z","2020-06-02T20:32:11.000Z","2020-06-02T20:32:12.000Z","2020-06-02T20:32:13.000Z","2020-06-02T20:32:14.000Z","2020-06-02T20:32:15.000Z","2020-06-02T20:32:16.000Z","2020-06-02T20:32:17.000Z","2020-06-02T20:32:18.000Z","2020-06-02T20:32:19.000Z","2020-06-02T20:32:20.000Z","2020-06-02T20:32:21.000Z","2020-06-02T20:32:22.000Z","2020-06-02T20:32:23.000Z","2020-06-02T20:32:24.000Z","2020-06-02T20:32:25.000Z","2020-06-02T20:32:26.000Z","2020-06-02T20:32:27.000Z","2020-06-02T20:32:28.000Z","2020-06-02T20:32:29.000Z","2020-06-02T20:32:30.000Z","2020-06-02T20:32:31.000Z","2020-06-02T20:32:32.000Z","2020-06-02T20:32:33.000Z","2020-06-02T20:32:34.000Z","2020-06-02T20:32:35.000Z","2020-06-02T20:32:36.000Z","2020-06-02T20:32:37.000Z","2020-06-02T20:32:38.000Z","2020-06-02T20:32:39.000Z","2020-06-02T20:32:40.000Z","2020-06-02T20:32:41.000Z","2020-06-02T20:32:42.000Z","2020-06-02T20:32:43.000Z","2020-06-02T20:32:44.000Z","2020-06-02T20:32:45.000Z","2020-06-02T20:32:46.000Z","2020-06-02T20:32:47.000Z","2020-06-02T20:32:48.000Z","2020-06-02T20:32:49.000Z","2020-06-02T20:32:50.000Z","2020-06-02T20:32:51.000Z","2020-06-02T20:32:52.000Z","2020-06-02T20:32:53.000Z","2020-06-02T20:32:54.000Z","2020-06-02T20:32:55.000Z","2020-06-02T20:32:56.000Z","2020-06-02T20:32:57.000Z","2020-06-02T20:32:58.000Z","2020-06-02T20:32:59.000Z","2020-06-02T20:33:00.000Z","2020-06-02T20:33:01.000Z","2020-06-02T20:33:02.000Z","2020-06-02T20:33:03.000Z","2020-06-02T20:33:04.000Z","2020-06-02T20:33:05.000Z","2020-06-02T20:33:06.000Z","2020-06-02T20:33:07.000Z","2020-06-02T20:33:08.000Z","2020-06-02T20:33:09.000Z","2020-06-02T20:33:10.000Z","2020-06-02T20:33:11.000Z","2020-06-02T20:33:12.000Z","2020-06-02T20:33:13.000Z","2020-06-02T20:33:14.000Z","2020-06-02T20:33:15.000Z","2020-06-02T20:33:16.000Z","2020-06-02T20:33:17.000Z","2020-06-02T20:33:18.000Z","2020-06-02T20:33:19.000Z","2020-06-02T20:33:20.000Z","2020-06-02T20:33:21.000Z","2020-06-02T20:33:22.000Z","2020-06-02T20:33:23.000Z","2020-06-02T20:33:24.000Z","2020-06-02T20:33:25.000Z","2020-06-02T20:33:26.000Z","2020-06-02T20:33:27.000Z","2020-06-02T20:33:28.000Z","2020-06-02T20:33:29.000Z","2020-06-02T20:33:30.000Z","2020-06-02T20:33:31.000Z","2020-06-02T20:33:32.000Z","2020-06-02T20:33:33.000Z","2020-06-02T20:33:34.000Z","2020-06-02T20:33:35.000Z","2020-06-02T20:33:36.000Z","2020-06-02T20:33:37.000Z","2020-06-02T20:33:38.000Z","2020-06-02T20:33:39.000Z","2020-06-02T20:33:40.000Z","2020-06-02T20:33:41.000Z","2020-06-02T20:33:42.000Z","2020-06-02T20:33:43.000Z","2020-06-02T20:33:44.000Z","2020-06-02T20:33:45.000Z","2020-06-02T20:33:46.000Z","2020-06-02T20:33:47.000Z","2020-06-02T20:33:48.000Z","2020-06-02T20:33:49.000Z","2020-06-02T20:33:50.000Z","2020-06-02T20:33:51.000Z","2020-06-02T20:33:52.000Z","2020-06-02T20:33:53.000Z","2020-06-02T20:33:54.000Z","2020-06-02T20:33:55.000Z","2020-06-02T20:33:56.000Z","2020-06-02T20:33:57.000Z","2020-06-02T20:33:58.000Z","2020-06-02T20:33:59.000Z","2020-06-02T20:34:00.000Z","2020-06-02T20:34:01.000Z","2020-06-02T20:34:02.000Z","2020-06-02T20:34:03.000Z","2020-06-02T20:34:04.000Z","2020-06-02T20:34:05.000Z","2020-06-02T20:34:06.000Z","2020-06-02T20:34:07.000Z","2020-06-02T20:34:08.000Z","2020-06-02T20:34:09.000Z","2020-06-02T20:34:10.000Z","2020-06-02T20:34:11.000Z","2020-06-02T20:34:12.000Z","2020-06-02T20:34:13.000Z","2020-06-02T20:34:14.000Z","2020-06-02T20:34:15.000Z","2020-06-02T20:34:16.000Z","2020-06-02T20:34:17.000Z","2020-06-02T20:34:18.000Z","2020-06-02T20:34:19.000Z","2020-06-02T20:34:20.000Z","2020-06-02T20:34:21.000Z","2020-06-02T20:34:22.000Z","2020-06-02T20:34:23.000Z","2020-06-02T20:34:24.000Z","2020-06-02T20:34:25.000Z","2020-06-02T20:34:26.000Z","2020-06-02T20:34:27.000Z","2020-06-02T20:34:28.000Z","2020-06-02T20:34:29.000Z","2020-06-02T20:34:30.000Z","2020-06-02T20:34:31.000Z","2020-06-02T20:34:32.000Z","2020-06-02T20:34:33.000Z","2020-06-02T20:34:34.000Z","2020-06-02T20:34:35.000Z","2020-06-02T20:34:36.000Z","2020-06-02T20:34:37.000Z","2020-06-02T20:34:38.000Z","2020-06-02T20:34:39.000Z","2020-06-02T20:34:40.000Z","2020-06-02T20:34:41.000Z","2020-06-02T20:34:42.000Z","2020-06-02T20:34:43.000Z","2020-06-02T20:34:44.000Z","2020-06-02T20:34:45.000Z","2020-06-02T20:34:46.000Z","2020-06-02T20:34:47.000Z","2020-06-02T20:34:48.000Z","2020-06-02T20:34:49.000Z","2020-06-02T20:34:50.000Z","2020-06-02T20:34:51.000Z","2020-06-02T20:34:52.000Z","2020-06-02T20:34:53.000Z","2020-06-02T20:34:54.000Z","2020-06-02T20:34:55.000Z","2020-06-02T20:34:56.000Z","2020-06-02T20:34:57.000Z","2020-06-02T20:34:58.000Z","2020-06-02T20:34:59.000Z","2020-06-02T20:35:00.000Z","2020-06-02T20:35:01.000Z","2020-06-02T20:35:02.000Z","2020-06-02T20:35:03.000Z","2020-06-02T20:35:04.000Z","2020-06-02T20:35:05.000Z","2020-06-02T20:35:06.000Z","2020-06-02T20:35:07.000Z","2020-06-02T20:35:08.000Z","2020-06-02T20:35:09.000Z","2020-06-02T20:35:10.000Z","2020-06-02T20:35:11.000Z","2020-06-02T20:35:12.000Z","2020-06-02T20:35:13.000Z","2020-06-02T20:35:14.000Z","2020-06-02T20:35:15.000Z","2020-06-02T20:35:16.000Z","2020-06-02T20:35:17.000Z","2020-06-02T20:35:18.000Z","2020-06-02T20:35:19.000Z","2020-06-02T20:35:20.000Z","2020-06-02T20:35:21.000Z","2020-06-02T20:35:22.000Z","2020-06-02T20:35:23.000Z","2020-06-02T20:35:24.000Z","2020-06-02T20:35:25.000Z","2020-06-02T20:35:26.000Z","2020-06-02T20:35:27.000Z","2020-06-02T20:35:28.000Z","2020-06-02T20:35:29.000Z","2020-06-02T20:35:30.000Z","2020-06-02T20:35:31.000Z","2020-06-02T20:35:32.000Z","2020-06-02T20:35:33.000Z","2020-06-02T20:35:34.000Z","2020-06-02T20:35:35.000Z","2020-06-02T20:35:36.000Z","2020-06-02T20:35:37.000Z","2020-06-02T20:35:38.000Z","2020-06-02T20:35:39.000Z","2020-06-02T20:35:40.000Z","2020-06-02T20:35:41.000Z","2020-06-02T20:35:42.000Z","2020-06-02T20:35:43.000Z","2020-06-02T20:35:44.000Z","2020-06-02T20:35:45.000Z","2020-06-02T20:35:46.000Z","2020-06-02T20:35:47.000Z","2020-06-02T20:35:48.000Z","2020-06-02T20:35:49.000Z","2020-06-02T20:35:50.000Z","2020-06-02T20:35:51.000Z","2020-06-02T20:35:52.000Z","2020-06-02T20:35:53.000Z","2020-06-02T20:35:54.000Z","2020-06-02T20:35:55.000Z","2020-06-02T20:35:56.000Z","2020-06-02T20:35:57.000Z","2020-06-02T20:35:58.000Z","2020-06-02T20:35:59.000Z","2020-06-02T20:36:00.000Z","2020-06-02T20:36:01.000Z","2020-06-02T20:36:02.000Z","2020-06-02T20:36:03.000Z","2020-06-02T20:36:04.000Z","2020-06-02T20:36:05.000Z","2020-06-02T20:36:06.000Z","2020-06-02T20:36:07.000Z","2020-06-02T20:36:08.000Z","2020-06-02T20:36:09.000Z","2020-06-02T20:36:10.000Z","2020-06-02T20:36:11.000Z","2020-06-02T20:36:12.000Z","2020-06-02T20:36:13.000Z","2020-06-02T20:36:14.000Z","2020-06-02T20:36:15.000Z","2020-06-02T20:36:16.000Z","2020-06-02T20:36:17.000Z","2020-06-02T20:36:18.000Z","2020-06-02T20:36:19.000Z","2020-06-02T20:36:20.000Z","2020-06-02T20:36:21.000Z","2020-06-02T20:36:22.000Z","2020-06-02T20:36:23.000Z","2020-06-02T20:36:24.000Z","2020-06-02T20:36:25.000Z","2020-06-02T20:36:26.000Z","2020-06-02T20:36:27.000Z","2020-06-02T20:36:28.000Z","2020-06-02T20:36:29.000Z","2020-06-02T20:36:30.000Z","2020-06-02T20:36:31.000Z","2020-06-02T20:36:32.000Z","2020-06-02T20:36:33.000Z","2020-06-02T20:36:34.000Z","2020-06-02T20:36:35.000Z","2020-06-02T20:36:36.000Z","2020-06-02T20:36:37.000Z","2020-06-02T20:36:38.000Z","2020-06-02T20:36:39.000Z","2020-06-02T20:36:40.000Z","2020-06-02T20:36:41.000Z","2020-06-02T20:36:42.000Z","2020-06-02T20:36:43.000Z","2020-06-02T20:36:44.000Z","2020-06-02T20:36:45.000Z","2020-06-02T20:36:46.000Z","2020-06-02T20:36:47.000Z","2020-06-02T20:36:48.000Z","2020-06-02T20:36:49.000Z","2020-06-02T20:36:50.000Z","2020-06-02T20:36:51.000Z","2020-06-02T20:36:52.000Z","2020-06-02T20:36:53.000Z","2020-06-02T20:36:54.000Z","2020-06-02T20:36:55.000Z","2020-06-02T20:36:56.000Z","2020-06-02T20:36:57.000Z","2020-06-02T20:36:58.000Z","2020-06-02T20:36:59.000Z","2020-06-02T20:37:00.000Z","2020-06-02T20:37:01.000Z","2020-06-02T20:37:02.000Z","2020-06-02T20:37:03.000Z","2020-06-02T20:37:04.000Z","2020-06-02T20:37:05.000Z","2020-06-02T20:37:06.000Z","2020-06-02T20:37:07.000Z","2020-06-02T20:37:08.000Z","2020-06-02T20:37:09.000Z","2020-06-02T20:37:10.000Z","2020-06-02T20:37:11.000Z","2020-06-02T20:37:12.000Z","2020-06-02T20:37:13.000Z","2020-06-02T20:37:14.000Z","2020-06-02T20:37:15.000Z","2020-06-02T20:37:16.000Z","2020-06-02T20:37:17.000Z","2020-06-02T20:37:18.000Z","2020-06-02T20:37:19.000Z","2020-06-02T20:37:20.000Z","2020-06-02T20:37:21.000Z","2020-06-02T20:37:22.000Z","2020-06-02T20:37:23.000Z","2020-06-02T20:37:24.000Z","2020-06-02T20:37:25.000Z","2020-06-02T20:37:26.000Z","2020-06-02T20:37:27.000Z","2020-06-02T20:37:28.000Z","2020-06-02T20:37:29.000Z","2020-06-02T20:37:30.000Z","2020-06-02T20:37:31.000Z","2020-06-02T20:37:32.000Z","2020-06-02T20:37:33.000Z","2020-06-02T20:37:34.000Z","2020-06-02T20:37:35.000Z","2020-06-02T20:37:36.000Z","2020-06-02T20:37:37.000Z","2020-06-02T20:37:38.000Z","2020-06-02T20:37:39.000Z","2020-06-02T20:37:40.000Z","2020-06-02T20:37:41.000Z","2020-06-02T20:37:42.000Z","2020-06-02T20:37:43.000Z","2020-06-02T20:37:44.000Z","2020-06-02T20:37:45.000Z","2020-06-02T20:37:46.000Z","2020-06-02T20:37:47.000Z","2020-06-02T20:37:48.000Z","2020-06-02T20:37:49.000Z","2020-06-02T20:37:50.000Z","2020-06-02T20:37:51.000Z","2020-06-02T20:37:52.000Z","2020-06-02T20:37:53.000Z","2020-06-02T20:37:54.000Z","2020-06-02T20:37:55.000Z","2020-06-02T20:37:56.000Z","2020-06-02T20:37:57.000Z","2020-06-02T20:37:58.000Z","2020-06-02T20:37:59.000Z","2020-06-02T20:38:00.000Z","2020-06-02T20:38:01.000Z","2020-06-02T20:38:02.000Z","2020-06-02T20:38:03.000Z","2020-06-02T20:38:04.000Z","2020-06-02T20:38:05.000Z","2020-06-02T20:38:06.000Z","2020-06-02T20:38:07.000Z","2020-06-02T20:38:08.000Z","2020-06-02T20:38:09.000Z","2020-06-02T20:38:10.000Z","2020-06-02T20:38:11.000Z","2020-06-02T20:38:12.000Z","2020-06-02T20:38:13.000Z","2020-06-02T20:38:14.000Z","2020-06-02T20:38:15.000Z","2020-06-02T20:38:16.000Z","2020-06-02T20:38:17.000Z","2020-06-02T20:38:18.000Z","2020-06-02T20:38:19.000Z","2020-06-02T20:38:20.000Z","2020-06-02T20:38:21.000Z","2020-06-02T20:38:22.000Z","2020-06-02T20:38:23.000Z","2020-06-02T20:38:24.000Z","2020-06-02T20:38:25.000Z","2020-06-02T20:38:26.000Z","2020-06-02T20:38:27.000Z","2020-06-02T20:38:28.000Z","2020-06-02T20:38:29.000Z","2020-06-02T20:38:30.000Z","2020-06-02T20:38:31.000Z","2020-06-02T20:38:32.000Z","2020-06-02T20:38:33.000Z","2020-06-02T20:38:34.000Z","2020-06-02T20:38:35.000Z","2020-06-02T20:38:36.000Z","2020-06-02T20:38:37.000Z","2020-06-02T20:38:38.000Z","2020-06-02T20:38:39.000Z","2020-06-02T20:38:40.000Z","2020-06-02T20:38:41.000Z","2020-06-02T20:38:42.000Z","2020-06-02T20:38:43.000Z","2020-06-02T20:38:44.000Z","2020-06-02T20:38:45.000Z","2020-06-02T20:38:46.000Z","2020-06-02T20:38:47.000Z","2020-06-02T20:38:48.000Z","2020-06-02T20:38:49.000Z","2020-06-02T20:38:50.000Z","2020-06-02T20:38:51.000Z","2020-06-02T20:38:52.000Z","2020-06-02T20:38:53.000Z","2020-06-02T20:38:54.000Z","2020-06-02T20:38:55.000Z","2020-06-02T20:38:56.000Z","2020-06-02T20:38:57.000Z","2020-06-02T20:38:58.000Z","2020-06-02T20:38:59.000Z","2020-06-02T20:39:00.000Z","2020-06-02T20:39:01.000Z","2020-06-02T20:39:02.000Z","2020-06-02T20:39:03.000Z","2020-06-02T20:39:04.000Z","2020-06-02T20:39:05.000Z","2020-06-02T20:39:06.000Z","2020-06-02T20:39:07.000Z","2020-06-02T20:39:08.000Z","2020-06-02T20:39:09.000Z","2020-06-02T20:39:10.000Z","2020-06-02T20:39:11.000Z","2020-06-02T20:39:12.000Z","2020-06-02T20:39:13.000Z","2020-06-02T20:39:14.000Z","2020-06-02T20:39:15.000Z","2020-06-02T20:39:16.000Z","2020-06-02T20:39:17.000Z","2020-06-02T20:39:18.000Z","2020-06-02T20:39:19.000Z","2020-06-02T20:39:20.000Z","2020-06-02T20:39:21.000Z","2020-06-02T20:39:22.000Z","2020-06-02T20:39:23.000Z","2020-06-02T20:39:24.000Z","2020-06-02T20:39:25.000Z","2020-06-02T20:39:26.000Z","2020-06-02T20:39:27.000Z","2020-06-02T20:39:28.000Z","2020-06-02T20:39:29.000Z","2020-06-02T20:39:30.000Z","2020-06-02T20:39:31.000Z","2020-06-02T20:39:32.000Z","2020-06-02T20:39:33.000Z","2020-06-02T20:39:34.000Z","2020-06-02T20:39:35.000Z","2020-06-02T20:39:36.000Z","2020-06-02T20:39:37.000Z","2020-06-02T20:39:38.000Z","2020-06-02T20:39:39.000Z","2020-06-02T20:39:40.000Z","2020-06-02T20:39:41.000Z","2020-06-02T20:39:42.000Z","2020-06-02T20:39:43.000Z","2020-06-02T20:39:44.000Z","2020-06-02T20:39:45.000Z","2020-06-02T20:39:46.000Z","2020-06-02T20:39:47.000Z","2020-06-02T20:39:48.000Z","2020-06-02T20:39:49.000Z","2020-06-02T20:39:50.000Z","2020-06-02T20:39:51.000Z","2020-06-02T20:39:52.000Z","2020-06-02T20:39:53.000Z","2020-06-02T20:39:54.000Z","2020-06-02T20:39:55.000Z","2020-06-02T20:39:56.000Z","2020-06-02T20:39:57.000Z","2020-06-02T20:39:58.000Z","2020-06-02T20:39:59.000Z","2020-06-02T20:40:00.000Z","2020-06-02T20:40:01.000Z","2020-06-02T20:40:02.000Z","2020-06-02T20:40:03.000Z","2020-06-02T20:40:04.000Z","2020-06-02T20:40:05.000Z","2020-06-02T20:40:06.000Z","2020-06-02T20:40:07.000Z","2020-06-02T20:40:08.000Z","2020-06-02T20:40:09.000Z","2020-06-02T20:40:10.000Z","2020-06-02T20:40:11.000Z","2020-06-02T20:40:12.000Z","2020-06-02T20:40:13.000Z","2020-06-02T20:40:14.000Z","2020-06-02T20:40:15.000Z","2020-06-02T20:40:16.000Z","2020-06-02T20:40:17.000Z","2020-06-02T20:40:18.000Z","2020-06-02T20:40:19.000Z","2020-06-02T20:40:20.000Z","2020-06-02T20:40:21.000Z","2020-06-02T20:40:22.000Z","2020-06-02T20:40:23.000Z","2020-06-02T20:40:24.000Z","2020-06-02T20:40:25.000Z","2020-06-02T20:40:26.000Z","2020-06-02T20:40:27.000Z","2020-06-02T20:40:28.000Z","2020-06-02T20:40:29.000Z","2020-06-02T20:40:30.000Z","2020-06-02T20:40:31.000Z","2020-06-02T20:40:32.000Z","2020-06-02T20:40:33.000Z","2020-06-02T20:40:34.000Z","2020-06-02T20:40:35.000Z","2020-06-02T20:40:36.000Z","2020-06-02T20:40:37.000Z","2020-06-02T20:40:38.000Z","2020-06-02T20:40:39.000Z","2020-06-02T20:40:40.000Z","2020-06-02T20:40:41.000Z","2020-06-02T20:40:42.000Z","2020-06-02T20:40:43.000Z","2020-06-02T20:40:44.000Z","2020-06-02T20:40:45.000Z","2020-06-02T20:40:46.000Z","2020-06-02T20:40:47.000Z","2020-06-02T20:40:48.000Z","2020-06-02T20:40:49.000Z","2020-06-02T20:40:50.000Z","2020-06-02T20:40:51.000Z","2020-06-02T20:40:52.000Z","2020-06-02T20:40:53.000Z","2020-06-02T20:40:54.000Z","2020-06-02T20:40:55.000Z","2020-06-02T20:40:56.000Z","2020-06-02T20:40:57.000Z","2020-06-02T20:40:58.000Z","2020-06-02T20:40:59.000Z","2020-06-02T20:41:00.000Z","2020-06-02T20:41:01.000Z","2020-06-02T20:41:02.000Z","2020-06-02T20:41:03.000Z","2020-06-02T20:41:04.000Z","2020-06-02T20:41:05.000Z","2020-06-02T20:41:06.000Z","2020-06-02T20:41:07.000Z","2020-06-02T20:41:08.000Z","2020-06-02T20:41:09.000Z","2020-06-02T20:41:10.000Z","2020-06-02T20:41:11.000Z","2020-06-02T20:41:12.000Z","2020-06-02T20:41:13.000Z","2020-06-02T20:41:14.000Z","2020-06-02T20:41:15.000Z","2020-06-02T20:41:16.000Z","2020-06-02T20:41:17.000Z","2020-06-02T20:41:18.000Z","2020-06-02T20:41:19.000Z","2020-06-02T20:41:20.000Z","2020-06-02T20:41:21.000Z","2020-06-02T20:41:22.000Z","2020-06-02T20:41:23.000Z","2020-06-02T20:41:24.000Z","2020-06-02T20:41:25.000Z","2020-06-02T20:41:26.000Z","2020-06-02T20:41:27.000Z","2020-06-02T20:41:28.000Z","2020-06-02T20:41:29.000Z","2020-06-02T20:41:30.000Z","2020-06-02T20:41:31.000Z","2020-06-02T20:41:32.000Z","2020-06-02T20:41:33.000Z","2020-06-02T20:41:34.000Z","2020-06-02T20:41:35.000Z","2020-06-02T20:41:36.000Z","2020-06-02T20:41:37.000Z","2020-06-02T20:41:38.000Z","2020-06-02T20:41:39.000Z","2020-06-02T20:41:40.000Z","2020-06-02T20:41:41.000Z","2020-06-02T20:41:42.000Z","2020-06-02T20:41:43.000Z","2020-06-02T20:41:44.000Z","2020-06-02T20:41:45.000Z","2020-06-02T20:41:46.000Z","2020-06-02T20:41:47.000Z","2020-06-02T20:41:48.000Z","2020-06-02T20:41:49.000Z","2020-06-02T20:41:50.000Z","2020-06-02T20:41:51.000Z","2020-06-02T20:41:52.000Z","2020-06-02T20:41:53.000Z","2020-06-02T20:41:54.000Z","2020-06-02T20:41:55.000Z","2020-06-02T20:41:56.000Z","2020-06-02T20:41:57.000Z","2020-06-02T20:41:58.000Z","2020-06-02T20:41:59.000Z","2020-06-02T20:42:00.000Z","2020-06-02T20:42:01.000Z","2020-06-02T20:42:02.000Z","2020-06-02T20:42:03.000Z","2020-06-02T20:42:04.000Z","2020-06-02T20:42:05.000Z","2020-06-02T20:42:06.000Z","2020-06-02T20:42:07.000Z","2020-06-02T20:42:08.000Z","2020-06-02T20:42:09.000Z","2020-06-02T20:42:10.000Z","2020-06-02T20:42:11.000Z","2020-06-02T20:42:12.000Z","2020-06-02T20:42:13.000Z","2020-06-02T20:42:14.000Z","2020-06-02T20:42:15.000Z","2020-06-02T20:42:16.000Z","2020-06-02T20:42:17.000Z","2020-06-02T20:42:18.000Z","2020-06-02T20:42:19.000Z","2020-06-02T20:42:20.000Z","2020-06-02T20:42:21.000Z","2020-06-02T20:42:22.000Z","2020-06-02T20:42:23.000Z","2020-06-02T20:42:24.000Z","2020-06-02T20:42:25.000Z","2020-06-02T20:42:26.000Z","2020-06-02T20:42:27.000Z","2020-06-02T20:42:28.000Z","2020-06-02T20:42:29.000Z","2020-06-02T20:42:30.000Z","2020-06-02T20:42:31.000Z","2020-06-02T20:42:32.000Z","2020-06-02T20:42:33.000Z","2020-06-02T20:42:34.000Z","2020-06-02T20:42:35.000Z","2020-06-02T20:42:36.000Z","2020-06-02T20:42:37.000Z","2020-06-02T20:42:38.000Z","2020-06-02T20:42:39.000Z","2020-06-02T20:42:40.000Z","2020-06-02T20:42:41.000Z","2020-06-02T20:42:42.000Z","2020-06-02T20:42:43.000Z","2020-06-02T20:42:44.000Z","2020-06-02T20:42:45.000Z","2020-06-02T20:42:46.000Z","2020-06-02T20:42:47.000Z","2020-06-02T20:42:48.000Z","2020-06-02T20:42:49.000Z","2020-06-02T20:42:50.000Z","2020-06-02T20:42:51.000Z","2020-06-02T20:42:52.000Z","2020-06-02T20:42:53.000Z","2020-06-02T20:42:54.000Z","2020-06-02T20:42:55.000Z","2020-06-02T20:42:56.000Z","2020-06-02T20:42:57.000Z","2020-06-02T20:42:58.000Z","2020-06-02T20:42:59.000Z","2020-06-02T20:43:00.000Z","2020-06-02T20:43:01.000Z","2020-06-02T20:43:02.000Z","2020-06-02T20:43:03.000Z","2020-06-02T20:43:04.000Z","2020-06-02T20:43:05.000Z","2020-06-02T20:43:06.000Z","2020-06-02T20:43:07.000Z","2020-06-02T20:43:08.000Z","2020-06-02T20:43:09.000Z","2020-06-02T20:43:10.000Z","2020-06-02T20:43:11.000Z","2020-06-02T20:43:12.000Z","2020-06-02T20:43:13.000Z","2020-06-02T20:43:14.000Z","2020-06-02T20:43:15.000Z","2020-06-02T20:43:16.000Z","2020-06-02T20:43:17.000Z","2020-06-02T20:43:18.000Z","2020-06-02T20:43:19.000Z","2020-06-02T20:43:20.000Z","2020-06-02T20:43:21.000Z","2020-06-02T20:43:22.000Z","2020-06-02T20:43:23.000Z","2020-06-02T20:43:24.000Z","2020-06-02T20:43:25.000Z","2020-06-02T20:43:26.000Z","2020-06-02T20:43:27.000Z","2020-06-02T20:43:28.000Z","2020-06-02T20:43:29.000Z","2020-06-02T20:43:30.000Z","2020-06-02T20:43:31.000Z","2020-06-02T20:43:32.000Z","2020-06-02T20:43:33.000Z","2020-06-02T20:43:34.000Z","2020-06-02T20:43:35.000Z","2020-06-02T20:43:36.000Z","2020-06-02T20:43:37.000Z","2020-06-02T20:43:38.000Z","2020-06-02T20:43:39.000Z","2020-06-02T20:43:40.000Z","2020-06-02T20:43:41.000Z","2020-06-02T20:43:42.000Z","2020-06-02T20:43:43.000Z","2020-06-02T20:43:44.000Z","2020-06-02T20:43:45.000Z","2020-06-02T20:43:46.000Z","2020-06-02T20:43:47.000Z","2020-06-02T20:43:48.000Z","2020-06-02T20:43:49.000Z","2020-06-02T20:43:50.000Z","2020-06-02T20:43:51.000Z","2020-06-02T20:43:52.000Z","2020-06-02T20:43:53.000Z","2020-06-02T20:43:54.000Z","2020-06-02T20:43:55.000Z","2020-06-02T20:43:56.000Z","2020-06-02T20:43:57.000Z","2020-06-02T20:43:58.000Z","2020-06-02T20:43:59.000Z","2020-06-02T20:44:00.000Z","2020-06-02T20:44:01.000Z","2020-06-02T20:44:02.000Z","2020-06-02T20:44:03.000Z","2020-06-02T20:44:04.000Z","2020-06-02T20:44:05.000Z","2020-06-02T20:44:06.000Z","2020-06-02T20:44:07.000Z","2020-06-02T20:44:08.000Z","2020-06-02T20:44:09.000Z","2020-06-02T20:44:10.000Z","2020-06-02T20:44:11.000Z","2020-06-02T20:44:12.000Z","2020-06-02T20:44:13.000Z","2020-06-02T20:44:14.000Z","2020-06-02T20:44:15.000Z","2020-06-02T20:44:16.000Z","2020-06-02T20:44:17.000Z","2020-06-02T20:44:18.000Z","2020-06-02T20:44:19.000Z","2020-06-02T20:44:20.000Z","2020-06-02T20:44:21.000Z","2020-06-02T20:44:22.000Z","2020-06-02T20:44:23.000Z","2020-06-02T20:44:24.000Z","2020-06-02T20:44:25.000Z","2020-06-02T20:44:26.000Z","2020-06-02T20:44:27.000Z","2020-06-02T20:44:28.000Z","2020-06-02T20:44:29.000Z","2020-06-02T20:44:30.000Z","2020-06-02T20:44:31.000Z","2020-06-02T20:44:32.000Z","2020-06-02T20:44:33.000Z","2020-06-02T20:44:34.000Z","2020-06-02T20:44:35.000Z","2020-06-02T20:44:36.000Z","2020-06-02T20:44:37.000Z","2020-06-02T20:44:38.000Z","2020-06-02T20:44:39.000Z","2020-06-02T20:44:40.000Z","2020-06-02T20:44:41.000Z","2020-06-02T20:44:42.000Z","2020-06-02T20:44:43.000Z","2020-06-02T20:44:44.000Z","2020-06-02T20:44:45.000Z","2020-06-02T20:44:46.000Z","2020-06-02T20:44:47.000Z","2020-06-02T20:44:48.000Z","2020-06-02T20:44:49.000Z","2020-06-02T20:44:50.000Z","2020-06-02T20:44:51.000Z","2020-06-02T20:44:52.000Z","2020-06-02T20:44:53.000Z","2020-06-02T20:44:54.000Z","2020-06-02T20:44:55.000Z","2020-06-02T20:44:56.000Z","2020-06-02T20:44:57.000Z","2020-06-02T20:44:58.000Z","2020-06-02T20:44:59.000Z","2020-06-02T20:45:00.000Z","2020-06-02T20:45:01.000Z","2020-06-02T20:45:02.000Z","2020-06-02T20:45:03.000Z","2020-06-02T20:45:04.000Z","2020-06-02T20:45:05.000Z","2020-06-02T20:45:06.000Z","2020-06-02T20:45:07.000Z","2020-06-02T20:45:08.000Z","2020-06-02T20:45:09.000Z","2020-06-02T20:45:10.000Z","2020-06-02T20:45:11.000Z","2020-06-02T20:45:12.000Z","2020-06-02T20:45:13.000Z","2020-06-02T20:45:14.000Z","2020-06-02T20:45:15.000Z","2020-06-02T20:45:16.000Z","2020-06-02T20:45:17.000Z","2020-06-02T20:45:18.000Z","2020-06-02T20:45:19.000Z","2020-06-02T20:45:20.000Z","2020-06-02T20:45:21.000Z","2020-06-02T20:45:22.000Z","2020-06-02T20:45:23.000Z","2020-06-02T20:45:24.000Z","2020-06-02T20:45:25.000Z","2020-06-02T20:45:26.000Z","2020-06-02T20:45:27.000Z","2020-06-02T20:45:28.000Z","2020-06-02T20:45:29.000Z","2020-06-02T20:45:30.000Z","2020-06-02T20:45:31.000Z","2020-06-02T20:45:32.000Z","2020-06-02T20:45:33.000Z","2020-06-02T20:45:34.000Z","2020-06-02T20:45:35.000Z","2020-06-02T20:45:36.000Z","2020-06-02T20:45:37.000Z","2020-06-02T20:45:38.000Z","2020-06-02T20:45:39.000Z","2020-06-02T20:45:40.000Z","2020-06-02T20:45:41.000Z","2020-06-02T20:45:42.000Z","2020-06-02T20:45:43.000Z","2020-06-02T20:45:44.000Z","2020-06-02T20:45:45.000Z","2020-06-02T20:45:46.000Z","2020-06-02T20:45:47.000Z","2020-06-02T20:45:48.000Z","2020-06-02T20:45:49.000Z","2020-06-02T20:45:50.000Z","2020-06-02T20:45:51.000Z","2020-06-02T20:45:52.000Z","2020-06-02T20:45:53.000Z","2020-06-02T20:45:54.000Z","2020-06-02T20:45:55.000Z","2020-06-02T20:45:56.000Z","2020-06-02T20:45:57.000Z","2020-06-02T20:45:58.000Z","2020-06-02T20:45:59.000Z","2020-06-02T20:46:00.000Z","2020-06-02T20:46:01.000Z","2020-06-02T20:46:02.000Z","2020-06-02T20:46:03.000Z","2020-06-02T20:46:04.000Z","2020-06-02T20:46:05.000Z","2020-06-02T20:46:06.000Z","2020-06-02T20:46:07.000Z","2020-06-02T20:46:08.000Z","2020-06-02T20:46:09.000Z","2020-06-02T20:46:10.000Z","2020-06-02T20:46:11.000Z","2020-06-02T20:46:12.000Z","2020-06-02T20:46:13.000Z","2020-06-02T20:46:14.000Z","2020-06-02T20:46:15.000Z","2020-06-02T20:46:16.000Z","2020-06-02T20:46:17.000Z","2020-06-02T20:46:18.000Z","2020-06-02T20:46:19.000Z","2020-06-02T20:46:20.000Z","2020-06-02T20:46:21.000Z","2020-06-02T20:46:22.000Z","2020-06-02T20:46:23.000Z","2020-06-02T20:46:24.000Z","2020-06-02T20:46:25.000Z","2020-06-02T20:46:26.000Z","2020-06-02T20:46:27.000Z","2020-06-02T20:46:28.000Z","2020-06-02T20:46:29.000Z","2020-06-02T20:46:30.000Z","2020-06-02T20:46:31.000Z","2020-06-02T20:46:32.000Z","2020-06-02T20:46:33.000Z","2020-06-02T20:46:34.000Z","2020-06-02T20:46:35.000Z","2020-06-02T20:46:36.000Z","2020-06-02T20:46:37.000Z","2020-06-02T20:46:38.000Z","2020-06-02T20:46:39.000Z","2020-06-02T20:46:40.000Z","2020-06-02T20:46:41.000Z","2020-06-02T20:46:42.000Z","2020-06-02T20:46:43.000Z","2020-06-02T20:46:44.000Z","2020-06-02T20:46:45.000Z","2020-06-02T20:46:46.000Z","2020-06-02T20:46:47.000Z","2020-06-02T20:46:48.000Z","2020-06-02T20:46:49.000Z","2020-06-02T20:46:50.000Z","2020-06-02T20:46:51.000Z","2020-06-02T20:46:52.000Z","2020-06-02T20:46:53.000Z","2020-06-02T20:46:54.000Z","2020-06-02T20:46:55.000Z","2020-06-02T20:46:56.000Z","2020-06-02T20:46:57.000Z","2020-06-02T20:46:58.000Z","2020-06-02T20:46:59.000Z","2020-06-02T20:47:00.000Z","2020-06-02T20:47:01.000Z","2020-06-02T20:47:02.000Z","2020-06-02T20:47:03.000Z","2020-06-02T20:47:04.000Z","2020-06-02T20:47:05.000Z","2020-06-02T20:47:06.000Z","2020-06-02T20:47:07.000Z","2020-06-02T20:47:08.000Z","2020-06-02T20:47:09.000Z","2020-06-02T20:47:10.000Z","2020-06-02T20:47:11.000Z","2020-06-02T20:47:12.000Z","2020-06-02T20:47:13.000Z","2020-06-02T20:47:14.000Z","2020-06-02T20:47:15.000Z","2020-06-02T20:47:16.000Z","2020-06-02T20:47:17.000Z","2020-06-02T20:47:18.000Z","2020-06-02T20:47:19.000Z","2020-06-02T20:47:20.000Z","2020-06-02T20:47:21.000Z","2020-06-02T20:47:22.000Z","2020-06-02T20:47:23.000Z","2020-06-02T20:47:24.000Z","2020-06-02T20:47:25.000Z","2020-06-02T20:47:26.000Z","2020-06-02T20:47:27.000Z","2020-06-02T20:47:28.000Z","2020-06-02T20:47:29.000Z","2020-06-02T20:47:30.000Z","2020-06-02T20:47:31.000Z","2020-06-02T20:47:32.000Z","2020-06-02T20:47:33.000Z","2020-06-02T20:47:34.000Z","2020-06-02T20:47:35.000Z","2020-06-02T20:47:36.000Z","2020-06-02T20:47:37.000Z","2020-06-02T20:47:38.000Z","2020-06-02T20:47:39.000Z","2020-06-02T20:47:40.000Z","2020-06-02T20:47:41.000Z","2020-06-02T20:47:42.000Z","2020-06-02T20:47:43.000Z","2020-06-02T20:47:44.000Z","2020-06-02T20:47:45.000Z","2020-06-02T20:47:46.000Z","2020-06-02T20:47:47.000Z","2020-06-02T20:47:48.000Z","2020-06-02T20:47:49.000Z","2020-06-02T20:47:50.000Z","2020-06-02T20:47:51.000Z","2020-06-02T20:47:52.000Z","2020-06-02T20:47:53.000Z","2020-06-02T20:47:54.000Z","2020-06-02T20:47:55.000Z","2020-06-02T20:47:56.000Z","2020-06-02T20:47:57.000Z","2020-06-02T20:47:58.000Z","2020-06-02T20:47:59.000Z","2020-06-02T20:48:00.000Z","2020-06-02T20:48:01.000Z","2020-06-02T20:48:02.000Z","2020-06-02T20:48:03.000Z","2020-06-02T20:48:04.000Z","2020-06-02T20:48:05.000Z","2020-06-02T20:48:06.000Z","2020-06-02T20:48:07.000Z","2020-06-02T20:48:08.000Z","2020-06-02T20:48:09.000Z","2020-06-02T20:48:10.000Z","2020-06-02T20:48:11.000Z","2020-06-02T20:48:12.000Z","2020-06-02T20:48:13.000Z","2020-06-02T20:48:14.000Z","2020-06-02T20:48:15.000Z","2020-06-02T20:48:16.000Z","2020-06-02T20:48:17.000Z","2020-06-02T20:48:18.000Z","2020-06-02T20:48:19.000Z","2020-06-02T20:48:20.000Z","2020-06-02T20:48:21.000Z","2020-06-02T20:48:22.000Z","2020-06-02T20:48:23.000Z","2020-06-02T20:48:24.000Z","2020-06-02T20:48:25.000Z","2020-06-02T20:48:26.000Z","2020-06-02T20:48:27.000Z","2020-06-02T20:48:28.000Z","2020-06-02T20:48:29.000Z","2020-06-02T20:48:30.000Z","2020-06-02T20:48:31.000Z","2020-06-02T20:48:32.000Z","2020-06-02T20:48:33.000Z","2020-06-02T20:48:34.000Z","2020-06-02T20:48:35.000Z","2020-06-02T20:48:36.000Z","2020-06-02T20:48:37.000Z","2020-06-02T20:48:38.000Z","2020-06-02T20:48:39.000Z","2020-06-02T20:48:40.000Z","2020-06-02T20:48:41.000Z","2020-06-02T20:48:42.000Z","2020-06-02T20:48:43.000Z","2020-06-02T20:48:44.000Z","2020-06-02T20:48:45.000Z","2020-06-02T20:48:46.000Z","2020-06-02T20:48:47.000Z","2020-06-02T20:48:48.000Z","2020-06-02T20:48:49.000Z","2020-06-02T20:48:50.000Z","2020-06-02T20:48:51.000Z","2020-06-02T20:48:52.000Z","2020-06-02T20:48:53.000Z","2020-06-02T20:48:54.000Z","2020-06-02T20:48:55.000Z","2020-06-02T20:48:56.000Z","2020-06-02T20:48:57.000Z","2020-06-02T20:48:58.000Z","2020-06-02T20:48:59.000Z","2020-06-02T20:49:00.000Z","2020-06-02T20:49:01.000Z","2020-06-02T20:49:02.000Z","2020-06-02T20:49:03.000Z","2020-06-02T20:49:04.000Z","2020-06-02T20:49:05.000Z","2020-06-02T20:49:06.000Z","2020-06-02T20:49:07.000Z","2020-06-02T20:49:08.000Z","2020-06-02T20:49:09.000Z","2020-06-02T20:49:10.000Z","2020-06-02T20:49:11.000Z","2020-06-02T20:49:12.000Z","2020-06-02T20:49:13.000Z","2020-06-02T20:49:14.000Z","2020-06-02T20:49:15.000Z","2020-06-02T20:49:16.000Z","2020-06-02T20:49:17.000Z","2020-06-02T20:49:18.000Z","2020-06-02T20:49:19.000Z","2020-06-02T20:49:20.000Z","2020-06-02T20:49:21.000Z","2020-06-02T20:49:22.000Z","2020-06-02T20:49:23.000Z","2020-06-02T20:49:24.000Z","2020-06-02T20:49:25.000Z","2020-06-02T20:49:26.000Z","2020-06-02T20:49:27.000Z","2020-06-02T20:49:28.000Z","2020-06-02T20:49:29.000Z","2020-06-02T20:49:30.000Z","2020-06-02T20:49:31.000Z","2020-06-02T20:49:32.000Z","2020-06-02T20:49:33.000Z","2020-06-02T20:49:34.000Z","2020-06-02T20:49:35.000Z","2020-06-02T20:49:36.000Z","2020-06-02T20:49:37.000Z","2020-06-02T20:49:38.000Z","2020-06-02T20:49:39.000Z","2020-06-02T20:49:40.000Z","2020-06-02T20:49:41.000Z","2020-06-02T20:49:42.000Z","2020-06-02T20:49:43.000Z","2020-06-02T20:49:44.000Z","2020-06-02T20:49:45.000Z","2020-06-02T20:49:46.000Z","2020-06-02T20:49:47.000Z","2020-06-02T20:49:48.000Z","2020-06-02T20:49:49.000Z","2020-06-02T20:49:50.000Z","2020-06-02T20:49:51.000Z","2020-06-02T20:49:52.000Z","2020-06-02T20:49:53.000Z","2020-06-02T20:49:54.000Z","2020-06-02T20:49:55.000Z","2020-06-02T20:49:56.000Z","2020-06-02T20:49:57.000Z","2020-06-02T20:49:58.000Z","2020-06-02T20:49:59.000Z","2020-06-02T20:50:00.000Z","2020-06-02T20:50:01.000Z","2020-06-02T20:50:02.000Z","2020-06-02T20:50:03.000Z","2020-06-02T20:50:04.000Z","2020-06-02T20:50:05.000Z","2020-06-02T20:50:06.000Z","2020-06-02T20:50:07.000Z","2020-06-02T20:50:08.000Z","2020-06-02T20:50:09.000Z","2020-06-02T20:50:10.000Z","2020-06-02T20:50:11.000Z","2020-06-02T20:50:12.000Z","2020-06-02T20:50:13.000Z","2020-06-02T20:50:14.000Z","2020-06-02T20:50:15.000Z","2020-06-02T20:50:16.000Z","2020-06-02T20:50:17.000Z","2020-06-02T20:50:18.000Z","2020-06-02T20:50:19.000Z","2020-06-02T20:50:20.000Z","2020-06-02T20:50:21.000Z","2020-06-02T20:50:22.000Z","2020-06-02T20:50:23.000Z","2020-06-02T20:50:24.000Z","2020-06-02T20:50:25.000Z","2020-06-02T20:50:26.000Z","2020-06-02T20:50:27.000Z","2020-06-02T20:50:28.000Z","2020-06-02T20:50:29.000Z","2020-06-02T20:50:30.000Z","2020-06-02T20:50:31.000Z","2020-06-02T20:50:32.000Z","2020-06-02T20:50:33.000Z","2020-06-02T20:50:34.000Z","2020-06-02T20:50:35.000Z","2020-06-02T20:50:36.000Z","2020-06-02T20:50:37.000Z","2020-06-02T20:50:38.000Z","2020-06-02T20:50:39.000Z","2020-06-02T20:50:40.000Z","2020-06-02T20:50:41.000Z","2020-06-02T20:50:42.000Z","2020-06-02T20:50:43.000Z","2020-06-02T20:50:44.000Z","2020-06-02T20:50:45.000Z","2020-06-02T20:50:46.000Z","2020-06-02T20:50:47.000Z","2020-06-02T20:50:48.000Z","2020-06-02T20:50:49.000Z","2020-06-02T20:50:50.000Z","2020-06-02T20:50:51.000Z","2020-06-02T20:50:52.000Z","2020-06-02T20:50:53.000Z","2020-06-02T20:50:54.000Z","2020-06-02T20:50:55.000Z","2020-06-02T20:50:56.000Z","2020-06-02T20:50:57.000Z","2020-06-02T20:50:58.000Z","2020-06-02T20:50:59.000Z","2020-06-02T20:51:00.000Z","2020-06-02T20:51:01.000Z","2020-06-02T20:51:02.000Z","2020-06-02T20:51:03.000Z","2020-06-02T20:51:04.000Z","2020-06-02T20:51:05.000Z","2020-06-02T20:51:06.000Z","2020-06-02T20:51:07.000Z","2020-06-02T20:51:08.000Z","2020-06-02T20:51:09.000Z","2020-06-02T20:51:10.000Z","2020-06-02T20:51:11.000Z","2020-06-02T20:51:12.000Z","2020-06-02T20:51:13.000Z","2020-06-02T20:51:14.000Z","2020-06-02T20:51:15.000Z","2020-06-02T20:51:16.000Z","2020-06-02T20:51:17.000Z","2020-06-02T20:51:18.000Z","2020-06-02T20:51:19.000Z","2020-06-02T20:51:20.000Z","2020-06-02T20:51:21.000Z","2020-06-02T20:51:22.000Z","2020-06-02T20:51:23.000Z","2020-06-02T20:51:24.000Z","2020-06-02T20:51:25.000Z","2020-06-02T20:51:26.000Z","2020-06-02T20:51:27.000Z","2020-06-02T20:51:28.000Z","2020-06-02T20:51:29.000Z","2020-06-02T20:51:30.000Z","2020-06-02T20:51:31.000Z","2020-06-02T20:51:32.000Z","2020-06-02T20:51:33.000Z","2020-06-02T20:51:34.000Z","2020-06-02T20:51:35.000Z","2020-06-02T20:51:36.000Z","2020-06-02T20:51:37.000Z","2020-06-02T20:51:38.000Z","2020-06-02T20:51:39.000Z","2020-06-02T20:51:40.000Z","2020-06-02T20:51:41.000Z","2020-06-02T20:51:42.000Z","2020-06-02T20:51:43.000Z","2020-06-02T20:51:44.000Z","2020-06-02T20:51:45.000Z","2020-06-02T20:51:46.000Z","2020-06-02T20:51:47.000Z","2020-06-02T20:51:48.000Z","2020-06-02T20:51:49.000Z","2020-06-02T20:51:50.000Z","2020-06-02T20:51:51.000Z","2020-06-02T20:51:52.000Z","2020-06-02T20:51:53.000Z","2020-06-02T20:51:54.000Z","2020-06-02T20:51:55.000Z","2020-06-02T20:51:56.000Z","2020-06-02T20:51:57.000Z","2020-06-02T20:51:58.000Z","2020-06-02T20:51:59.000Z","2020-06-02T20:52:00.000Z","2020-06-02T20:52:01.000Z","2020-06-02T20:52:02.000Z","2020-06-02T20:52:03.000Z","2020-06-02T20:52:04.000Z","2020-06-02T20:52:05.000Z","2020-06-02T20:52:06.000Z","2020-06-02T20:52:07.000Z","2020-06-02T20:52:08.000Z","2020-06-02T20:52:09.000Z","2020-06-02T20:52:10.000Z","2020-06-02T20:52:11.000Z","2020-06-02T20:52:12.000Z","2020-06-02T20:52:13.000Z","2020-06-02T20:52:14.000Z","2020-06-02T20:52:15.000Z","2020-06-02T20:52:16.000Z","2020-06-02T20:52:17.000Z","2020-06-02T20:52:18.000Z","2020-06-02T20:52:19.000Z","2020-06-02T20:52:20.000Z","2020-06-02T20:52:21.000Z","2020-06-02T20:52:22.000Z","2020-06-02T20:52:23.000Z","2020-06-02T20:52:24.000Z","2020-06-02T20:52:25.000Z","2020-06-02T20:52:26.000Z","2020-06-02T20:52:27.000Z","2020-06-02T20:52:28.000Z","2020-06-02T20:52:29.000Z","2020-06-02T20:52:30.000Z","2020-06-02T20:52:31.000Z","2020-06-02T20:52:32.000Z","2020-06-02T20:52:33.000Z","2020-06-02T20:52:34.000Z","2020-06-02T20:52:35.000Z","2020-06-02T20:52:36.000Z","2020-06-02T20:52:37.000Z","2020-06-02T20:52:38.000Z","2020-06-02T20:52:39.000Z","2020-06-02T20:52:40.000Z","2020-06-02T20:52:41.000Z","2020-06-02T20:52:42.000Z","2020-06-02T20:52:43.000Z","2020-06-02T20:52:44.000Z","2020-06-02T20:52:45.000Z","2020-06-02T20:52:46.000Z","2020-06-02T20:52:47.000Z","2020-06-02T20:52:48.000Z","2020-06-02T20:52:49.000Z","2020-06-02T20:52:50.000Z","2020-06-02T20:52:51.000Z","2020-06-02T20:52:52.000Z","2020-06-02T20:52:53.000Z","2020-06-02T20:52:54.000Z","2020-06-02T20:52:55.000Z","2020-06-02T20:52:56.000Z","2020-06-02T20:52:57.000Z","2020-06-02T20:52:58.000Z","2020-06-02T20:52:59.000Z","2020-06-02T20:53:00.000Z","2020-06-02T20:53:01.000Z","2020-06-02T20:53:02.000Z","2020-06-02T20:53:03.000Z","2020-06-02T20:53:04.000Z","2020-06-02T20:53:05.000Z","2020-06-02T20:53:06.000Z","2020-06-02T20:53:07.000Z","2020-06-02T20:53:08.000Z","2020-06-02T20:53:09.000Z","2020-06-02T20:53:10.000Z","2020-06-02T20:53:11.000Z","2020-06-02T20:53:12.000Z","2020-06-02T20:53:13.000Z","2020-06-02T20:53:14.000Z","2020-06-02T20:53:15.000Z","2020-06-02T20:53:16.000Z","2020-06-02T20:53:17.000Z","2020-06-02T20:53:18.000Z","2020-06-02T20:53:19.000Z","2020-06-02T20:53:20.000Z","2020-06-02T20:53:21.000Z","2020-06-02T20:53:22.000Z","2020-06-02T20:53:23.000Z","2020-06-02T20:53:24.000Z","2020-06-02T20:53:25.000Z","2020-06-02T20:53:26.000Z","2020-06-02T20:53:27.000Z","2020-06-02T20:53:28.000Z","2020-06-02T20:53:29.000Z","2020-06-02T20:53:30.000Z","2020-06-02T20:53:31.000Z","2020-06-02T20:53:32.000Z","2020-06-02T20:53:33.000Z","2020-06-02T20:53:34.000Z","2020-06-02T20:53:35.000Z","2020-06-02T20:53:36.000Z","2020-06-02T20:53:37.000Z","2020-06-02T20:53:38.000Z","2020-06-02T20:53:39.000Z","2020-06-02T20:53:40.000Z","2020-06-02T20:53:41.000Z","2020-06-02T20:53:42.000Z","2020-06-02T20:53:43.000Z","2020-06-02T20:53:44.000Z","2020-06-02T20:53:45.000Z","2020-06-02T20:53:46.000Z","2020-06-02T20:53:47.000Z","2020-06-02T20:53:48.000Z","2020-06-02T20:53:49.000Z","2020-06-02T20:53:50.000Z","2020-06-02T20:53:51.000Z","2020-06-02T20:53:52.000Z","2020-06-02T20:53:53.000Z","2020-06-02T20:53:54.000Z","2020-06-02T20:53:55.000Z","2020-06-02T20:53:56.000Z","2020-06-02T20:53:57.000Z","2020-06-02T20:53:58.000Z","2020-06-02T20:53:59.000Z","2020-06-02T20:54:00.000Z","2020-06-02T20:54:01.000Z","2020-06-02T20:54:02.000Z","2020-06-02T20:54:03.000Z","2020-06-02T20:54:04.000Z","2020-06-02T20:54:05.000Z","2020-06-02T20:54:06.000Z","2020-06-02T20:54:07.000Z","2020-06-02T20:54:08.000Z","2020-06-02T20:54:09.000Z","2020-06-02T20:54:10.000Z","2020-06-02T20:54:11.000Z","2020-06-02T20:54:12.000Z","2020-06-02T20:54:13.000Z","2020-06-02T20:54:14.000Z","2020-06-02T20:54:15.000Z","2020-06-02T20:54:16.000Z","2020-06-02T20:54:17.000Z","2020-06-02T20:54:18.000Z","2020-06-02T20:54:19.000Z","2020-06-02T20:54:20.000Z","2020-06-02T20:54:21.000Z","2020-06-02T20:54:22.000Z","2020-06-02T20:54:23.000Z","2020-06-02T20:54:24.000Z","2020-06-02T20:54:25.000Z","2020-06-02T20:54:26.000Z","2020-06-02T20:54:27.000Z","2020-06-02T20:54:28.000Z","2020-06-02T20:54:29.000Z","2020-06-02T20:54:30.000Z","2020-06-02T20:54:31.000Z","2020-06-02T20:54:32.000Z","2020-06-02T20:54:33.000Z","2020-06-02T20:54:34.000Z","2020-06-02T20:54:35.000Z","2020-06-02T20:54:36.000Z","2020-06-02T20:54:37.000Z","2020-06-02T20:54:38.000Z","2020-06-02T20:54:39.000Z","2020-06-02T20:54:40.000Z","2020-06-02T20:54:41.000Z","2020-06-02T20:54:42.000Z","2020-06-02T20:54:43.000Z","2020-06-02T20:54:44.000Z","2020-06-02T20:54:45.000Z","2020-06-02T20:54:46.000Z","2020-06-02T20:54:47.000Z","2020-06-02T20:54:48.000Z","2020-06-02T20:54:49.000Z","2020-06-02T20:54:50.000Z","2020-06-02T20:54:51.000Z","2020-06-02T20:54:52.000Z","2020-06-02T20:54:53.000Z","2020-06-02T20:54:54.000Z","2020-06-02T20:54:55.000Z","2020-06-02T20:54:56.000Z","2020-06-02T20:54:57.000Z","2020-06-02T20:54:58.000Z","2020-06-02T20:54:59.000Z","2020-06-02T20:55:00.000Z","2020-06-02T20:55:01.000Z","2020-06-02T20:55:02.000Z","2020-06-02T20:55:03.000Z","2020-06-02T20:55:04.000Z","2020-06-02T20:55:05.000Z","2020-06-02T20:55:06.000Z","2020-06-02T20:55:07.000Z","2020-06-02T20:55:08.000Z","2020-06-02T20:55:09.000Z","2020-06-02T20:55:10.000Z","2020-06-02T20:55:11.000Z","2020-06-02T20:55:12.000Z","2020-06-02T20:55:13.000Z","2020-06-02T20:55:14.000Z","2020-06-02T20:55:15.000Z","2020-06-02T20:55:16.000Z","2020-06-02T20:55:17.000Z","2020-06-02T20:55:18.000Z","2020-06-02T20:55:19.000Z","2020-06-02T20:55:20.000Z","2020-06-02T20:55:21.000Z","2020-06-02T20:55:22.000Z","2020-06-02T20:55:23.000Z","2020-06-02T20:55:24.000Z","2020-06-02T20:55:25.000Z","2020-06-02T20:55:26.000Z","2020-06-02T20:55:27.000Z","2020-06-02T20:55:28.000Z","2020-06-02T20:55:29.000Z","2020-06-02T20:55:30.000Z","2020-06-02T20:55:31.000Z","2020-06-02T20:55:32.000Z","2020-06-02T20:55:33.000Z","2020-06-02T20:55:34.000Z","2020-06-02T20:55:35.000Z","2020-06-02T20:55:36.000Z","2020-06-02T20:55:37.000Z","2020-06-02T20:55:38.000Z","2020-06-02T20:55:39.000Z","2020-06-02T20:55:40.000Z","2020-06-02T20:55:41.000Z","2020-06-02T20:55:42.000Z","2020-06-02T20:55:43.000Z","2020-06-02T20:55:44.000Z","2020-06-02T20:55:45.000Z","2020-06-02T20:55:46.000Z","2020-06-02T20:55:47.000Z","2020-06-02T20:55:48.000Z","2020-06-02T20:55:49.000Z","2020-06-02T20:55:50.000Z","2020-06-02T20:55:51.000Z","2020-06-02T20:55:52.000Z","2020-06-02T20:55:53.000Z","2020-06-02T20:55:54.000Z","2020-06-02T20:55:55.000Z","2020-06-02T20:55:56.000Z","2020-06-02T20:55:57.000Z","2020-06-02T20:55:58.000Z","2020-06-02T20:55:59.000Z","2020-06-02T20:56:00.000Z","2020-06-02T20:56:01.000Z","2020-06-02T20:56:02.000Z","2020-06-02T20:56:03.000Z","2020-06-02T20:56:04.000Z","2020-06-02T20:56:05.000Z","2020-06-02T20:56:06.000Z","2020-06-02T20:56:07.000Z","2020-06-02T20:56:08.000Z","2020-06-02T20:56:09.000Z","2020-06-02T20:56:10.000Z","2020-06-02T20:56:11.000Z","2020-06-02T20:56:12.000Z","2020-06-02T20:56:13.000Z","2020-06-02T20:56:14.000Z","2020-06-02T20:56:15.000Z","2020-06-02T20:56:16.000Z","2020-06-02T20:56:17.000Z","2020-06-02T20:56:18.000Z","2020-06-02T20:56:19.000Z","2020-06-02T20:56:20.000Z","2020-06-02T20:56:21.000Z","2020-06-02T20:56:22.000Z","2020-06-02T20:56:23.000Z","2020-06-02T20:56:24.000Z","2020-06-02T20:56:25.000Z","2020-06-02T20:56:26.000Z","2020-06-02T20:56:27.000Z","2020-06-02T20:56:28.000Z","2020-06-02T20:56:29.000Z","2020-06-02T20:56:30.000Z","2020-06-02T20:56:31.000Z","2020-06-02T20:56:32.000Z","2020-06-02T20:56:33.000Z","2020-06-02T20:56:34.000Z","2020-06-02T20:56:35.000Z","2020-06-02T20:56:36.000Z","2020-06-02T20:56:37.000Z","2020-06-02T20:56:38.000Z","2020-06-02T20:56:39.000Z","2020-06-02T20:56:40.000Z","2020-06-02T20:56:41.000Z","2020-06-02T20:56:42.000Z","2020-06-02T20:56:43.000Z","2020-06-02T20:56:44.000Z","2020-06-02T20:56:45.000Z","2020-06-02T20:56:46.000Z","2020-06-02T20:56:47.000Z","2020-06-02T20:56:48.000Z","2020-06-02T20:56:49.000Z","2020-06-02T20:56:50.000Z","2020-06-02T20:56:51.000Z","2020-06-02T20:56:52.000Z","2020-06-02T20:56:53.000Z","2020-06-02T20:56:54.000Z","2020-06-02T20:56:55.000Z","2020-06-02T20:56:56.000Z","2020-06-02T20:56:57.000Z","2020-06-02T20:56:58.000Z","2020-06-02T20:56:59.000Z","2020-06-02T20:57:00.000Z","2020-06-02T20:57:01.000Z","2020-06-02T20:57:02.000Z","2020-06-02T20:57:03.000Z","2020-06-02T20:57:04.000Z","2020-06-02T20:57:05.000Z","2020-06-02T20:57:06.000Z","2020-06-02T20:57:07.000Z","2020-06-02T20:57:08.000Z","2020-06-02T20:57:09.000Z","2020-06-02T20:57:10.000Z","2020-06-02T20:57:11.000Z","2020-06-02T20:57:12.000Z","2020-06-02T20:57:13.000Z","2020-06-02T20:57:14.000Z","2020-06-02T20:57:15.000Z","2020-06-02T20:57:16.000Z","2020-06-02T20:57:17.000Z","2020-06-02T20:57:18.000Z","2020-06-02T20:57:19.000Z","2020-06-02T20:57:20.000Z","2020-06-02T20:57:21.000Z","2020-06-02T20:57:22.000Z","2020-06-02T20:57:23.000Z","2020-06-02T20:57:24.000Z","2020-06-02T20:57:25.000Z","2020-06-02T20:57:26.000Z","2020-06-02T20:57:27.000Z","2020-06-02T20:57:28.000Z","2020-06-02T20:57:29.000Z","2020-06-02T20:57:30.000Z","2020-06-02T20:57:31.000Z","2020-06-02T20:57:32.000Z","2020-06-02T20:57:33.000Z","2020-06-02T20:57:34.000Z","2020-06-02T20:57:35.000Z","2020-06-02T20:57:36.000Z","2020-06-02T20:57:37.000Z","2020-06-02T20:57:38.000Z","2020-06-02T20:57:39.000Z","2020-06-02T20:57:40.000Z","2020-06-02T20:57:41.000Z","2020-06-02T20:57:42.000Z","2020-06-02T20:57:43.000Z","2020-06-02T20:57:44.000Z","2020-06-02T20:57:45.000Z","2020-06-02T20:57:46.000Z","2020-06-02T20:57:47.000Z","2020-06-02T20:57:48.000Z","2020-06-02T20:57:49.000Z","2020-06-02T20:57:50.000Z","2020-06-02T20:57:51.000Z","2020-06-02T20:57:52.000Z","2020-06-02T20:57:53.000Z","2020-06-02T20:57:54.000Z","2020-06-02T20:57:55.000Z","2020-06-02T20:57:56.000Z","2020-06-02T20:57:57.000Z","2020-06-02T20:57:58.000Z","2020-06-02T20:57:59.000Z","2020-06-02T20:58:00.000Z","2020-06-02T20:58:01.000Z","2020-06-02T20:58:02.000Z","2020-06-02T20:58:03.000Z","2020-06-02T20:58:04.000Z","2020-06-02T20:58:05.000Z","2020-06-02T20:58:06.000Z","2020-06-02T20:58:07.000Z","2020-06-02T20:58:08.000Z","2020-06-02T20:58:09.000Z","2020-06-02T20:58:10.000Z","2020-06-02T20:58:11.000Z","2020-06-02T20:58:12.000Z","2020-06-02T20:58:13.000Z","2020-06-02T20:58:14.000Z","2020-06-02T20:58:15.000Z","2020-06-02T20:58:16.000Z","2020-06-02T20:58:17.000Z","2020-06-02T20:58:18.000Z","2020-06-02T20:58:19.000Z","2020-06-02T20:58:20.000Z","2020-06-02T20:58:21.000Z","2020-06-02T20:58:22.000Z","2020-06-02T20:58:23.000Z","2020-06-02T20:58:24.000Z","2020-06-02T20:58:25.000Z","2020-06-02T20:58:26.000Z","2020-06-02T20:58:27.000Z","2020-06-02T20:58:28.000Z","2020-06-02T20:58:29.000Z","2020-06-02T20:58:30.000Z","2020-06-02T20:58:31.000Z","2020-06-02T20:58:32.000Z","2020-06-02T20:58:33.000Z","2020-06-02T20:58:34.000Z","2020-06-02T20:58:35.000Z","2020-06-02T20:58:36.000Z","2020-06-02T20:58:37.000Z","2020-06-02T20:58:38.000Z","2020-06-02T20:58:39.000Z","2020-06-02T20:58:40.000Z","2020-06-02T20:58:41.000Z","2020-06-02T20:58:42.000Z","2020-06-02T20:58:43.000Z","2020-06-02T20:58:44.000Z","2020-06-02T20:58:45.000Z","2020-06-02T20:58:46.000Z","2020-06-02T20:58:47.000Z","2020-06-02T20:58:48.000Z","2020-06-02T20:58:49.000Z","2020-06-02T20:58:50.000Z","2020-06-02T20:58:51.000Z","2020-06-02T20:58:52.000Z","2020-06-02T20:58:53.000Z","2020-06-02T20:58:54.000Z","2020-06-02T20:58:55.000Z","2020-06-02T20:58:56.000Z","2020-06-02T20:58:57.000Z","2020-06-02T20:58:58.000Z","2020-06-02T20:58:59.000Z","2020-06-02T20:59:00.000Z","2020-06-02T20:59:01.000Z","2020-06-02T20:59:02.000Z","2020-06-02T20:59:03.000Z","2020-06-02T20:59:04.000Z","2020-06-02T20:59:05.000Z","2020-06-02T20:59:06.000Z","2020-06-02T20:59:07.000Z","2020-06-02T20:59:08.000Z","2020-06-02T20:59:09.000Z","2020-06-02T20:59:10.000Z","2020-06-02T20:59:11.000Z","2020-06-02T20:59:12.000Z","2020-06-02T20:59:13.000Z","2020-06-02T20:59:14.000Z","2020-06-02T20:59:15.000Z","2020-06-02T20:59:16.000Z","2020-06-02T20:59:17.000Z","2020-06-02T20:59:18.000Z","2020-06-02T20:59:19.000Z","2020-06-02T20:59:20.000Z","2020-06-02T20:59:21.000Z","2020-06-02T20:59:22.000Z","2020-06-02T20:59:23.000Z","2020-06-02T20:59:24.000Z","2020-06-02T20:59:25.000Z","2020-06-02T20:59:26.000Z","2020-06-02T20:59:27.000Z","2020-06-02T20:59:28.000Z","2020-06-02T20:59:29.000Z","2020-06-02T20:59:30.000Z","2020-06-02T20:59:31.000Z","2020-06-02T20:59:32.000Z","2020-06-02T20:59:33.000Z","2020-06-02T20:59:34.000Z","2020-06-02T20:59:35.000Z","2020-06-02T20:59:36.000Z","2020-06-02T20:59:37.000Z","2020-06-02T20:59:38.000Z","2020-06-02T20:59:39.000Z","2020-06-02T20:59:40.000Z","2020-06-02T20:59:41.000Z","2020-06-02T20:59:42.000Z","2020-06-02T20:59:43.000Z","2020-06-02T20:59:44.000Z","2020-06-02T20:59:45.000Z","2020-06-02T20:59:46.000Z","2020-06-02T20:59:47.000Z","2020-06-02T20:59:48.000Z","2020-06-02T20:59:49.000Z","2020-06-02T20:59:50.000Z","2020-06-02T20:59:51.000Z","2020-06-02T20:59:52.000Z","2020-06-02T20:59:53.000Z","2020-06-02T20:59:54.000Z","2020-06-02T20:59:55.000Z","2020-06-02T20:59:56.000Z","2020-06-02T20:59:57.000Z","2020-06-02T20:59:58.000Z","2020-06-02T20:59:59.000Z","2020-06-02T21:00:00.000Z","2020-06-02T21:00:01.000Z","2020-06-02T21:00:02.000Z","2020-06-02T21:00:03.000Z","2020-06-02T21:00:04.000Z","2020-06-02T21:00:05.000Z","2020-06-02T21:00:06.000Z","2020-06-02T21:00:07.000Z","2020-06-02T21:00:08.000Z","2020-06-02T21:00:09.000Z","2020-06-02T21:00:10.000Z","2020-06-02T21:00:11.000Z","2020-06-02T21:00:12.000Z","2020-06-02T21:00:13.000Z","2020-06-02T21:00:14.000Z","2020-06-02T21:00:15.000Z","2020-06-02T21:00:16.000Z","2020-06-02T21:00:17.000Z","2020-06-02T21:00:18.000Z","2020-06-02T21:00:19.000Z","2020-06-02T21:00:20.000Z","2020-06-02T21:00:21.000Z","2020-06-02T21:00:22.000Z","2020-06-02T21:00:23.000Z","2020-06-02T21:00:24.000Z","2020-06-02T21:00:25.000Z","2020-06-02T21:00:26.000Z","2020-06-02T21:00:27.000Z","2020-06-02T21:00:28.000Z","2020-06-02T21:00:29.000Z","2020-06-02T21:00:30.000Z","2020-06-02T21:00:31.000Z","2020-06-02T21:00:32.000Z","2020-06-02T21:00:33.000Z","2020-06-02T21:00:34.000Z","2020-06-02T21:00:35.000Z","2020-06-02T21:00:36.000Z","2020-06-02T21:00:37.000Z","2020-06-02T21:00:38.000Z","2020-06-02T21:00:39.000Z","2020-06-02T21:00:40.000Z","2020-06-02T21:00:41.000Z","2020-06-02T21:00:42.000Z","2020-06-02T21:00:43.000Z","2020-06-02T21:00:44.000Z","2020-06-02T21:00:45.000Z","2020-06-02T21:00:46.000Z","2020-06-02T21:00:47.000Z","2020-06-02T21:00:48.000Z","2020-06-02T21:00:49.000Z","2020-06-02T21:00:50.000Z","2020-06-02T21:00:51.000Z","2020-06-02T21:00:52.000Z","2020-06-02T21:00:53.000Z","2020-06-02T21:00:54.000Z","2020-06-02T21:00:55.000Z","2020-06-02T21:00:56.000Z","2020-06-02T21:00:57.000Z","2020-06-02T21:00:58.000Z","2020-06-02T21:00:59.000Z","2020-06-02T21:01:00.000Z","2020-06-02T21:01:01.000Z","2020-06-02T21:01:02.000Z","2020-06-02T21:01:03.000Z","2020-06-02T21:01:04.000Z","2020-06-02T21:01:05.000Z","2020-06-02T21:01:06.000Z","2020-06-02T21:01:07.000Z","2020-06-02T21:01:08.000Z","2020-06-02T21:01:09.000Z","2020-06-02T21:01:10.000Z","2020-06-02T21:01:11.000Z","2020-06-02T21:01:12.000Z","2020-06-02T21:01:13.000Z","2020-06-02T21:01:14.000Z","2020-06-02T21:01:15.000Z","2020-06-02T21:01:16.000Z","2020-06-02T21:01:17.000Z","2020-06-02T21:01:18.000Z","2020-06-02T21:01:19.000Z","2020-06-02T21:01:20.000Z","2020-06-02T21:01:21.000Z","2020-06-02T21:01:22.000Z","2020-06-02T21:01:23.000Z","2020-06-02T21:01:24.000Z","2020-06-02T21:01:25.000Z","2020-06-02T21:01:26.000Z","2020-06-02T21:01:27.000Z","2020-06-02T21:01:28.000Z","2020-06-02T21:01:29.000Z","2020-06-02T21:01:30.000Z","2020-06-02T21:01:31.000Z","2020-06-02T21:01:32.000Z","2020-06-02T21:01:33.000Z","2020-06-02T21:01:34.000Z","2020-06-02T21:01:35.000Z","2020-06-02T21:01:36.000Z","2020-06-02T21:01:37.000Z","2020-06-02T21:01:38.000Z","2020-06-02T21:01:39.000Z","2020-06-02T21:01:40.000Z","2020-06-02T21:01:41.000Z","2020-06-02T21:01:42.000Z","2020-06-02T21:01:43.000Z","2020-06-02T21:01:44.000Z","2020-06-02T21:01:45.000Z","2020-06-02T21:01:46.000Z","2020-06-02T21:01:47.000Z","2020-06-02T21:01:48.000Z","2020-06-02T21:01:49.000Z","2020-06-02T21:01:50.000Z","2020-06-02T21:01:51.000Z","2020-06-02T21:01:52.000Z","2020-06-02T21:01:53.000Z","2020-06-02T21:01:54.000Z","2020-06-02T21:01:55.000Z","2020-06-02T21:01:56.000Z","2020-06-02T21:01:57.000Z","2020-06-02T21:01:58.000Z","2020-06-02T21:01:59.000Z","2020-06-02T21:02:00.000Z","2020-06-02T21:02:01.000Z","2020-06-02T21:02:02.000Z","2020-06-02T21:02:03.000Z","2020-06-02T21:02:04.000Z","2020-06-02T21:02:05.000Z","2020-06-02T21:02:06.000Z","2020-06-02T21:02:07.000Z","2020-06-02T21:02:08.000Z","2020-06-02T21:02:09.000Z","2020-06-02T21:02:10.000Z","2020-06-02T21:02:11.000Z","2020-06-02T21:02:12.000Z","2020-06-02T21:02:13.000Z","2020-06-02T21:02:14.000Z","2020-06-02T21:02:15.000Z","2020-06-02T21:02:16.000Z","2020-06-02T21:02:17.000Z","2020-06-02T21:02:18.000Z","2020-06-02T21:02:19.000Z","2020-06-02T21:02:20.000Z","2020-06-02T21:02:21.000Z","2020-06-02T21:02:22.000Z","2020-06-02T21:02:23.000Z","2020-06-02T21:02:24.000Z","2020-06-02T21:02:25.000Z","2020-06-02T21:02:26.000Z","2020-06-02T21:02:27.000Z","2020-06-02T21:02:28.000Z","2020-06-02T21:02:29.000Z","2020-06-02T21:02:30.000Z","2020-06-02T21:02:31.000Z","2020-06-02T21:02:32.000Z","2020-06-02T21:02:33.000Z","2020-06-02T21:02:34.000Z","2020-06-02T21:02:35.000Z","2020-06-02T21:02:36.000Z","2020-06-02T21:02:37.000Z","2020-06-02T21:02:38.000Z","2020-06-02T21:02:39.000Z","2020-06-02T21:02:40.000Z","2020-06-02T21:02:41.000Z","2020-06-02T21:02:42.000Z","2020-06-02T21:02:43.000Z","2020-06-02T21:02:44.000Z","2020-06-02T21:02:45.000Z","2020-06-02T21:02:46.000Z","2020-06-02T21:02:47.000Z","2020-06-02T21:02:48.000Z","2020-06-02T21:02:49.000Z","2020-06-02T21:02:50.000Z","2020-06-02T21:02:51.000Z","2020-06-02T21:02:52.000Z","2020-06-02T21:02:53.000Z","2020-06-02T21:02:54.000Z","2020-06-02T21:02:55.000Z","2020-06-02T21:02:56.000Z","2020-06-02T21:02:57.000Z","2020-06-02T21:02:58.000Z","2020-06-02T21:02:59.000Z","2020-06-02T21:03:00.000Z","2020-06-02T21:03:01.000Z","2020-06-02T21:03:02.000Z","2020-06-02T21:03:03.000Z","2020-06-02T21:03:04.000Z","2020-06-02T21:03:05.000Z","2020-06-02T21:03:06.000Z","2020-06-02T21:03:07.000Z","2020-06-02T21:03:08.000Z","2020-06-02T21:03:09.000Z","2020-06-02T21:03:10.000Z","2020-06-02T21:03:11.000Z","2020-06-02T21:03:12.000Z","2020-06-02T21:03:13.000Z","2020-06-02T21:03:14.000Z","2020-06-02T21:03:15.000Z","2020-06-02T21:03:16.000Z","2020-06-02T21:03:17.000Z","2020-06-02T21:03:18.000Z","2020-06-02T21:03:19.000Z","2020-06-02T21:03:20.000Z","2020-06-02T21:03:21.000Z","2020-06-02T21:03:22.000Z","2020-06-02T21:03:23.000Z","2020-06-02T21:03:24.000Z","2020-06-02T21:03:25.000Z","2020-06-02T21:03:26.000Z","2020-06-02T21:03:27.000Z","2020-06-02T21:03:28.000Z","2020-06-02T21:03:29.000Z","2020-06-02T21:03:30.000Z","2020-06-02T21:03:31.000Z","2020-06-02T21:03:32.000Z","2020-06-02T21:03:33.000Z","2020-06-02T21:03:34.000Z","2020-06-02T21:03:35.000Z","2020-06-02T21:03:36.000Z","2020-06-02T21:03:37.000Z","2020-06-02T21:03:38.000Z","2020-06-02T21:03:39.000Z","2020-06-02T21:03:40.000Z","2020-06-02T21:03:41.000Z","2020-06-02T21:03:42.000Z","2020-06-02T21:03:43.000Z","2020-06-02T21:03:44.000Z","2020-06-02T21:03:45.000Z","2020-06-02T21:03:46.000Z","2020-06-02T21:03:47.000Z","2020-06-02T21:03:48.000Z","2020-06-02T21:03:49.000Z","2020-06-02T21:03:50.000Z","2020-06-02T21:03:51.000Z","2020-06-02T21:03:52.000Z","2020-06-02T21:03:53.000Z","2020-06-02T21:03:54.000Z","2020-06-02T21:03:55.000Z","2020-06-02T21:03:56.000Z","2020-06-02T21:03:57.000Z","2020-06-02T21:03:58.000Z","2020-06-02T21:03:59.000Z","2020-06-02T21:04:00.000Z","2020-06-02T21:04:01.000Z","2020-06-02T21:04:02.000Z","2020-06-02T21:04:03.000Z","2020-06-02T21:04:04.000Z","2020-06-02T21:04:05.000Z","2020-06-02T21:04:06.000Z","2020-06-02T21:04:07.000Z","2020-06-02T21:04:08.000Z","2020-06-02T21:04:09.000Z","2020-06-02T21:04:10.000Z","2020-06-02T21:04:11.000Z","2020-06-02T21:04:12.000Z","2020-06-02T21:04:13.000Z","2020-06-02T21:04:14.000Z","2020-06-02T21:04:15.000Z","2020-06-02T21:04:16.000Z","2020-06-02T21:04:17.000Z","2020-06-02T21:04:18.000Z","2020-06-02T21:04:19.000Z","2020-06-02T21:04:20.000Z","2020-06-02T21:04:21.000Z","2020-06-02T21:04:22.000Z","2020-06-02T21:04:23.000Z","2020-06-02T21:04:24.000Z","2020-06-02T21:04:25.000Z","2020-06-02T21:04:26.000Z","2020-06-02T21:04:27.000Z","2020-06-02T21:04:28.000Z","2020-06-02T21:04:29.000Z","2020-06-02T21:04:30.000Z","2020-06-02T21:04:31.000Z","2020-06-02T21:04:32.000Z","2020-06-02T21:04:33.000Z","2020-06-02T21:04:34.000Z","2020-06-02T21:04:35.000Z","2020-06-02T21:04:36.000Z","2020-06-02T21:04:37.000Z","2020-06-02T21:04:38.000Z","2020-06-02T21:04:39.000Z","2020-06-02T21:04:40.000Z","2020-06-02T21:04:41.000Z","2020-06-02T21:04:42.000Z","2020-06-02T21:04:43.000Z","2020-06-02T21:04:44.000Z","2020-06-02T21:04:45.000Z","2020-06-02T21:04:46.000Z","2020-06-02T21:04:47.000Z","2020-06-02T21:04:48.000Z","2020-06-02T21:04:49.000Z","2020-06-02T21:04:50.000Z","2020-06-02T21:04:51.000Z","2020-06-02T21:04:52.000Z","2020-06-02T21:04:53.000Z","2020-06-02T21:04:54.000Z","2020-06-02T21:04:55.000Z","2020-06-02T21:04:56.000Z","2020-06-02T21:04:57.000Z","2020-06-02T21:04:58.000Z","2020-06-02T21:04:59.000Z","2020-06-02T21:05:00.000Z","2020-06-02T21:05:01.000Z","2020-06-02T21:05:02.000Z","2020-06-02T21:05:03.000Z","2020-06-02T21:05:04.000Z","2020-06-02T21:05:05.000Z","2020-06-02T21:05:06.000Z","2020-06-02T21:05:07.000Z","2020-06-02T21:05:08.000Z","2020-06-02T21:05:09.000Z","2020-06-02T21:05:10.000Z","2020-06-02T21:05:11.000Z","2020-06-02T21:05:12.000Z","2020-06-02T21:05:13.000Z","2020-06-02T21:05:14.000Z","2020-06-02T21:05:15.000Z","2020-06-02T21:05:16.000Z","2020-06-02T21:05:17.000Z","2020-06-02T21:05:18.000Z","2020-06-02T21:05:19.000Z","2020-06-02T21:05:20.000Z","2020-06-02T21:05:21.000Z","2020-06-02T21:05:22.000Z","2020-06-02T21:05:23.000Z","2020-06-02T21:05:24.000Z","2020-06-02T21:05:25.000Z","2020-06-02T21:05:26.000Z","2020-06-02T21:05:27.000Z","2020-06-02T21:05:28.000Z","2020-06-02T21:05:29.000Z","2020-06-02T21:05:30.000Z","2020-06-02T21:05:31.000Z","2020-06-02T21:05:32.000Z","2020-06-02T21:05:33.000Z","2020-06-02T21:05:34.000Z","2020-06-02T21:05:35.000Z","2020-06-02T21:05:36.000Z","2020-06-02T21:05:37.000Z","2020-06-02T21:05:38.000Z","2020-06-02T21:05:39.000Z","2020-06-02T21:05:40.000Z","2020-06-02T21:05:41.000Z","2020-06-02T21:05:42.000Z","2020-06-02T21:05:43.000Z","2020-06-02T21:05:44.000Z","2020-06-02T21:05:45.000Z","2020-06-02T21:05:46.000Z","2020-06-02T21:05:47.000Z","2020-06-02T21:05:48.000Z","2020-06-02T21:05:49.000Z","2020-06-02T21:05:50.000Z","2020-06-02T21:05:51.000Z","2020-06-02T21:05:52.000Z","2020-06-02T21:05:53.000Z","2020-06-02T21:05:54.000Z","2020-06-02T21:05:55.000Z","2020-06-02T21:05:56.000Z","2020-06-02T21:05:57.000Z","2020-06-02T21:05:58.000Z","2020-06-02T21:05:59.000Z","2020-06-02T21:06:00.000Z","2020-06-02T21:06:01.000Z","2020-06-02T21:06:02.000Z","2020-06-02T21:06:03.000Z","2020-06-02T21:06:04.000Z","2020-06-02T21:06:05.000Z","2020-06-02T21:06:06.000Z","2020-06-02T21:06:07.000Z","2020-06-02T21:06:08.000Z","2020-06-02T21:06:09.000Z","2020-06-02T21:06:10.000Z","2020-06-02T21:06:11.000Z","2020-06-02T21:06:12.000Z","2020-06-02T21:06:13.000Z","2020-06-02T21:06:14.000Z","2020-06-02T21:06:15.000Z","2020-06-02T21:06:16.000Z","2020-06-02T21:06:17.000Z","2020-06-02T21:06:18.000Z","2020-06-02T21:06:19.000Z","2020-06-02T21:06:20.000Z","2020-06-02T21:06:21.000Z","2020-06-02T21:06:22.000Z","2020-06-02T21:06:23.000Z","2020-06-02T21:06:24.000Z","2020-06-02T21:06:25.000Z","2020-06-02T21:06:26.000Z","2020-06-02T21:06:27.000Z","2020-06-02T21:06:28.000Z","2020-06-02T21:06:29.000Z","2020-06-02T21:06:30.000Z","2020-06-02T21:06:31.000Z","2020-06-02T21:06:32.000Z","2020-06-02T21:06:33.000Z","2020-06-02T21:06:34.000Z","2020-06-02T21:06:35.000Z","2020-06-02T21:06:36.000Z","2020-06-02T21:06:37.000Z","2020-06-02T21:06:38.000Z","2020-06-02T21:06:39.000Z","2020-06-02T21:06:40.000Z","2020-06-02T21:06:41.000Z","2020-06-02T21:06:42.000Z","2020-06-02T21:06:43.000Z","2020-06-02T21:06:44.000Z","2020-06-02T21:06:45.000Z","2020-06-02T21:06:46.000Z","2020-06-02T21:06:47.000Z","2020-06-02T21:06:48.000Z","2020-06-02T21:06:49.000Z","2020-06-02T21:06:50.000Z","2020-06-02T21:06:51.000Z","2020-06-02T21:06:52.000Z","2020-06-02T21:06:53.000Z","2020-06-02T21:06:54.000Z","2020-06-02T21:06:55.000Z","2020-06-02T21:06:56.000Z","2020-06-02T21:06:57.000Z","2020-06-02T21:06:58.000Z","2020-06-02T21:06:59.000Z","2020-06-02T21:07:00.000Z","2020-06-02T21:07:01.000Z","2020-06-02T21:07:02.000Z","2020-06-02T21:07:03.000Z","2020-06-02T21:07:04.000Z","2020-06-02T21:07:05.000Z","2020-06-02T21:07:06.000Z","2020-06-02T21:07:07.000Z","2020-06-02T21:07:08.000Z","2020-06-02T21:07:09.000Z","2020-06-02T21:07:10.000Z","2020-06-02T21:07:11.000Z","2020-06-02T21:07:12.000Z","2020-06-02T21:07:13.000Z","2020-06-02T21:07:14.000Z","2020-06-02T21:07:15.000Z","2020-06-02T21:07:16.000Z","2020-06-02T21:07:17.000Z","2020-06-02T21:07:18.000Z","2020-06-02T21:07:19.000Z","2020-06-02T21:07:20.000Z","2020-06-02T21:07:21.000Z","2020-06-02T21:07:22.000Z","2020-06-02T21:07:23.000Z","2020-06-02T21:07:24.000Z","2020-06-02T21:07:25.000Z","2020-06-02T21:07:26.000Z","2020-06-02T21:07:27.000Z","2020-06-02T21:07:28.000Z","2020-06-02T21:07:29.000Z","2020-06-02T21:07:30.000Z","2020-06-02T21:07:31.000Z","2020-06-02T21:07:32.000Z","2020-06-02T21:07:33.000Z","2020-06-02T21:07:34.000Z","2020-06-02T21:07:35.000Z","2020-06-02T21:07:36.000Z","2020-06-02T21:07:37.000Z","2020-06-02T21:07:38.000Z","2020-06-02T21:07:39.000Z","2020-06-02T21:07:40.000Z","2020-06-02T21:07:41.000Z","2020-06-02T21:07:42.000Z","2020-06-02T21:07:43.000Z","2020-06-02T21:07:44.000Z","2020-06-02T21:07:45.000Z","2020-06-02T21:07:46.000Z","2020-06-02T21:07:47.000Z","2020-06-02T21:07:48.000Z","2020-06-02T21:07:49.000Z","2020-06-02T21:07:50.000Z","2020-06-02T21:07:51.000Z","2020-06-02T21:07:52.000Z","2020-06-02T21:07:53.000Z","2020-06-02T21:07:54.000Z","2020-06-02T21:07:55.000Z","2020-06-02T21:07:56.000Z","2020-06-02T21:07:57.000Z","2020-06-02T21:07:58.000Z","2020-06-02T21:07:59.000Z","2020-06-02T21:08:00.000Z","2020-06-02T21:08:01.000Z","2020-06-02T21:08:02.000Z","2020-06-02T21:08:03.000Z","2020-06-02T21:08:04.000Z","2020-06-02T21:08:05.000Z","2020-06-02T21:08:06.000Z","2020-06-02T21:08:07.000Z","2020-06-02T21:08:08.000Z","2020-06-02T21:08:09.000Z","2020-06-02T21:08:10.000Z","2020-06-02T21:08:11.000Z","2020-06-02T21:08:12.000Z","2020-06-02T21:08:13.000Z","2020-06-02T21:08:14.000Z","2020-06-02T21:08:15.000Z","2020-06-02T21:08:16.000Z","2020-06-02T21:08:17.000Z","2020-06-02T21:08:18.000Z","2020-06-02T21:08:19.000Z","2020-06-02T21:08:20.000Z","2020-06-02T21:08:21.000Z","2020-06-02T21:08:22.000Z","2020-06-02T21:08:23.000Z","2020-06-02T21:08:24.000Z","2020-06-02T21:08:25.000Z","2020-06-02T21:08:26.000Z","2020-06-02T21:08:27.000Z","2020-06-02T21:08:28.000Z","2020-06-02T21:08:29.000Z","2020-06-02T21:08:30.000Z","2020-06-02T21:08:31.000Z","2020-06-02T21:08:32.000Z","2020-06-02T21:08:33.000Z","2020-06-02T21:08:34.000Z","2020-06-02T21:08:35.000Z","2020-06-02T21:08:36.000Z","2020-06-02T21:08:37.000Z","2020-06-02T21:08:38.000Z","2020-06-02T21:08:39.000Z","2020-06-02T21:08:40.000Z","2020-06-02T21:08:41.000Z","2020-06-02T21:08:42.000Z","2020-06-02T21:08:43.000Z","2020-06-02T21:08:44.000Z","2020-06-02T21:08:45.000Z","2020-06-02T21:08:46.000Z","2020-06-02T21:08:47.000Z","2020-06-02T21:08:48.000Z","2020-06-02T21:08:49.000Z","2020-06-02T21:08:50.000Z","2020-06-02T21:08:51.000Z","2020-06-02T21:08:52.000Z","2020-06-02T21:08:53.000Z","2020-06-02T21:08:54.000Z","2020-06-02T21:08:55.000Z","2020-06-02T21:08:56.000Z","2020-06-02T21:08:57.000Z","2020-06-02T21:08:58.000Z","2020-06-02T21:08:59.000Z","2020-06-02T21:09:00.000Z","2020-06-02T21:09:01.000Z","2020-06-02T21:09:02.000Z","2020-06-02T21:09:03.000Z","2020-06-02T21:09:04.000Z","2020-06-02T21:09:05.000Z","2020-06-02T21:09:06.000Z","2020-06-02T21:09:07.000Z","2020-06-02T21:09:08.000Z","2020-06-02T21:09:09.000Z","2020-06-02T21:09:10.000Z","2020-06-02T21:09:11.000Z","2020-06-02T21:09:12.000Z","2020-06-02T21:09:13.000Z","2020-06-02T21:09:14.000Z","2020-06-02T21:09:15.000Z","2020-06-02T21:09:16.000Z","2020-06-02T21:09:17.000Z","2020-06-02T21:09:18.000Z","2020-06-02T21:09:19.000Z","2020-06-02T21:09:20.000Z","2020-06-02T21:09:21.000Z","2020-06-02T21:09:22.000Z","2020-06-02T21:09:23.000Z","2020-06-02T21:09:24.000Z","2020-06-02T21:09:25.000Z","2020-06-02T21:09:26.000Z","2020-06-02T21:09:27.000Z","2020-06-02T21:09:28.000Z","2020-06-02T21:09:29.000Z","2020-06-02T21:09:30.000Z","2020-06-02T21:09:31.000Z","2020-06-02T21:09:32.000Z","2020-06-02T21:09:33.000Z","2020-06-02T21:09:34.000Z","2020-06-02T21:09:35.000Z","2020-06-02T21:09:36.000Z","2020-06-02T21:09:37.000Z","2020-06-02T21:09:38.000Z","2020-06-02T21:09:39.000Z","2020-06-02T21:09:40.000Z","2020-06-02T21:09:41.000Z","2020-06-02T21:09:42.000Z","2020-06-02T21:09:43.000Z","2020-06-02T21:09:44.000Z","2020-06-02T21:09:45.000Z","2020-06-02T21:09:46.000Z","2020-06-02T21:09:47.000Z","2020-06-02T21:09:48.000Z","2020-06-02T21:09:49.000Z","2020-06-02T21:09:50.000Z","2020-06-02T21:09:51.000Z","2020-06-02T21:09:52.000Z","2020-06-02T21:09:53.000Z","2020-06-02T21:09:54.000Z","2020-06-02T21:09:55.000Z","2020-06-02T21:09:56.000Z","2020-06-02T21:09:57.000Z","2020-06-02T21:09:58.000Z","2020-06-02T21:09:59.000Z","2020-06-02T21:10:00.000Z","2020-06-02T21:10:01.000Z","2020-06-02T21:10:02.000Z","2020-06-02T21:10:03.000Z","2020-06-02T21:10:04.000Z","2020-06-02T21:10:05.000Z","2020-06-02T21:10:06.000Z","2020-06-02T21:10:07.000Z","2020-06-02T21:10:08.000Z","2020-06-02T21:10:09.000Z","2020-06-02T21:10:10.000Z","2020-06-02T21:10:11.000Z","2020-06-02T21:10:12.000Z","2020-06-02T21:10:13.000Z","2020-06-02T21:10:14.000Z","2020-06-02T21:10:15.000Z","2020-06-02T21:10:16.000Z","2020-06-02T21:10:17.000Z","2020-06-02T21:10:18.000Z","2020-06-02T21:10:19.000Z","2020-06-02T21:10:20.000Z","2020-06-02T21:10:21.000Z","2020-06-02T21:10:22.000Z","2020-06-02T21:10:23.000Z","2020-06-02T21:10:24.000Z","2020-06-02T21:10:25.000Z","2020-06-02T21:10:26.000Z","2020-06-02T21:10:27.000Z","2020-06-02T21:10:28.000Z","2020-06-02T21:10:29.000Z","2020-06-02T21:10:30.000Z","2020-06-02T21:10:31.000Z","2020-06-02T21:10:32.000Z","2020-06-02T21:10:33.000Z","2020-06-02T21:10:34.000Z","2020-06-02T21:10:35.000Z","2020-06-02T21:10:36.000Z","2020-06-02T21:10:37.000Z","2020-06-02T21:10:38.000Z","2020-06-02T21:10:39.000Z","2020-06-02T21:10:40.000Z","2020-06-02T21:10:41.000Z","2020-06-02T21:10:42.000Z","2020-06-02T21:10:43.000Z","2020-06-02T21:10:44.000Z","2020-06-02T21:10:45.000Z","2020-06-02T21:10:46.000Z","2020-06-02T21:10:47.000Z","2020-06-02T21:10:48.000Z","2020-06-02T21:10:49.000Z","2020-06-02T21:10:50.000Z","2020-06-02T21:10:51.000Z","2020-06-02T21:10:52.000Z","2020-06-02T21:10:53.000Z","2020-06-02T21:10:54.000Z","2020-06-02T21:10:55.000Z","2020-06-02T21:10:56.000Z","2020-06-02T21:10:57.000Z","2020-06-02T21:10:58.000Z","2020-06-02T21:10:59.000Z","2020-06-02T21:11:00.000Z","2020-06-02T21:11:01.000Z","2020-06-02T21:11:02.000Z","2020-06-02T21:11:03.000Z","2020-06-02T21:11:04.000Z","2020-06-02T21:11:05.000Z","2020-06-02T21:11:06.000Z","2020-06-02T21:11:07.000Z","2020-06-02T21:11:08.000Z","2020-06-02T21:11:09.000Z","2020-06-02T21:11:10.000Z","2020-06-02T21:11:11.000Z","2020-06-02T21:11:12.000Z","2020-06-02T21:11:13.000Z","2020-06-02T21:11:14.000Z","2020-06-02T21:11:15.000Z","2020-06-02T21:11:16.000Z","2020-06-02T21:11:17.000Z","2020-06-02T21:11:18.000Z","2020-06-02T21:11:19.000Z","2020-06-02T21:11:20.000Z","2020-06-02T21:11:21.000Z","2020-06-02T21:11:22.000Z","2020-06-02T21:11:23.000Z","2020-06-02T21:11:24.000Z","2020-06-02T21:11:25.000Z","2020-06-02T21:11:26.000Z","2020-06-02T21:11:27.000Z","2020-06-02T21:11:28.000Z","2020-06-02T21:11:29.000Z","2020-06-02T21:11:30.000Z","2020-06-02T21:11:31.000Z","2020-06-02T21:11:32.000Z","2020-06-02T21:11:33.000Z","2020-06-02T21:11:34.000Z","2020-06-02T21:11:35.000Z","2020-06-02T21:11:36.000Z","2020-06-02T21:11:37.000Z","2020-06-02T21:11:38.000Z","2020-06-02T21:11:39.000Z","2020-06-02T21:11:40.000Z","2020-06-02T21:11:41.000Z","2020-06-02T21:11:42.000Z","2020-06-02T21:11:43.000Z","2020-06-02T21:11:44.000Z","2020-06-02T21:11:45.000Z","2020-06-02T21:11:46.000Z","2020-06-02T21:11:47.000Z","2020-06-02T21:11:48.000Z","2020-06-02T21:11:49.000Z","2020-06-02T21:11:50.000Z","2020-06-02T21:11:51.000Z","2020-06-02T21:11:52.000Z","2020-06-02T21:11:53.000Z","2020-06-02T21:11:54.000Z","2020-06-02T21:11:55.000Z","2020-06-02T21:11:56.000Z","2020-06-02T21:11:57.000Z","2020-06-02T21:11:58.000Z","2020-06-02T21:11:59.000Z","2020-06-02T21:12:00.000Z","2020-06-02T21:12:01.000Z","2020-06-02T21:12:02.000Z","2020-06-02T21:12:03.000Z","2020-06-02T21:12:04.000Z","2020-06-02T21:12:05.000Z","2020-06-02T21:12:06.000Z","2020-06-02T21:12:07.000Z","2020-06-02T21:12:08.000Z","2020-06-02T21:12:09.000Z","2020-06-02T21:12:10.000Z","2020-06-02T21:12:11.000Z","2020-06-02T21:12:12.000Z","2020-06-02T21:12:13.000Z","2020-06-02T21:12:14.000Z","2020-06-02T21:12:15.000Z","2020-06-02T21:12:16.000Z","2020-06-02T21:12:17.000Z","2020-06-02T21:12:18.000Z","2020-06-02T21:12:19.000Z","2020-06-02T21:12:20.000Z","2020-06-02T21:12:21.000Z","2020-06-02T21:12:22.000Z","2020-06-02T21:12:23.000Z","2020-06-02T21:12:24.000Z","2020-06-02T21:12:25.000Z","2020-06-02T21:12:26.000Z","2020-06-02T21:12:27.000Z","2020-06-02T21:12:28.000Z","2020-06-02T21:12:29.000Z","2020-06-02T21:12:30.000Z","2020-06-02T21:12:31.000Z","2020-06-02T21:12:32.000Z","2020-06-02T21:12:33.000Z","2020-06-02T21:12:34.000Z","2020-06-02T21:12:35.000Z","2020-06-02T21:12:36.000Z","2020-06-02T21:12:37.000Z","2020-06-02T21:12:38.000Z","2020-06-02T21:12:39.000Z","2020-06-02T21:12:40.000Z","2020-06-02T21:12:41.000Z","2020-06-02T21:12:42.000Z","2020-06-02T21:12:43.000Z","2020-06-02T21:12:44.000Z","2020-06-02T21:12:45.000Z","2020-06-02T21:12:46.000Z","2020-06-02T21:12:47.000Z","2020-06-02T21:12:48.000Z","2020-06-02T21:12:49.000Z","2020-06-02T21:12:50.000Z","2020-06-02T21:12:51.000Z","2020-06-02T21:12:52.000Z","2020-06-02T21:12:53.000Z","2020-06-02T21:12:54.000Z","2020-06-02T21:12:55.000Z","2020-06-02T21:12:56.000Z","2020-06-02T21:12:57.000Z","2020-06-02T21:12:58.000Z","2020-06-02T21:12:59.000Z","2020-06-02T21:13:00.000Z","2020-06-02T21:13:01.000Z","2020-06-02T21:13:02.000Z","2020-06-02T21:13:03.000Z","2020-06-02T21:13:04.000Z","2020-06-02T21:13:05.000Z","2020-06-02T21:13:06.000Z","2020-06-02T21:13:07.000Z","2020-06-02T21:13:08.000Z","2020-06-02T21:13:09.000Z","2020-06-02T21:13:10.000Z","2020-06-02T21:13:11.000Z","2020-06-02T21:13:12.000Z","2020-06-02T21:13:13.000Z","2020-06-02T21:13:14.000Z","2020-06-02T21:13:15.000Z","2020-06-02T21:13:16.000Z","2020-06-02T21:13:17.000Z","2020-06-02T21:13:18.000Z","2020-06-02T21:13:19.000Z","2020-06-02T21:13:20.000Z","2020-06-02T21:13:21.000Z","2020-06-02T21:13:22.000Z","2020-06-02T21:13:23.000Z","2020-06-02T21:13:24.000Z","2020-06-02T21:13:25.000Z","2020-06-02T21:13:26.000Z","2020-06-02T21:13:27.000Z","2020-06-02T21:13:28.000Z","2020-06-02T21:13:29.000Z","2020-06-02T21:13:30.000Z","2020-06-02T21:13:31.000Z","2020-06-02T21:13:32.000Z","2020-06-02T21:13:33.000Z","2020-06-02T21:13:34.000Z","2020-06-02T21:13:35.000Z","2020-06-02T21:13:36.000Z","2020-06-02T21:13:37.000Z","2020-06-02T21:13:38.000Z","2020-06-02T21:13:39.000Z","2020-06-02T21:13:40.000Z","2020-06-02T21:13:41.000Z","2020-06-02T21:13:42.000Z","2020-06-02T21:13:43.000Z","2020-06-02T21:13:44.000Z","2020-06-02T21:13:45.000Z","2020-06-02T21:13:46.000Z","2020-06-02T21:13:47.000Z","2020-06-02T21:13:48.000Z","2020-06-02T21:13:49.000Z","2020-06-02T21:13:50.000Z","2020-06-02T21:13:51.000Z","2020-06-02T21:13:52.000Z","2020-06-02T21:13:53.000Z","2020-06-02T21:13:54.000Z","2020-06-02T21:13:55.000Z","2020-06-02T21:13:56.000Z","2020-06-02T21:13:57.000Z","2020-06-02T21:13:58.000Z","2020-06-02T21:13:59.000Z","2020-06-02T21:14:00.000Z","2020-06-02T21:14:01.000Z","2020-06-02T21:14:02.000Z","2020-06-02T21:14:03.000Z","2020-06-02T21:14:04.000Z","2020-06-02T21:14:05.000Z","2020-06-02T21:14:06.000Z","2020-06-02T21:14:07.000Z","2020-06-02T21:14:08.000Z","2020-06-02T21:14:09.000Z","2020-06-02T21:14:10.000Z","2020-06-02T21:14:11.000Z","2020-06-02T21:14:12.000Z","2020-06-02T21:14:13.000Z","2020-06-02T21:14:14.000Z","2020-06-02T21:14:15.000Z","2020-06-02T21:14:16.000Z","2020-06-02T21:14:17.000Z","2020-06-02T21:14:18.000Z","2020-06-02T21:14:19.000Z","2020-06-02T21:14:20.000Z","2020-06-02T21:14:21.000Z","2020-06-02T21:14:22.000Z","2020-06-02T21:14:23.000Z","2020-06-02T21:14:24.000Z","2020-06-02T21:14:25.000Z","2020-06-02T21:14:26.000Z","2020-06-02T21:14:27.000Z","2020-06-02T21:14:28.000Z","2020-06-02T21:14:29.000Z","2020-06-02T21:14:30.000Z","2020-06-02T21:14:31.000Z","2020-06-02T21:14:32.000Z","2020-06-02T21:14:33.000Z","2020-06-02T21:14:34.000Z","2020-06-02T21:14:35.000Z","2020-06-02T21:14:36.000Z","2020-06-02T21:14:37.000Z","2020-06-02T21:14:38.000Z","2020-06-02T21:14:39.000Z","2020-06-02T21:14:40.000Z","2020-06-02T21:14:41.000Z","2020-06-02T21:14:42.000Z","2020-06-02T21:14:43.000Z","2020-06-02T21:14:44.000Z","2020-06-02T21:14:45.000Z","2020-06-02T21:14:46.000Z","2020-06-02T21:14:47.000Z","2020-06-02T21:14:48.000Z","2020-06-02T21:14:49.000Z","2020-06-02T21:14:50.000Z","2020-06-02T21:14:51.000Z","2020-06-02T21:14:52.000Z","2020-06-02T21:14:53.000Z","2020-06-02T21:14:54.000Z","2020-06-02T21:14:55.000Z","2020-06-02T21:14:56.000Z","2020-06-02T21:14:57.000Z","2020-06-02T21:14:58.000Z","2020-06-02T21:14:59.000Z","2020-06-02T21:15:00.000Z","2020-06-02T21:15:01.000Z","2020-06-02T21:15:02.000Z","2020-06-02T21:15:03.000Z","2020-06-02T21:15:04.000Z","2020-06-02T21:15:05.000Z","2020-06-02T21:15:06.000Z","2020-06-02T21:15:07.000Z","2020-06-02T21:15:08.000Z","2020-06-02T21:15:09.000Z","2020-06-02T21:15:10.000Z","2020-06-02T21:15:11.000Z","2020-06-02T21:15:12.000Z","2020-06-02T21:15:13.000Z","2020-06-02T21:15:14.000Z","2020-06-02T21:15:15.000Z","2020-06-02T21:15:16.000Z","2020-06-02T21:15:17.000Z","2020-06-02T21:15:18.000Z","2020-06-02T21:15:19.000Z","2020-06-02T21:15:20.000Z","2020-06-02T21:15:21.000Z","2020-06-02T21:15:22.000Z","2020-06-02T21:15:23.000Z","2020-06-02T21:15:24.000Z","2020-06-02T21:15:25.000Z","2020-06-02T21:15:26.000Z","2020-06-02T21:15:27.000Z","2020-06-02T21:15:28.000Z","2020-06-02T21:15:29.000Z","2020-06-02T21:15:30.000Z","2020-06-02T21:15:31.000Z","2020-06-02T21:15:32.000Z","2020-06-02T21:15:33.000Z","2020-06-02T21:15:34.000Z","2020-06-02T21:15:35.000Z","2020-06-02T21:15:36.000Z","2020-06-02T21:15:37.000Z","2020-06-02T21:15:38.000Z","2020-06-02T21:15:39.000Z","2020-06-02T21:15:40.000Z","2020-06-02T21:15:41.000Z","2020-06-02T21:15:42.000Z","2020-06-02T21:15:43.000Z","2020-06-02T21:15:44.000Z","2020-06-02T21:15:45.000Z","2020-06-02T21:15:46.000Z","2020-06-02T21:15:47.000Z","2020-06-02T21:15:48.000Z","2020-06-02T21:15:49.000Z","2020-06-02T21:15:50.000Z","2020-06-02T21:15:51.000Z","2020-06-02T21:15:52.000Z","2020-06-02T21:15:53.000Z","2020-06-02T21:15:54.000Z","2020-06-02T21:15:55.000Z","2020-06-02T21:15:56.000Z","2020-06-02T21:15:57.000Z","2020-06-02T21:15:58.000Z","2020-06-02T21:15:59.000Z","2020-06-02T21:16:00.000Z","2020-06-02T21:16:01.000Z","2020-06-02T21:16:02.000Z","2020-06-02T21:16:03.000Z","2020-06-02T21:16:04.000Z","2020-06-02T21:16:05.000Z","2020-06-02T21:16:06.000Z","2020-06-02T21:16:07.000Z","2020-06-02T21:16:08.000Z","2020-06-02T21:16:09.000Z","2020-06-02T21:16:10.000Z","2020-06-02T21:16:11.000Z","2020-06-02T21:16:12.000Z","2020-06-02T21:16:13.000Z","2020-06-02T21:16:14.000Z","2020-06-02T21:16:15.000Z","2020-06-02T21:16:16.000Z","2020-06-02T21:16:17.000Z","2020-06-02T21:16:18.000Z","2020-06-02T21:16:19.000Z","2020-06-02T21:16:20.000Z","2020-06-02T21:16:21.000Z","2020-06-02T21:16:22.000Z","2020-06-02T21:16:23.000Z","2020-06-02T21:16:24.000Z","2020-06-02T21:16:25.000Z","2020-06-02T21:16:26.000Z","2020-06-02T21:16:27.000Z","2020-06-02T21:16:28.000Z","2020-06-02T21:16:29.000Z","2020-06-02T21:16:30.000Z","2020-06-02T21:16:31.000Z","2020-06-02T21:16:32.000Z","2020-06-02T21:16:33.000Z","2020-06-02T21:16:34.000Z","2020-06-02T21:16:35.000Z","2020-06-02T21:16:36.000Z","2020-06-02T21:16:37.000Z","2020-06-02T21:16:38.000Z","2020-06-02T21:16:39.000Z","2020-06-02T21:16:40.000Z","2020-06-02T21:16:41.000Z","2020-06-02T21:16:42.000Z","2020-06-02T21:16:43.000Z","2020-06-02T21:16:44.000Z","2020-06-02T21:16:45.000Z","2020-06-02T21:16:46.000Z","2020-06-02T21:16:47.000Z","2020-06-02T21:16:48.000Z","2020-06-02T21:16:49.000Z","2020-06-02T21:16:50.000Z","2020-06-02T21:16:51.000Z","2020-06-02T21:16:52.000Z","2020-06-02T21:16:53.000Z","2020-06-02T21:16:54.000Z","2020-06-02T21:16:55.000Z","2020-06-02T21:16:56.000Z","2020-06-02T21:16:57.000Z","2020-06-02T21:16:58.000Z","2020-06-02T21:16:59.000Z","2020-06-02T21:17:00.000Z","2020-06-02T21:17:01.000Z","2020-06-02T21:17:02.000Z","2020-06-02T21:17:03.000Z","2020-06-02T21:17:04.000Z","2020-06-02T21:17:05.000Z","2020-06-02T21:17:06.000Z","2020-06-02T21:17:07.000Z","2020-06-02T21:17:08.000Z","2020-06-02T21:17:09.000Z","2020-06-02T21:17:10.000Z","2020-06-02T21:17:11.000Z","2020-06-02T21:17:12.000Z","2020-06-02T21:17:13.000Z","2020-06-02T21:17:14.000Z","2020-06-02T21:17:15.000Z","2020-06-02T21:17:16.000Z","2020-06-02T21:17:17.000Z","2020-06-02T21:17:18.000Z","2020-06-02T21:17:19.000Z","2020-06-02T21:17:20.000Z","2020-06-02T21:17:21.000Z","2020-06-02T21:17:22.000Z","2020-06-02T21:17:23.000Z","2020-06-02T21:17:24.000Z","2020-06-02T21:17:25.000Z","2020-06-02T21:17:26.000Z","2020-06-02T21:17:27.000Z","2020-06-02T21:17:28.000Z","2020-06-02T21:17:29.000Z","2020-06-02T21:17:30.000Z","2020-06-02T21:17:31.000Z","2020-06-02T21:17:32.000Z","2020-06-02T21:17:33.000Z","2020-06-02T21:17:34.000Z","2020-06-02T21:17:35.000Z","2020-06-02T21:17:36.000Z","2020-06-02T21:17:37.000Z","2020-06-02T21:17:38.000Z","2020-06-02T21:17:39.000Z","2020-06-02T21:17:40.000Z","2020-06-02T21:17:41.000Z","2020-06-02T21:17:42.000Z","2020-06-02T21:17:43.000Z","2020-06-02T21:17:44.000Z","2020-06-02T21:17:45.000Z","2020-06-02T21:17:46.000Z","2020-06-02T21:17:47.000Z","2020-06-02T21:17:48.000Z","2020-06-02T21:17:49.000Z","2020-06-02T21:17:50.000Z","2020-06-02T21:17:51.000Z","2020-06-02T21:17:52.000Z","2020-06-02T21:17:53.000Z","2020-06-02T21:17:54.000Z","2020-06-02T21:17:55.000Z","2020-06-02T21:17:56.000Z","2020-06-02T21:17:57.000Z","2020-06-02T21:17:58.000Z","2020-06-02T21:17:59.000Z","2020-06-02T21:18:00.000Z","2020-06-02T21:18:01.000Z","2020-06-02T21:18:02.000Z","2020-06-02T21:18:03.000Z","2020-06-02T21:18:04.000Z","2020-06-02T21:18:05.000Z","2020-06-02T21:18:06.000Z","2020-06-02T21:18:07.000Z","2020-06-02T21:18:08.000Z","2020-06-02T21:18:09.000Z","2020-06-02T21:18:10.000Z","2020-06-02T21:18:11.000Z","2020-06-02T21:18:12.000Z","2020-06-02T21:18:13.000Z","2020-06-02T21:18:14.000Z","2020-06-02T21:18:15.000Z","2020-06-02T21:18:16.000Z","2020-06-02T21:18:17.000Z","2020-06-02T21:18:18.000Z","2020-06-02T21:18:19.000Z","2020-06-02T21:18:20.000Z","2020-06-02T21:18:21.000Z","2020-06-02T21:18:22.000Z","2020-06-02T21:18:23.000Z","2020-06-02T21:18:24.000Z","2020-06-02T21:18:25.000Z","2020-06-02T21:18:26.000Z","2020-06-02T21:18:27.000Z","2020-06-02T21:18:28.000Z","2020-06-02T21:18:29.000Z","2020-06-02T21:18:30.000Z","2020-06-02T21:18:31.000Z","2020-06-02T21:18:32.000Z","2020-06-02T21:18:33.000Z","2020-06-02T21:18:34.000Z","2020-06-02T21:18:35.000Z","2020-06-02T21:18:36.000Z","2020-06-02T21:18:37.000Z","2020-06-02T21:18:38.000Z","2020-06-02T21:18:39.000Z","2020-06-02T21:18:40.000Z","2020-06-02T21:18:41.000Z","2020-06-02T21:18:42.000Z","2020-06-02T21:18:43.000Z","2020-06-02T21:18:44.000Z","2020-06-02T21:18:45.000Z","2020-06-02T21:18:46.000Z","2020-06-02T21:18:47.000Z","2020-06-02T21:18:48.000Z","2020-06-02T21:18:49.000Z","2020-06-02T21:18:50.000Z","2020-06-02T21:18:51.000Z","2020-06-02T21:18:52.000Z","2020-06-02T21:18:53.000Z","2020-06-02T21:18:54.000Z","2020-06-02T21:18:55.000Z","2020-06-02T21:18:56.000Z","2020-06-02T21:18:57.000Z","2020-06-02T21:18:58.000Z","2020-06-02T21:18:59.000Z","2020-06-02T21:19:00.000Z","2020-06-02T21:19:01.000Z","2020-06-02T21:19:02.000Z","2020-06-02T21:19:03.000Z","2020-06-02T21:19:04.000Z","2020-06-02T21:19:05.000Z","2020-06-02T21:19:06.000Z","2020-06-02T21:19:07.000Z","2020-06-02T21:19:08.000Z","2020-06-02T21:19:09.000Z","2020-06-02T21:19:10.000Z","2020-06-02T21:19:11.000Z","2020-06-02T21:19:12.000Z","2020-06-02T21:19:13.000Z","2020-06-02T21:19:14.000Z","2020-06-02T21:19:15.000Z","2020-06-02T21:19:16.000Z","2020-06-02T21:19:17.000Z","2020-06-02T21:19:18.000Z","2020-06-02T21:19:19.000Z","2020-06-02T21:19:20.000Z","2020-06-02T21:19:21.000Z","2020-06-02T21:19:22.000Z","2020-06-02T21:19:23.000Z","2020-06-02T21:19:24.000Z","2020-06-02T21:19:25.000Z","2020-06-02T21:19:26.000Z","2020-06-02T21:19:27.000Z","2020-06-02T21:19:28.000Z","2020-06-02T21:19:29.000Z","2020-06-02T21:19:30.000Z","2020-06-02T21:19:31.000Z","2020-06-02T21:19:32.000Z","2020-06-02T21:19:33.000Z","2020-06-02T21:19:34.000Z","2020-06-02T21:19:35.000Z","2020-06-02T21:19:36.000Z","2020-06-02T21:19:37.000Z","2020-06-02T21:19:38.000Z","2020-06-02T21:19:39.000Z","2020-06-02T21:19:40.000Z","2020-06-02T21:19:41.000Z","2020-06-02T21:19:42.000Z","2020-06-02T21:19:43.000Z","2020-06-02T21:19:44.000Z","2020-06-02T21:19:45.000Z","2020-06-02T21:19:46.000Z","2020-06-02T21:19:47.000Z","2020-06-02T21:19:48.000Z","2020-06-02T21:19:49.000Z","2020-06-02T21:19:50.000Z","2020-06-02T21:19:51.000Z","2020-06-02T21:19:52.000Z","2020-06-02T21:19:53.000Z","2020-06-02T21:19:54.000Z","2020-06-02T21:19:55.000Z","2020-06-02T21:19:56.000Z","2020-06-02T21:19:57.000Z","2020-06-02T21:19:58.000Z","2020-06-02T21:19:59.000Z","2020-06-02T21:20:00.000Z","2020-06-02T21:20:01.000Z","2020-06-02T21:20:02.000Z","2020-06-02T21:20:03.000Z","2020-06-02T21:20:04.000Z","2020-06-02T21:20:05.000Z","2020-06-02T21:20:06.000Z","2020-06-02T21:20:07.000Z","2020-06-02T21:20:08.000Z","2020-06-02T21:20:09.000Z","2020-06-02T21:20:10.000Z","2020-06-02T21:20:11.000Z","2020-06-02T21:20:12.000Z","2020-06-02T21:20:13.000Z","2020-06-02T21:20:14.000Z","2020-06-02T21:20:15.000Z","2020-06-02T21:20:16.000Z","2020-06-02T21:20:17.000Z","2020-06-02T21:20:18.000Z","2020-06-02T21:20:19.000Z","2020-06-02T21:20:20.000Z","2020-06-02T21:20:21.000Z","2020-06-02T21:20:22.000Z","2020-06-02T21:20:23.000Z","2020-06-02T21:20:24.000Z","2020-06-02T21:20:25.000Z","2020-06-02T21:20:26.000Z","2020-06-02T21:20:27.000Z","2020-06-02T21:20:28.000Z","2020-06-02T21:20:29.000Z","2020-06-02T21:20:30.000Z","2020-06-02T21:20:31.000Z","2020-06-02T21:20:32.000Z","2020-06-02T21:20:33.000Z","2020-06-02T21:20:34.000Z","2020-06-02T21:20:35.000Z","2020-06-02T21:20:36.000Z","2020-06-02T21:20:37.000Z","2020-06-02T21:20:38.000Z","2020-06-02T21:20:39.000Z","2020-06-02T21:20:40.000Z","2020-06-02T21:20:41.000Z","2020-06-02T21:20:42.000Z","2020-06-02T21:20:43.000Z","2020-06-02T21:20:44.000Z","2020-06-02T21:20:45.000Z","2020-06-02T21:20:46.000Z","2020-06-02T21:20:47.000Z","2020-06-02T21:20:48.000Z","2020-06-02T21:20:49.000Z","2020-06-02T21:20:50.000Z","2020-06-02T21:20:51.000Z","2020-06-02T21:20:52.000Z","2020-06-02T21:20:53.000Z","2020-06-02T21:20:54.000Z","2020-06-02T21:20:55.000Z","2020-06-02T21:20:56.000Z","2020-06-02T21:20:57.000Z","2020-06-02T21:20:58.000Z","2020-06-02T21:20:59.000Z","2020-06-02T21:21:00.000Z","2020-06-02T21:21:01.000Z","2020-06-02T21:21:02.000Z","2020-06-02T21:21:03.000Z","2020-06-02T21:21:04.000Z","2020-06-02T21:21:05.000Z","2020-06-02T21:21:06.000Z","2020-06-02T21:21:07.000Z","2020-06-02T21:21:08.000Z","2020-06-02T21:21:09.000Z","2020-06-02T21:21:10.000Z","2020-06-02T21:21:11.000Z","2020-06-02T21:21:12.000Z","2020-06-02T21:21:13.000Z","2020-06-02T21:21:14.000Z","2020-06-02T21:21:15.000Z","2020-06-02T21:21:16.000Z","2020-06-02T21:21:17.000Z","2020-06-02T21:21:18.000Z","2020-06-02T21:21:19.000Z","2020-06-02T21:21:20.000Z","2020-06-02T21:21:21.000Z","2020-06-02T21:21:22.000Z","2020-06-02T21:21:23.000Z","2020-06-02T21:21:24.000Z","2020-06-02T21:21:25.000Z","2020-06-02T21:21:26.000Z","2020-06-02T21:21:27.000Z","2020-06-02T21:21:28.000Z","2020-06-02T21:21:29.000Z","2020-06-02T21:21:30.000Z","2020-06-02T21:21:31.000Z","2020-06-02T21:21:32.000Z","2020-06-02T21:21:33.000Z","2020-06-02T21:21:34.000Z","2020-06-02T21:21:35.000Z","2020-06-02T21:21:36.000Z","2020-06-02T21:21:37.000Z","2020-06-02T21:21:38.000Z","2020-06-02T21:21:39.000Z","2020-06-02T21:21:40.000Z","2020-06-02T21:21:41.000Z","2020-06-02T21:21:42.000Z","2020-06-02T21:21:43.000Z","2020-06-02T21:21:44.000Z","2020-06-02T21:21:45.000Z","2020-06-02T21:21:46.000Z","2020-06-02T21:21:47.000Z","2020-06-02T21:21:48.000Z","2020-06-02T21:21:49.000Z","2020-06-02T21:21:50.000Z","2020-06-02T21:21:51.000Z","2020-06-02T21:21:52.000Z","2020-06-02T21:21:53.000Z","2020-06-02T21:21:54.000Z","2020-06-02T21:21:55.000Z","2020-06-02T21:21:56.000Z","2020-06-02T21:21:57.000Z","2020-06-02T21:21:58.000Z","2020-06-02T21:21:59.000Z","2020-06-02T21:22:00.000Z","2020-06-02T21:22:01.000Z","2020-06-02T21:22:02.000Z","2020-06-02T21:22:03.000Z","2020-06-02T21:22:04.000Z","2020-06-02T21:22:05.000Z","2020-06-02T21:22:06.000Z","2020-06-02T21:22:07.000Z","2020-06-02T21:22:08.000Z","2020-06-02T21:22:09.000Z","2020-06-02T21:22:10.000Z","2020-06-02T21:22:11.000Z","2020-06-02T21:22:12.000Z","2020-06-02T21:22:13.000Z","2020-06-02T21:22:14.000Z","2020-06-02T21:22:15.000Z","2020-06-02T21:22:16.000Z","2020-06-02T21:22:17.000Z","2020-06-02T21:22:18.000Z","2020-06-02T21:22:19.000Z","2020-06-02T21:22:20.000Z","2020-06-02T21:22:21.000Z","2020-06-02T21:22:22.000Z","2020-06-02T21:22:23.000Z","2020-06-02T21:22:24.000Z","2020-06-02T21:22:25.000Z","2020-06-02T21:22:26.000Z","2020-06-02T21:22:27.000Z","2020-06-02T21:22:28.000Z","2020-06-02T21:22:29.000Z","2020-06-02T21:22:30.000Z","2020-06-02T21:22:31.000Z","2020-06-02T21:22:32.000Z","2020-06-02T21:22:33.000Z","2020-06-02T21:22:34.000Z","2020-06-02T21:22:35.000Z","2020-06-02T21:22:36.000Z","2020-06-02T21:22:37.000Z","2020-06-02T21:22:38.000Z","2020-06-02T21:22:39.000Z","2020-06-02T21:22:40.000Z","2020-06-02T21:22:41.000Z","2020-06-02T21:22:42.000Z","2020-06-02T21:22:43.000Z","2020-06-02T21:22:44.000Z","2020-06-02T21:22:45.000Z","2020-06-02T21:22:46.000Z","2020-06-02T21:22:47.000Z","2020-06-02T21:22:48.000Z","2020-06-02T21:22:49.000Z","2020-06-02T21:22:50.000Z","2020-06-02T21:22:51.000Z","2020-06-02T21:22:52.000Z","2020-06-02T21:22:53.000Z","2020-06-02T21:22:54.000Z","2020-06-02T21:22:55.000Z","2020-06-02T21:22:56.000Z","2020-06-02T21:22:57.000Z","2020-06-02T21:22:58.000Z","2020-06-02T21:22:59.000Z","2020-06-02T21:23:00.000Z","2020-06-02T21:23:01.000Z","2020-06-02T21:23:02.000Z","2020-06-02T21:23:03.000Z","2020-06-02T21:23:04.000Z","2020-06-02T21:23:05.000Z","2020-06-02T21:23:06.000Z","2020-06-02T21:23:07.000Z","2020-06-02T21:23:08.000Z","2020-06-02T21:23:09.000Z","2020-06-02T21:23:10.000Z","2020-06-02T21:23:11.000Z","2020-06-02T21:23:12.000Z","2020-06-02T21:23:13.000Z","2020-06-02T21:23:14.000Z","2020-06-02T21:23:15.000Z","2020-06-02T21:23:16.000Z","2020-06-02T21:23:17.000Z","2020-06-02T21:23:18.000Z","2020-06-02T21:23:19.000Z","2020-06-02T21:23:20.000Z","2020-06-02T21:23:21.000Z","2020-06-02T21:23:22.000Z","2020-06-02T21:23:23.000Z","2020-06-02T21:23:24.000Z","2020-06-02T21:23:25.000Z","2020-06-02T21:23:26.000Z","2020-06-02T21:23:27.000Z","2020-06-02T21:23:28.000Z","2020-06-02T21:23:29.000Z","2020-06-02T21:23:30.000Z","2020-06-02T21:23:31.000Z","2020-06-02T21:23:32.000Z","2020-06-02T21:23:33.000Z","2020-06-02T21:23:34.000Z","2020-06-02T21:23:35.000Z","2020-06-02T21:23:36.000Z","2020-06-02T21:23:37.000Z","2020-06-02T21:23:38.000Z","2020-06-02T21:23:39.000Z","2020-06-02T21:23:40.000Z","2020-06-02T21:23:41.000Z","2020-06-02T21:23:42.000Z","2020-06-02T21:23:43.000Z","2020-06-02T21:23:44.000Z","2020-06-02T21:23:45.000Z","2020-06-02T21:23:46.000Z","2020-06-02T21:23:47.000Z","2020-06-02T21:23:48.000Z","2020-06-02T21:23:49.000Z","2020-06-02T21:23:50.000Z","2020-06-02T21:23:51.000Z","2020-06-02T21:23:52.000Z","2020-06-02T21:23:53.000Z","2020-06-02T21:23:54.000Z","2020-06-02T21:23:55.000Z","2020-06-02T21:23:56.000Z","2020-06-02T21:23:57.000Z","2020-06-02T21:23:58.000Z","2020-06-02T21:23:59.000Z","2020-06-02T21:24:00.000Z","2020-06-02T21:24:01.000Z","2020-06-02T21:24:02.000Z","2020-06-02T21:24:03.000Z","2020-06-02T21:24:04.000Z","2020-06-02T21:24:05.000Z","2020-06-02T21:24:06.000Z","2020-06-02T21:24:07.000Z","2020-06-02T21:24:08.000Z","2020-06-02T21:24:09.000Z","2020-06-02T21:24:10.000Z","2020-06-02T21:24:11.000Z","2020-06-02T21:24:12.000Z","2020-06-02T21:24:13.000Z","2020-06-02T21:24:14.000Z","2020-06-02T21:24:15.000Z","2020-06-02T21:24:16.000Z","2020-06-02T21:24:17.000Z","2020-06-02T21:24:18.000Z","2020-06-02T21:24:19.000Z","2020-06-02T21:24:20.000Z","2020-06-02T21:24:21.000Z","2020-06-02T21:24:22.000Z","2020-06-02T21:24:23.000Z","2020-06-02T21:24:24.000Z","2020-06-02T21:24:25.000Z","2020-06-02T21:24:26.000Z","2020-06-02T21:24:27.000Z","2020-06-02T21:24:28.000Z","2020-06-02T21:24:29.000Z","2020-06-02T21:24:30.000Z","2020-06-02T21:24:31.000Z","2020-06-02T21:24:32.000Z","2020-06-02T21:24:33.000Z","2020-06-02T21:24:34.000Z","2020-06-02T21:24:35.000Z","2020-06-02T21:24:36.000Z","2020-06-02T21:24:37.000Z","2020-06-02T21:24:38.000Z","2020-06-02T21:24:39.000Z","2020-06-02T21:24:40.000Z","2020-06-02T21:24:41.000Z","2020-06-02T21:24:42.000Z","2020-06-02T21:24:43.000Z","2020-06-02T21:24:44.000Z","2020-06-02T21:24:45.000Z","2020-06-02T21:24:46.000Z","2020-06-02T21:24:47.000Z","2020-06-02T21:24:48.000Z","2020-06-02T21:24:49.000Z","2020-06-02T21:24:50.000Z","2020-06-02T21:24:51.000Z","2020-06-02T21:24:52.000Z","2020-06-02T21:24:53.000Z","2020-06-02T21:24:54.000Z","2020-06-02T21:24:55.000Z","2020-06-02T21:24:56.000Z","2020-06-02T21:24:57.000Z","2020-06-02T21:24:58.000Z","2020-06-02T21:24:59.000Z","2020-06-02T21:25:00.000Z","2020-06-02T21:25:01.000Z","2020-06-02T21:25:02.000Z","2020-06-02T21:25:03.000Z","2020-06-02T21:25:04.000Z","2020-06-02T21:25:05.000Z","2020-06-02T21:25:06.000Z","2020-06-02T21:25:07.000Z","2020-06-02T21:25:08.000Z","2020-06-02T21:25:09.000Z","2020-06-02T21:25:10.000Z","2020-06-02T21:25:11.000Z","2020-06-02T21:25:12.000Z","2020-06-02T21:25:13.000Z","2020-06-02T21:25:14.000Z","2020-06-02T21:25:15.000Z","2020-06-02T21:25:16.000Z","2020-06-02T21:25:17.000Z","2020-06-02T21:25:18.000Z","2020-06-02T21:25:19.000Z","2020-06-02T21:25:20.000Z","2020-06-02T21:25:21.000Z","2020-06-02T21:25:22.000Z","2020-06-02T21:25:23.000Z","2020-06-02T21:25:24.000Z","2020-06-02T21:25:25.000Z","2020-06-02T21:25:26.000Z","2020-06-02T21:25:27.000Z","2020-06-02T21:25:28.000Z","2020-06-02T21:25:29.000Z","2020-06-02T21:25:30.000Z","2020-06-02T21:25:31.000Z","2020-06-02T21:25:32.000Z","2020-06-02T21:25:33.000Z","2020-06-02T21:25:34.000Z","2020-06-02T21:25:35.000Z","2020-06-02T21:25:36.000Z","2020-06-02T21:25:37.000Z","2020-06-02T21:25:38.000Z","2020-06-02T21:25:39.000Z","2020-06-02T21:25:40.000Z","2020-06-02T21:25:41.000Z","2020-06-02T21:25:42.000Z","2020-06-02T21:25:43.000Z","2020-06-02T21:25:44.000Z","2020-06-02T21:25:45.000Z","2020-06-02T21:25:46.000Z","2020-06-02T21:25:47.000Z","2020-06-02T21:25:48.000Z","2020-06-02T21:25:49.000Z","2020-06-02T21:25:50.000Z","2020-06-02T21:25:51.000Z","2020-06-02T21:25:52.000Z","2020-06-02T21:25:53.000Z","2020-06-02T21:25:54.000Z","2020-06-02T21:25:55.000Z","2020-06-02T21:25:56.000Z","2020-06-02T21:25:57.000Z","2020-06-02T21:25:58.000Z","2020-06-02T21:25:59.000Z","2020-06-02T21:26:00.000Z","2020-06-02T21:26:01.000Z","2020-06-02T21:26:02.000Z","2020-06-02T21:26:03.000Z","2020-06-02T21:26:04.000Z","2020-06-02T21:26:05.000Z","2020-06-02T21:26:06.000Z","2020-06-02T21:26:07.000Z","2020-06-02T21:26:08.000Z","2020-06-02T21:26:09.000Z","2020-06-02T21:26:10.000Z","2020-06-02T21:26:11.000Z","2020-06-02T21:26:12.000Z","2020-06-02T21:26:13.000Z","2020-06-02T21:26:14.000Z","2020-06-02T21:26:15.000Z","2020-06-02T21:26:16.000Z","2020-06-02T21:26:17.000Z","2020-06-02T21:26:18.000Z","2020-06-02T21:26:19.000Z","2020-06-02T21:26:20.000Z","2020-06-02T21:26:21.000Z","2020-06-02T21:26:22.000Z","2020-06-02T21:26:23.000Z","2020-06-02T21:26:24.000Z","2020-06-02T21:26:25.000Z","2020-06-02T21:26:26.000Z","2020-06-02T21:26:27.000Z","2020-06-02T21:26:28.000Z","2020-06-02T21:26:29.000Z","2020-06-02T21:26:30.000Z","2020-06-02T21:26:31.000Z","2020-06-02T21:26:32.000Z","2020-06-02T21:26:33.000Z","2020-06-02T21:26:34.000Z","2020-06-02T21:26:35.000Z","2020-06-02T21:26:36.000Z","2020-06-02T21:26:37.000Z","2020-06-02T21:26:38.000Z","2020-06-02T21:26:39.000Z","2020-06-02T21:26:40.000Z","2020-06-02T21:26:41.000Z","2020-06-02T21:26:42.000Z","2020-06-02T21:26:43.000Z","2020-06-02T21:26:44.000Z","2020-06-02T21:26:45.000Z","2020-06-02T21:26:46.000Z","2020-06-02T21:26:47.000Z","2020-06-02T21:26:48.000Z","2020-06-02T21:26:49.000Z","2020-06-02T21:26:50.000Z","2020-06-02T21:26:51.000Z","2020-06-02T21:26:52.000Z","2020-06-02T21:26:53.000Z","2020-06-02T21:26:54.000Z","2020-06-02T21:26:55.000Z","2020-06-02T21:26:56.000Z","2020-06-02T21:26:57.000Z","2020-06-02T21:26:58.000Z","2020-06-02T21:26:59.000Z","2020-06-02T21:27:00.000Z","2020-06-02T21:27:01.000Z","2020-06-02T21:27:02.000Z","2020-06-02T21:27:03.000Z","2020-06-02T21:27:04.000Z","2020-06-02T21:27:05.000Z","2020-06-02T21:27:06.000Z","2020-06-02T21:27:07.000Z","2020-06-02T21:27:08.000Z","2020-06-02T21:27:09.000Z","2020-06-02T21:27:10.000Z","2020-06-02T21:27:11.000Z","2020-06-02T21:27:12.000Z","2020-06-02T21:27:13.000Z","2020-06-02T21:27:14.000Z","2020-06-02T21:27:15.000Z","2020-06-02T21:27:16.000Z","2020-06-02T21:27:17.000Z","2020-06-02T21:27:18.000Z","2020-06-02T21:27:19.000Z","2020-06-02T21:27:20.000Z","2020-06-02T21:27:21.000Z","2020-06-02T21:27:22.000Z","2020-06-02T21:27:23.000Z","2020-06-02T21:27:24.000Z","2020-06-02T21:27:25.000Z","2020-06-02T21:27:26.000Z","2020-06-02T21:27:27.000Z","2020-06-02T21:27:28.000Z","2020-06-02T21:27:29.000Z","2020-06-02T21:27:30.000Z","2020-06-02T21:27:31.000Z","2020-06-02T21:27:32.000Z","2020-06-02T21:27:33.000Z","2020-06-02T21:27:34.000Z","2020-06-02T21:27:35.000Z","2020-06-02T21:27:36.000Z","2020-06-02T21:27:37.000Z","2020-06-02T21:27:38.000Z","2020-06-02T21:27:39.000Z","2020-06-02T21:27:40.000Z","2020-06-02T21:27:41.000Z","2020-06-02T21:27:42.000Z","2020-06-02T21:27:43.000Z","2020-06-02T21:27:44.000Z","2020-06-02T21:27:45.000Z","2020-06-02T21:27:46.000Z","2020-06-02T21:27:47.000Z","2020-06-02T21:27:48.000Z","2020-06-02T21:27:49.000Z","2020-06-02T21:27:50.000Z","2020-06-02T21:27:51.000Z","2020-06-02T21:27:52.000Z","2020-06-02T21:27:53.000Z","2020-06-02T21:27:54.000Z","2020-06-02T21:27:55.000Z","2020-06-02T21:27:56.000Z","2020-06-02T21:27:57.000Z","2020-06-02T21:27:58.000Z","2020-06-02T21:27:59.000Z","2020-06-02T21:28:00.000Z","2020-06-02T21:28:01.000Z","2020-06-02T21:28:02.000Z","2020-06-02T21:28:03.000Z","2020-06-02T21:28:04.000Z","2020-06-02T21:28:05.000Z","2020-06-02T21:28:06.000Z","2020-06-02T21:28:07.000Z","2020-06-02T21:28:08.000Z","2020-06-02T21:28:09.000Z","2020-06-02T21:28:10.000Z","2020-06-02T21:28:11.000Z","2020-06-02T21:28:12.000Z","2020-06-02T21:28:13.000Z","2020-06-02T21:28:14.000Z","2020-06-02T21:28:15.000Z","2020-06-02T21:28:16.000Z","2020-06-02T21:28:17.000Z","2020-06-02T21:28:18.000Z","2020-06-02T21:28:19.000Z","2020-06-02T21:28:20.000Z","2020-06-02T21:28:21.000Z","2020-06-02T21:28:22.000Z","2020-06-02T21:28:23.000Z","2020-06-02T21:28:24.000Z","2020-06-02T21:28:25.000Z","2020-06-02T21:28:26.000Z","2020-06-02T21:28:27.000Z","2020-06-02T21:28:28.000Z","2020-06-02T21:28:29.000Z","2020-06-02T21:28:30.000Z","2020-06-02T21:28:31.000Z","2020-06-02T21:28:32.000Z","2020-06-02T21:28:33.000Z","2020-06-02T21:28:34.000Z","2020-06-02T21:28:35.000Z","2020-06-02T21:28:36.000Z","2020-06-02T21:28:37.000Z","2020-06-02T21:28:38.000Z","2020-06-02T21:28:39.000Z","2020-06-02T21:28:40.000Z","2020-06-02T21:28:41.000Z","2020-06-02T21:28:42.000Z","2020-06-02T21:28:43.000Z","2020-06-02T21:28:44.000Z","2020-06-02T21:28:45.000Z","2020-06-02T21:28:46.000Z","2020-06-02T21:28:47.000Z","2020-06-02T21:28:48.000Z","2020-06-02T21:28:49.000Z","2020-06-02T21:28:50.000Z","2020-06-02T21:28:51.000Z","2020-06-02T21:28:52.000Z","2020-06-02T21:28:53.000Z","2020-06-02T21:28:54.000Z","2020-06-02T21:28:55.000Z","2020-06-02T21:28:56.000Z","2020-06-02T21:28:57.000Z","2020-06-02T21:28:58.000Z","2020-06-02T21:28:59.000Z","2020-06-02T21:29:00.000Z","2020-06-02T21:29:01.000Z","2020-06-02T21:29:02.000Z","2020-06-02T21:29:03.000Z","2020-06-02T21:29:04.000Z","2020-06-02T21:29:05.000Z","2020-06-02T21:29:06.000Z","2020-06-02T21:29:07.000Z","2020-06-02T21:29:08.000Z","2020-06-02T21:29:09.000Z","2020-06-02T21:29:10.000Z","2020-06-02T21:29:11.000Z","2020-06-02T21:29:12.000Z","2020-06-02T21:29:13.000Z","2020-06-02T21:29:14.000Z","2020-06-02T21:29:15.000Z","2020-06-02T21:29:16.000Z","2020-06-02T21:29:17.000Z","2020-06-02T21:29:18.000Z","2020-06-02T21:29:19.000Z","2020-06-02T21:29:20.000Z","2020-06-02T21:29:21.000Z","2020-06-02T21:29:22.000Z","2020-06-02T21:29:23.000Z","2020-06-02T21:29:24.000Z","2020-06-02T21:29:25.000Z","2020-06-02T21:29:26.000Z","2020-06-02T21:29:27.000Z","2020-06-02T21:29:28.000Z","2020-06-02T21:29:29.000Z","2020-06-02T21:29:30.000Z","2020-06-02T21:29:31.000Z","2020-06-02T21:29:32.000Z","2020-06-02T21:29:33.000Z","2020-06-02T21:29:34.000Z","2020-06-02T21:29:35.000Z","2020-06-02T21:29:36.000Z","2020-06-02T21:29:37.000Z","2020-06-02T21:29:38.000Z","2020-06-02T21:29:39.000Z","2020-06-02T21:29:40.000Z","2020-06-02T21:29:41.000Z","2020-06-02T21:29:42.000Z","2020-06-02T21:29:43.000Z","2020-06-02T21:29:44.000Z","2020-06-02T21:29:45.000Z","2020-06-02T21:29:46.000Z","2020-06-02T21:29:47.000Z","2020-06-02T21:29:48.000Z","2020-06-02T21:29:49.000Z","2020-06-02T21:29:50.000Z","2020-06-02T21:29:51.000Z","2020-06-02T21:29:52.000Z","2020-06-02T21:29:53.000Z","2020-06-02T21:29:54.000Z","2020-06-02T21:29:55.000Z","2020-06-02T21:29:56.000Z","2020-06-02T21:29:57.000Z","2020-06-02T21:29:58.000Z","2020-06-02T21:29:59.000Z","2020-06-02T21:30:00.000Z","2020-06-02T21:30:01.000Z","2020-06-02T21:30:02.000Z","2020-06-02T21:30:03.000Z","2020-06-02T21:30:04.000Z","2020-06-02T21:30:05.000Z","2020-06-02T21:30:06.000Z","2020-06-02T21:30:07.000Z","2020-06-02T21:30:08.000Z","2020-06-02T21:30:09.000Z","2020-06-02T21:30:10.000Z","2020-06-02T21:30:11.000Z","2020-06-02T21:30:12.000Z","2020-06-02T21:30:13.000Z","2020-06-02T21:30:14.000Z","2020-06-02T21:30:15.000Z","2020-06-02T21:30:16.000Z","2020-06-02T21:30:17.000Z","2020-06-02T21:30:18.000Z","2020-06-02T21:30:19.000Z","2020-06-02T21:30:20.000Z","2020-06-02T21:30:21.000Z","2020-06-02T21:30:22.000Z","2020-06-02T21:30:23.000Z","2020-06-02T21:30:24.000Z","2020-06-02T21:30:25.000Z","2020-06-02T21:30:26.000Z","2020-06-02T21:30:27.000Z","2020-06-02T21:30:28.000Z","2020-06-02T21:30:29.000Z","2020-06-02T21:30:30.000Z","2020-06-02T21:30:31.000Z","2020-06-02T21:30:32.000Z","2020-06-02T21:30:33.000Z","2020-06-02T21:30:34.000Z","2020-06-02T21:30:35.000Z","2020-06-02T21:30:36.000Z","2020-06-02T21:30:37.000Z","2020-06-02T21:30:38.000Z","2020-06-02T21:30:39.000Z","2020-06-02T21:30:40.000Z","2020-06-02T21:30:41.000Z","2020-06-02T21:30:42.000Z","2020-06-02T21:30:43.000Z","2020-06-02T21:30:44.000Z","2020-06-02T21:30:45.000Z","2020-06-02T21:30:46.000Z","2020-06-02T21:30:47.000Z","2020-06-02T21:30:48.000Z","2020-06-02T21:30:49.000Z","2020-06-02T21:30:50.000Z","2020-06-02T21:30:51.000Z","2020-06-02T21:30:52.000Z","2020-06-02T21:30:53.000Z","2020-06-02T21:30:54.000Z","2020-06-02T21:30:55.000Z","2020-06-02T21:30:56.000Z","2020-06-02T21:30:57.000Z","2020-06-02T21:30:58.000Z","2020-06-02T21:30:59.000Z","2020-06-02T21:31:00.000Z","2020-06-02T21:31:01.000Z","2020-06-02T21:31:02.000Z","2020-06-02T21:31:03.000Z","2020-06-02T21:31:04.000Z","2020-06-02T21:31:05.000Z","2020-06-02T21:31:06.000Z","2020-06-02T21:31:07.000Z","2020-06-02T21:31:08.000Z","2020-06-02T21:31:09.000Z","2020-06-02T21:31:10.000Z","2020-06-02T21:31:11.000Z","2020-06-02T21:31:12.000Z","2020-06-02T21:31:13.000Z","2020-06-02T21:31:14.000Z","2020-06-02T21:31:15.000Z","2020-06-02T21:31:16.000Z","2020-06-02T21:31:17.000Z","2020-06-02T21:31:18.000Z","2020-06-02T21:31:19.000Z","2020-06-02T21:31:20.000Z","2020-06-02T21:31:21.000Z","2020-06-02T21:31:22.000Z","2020-06-02T21:31:23.000Z","2020-06-02T21:31:24.000Z","2020-06-02T21:31:25.000Z","2020-06-02T21:31:26.000Z","2020-06-02T21:31:27.000Z","2020-06-02T21:31:28.000Z","2020-06-02T21:31:29.000Z","2020-06-02T21:31:30.000Z","2020-06-02T21:31:31.000Z","2020-06-02T21:31:32.000Z","2020-06-02T21:31:33.000Z","2020-06-02T21:31:34.000Z","2020-06-02T21:31:35.000Z","2020-06-02T21:31:36.000Z","2020-06-02T21:31:37.000Z","2020-06-02T21:31:38.000Z","2020-06-02T21:31:39.000Z","2020-06-02T21:31:40.000Z","2020-06-02T21:31:41.000Z","2020-06-02T21:31:42.000Z","2020-06-02T21:31:43.000Z","2020-06-02T21:31:44.000Z","2020-06-02T21:31:45.000Z","2020-06-02T21:31:46.000Z","2020-06-02T21:31:47.000Z","2020-06-02T21:31:48.000Z","2020-06-02T21:31:49.000Z","2020-06-02T21:31:50.000Z","2020-06-02T21:31:51.000Z","2020-06-02T21:31:52.000Z","2020-06-02T21:31:53.000Z","2020-06-02T21:31:54.000Z","2020-06-02T21:31:55.000Z","2020-06-02T21:31:56.000Z","2020-06-02T21:31:57.000Z","2020-06-02T21:31:58.000Z","2020-06-02T21:31:59.000Z","2020-06-02T21:32:00.000Z","2020-06-02T21:32:01.000Z","2020-06-02T21:32:02.000Z","2020-06-02T21:32:03.000Z","2020-06-02T21:32:04.000Z","2020-06-02T21:32:05.000Z","2020-06-02T21:32:06.000Z","2020-06-02T21:32:07.000Z","2020-06-02T21:32:08.000Z","2020-06-02T21:32:09.000Z","2020-06-02T21:32:10.000Z","2020-06-02T21:32:11.000Z","2020-06-02T21:32:12.000Z","2020-06-02T21:32:13.000Z","2020-06-02T21:32:14.000Z","2020-06-02T21:32:15.000Z","2020-06-02T21:32:16.000Z","2020-06-02T21:32:17.000Z","2020-06-02T21:32:18.000Z","2020-06-02T21:32:19.000Z","2020-06-02T21:32:20.000Z","2020-06-02T21:32:21.000Z","2020-06-02T21:32:22.000Z","2020-06-02T21:32:23.000Z","2020-06-02T21:32:24.000Z","2020-06-02T21:32:25.000Z","2020-06-02T21:32:26.000Z","2020-06-02T21:32:27.000Z","2020-06-02T21:32:28.000Z","2020-06-02T21:32:29.000Z","2020-06-02T21:32:30.000Z","2020-06-02T21:32:31.000Z","2020-06-02T21:32:32.000Z","2020-06-02T21:32:33.000Z","2020-06-02T21:32:34.000Z","2020-06-02T21:32:35.000Z","2020-06-02T21:32:36.000Z","2020-06-02T21:32:37.000Z","2020-06-02T21:32:38.000Z","2020-06-02T21:32:39.000Z","2020-06-02T21:32:40.000Z","2020-06-02T21:32:41.000Z","2020-06-02T21:32:42.000Z","2020-06-02T21:32:43.000Z","2020-06-02T21:32:44.000Z","2020-06-02T21:32:45.000Z","2020-06-02T21:32:46.000Z","2020-06-02T21:32:47.000Z","2020-06-02T21:32:48.000Z","2020-06-02T21:32:49.000Z","2020-06-02T21:32:50.000Z","2020-06-02T21:32:51.000Z","2020-06-02T21:32:52.000Z","2020-06-02T21:32:53.000Z","2020-06-02T21:32:54.000Z","2020-06-02T21:32:55.000Z","2020-06-02T21:32:56.000Z","2020-06-02T21:32:57.000Z","2020-06-02T21:32:58.000Z","2020-06-02T21:32:59.000Z","2020-06-02T21:33:00.000Z","2020-06-02T21:33:01.000Z","2020-06-02T21:33:02.000Z","2020-06-02T21:33:03.000Z","2020-06-02T21:33:04.000Z","2020-06-02T21:33:05.000Z","2020-06-02T21:33:06.000Z","2020-06-02T21:33:07.000Z","2020-06-02T21:33:08.000Z","2020-06-02T21:33:09.000Z","2020-06-02T21:33:10.000Z","2020-06-02T21:33:11.000Z","2020-06-02T21:33:12.000Z","2020-06-02T21:33:13.000Z","2020-06-02T21:33:14.000Z","2020-06-02T21:33:15.000Z","2020-06-02T21:33:16.000Z","2020-06-02T21:33:17.000Z","2020-06-02T21:33:18.000Z","2020-06-02T21:33:19.000Z","2020-06-02T21:33:20.000Z","2020-06-02T21:33:21.000Z","2020-06-02T21:33:22.000Z","2020-06-02T21:33:23.000Z","2020-06-02T21:33:24.000Z","2020-06-02T21:33:25.000Z","2020-06-02T21:33:26.000Z","2020-06-02T21:33:27.000Z","2020-06-02T21:33:28.000Z","2020-06-02T21:33:29.000Z","2020-06-02T21:33:30.000Z","2020-06-02T21:33:31.000Z","2020-06-02T21:33:32.000Z","2020-06-02T21:33:33.000Z","2020-06-02T21:33:34.000Z","2020-06-02T21:33:35.000Z","2020-06-02T21:33:36.000Z","2020-06-02T21:33:37.000Z","2020-06-02T21:33:38.000Z","2020-06-02T21:33:39.000Z","2020-06-02T21:33:40.000Z","2020-06-02T21:33:41.000Z","2020-06-02T21:33:42.000Z","2020-06-02T21:33:43.000Z","2020-06-02T21:33:44.000Z","2020-06-02T21:33:45.000Z","2020-06-02T21:33:46.000Z","2020-06-02T21:33:47.000Z","2020-06-02T21:33:48.000Z","2020-06-02T21:33:49.000Z","2020-06-02T21:33:50.000Z","2020-06-02T21:33:51.000Z","2020-06-02T21:33:52.000Z","2020-06-02T21:33:53.000Z","2020-06-02T21:33:54.000Z","2020-06-02T21:33:55.000Z","2020-06-02T21:33:56.000Z","2020-06-02T21:33:57.000Z","2020-06-02T21:33:58.000Z","2020-06-02T21:33:59.000Z","2020-06-02T21:34:00.000Z","2020-06-02T21:34:01.000Z","2020-06-02T21:34:02.000Z","2020-06-02T21:34:03.000Z","2020-06-02T21:34:04.000Z","2020-06-02T21:34:05.000Z","2020-06-02T21:34:06.000Z","2020-06-02T21:34:07.000Z","2020-06-02T21:34:08.000Z","2020-06-02T21:34:09.000Z","2020-06-02T21:34:10.000Z","2020-06-02T21:34:11.000Z","2020-06-02T21:34:12.000Z","2020-06-02T21:34:13.000Z","2020-06-02T21:34:14.000Z","2020-06-02T21:34:15.000Z","2020-06-02T21:34:16.000Z","2020-06-02T21:34:17.000Z","2020-06-02T21:34:18.000Z","2020-06-02T21:34:19.000Z","2020-06-02T21:34:20.000Z","2020-06-02T21:34:21.000Z","2020-06-02T21:34:22.000Z","2020-06-02T21:34:23.000Z","2020-06-02T21:34:24.000Z","2020-06-02T21:34:25.000Z","2020-06-02T21:34:26.000Z","2020-06-02T21:34:27.000Z","2020-06-02T21:34:28.000Z","2020-06-02T21:34:29.000Z","2020-06-02T21:34:30.000Z","2020-06-02T21:34:31.000Z","2020-06-02T21:34:32.000Z","2020-06-02T21:34:33.000Z","2020-06-02T21:34:34.000Z","2020-06-02T21:34:35.000Z","2020-06-02T21:34:36.000Z","2020-06-02T21:34:37.000Z","2020-06-02T21:34:38.000Z","2020-06-02T21:34:39.000Z","2020-06-02T21:34:40.000Z","2020-06-02T21:34:41.000Z","2020-06-02T21:34:42.000Z","2020-06-02T21:34:43.000Z","2020-06-02T21:34:44.000Z","2020-06-02T21:34:45.000Z","2020-06-02T21:34:46.000Z","2020-06-02T21:34:47.000Z","2020-06-02T21:34:48.000Z","2020-06-02T21:34:49.000Z","2020-06-02T21:34:50.000Z","2020-06-02T21:34:51.000Z","2020-06-02T21:34:52.000Z","2020-06-02T21:34:53.000Z","2020-06-02T21:34:54.000Z","2020-06-02T21:34:55.000Z","2020-06-02T21:34:56.000Z","2020-06-02T21:34:57.000Z","2020-06-02T21:34:58.000Z","2020-06-02T21:34:59.000Z","2020-06-02T21:35:00.000Z","2020-06-02T21:35:01.000Z","2020-06-02T21:35:02.000Z","2020-06-02T21:35:03.000Z","2020-06-02T21:35:04.000Z","2020-06-02T21:35:05.000Z","2020-06-02T21:35:06.000Z","2020-06-02T21:35:07.000Z","2020-06-02T21:35:08.000Z","2020-06-02T21:35:09.000Z","2020-06-02T21:35:10.000Z","2020-06-02T21:35:11.000Z","2020-06-02T21:35:12.000Z","2020-06-02T21:35:13.000Z","2020-06-02T21:35:14.000Z","2020-06-02T21:35:15.000Z","2020-06-02T21:35:16.000Z","2020-06-02T21:35:17.000Z","2020-06-02T21:35:18.000Z","2020-06-02T21:35:19.000Z","2020-06-02T21:35:20.000Z","2020-06-02T21:35:21.000Z","2020-06-02T21:35:22.000Z","2020-06-02T21:35:23.000Z","2020-06-02T21:35:24.000Z","2020-06-02T21:35:25.000Z","2020-06-02T21:35:26.000Z","2020-06-02T21:35:27.000Z","2020-06-02T21:35:28.000Z","2020-06-02T21:35:29.000Z","2020-06-02T21:35:30.000Z","2020-06-02T21:35:31.000Z","2020-06-02T21:35:32.000Z","2020-06-02T21:35:33.000Z","2020-06-02T21:35:34.000Z","2020-06-02T21:35:35.000Z","2020-06-02T21:35:36.000Z","2020-06-02T21:35:37.000Z","2020-06-02T21:35:38.000Z","2020-06-02T21:35:39.000Z","2020-06-02T21:35:40.000Z","2020-06-02T21:35:41.000Z","2020-06-02T21:35:42.000Z","2020-06-02T21:35:43.000Z","2020-06-02T21:35:44.000Z","2020-06-02T21:35:45.000Z","2020-06-02T21:35:46.000Z","2020-06-02T21:35:47.000Z","2020-06-02T21:35:48.000Z","2020-06-02T21:35:49.000Z","2020-06-02T21:35:50.000Z","2020-06-02T21:35:51.000Z","2020-06-02T21:35:52.000Z","2020-06-02T21:35:53.000Z","2020-06-02T21:35:54.000Z","2020-06-02T21:35:55.000Z","2020-06-02T21:35:56.000Z","2020-06-02T21:35:57.000Z","2020-06-02T21:35:58.000Z","2020-06-02T21:35:59.000Z","2020-06-02T21:36:00.000Z","2020-06-02T21:36:01.000Z","2020-06-02T21:36:02.000Z","2020-06-02T21:36:03.000Z","2020-06-02T21:36:04.000Z","2020-06-02T21:36:05.000Z","2020-06-02T21:36:06.000Z","2020-06-02T21:36:07.000Z","2020-06-02T21:36:08.000Z","2020-06-02T21:36:09.000Z","2020-06-02T21:36:10.000Z","2020-06-02T21:36:11.000Z","2020-06-02T21:36:12.000Z","2020-06-02T21:36:13.000Z","2020-06-02T21:36:14.000Z","2020-06-02T21:36:15.000Z","2020-06-02T21:36:16.000Z","2020-06-02T21:36:17.000Z","2020-06-02T21:36:18.000Z","2020-06-02T21:36:19.000Z","2020-06-02T21:36:20.000Z","2020-06-02T21:36:21.000Z","2020-06-02T21:36:22.000Z","2020-06-02T21:36:23.000Z","2020-06-02T21:36:24.000Z","2020-06-02T21:36:25.000Z","2020-06-02T21:36:26.000Z","2020-06-02T21:36:27.000Z","2020-06-02T21:36:28.000Z","2020-06-02T21:36:29.000Z","2020-06-02T21:36:30.000Z","2020-06-02T21:36:31.000Z","2020-06-02T21:36:32.000Z","2020-06-02T21:36:33.000Z","2020-06-02T21:36:34.000Z","2020-06-02T21:36:35.000Z","2020-06-02T21:36:36.000Z","2020-06-02T21:36:37.000Z","2020-06-02T21:36:38.000Z","2020-06-02T21:36:39.000Z","2020-06-02T21:36:40.000Z","2020-06-02T21:36:41.000Z","2020-06-02T21:36:42.000Z","2020-06-02T21:36:43.000Z","2020-06-02T21:36:44.000Z","2020-06-02T21:36:45.000Z","2020-06-02T21:36:46.000Z","2020-06-02T21:36:47.000Z","2020-06-02T21:36:48.000Z","2020-06-02T21:36:49.000Z","2020-06-02T21:36:50.000Z","2020-06-02T21:36:51.000Z","2020-06-02T21:36:52.000Z","2020-06-02T21:36:53.000Z","2020-06-02T21:36:54.000Z","2020-06-02T21:36:55.000Z","2020-06-02T21:36:56.000Z","2020-06-02T21:36:57.000Z","2020-06-02T21:36:58.000Z","2020-06-02T21:36:59.000Z","2020-06-02T21:37:00.000Z","2020-06-02T21:37:01.000Z","2020-06-02T21:37:02.000Z","2020-06-02T21:37:03.000Z","2020-06-02T21:37:04.000Z","2020-06-02T21:37:05.000Z","2020-06-02T21:37:06.000Z","2020-06-02T21:37:07.000Z","2020-06-02T21:37:08.000Z","2020-06-02T21:37:09.000Z","2020-06-02T21:37:10.000Z","2020-06-02T21:37:11.000Z","2020-06-02T21:37:12.000Z","2020-06-02T21:37:13.000Z","2020-06-02T21:37:14.000Z","2020-06-02T21:37:15.000Z","2020-06-02T21:37:16.000Z","2020-06-02T21:37:17.000Z","2020-06-02T21:37:18.000Z","2020-06-02T21:37:19.000Z","2020-06-02T21:37:20.000Z","2020-06-02T21:37:21.000Z","2020-06-02T21:37:22.000Z","2020-06-02T21:37:23.000Z","2020-06-02T21:37:24.000Z","2020-06-02T21:37:25.000Z","2020-06-02T21:37:26.000Z","2020-06-02T21:37:27.000Z","2020-06-02T21:37:28.000Z","2020-06-02T21:37:29.000Z","2020-06-02T21:37:30.000Z","2020-06-02T21:37:31.000Z","2020-06-02T21:37:32.000Z","2020-06-02T21:37:33.000Z","2020-06-02T21:37:34.000Z","2020-06-02T21:37:35.000Z","2020-06-02T21:37:36.000Z","2020-06-02T21:37:37.000Z","2020-06-02T21:37:38.000Z","2020-06-02T21:37:39.000Z","2020-06-02T21:37:40.000Z","2020-06-02T21:37:41.000Z","2020-06-02T21:37:42.000Z","2020-06-02T21:37:43.000Z","2020-06-02T21:37:44.000Z","2020-06-02T21:37:45.000Z","2020-06-02T21:37:46.000Z","2020-06-02T21:37:47.000Z","2020-06-02T21:37:48.000Z","2020-06-02T21:37:49.000Z","2020-06-02T21:37:50.000Z","2020-06-02T21:37:51.000Z","2020-06-02T21:37:52.000Z","2020-06-02T21:37:53.000Z","2020-06-02T21:37:54.000Z","2020-06-02T21:37:55.000Z","2020-06-02T21:37:56.000Z","2020-06-02T21:37:57.000Z","2020-06-02T21:37:58.000Z","2020-06-02T21:37:59.000Z","2020-06-02T21:38:00.000Z","2020-06-02T21:38:01.000Z","2020-06-02T21:38:02.000Z","2020-06-02T21:38:03.000Z","2020-06-02T21:38:04.000Z","2020-06-02T21:38:05.000Z","2020-06-02T21:38:06.000Z","2020-06-02T21:38:07.000Z","2020-06-02T21:38:08.000Z","2020-06-02T21:38:09.000Z","2020-06-02T21:38:10.000Z","2020-06-02T21:38:11.000Z","2020-06-02T21:38:12.000Z","2020-06-02T21:38:13.000Z","2020-06-02T21:38:14.000Z","2020-06-02T21:38:15.000Z","2020-06-02T21:38:16.000Z","2020-06-02T21:38:17.000Z","2020-06-02T21:38:18.000Z","2020-06-02T21:38:19.000Z","2020-06-02T21:38:20.000Z","2020-06-02T21:38:21.000Z","2020-06-02T21:38:22.000Z","2020-06-02T21:38:23.000Z","2020-06-02T21:38:24.000Z","2020-06-02T21:38:25.000Z","2020-06-02T21:38:26.000Z","2020-06-02T21:38:27.000Z","2020-06-02T21:38:28.000Z","2020-06-02T21:38:29.000Z","2020-06-02T21:38:30.000Z","2020-06-02T21:38:31.000Z","2020-06-02T21:38:32.000Z","2020-06-02T21:38:33.000Z","2020-06-02T21:38:34.000Z","2020-06-02T21:38:35.000Z","2020-06-02T21:38:36.000Z","2020-06-02T21:38:37.000Z","2020-06-02T21:38:38.000Z","2020-06-02T21:38:39.000Z","2020-06-02T21:38:40.000Z","2020-06-02T21:38:41.000Z","2020-06-02T21:38:42.000Z","2020-06-02T21:38:43.000Z","2020-06-02T21:38:44.000Z","2020-06-02T21:38:45.000Z","2020-06-02T21:38:46.000Z","2020-06-02T21:38:47.000Z","2020-06-02T21:38:48.000Z","2020-06-02T21:38:49.000Z","2020-06-02T21:38:50.000Z","2020-06-02T21:38:51.000Z","2020-06-02T21:38:52.000Z","2020-06-02T21:38:53.000Z","2020-06-02T21:38:54.000Z","2020-06-02T21:38:55.000Z","2020-06-02T21:38:56.000Z","2020-06-02T21:38:57.000Z","2020-06-02T21:38:58.000Z","2020-06-02T21:38:59.000Z","2020-06-02T21:39:00.000Z","2020-06-02T21:39:01.000Z","2020-06-02T21:39:02.000Z","2020-06-02T21:39:03.000Z","2020-06-02T21:39:04.000Z","2020-06-02T21:39:05.000Z","2020-06-02T21:39:06.000Z","2020-06-02T21:39:07.000Z","2020-06-02T21:39:08.000Z","2020-06-02T21:39:09.000Z","2020-06-02T21:39:10.000Z","2020-06-02T21:39:11.000Z","2020-06-02T21:39:12.000Z","2020-06-02T21:39:13.000Z","2020-06-02T21:39:14.000Z","2020-06-02T21:39:15.000Z","2020-06-02T21:39:16.000Z","2020-06-02T21:39:17.000Z","2020-06-02T21:39:18.000Z","2020-06-02T21:39:19.000Z","2020-06-02T21:39:20.000Z","2020-06-02T21:39:21.000Z","2020-06-02T21:39:22.000Z","2020-06-02T21:39:23.000Z","2020-06-02T21:39:24.000Z","2020-06-02T21:39:25.000Z","2020-06-02T21:39:26.000Z","2020-06-02T21:39:27.000Z","2020-06-02T21:39:28.000Z","2020-06-02T21:39:29.000Z","2020-06-02T21:39:30.000Z","2020-06-02T21:39:31.000Z","2020-06-02T21:39:32.000Z","2020-06-02T21:39:33.000Z","2020-06-02T21:39:34.000Z","2020-06-02T21:39:35.000Z","2020-06-02T21:39:36.000Z","2020-06-02T21:39:37.000Z","2020-06-02T21:39:38.000Z","2020-06-02T21:39:39.000Z","2020-06-02T21:39:40.000Z","2020-06-02T21:39:41.000Z","2020-06-02T21:39:42.000Z","2020-06-02T21:39:43.000Z","2020-06-02T21:39:44.000Z","2020-06-02T21:39:45.000Z","2020-06-02T21:39:46.000Z","2020-06-02T21:39:47.000Z","2020-06-02T21:39:48.000Z","2020-06-02T21:39:49.000Z","2020-06-02T21:39:50.000Z","2020-06-02T21:39:51.000Z","2020-06-02T21:39:52.000Z","2020-06-02T21:39:53.000Z","2020-06-02T21:39:54.000Z","2020-06-02T21:39:55.000Z","2020-06-02T21:39:56.000Z","2020-06-02T21:39:57.000Z","2020-06-02T21:39:58.000Z","2020-06-02T21:39:59.000Z","2020-06-02T21:40:00.000Z","2020-06-02T21:40:01.000Z","2020-06-02T21:40:02.000Z","2020-06-02T21:40:03.000Z","2020-06-02T21:40:04.000Z","2020-06-02T21:40:05.000Z","2020-06-02T21:40:06.000Z","2020-06-02T21:40:07.000Z","2020-06-02T21:40:08.000Z","2020-06-02T21:40:09.000Z","2020-06-02T21:40:10.000Z","2020-06-02T21:40:11.000Z","2020-06-02T21:40:12.000Z","2020-06-02T21:40:13.000Z","2020-06-02T21:40:14.000Z","2020-06-02T21:40:15.000Z","2020-06-02T21:40:16.000Z","2020-06-02T21:40:17.000Z","2020-06-02T21:40:18.000Z","2020-06-02T21:40:19.000Z","2020-06-02T21:40:20.000Z","2020-06-02T21:40:21.000Z","2020-06-02T21:40:22.000Z","2020-06-02T21:40:23.000Z","2020-06-02T21:40:24.000Z","2020-06-02T21:40:25.000Z","2020-06-02T21:40:26.000Z","2020-06-02T21:40:27.000Z","2020-06-02T21:40:28.000Z","2020-06-02T21:40:29.000Z","2020-06-02T21:40:30.000Z","2020-06-02T21:40:31.000Z","2020-06-02T21:40:32.000Z","2020-06-02T21:40:33.000Z","2020-06-02T21:40:34.000Z","2020-06-02T21:40:35.000Z","2020-06-02T21:40:36.000Z","2020-06-02T21:40:37.000Z","2020-06-02T21:40:38.000Z","2020-06-02T21:40:39.000Z","2020-06-02T21:40:40.000Z","2020-06-02T21:40:41.000Z","2020-06-02T21:40:42.000Z","2020-06-02T21:40:43.000Z","2020-06-02T21:40:44.000Z","2020-06-02T21:40:45.000Z","2020-06-02T21:40:46.000Z","2020-06-02T21:40:47.000Z","2020-06-02T21:40:48.000Z","2020-06-02T21:40:49.000Z","2020-06-02T21:40:50.000Z","2020-06-02T21:40:51.000Z","2020-06-02T21:40:52.000Z","2020-06-02T21:40:53.000Z","2020-06-02T21:40:54.000Z","2020-06-02T21:40:55.000Z","2020-06-02T21:40:56.000Z","2020-06-02T21:40:57.000Z","2020-06-02T21:40:58.000Z","2020-06-02T21:40:59.000Z","2020-06-02T21:41:00.000Z","2020-06-02T21:41:01.000Z","2020-06-02T21:41:02.000Z","2020-06-02T21:41:03.000Z","2020-06-02T21:41:04.000Z","2020-06-02T21:41:05.000Z","2020-06-02T21:41:06.000Z","2020-06-02T21:41:07.000Z","2020-06-02T21:41:08.000Z","2020-06-02T21:41:09.000Z","2020-06-02T21:41:10.000Z","2020-06-02T21:41:11.000Z","2020-06-02T21:41:12.000Z","2020-06-02T21:41:13.000Z","2020-06-02T21:41:14.000Z","2020-06-02T21:41:15.000Z","2020-06-02T21:41:16.000Z","2020-06-02T21:41:17.000Z","2020-06-02T21:41:18.000Z","2020-06-02T21:41:19.000Z","2020-06-02T21:41:20.000Z","2020-06-02T21:41:21.000Z","2020-06-02T21:41:22.000Z","2020-06-02T21:41:23.000Z","2020-06-02T21:41:24.000Z","2020-06-02T21:41:25.000Z","2020-06-02T21:41:26.000Z","2020-06-02T21:41:27.000Z","2020-06-02T21:41:28.000Z","2020-06-02T21:41:29.000Z","2020-06-02T21:41:30.000Z","2020-06-02T21:41:31.000Z","2020-06-02T21:41:32.000Z","2020-06-02T21:41:33.000Z","2020-06-02T21:41:34.000Z","2020-06-02T21:41:35.000Z","2020-06-02T21:41:36.000Z","2020-06-02T21:41:37.000Z","2020-06-02T21:41:38.000Z","2020-06-02T21:41:39.000Z","2020-06-02T21:41:40.000Z","2020-06-02T21:41:41.000Z","2020-06-02T21:41:42.000Z","2020-06-02T21:41:43.000Z","2020-06-02T21:41:44.000Z","2020-06-02T21:41:45.000Z","2020-06-02T21:41:46.000Z","2020-06-02T21:41:47.000Z","2020-06-02T21:41:48.000Z","2020-06-02T21:41:49.000Z","2020-06-02T21:41:50.000Z","2020-06-02T21:41:51.000Z","2020-06-02T21:41:52.000Z","2020-06-02T21:41:53.000Z","2020-06-02T21:41:54.000Z","2020-06-02T21:41:55.000Z","2020-06-02T21:41:56.000Z","2020-06-02T21:41:57.000Z","2020-06-02T21:41:58.000Z","2020-06-02T21:41:59.000Z","2020-06-02T21:42:00.000Z","2020-06-02T21:42:01.000Z","2020-06-02T21:42:02.000Z","2020-06-02T21:42:03.000Z","2020-06-02T21:42:04.000Z","2020-06-02T21:42:05.000Z","2020-06-02T21:42:06.000Z","2020-06-02T21:42:07.000Z","2020-06-02T21:42:08.000Z","2020-06-02T21:42:09.000Z","2020-06-02T21:42:10.000Z","2020-06-02T21:42:11.000Z","2020-06-02T21:42:12.000Z","2020-06-02T21:42:13.000Z","2020-06-02T21:42:14.000Z","2020-06-02T21:42:15.000Z","2020-06-02T21:42:16.000Z","2020-06-02T21:42:17.000Z","2020-06-02T21:42:18.000Z","2020-06-02T21:42:19.000Z","2020-06-02T21:42:20.000Z","2020-06-02T21:42:21.000Z","2020-06-02T21:42:22.000Z","2020-06-02T21:42:23.000Z","2020-06-02T21:42:24.000Z","2020-06-02T21:42:25.000Z","2020-06-02T21:42:26.000Z","2020-06-02T21:42:27.000Z","2020-06-02T21:42:28.000Z","2020-06-02T21:42:29.000Z","2020-06-02T21:42:30.000Z","2020-06-02T21:42:31.000Z","2020-06-02T21:42:32.000Z","2020-06-02T21:42:33.000Z","2020-06-02T21:42:34.000Z","2020-06-02T21:42:35.000Z","2020-06-02T21:42:36.000Z","2020-06-02T21:42:37.000Z","2020-06-02T21:42:38.000Z","2020-06-02T21:42:39.000Z","2020-06-02T21:42:40.000Z","2020-06-02T21:42:41.000Z","2020-06-02T21:42:42.000Z","2020-06-02T21:42:43.000Z","2020-06-02T21:42:44.000Z","2020-06-02T21:42:45.000Z","2020-06-02T21:42:46.000Z","2020-06-02T21:42:47.000Z","2020-06-02T21:42:48.000Z","2020-06-02T21:42:49.000Z","2020-06-02T21:42:50.000Z","2020-06-02T21:42:51.000Z","2020-06-02T21:42:52.000Z","2020-06-02T21:42:53.000Z","2020-06-02T21:42:54.000Z","2020-06-02T21:42:55.000Z","2020-06-02T21:42:56.000Z","2020-06-02T21:42:57.000Z","2020-06-02T21:42:58.000Z","2020-06-02T21:42:59.000Z","2020-06-02T21:43:00.000Z","2020-06-02T21:43:01.000Z","2020-06-02T21:43:02.000Z","2020-06-02T21:43:03.000Z","2020-06-02T21:43:04.000Z","2020-06-02T21:43:05.000Z","2020-06-02T21:43:06.000Z","2020-06-02T21:43:07.000Z","2020-06-02T21:43:08.000Z","2020-06-02T21:43:09.000Z","2020-06-02T21:43:10.000Z","2020-06-02T21:43:11.000Z","2020-06-02T21:43:12.000Z","2020-06-02T21:43:13.000Z","2020-06-02T21:43:14.000Z","2020-06-02T21:43:15.000Z","2020-06-02T21:43:16.000Z","2020-06-02T21:43:17.000Z","2020-06-02T21:43:18.000Z","2020-06-02T21:43:19.000Z","2020-06-02T21:43:20.000Z","2020-06-02T21:43:21.000Z","2020-06-02T21:43:22.000Z","2020-06-02T21:43:23.000Z","2020-06-02T21:43:24.000Z","2020-06-02T21:43:25.000Z","2020-06-02T21:43:26.000Z","2020-06-02T21:43:27.000Z","2020-06-02T21:43:28.000Z","2020-06-02T21:43:29.000Z","2020-06-02T21:43:30.000Z","2020-06-02T21:43:31.000Z","2020-06-02T21:43:32.000Z","2020-06-02T21:43:33.000Z","2020-06-02T21:43:34.000Z","2020-06-02T21:43:35.000Z","2020-06-02T21:43:36.000Z","2020-06-02T21:43:37.000Z","2020-06-02T21:43:38.000Z","2020-06-02T21:43:39.000Z","2020-06-02T21:43:40.000Z","2020-06-02T21:43:41.000Z","2020-06-02T21:43:42.000Z","2020-06-02T21:43:43.000Z","2020-06-02T21:43:44.000Z","2020-06-02T21:43:45.000Z","2020-06-02T21:43:46.000Z","2020-06-02T21:43:47.000Z","2020-06-02T21:43:48.000Z","2020-06-02T21:43:49.000Z","2020-06-02T21:43:50.000Z","2020-06-02T21:43:51.000Z","2020-06-02T21:43:52.000Z","2020-06-02T21:43:53.000Z","2020-06-02T21:43:54.000Z","2020-06-02T21:43:55.000Z","2020-06-02T21:43:56.000Z","2020-06-02T21:43:57.000Z","2020-06-02T21:43:58.000Z","2020-06-02T21:43:59.000Z","2020-06-02T21:44:00.000Z","2020-06-02T21:44:01.000Z","2020-06-02T21:44:02.000Z","2020-06-02T21:44:03.000Z","2020-06-02T21:44:04.000Z","2020-06-02T21:44:05.000Z","2020-06-02T21:44:06.000Z","2020-06-02T21:44:07.000Z","2020-06-02T21:44:08.000Z","2020-06-02T21:44:09.000Z","2020-06-02T21:44:10.000Z","2020-06-02T21:44:11.000Z","2020-06-02T21:44:12.000Z","2020-06-02T21:44:13.000Z","2020-06-02T21:44:14.000Z","2020-06-02T21:44:15.000Z","2020-06-02T21:44:16.000Z","2020-06-02T21:44:17.000Z","2020-06-02T21:44:18.000Z","2020-06-02T21:44:19.000Z","2020-06-02T21:44:20.000Z","2020-06-02T21:44:21.000Z","2020-06-02T21:44:22.000Z","2020-06-02T21:44:23.000Z","2020-06-02T21:44:24.000Z","2020-06-02T21:44:25.000Z","2020-06-02T21:44:26.000Z","2020-06-02T21:44:27.000Z","2020-06-02T21:44:28.000Z","2020-06-02T21:44:29.000Z","2020-06-02T21:44:30.000Z","2020-06-02T21:44:31.000Z","2020-06-02T21:44:32.000Z","2020-06-02T21:44:33.000Z","2020-06-02T21:44:34.000Z","2020-06-02T21:44:35.000Z","2020-06-02T21:44:36.000Z","2020-06-02T21:44:37.000Z","2020-06-02T21:44:38.000Z","2020-06-02T21:44:39.000Z","2020-06-02T21:44:40.000Z","2020-06-02T21:44:41.000Z","2020-06-02T21:44:42.000Z","2020-06-02T21:44:43.000Z","2020-06-02T21:44:44.000Z","2020-06-02T21:44:45.000Z","2020-06-02T21:44:46.000Z","2020-06-02T21:44:47.000Z","2020-06-02T21:44:48.000Z","2020-06-02T21:44:49.000Z","2020-06-02T21:44:50.000Z","2020-06-02T21:44:51.000Z","2020-06-02T21:44:52.000Z","2020-06-02T21:44:53.000Z","2020-06-02T21:44:54.000Z","2020-06-02T21:44:55.000Z","2020-06-02T21:44:56.000Z","2020-06-02T21:44:57.000Z","2020-06-02T21:44:58.000Z","2020-06-02T21:44:59.000Z","2020-06-02T21:45:00.000Z","2020-06-02T21:45:01.000Z","2020-06-02T21:45:02.000Z","2020-06-02T21:45:03.000Z","2020-06-02T21:45:04.000Z","2020-06-02T21:45:05.000Z","2020-06-02T21:45:06.000Z","2020-06-02T21:45:07.000Z","2020-06-02T21:45:08.000Z","2020-06-02T21:45:09.000Z","2020-06-02T21:45:10.000Z","2020-06-02T21:45:11.000Z","2020-06-02T21:45:12.000Z","2020-06-02T21:45:13.000Z","2020-06-02T21:45:14.000Z","2020-06-02T21:45:15.000Z","2020-06-02T21:45:16.000Z","2020-06-02T21:45:17.000Z","2020-06-02T21:45:18.000Z","2020-06-02T21:45:19.000Z","2020-06-02T21:45:20.000Z","2020-06-02T21:45:21.000Z","2020-06-02T21:45:22.000Z","2020-06-02T21:45:23.000Z","2020-06-02T21:45:24.000Z","2020-06-02T21:45:25.000Z","2020-06-02T21:45:26.000Z","2020-06-02T21:45:27.000Z","2020-06-02T21:45:28.000Z","2020-06-02T21:45:29.000Z","2020-06-02T21:45:30.000Z","2020-06-02T21:45:31.000Z","2020-06-02T21:45:32.000Z","2020-06-02T21:45:33.000Z","2020-06-02T21:45:34.000Z","2020-06-02T21:45:35.000Z","2020-06-02T21:45:36.000Z","2020-06-02T21:45:37.000Z","2020-06-02T21:45:38.000Z","2020-06-02T21:45:39.000Z","2020-06-02T21:45:40.000Z","2020-06-02T21:45:41.000Z","2020-06-02T21:45:42.000Z","2020-06-02T21:45:43.000Z","2020-06-02T21:45:44.000Z","2020-06-02T21:45:45.000Z","2020-06-02T21:45:46.000Z","2020-06-02T21:45:47.000Z","2020-06-02T21:45:48.000Z","2020-06-02T21:45:49.000Z","2020-06-02T21:45:50.000Z","2020-06-02T21:45:51.000Z","2020-06-02T21:45:52.000Z","2020-06-02T21:45:53.000Z","2020-06-02T21:45:54.000Z","2020-06-02T21:45:55.000Z","2020-06-02T21:45:56.000Z","2020-06-02T21:45:57.000Z","2020-06-02T21:45:58.000Z","2020-06-02T21:45:59.000Z","2020-06-02T21:46:00.000Z","2020-06-02T21:46:01.000Z","2020-06-02T21:46:02.000Z","2020-06-02T21:46:03.000Z","2020-06-02T21:46:04.000Z","2020-06-02T21:46:05.000Z","2020-06-02T21:46:06.000Z","2020-06-02T21:46:07.000Z","2020-06-02T21:46:08.000Z","2020-06-02T21:46:09.000Z","2020-06-02T21:46:10.000Z","2020-06-02T21:46:11.000Z","2020-06-02T21:46:12.000Z","2020-06-02T21:46:13.000Z","2020-06-02T21:46:14.000Z","2020-06-02T21:46:15.000Z","2020-06-02T21:46:16.000Z","2020-06-02T21:46:17.000Z","2020-06-02T21:46:18.000Z","2020-06-02T21:46:19.000Z","2020-06-02T21:46:20.000Z","2020-06-02T21:46:21.000Z","2020-06-02T21:46:22.000Z","2020-06-02T21:46:23.000Z","2020-06-02T21:46:24.000Z","2020-06-02T21:46:25.000Z","2020-06-02T21:46:26.000Z","2020-06-02T21:46:27.000Z","2020-06-02T21:46:28.000Z","2020-06-02T21:46:29.000Z","2020-06-02T21:46:30.000Z","2020-06-02T21:46:31.000Z","2020-06-02T21:46:32.000Z","2020-06-02T21:46:33.000Z","2020-06-02T21:46:34.000Z","2020-06-02T21:46:35.000Z","2020-06-02T21:46:36.000Z","2020-06-02T21:46:37.000Z","2020-06-02T21:46:38.000Z","2020-06-02T21:46:39.000Z","2020-06-02T21:46:40.000Z","2020-06-02T21:46:41.000Z","2020-06-02T21:46:42.000Z","2020-06-02T21:46:43.000Z","2020-06-02T21:46:44.000Z","2020-06-02T21:46:45.000Z","2020-06-02T21:46:46.000Z","2020-06-02T21:46:47.000Z","2020-06-02T21:46:48.000Z","2020-06-02T21:46:49.000Z","2020-06-02T21:46:50.000Z","2020-06-02T21:46:51.000Z","2020-06-02T21:46:52.000Z","2020-06-02T21:46:53.000Z","2020-06-02T21:46:54.000Z","2020-06-02T21:46:55.000Z","2020-06-02T21:46:56.000Z","2020-06-02T21:46:57.000Z","2020-06-02T21:46:58.000Z","2020-06-02T21:46:59.000Z","2020-06-02T21:47:00.000Z","2020-06-02T21:47:01.000Z","2020-06-02T21:47:02.000Z","2020-06-02T21:47:03.000Z","2020-06-02T21:47:04.000Z","2020-06-02T21:47:05.000Z","2020-06-02T21:47:06.000Z","2020-06-02T21:47:07.000Z","2020-06-02T21:47:08.000Z","2020-06-02T21:47:09.000Z","2020-06-02T21:47:10.000Z","2020-06-02T21:47:11.000Z","2020-06-02T21:47:12.000Z","2020-06-02T21:47:13.000Z","2020-06-02T21:47:14.000Z","2020-06-02T21:47:15.000Z","2020-06-02T21:47:16.000Z","2020-06-02T21:47:17.000Z","2020-06-02T21:47:18.000Z","2020-06-02T21:47:19.000Z","2020-06-02T21:47:20.000Z","2020-06-02T21:47:21.000Z","2020-06-02T21:47:22.000Z","2020-06-02T21:47:23.000Z","2020-06-02T21:47:24.000Z","2020-06-02T21:47:25.000Z","2020-06-02T21:47:26.000Z","2020-06-02T21:47:27.000Z","2020-06-02T21:47:28.000Z","2020-06-02T21:47:29.000Z","2020-06-02T21:47:30.000Z","2020-06-02T21:47:31.000Z","2020-06-02T21:47:32.000Z","2020-06-02T21:47:33.000Z","2020-06-02T21:47:34.000Z","2020-06-02T21:47:35.000Z","2020-06-02T21:47:36.000Z","2020-06-02T21:47:37.000Z","2020-06-02T21:47:38.000Z","2020-06-02T21:47:39.000Z","2020-06-02T21:47:40.000Z","2020-06-02T21:47:41.000Z","2020-06-02T21:47:42.000Z","2020-06-02T21:47:43.000Z","2020-06-02T21:47:44.000Z","2020-06-02T21:47:45.000Z","2020-06-02T21:47:46.000Z","2020-06-02T21:47:47.000Z","2020-06-02T21:47:48.000Z","2020-06-02T21:47:49.000Z","2020-06-02T21:47:50.000Z","2020-06-02T21:47:51.000Z","2020-06-02T21:47:52.000Z","2020-06-02T21:47:53.000Z","2020-06-02T21:47:54.000Z","2020-06-02T21:47:55.000Z","2020-06-02T21:47:56.000Z","2020-06-02T21:47:57.000Z","2020-06-02T21:47:58.000Z","2020-06-02T21:47:59.000Z","2020-06-02T21:48:00.000Z","2020-06-02T21:48:01.000Z","2020-06-02T21:48:02.000Z","2020-06-02T21:48:03.000Z","2020-06-02T21:48:04.000Z","2020-06-02T21:48:05.000Z","2020-06-02T21:48:06.000Z","2020-06-02T21:48:07.000Z","2020-06-02T21:48:08.000Z","2020-06-02T21:48:09.000Z","2020-06-02T21:48:10.000Z","2020-06-02T21:48:11.000Z","2020-06-02T21:48:12.000Z","2020-06-02T21:48:13.000Z","2020-06-02T21:48:14.000Z","2020-06-02T21:48:15.000Z","2020-06-02T21:48:16.000Z","2020-06-02T21:48:17.000Z","2020-06-02T21:48:18.000Z","2020-06-02T21:48:19.000Z","2020-06-02T21:48:20.000Z","2020-06-02T21:48:21.000Z","2020-06-02T21:48:22.000Z","2020-06-02T21:48:23.000Z","2020-06-02T21:48:24.000Z","2020-06-02T21:48:25.000Z","2020-06-02T21:48:26.000Z","2020-06-02T21:48:27.000Z","2020-06-02T21:48:28.000Z","2020-06-02T21:48:29.000Z","2020-06-02T21:48:30.000Z","2020-06-02T21:48:31.000Z","2020-06-02T21:48:32.000Z","2020-06-02T21:48:33.000Z","2020-06-02T21:48:34.000Z","2020-06-02T21:48:35.000Z","2020-06-02T21:48:36.000Z","2020-06-02T21:48:37.000Z","2020-06-02T21:48:38.000Z","2020-06-02T21:48:39.000Z","2020-06-02T21:48:40.000Z","2020-06-02T21:48:41.000Z","2020-06-02T21:48:42.000Z","2020-06-02T21:48:43.000Z","2020-06-02T21:48:44.000Z","2020-06-02T21:48:45.000Z","2020-06-02T21:48:46.000Z","2020-06-02T21:48:47.000Z","2020-06-02T21:48:48.000Z","2020-06-02T21:48:49.000Z","2020-06-02T21:48:50.000Z","2020-06-02T21:48:51.000Z","2020-06-02T21:48:52.000Z","2020-06-02T21:48:53.000Z","2020-06-02T21:48:54.000Z","2020-06-02T21:48:55.000Z","2020-06-02T21:48:56.000Z","2020-06-02T21:48:57.000Z","2020-06-02T21:48:58.000Z","2020-06-02T21:48:59.000Z","2020-06-02T21:49:00.000Z","2020-06-02T21:49:01.000Z","2020-06-02T21:49:02.000Z","2020-06-02T21:49:03.000Z","2020-06-02T21:49:04.000Z","2020-06-02T21:49:05.000Z","2020-06-02T21:49:06.000Z","2020-06-02T21:49:07.000Z","2020-06-02T21:49:08.000Z","2020-06-02T21:49:09.000Z","2020-06-02T21:49:10.000Z","2020-06-02T21:49:11.000Z","2020-06-02T21:49:12.000Z","2020-06-02T21:49:13.000Z","2020-06-02T21:49:14.000Z","2020-06-02T21:49:15.000Z","2020-06-02T21:49:16.000Z","2020-06-02T21:49:17.000Z","2020-06-02T21:49:18.000Z","2020-06-02T21:49:19.000Z","2020-06-02T21:49:20.000Z","2020-06-02T21:49:21.000Z","2020-06-02T21:49:22.000Z","2020-06-02T21:49:23.000Z","2020-06-02T21:49:24.000Z","2020-06-02T21:49:25.000Z","2020-06-02T21:49:26.000Z","2020-06-02T21:49:27.000Z","2020-06-02T21:49:28.000Z","2020-06-02T21:49:29.000Z","2020-06-02T21:49:30.000Z","2020-06-02T21:49:31.000Z","2020-06-02T21:49:32.000Z","2020-06-02T21:49:33.000Z","2020-06-02T21:49:34.000Z","2020-06-02T21:49:35.000Z","2020-06-02T21:49:36.000Z","2020-06-02T21:49:37.000Z","2020-06-02T21:49:38.000Z","2020-06-02T21:49:39.000Z","2020-06-02T21:49:40.000Z","2020-06-02T21:49:41.000Z","2020-06-02T21:49:42.000Z","2020-06-02T21:49:43.000Z","2020-06-02T21:49:44.000Z","2020-06-02T21:49:45.000Z","2020-06-02T21:49:46.000Z","2020-06-02T21:49:47.000Z","2020-06-02T21:49:48.000Z","2020-06-02T21:49:49.000Z","2020-06-02T21:49:50.000Z","2020-06-02T21:49:51.000Z","2020-06-02T21:49:52.000Z","2020-06-02T21:49:53.000Z","2020-06-02T21:49:54.000Z","2020-06-02T21:49:55.000Z","2020-06-02T21:49:56.000Z","2020-06-02T21:49:57.000Z","2020-06-02T21:49:58.000Z","2020-06-02T21:49:59.000Z","2020-06-02T21:50:00.000Z","2020-06-02T21:50:01.000Z","2020-06-02T21:50:02.000Z","2020-06-02T21:50:03.000Z","2020-06-02T21:50:04.000Z","2020-06-02T21:50:05.000Z","2020-06-02T21:50:06.000Z","2020-06-02T21:50:07.000Z","2020-06-02T21:50:08.000Z","2020-06-02T21:50:09.000Z","2020-06-02T21:50:10.000Z","2020-06-02T21:50:11.000Z","2020-06-02T21:50:12.000Z","2020-06-02T21:50:13.000Z","2020-06-02T21:50:14.000Z","2020-06-02T21:50:15.000Z","2020-06-02T21:50:16.000Z","2020-06-02T21:50:17.000Z","2020-06-02T21:50:18.000Z","2020-06-02T21:50:19.000Z","2020-06-02T21:50:20.000Z","2020-06-02T21:50:21.000Z","2020-06-02T21:50:22.000Z","2020-06-02T21:50:23.000Z","2020-06-02T21:50:24.000Z","2020-06-02T21:50:25.000Z","2020-06-02T21:50:26.000Z","2020-06-02T21:50:27.000Z","2020-06-02T21:50:28.000Z","2020-06-02T21:50:29.000Z","2020-06-02T21:50:30.000Z","2020-06-02T21:50:31.000Z","2020-06-02T21:50:32.000Z","2020-06-02T21:50:33.000Z","2020-06-02T21:50:34.000Z","2020-06-02T21:50:35.000Z","2020-06-02T21:50:36.000Z","2020-06-02T21:50:37.000Z","2020-06-02T21:50:38.000Z","2020-06-02T21:50:39.000Z","2020-06-02T21:50:40.000Z","2020-06-02T21:50:41.000Z","2020-06-02T21:50:42.000Z","2020-06-02T21:50:43.000Z","2020-06-02T21:50:44.000Z","2020-06-02T21:50:45.000Z","2020-06-02T21:50:46.000Z","2020-06-02T21:50:47.000Z","2020-06-02T21:50:48.000Z","2020-06-02T21:50:49.000Z","2020-06-02T21:50:50.000Z","2020-06-02T21:50:51.000Z","2020-06-02T21:50:52.000Z","2020-06-02T21:50:53.000Z","2020-06-02T21:50:54.000Z","2020-06-02T21:50:55.000Z","2020-06-02T21:50:56.000Z","2020-06-02T21:50:57.000Z","2020-06-02T21:50:58.000Z","2020-06-02T21:50:59.000Z","2020-06-02T21:51:00.000Z","2020-06-02T21:51:01.000Z","2020-06-02T21:51:02.000Z","2020-06-02T21:51:03.000Z","2020-06-02T21:51:04.000Z","2020-06-02T21:51:05.000Z","2020-06-02T21:51:06.000Z","2020-06-02T21:51:07.000Z","2020-06-02T21:51:08.000Z","2020-06-02T21:51:09.000Z","2020-06-02T21:51:10.000Z","2020-06-02T21:51:11.000Z","2020-06-02T21:51:12.000Z","2020-06-02T21:51:13.000Z","2020-06-02T21:51:14.000Z","2020-06-02T21:51:15.000Z","2020-06-02T21:51:16.000Z","2020-06-02T21:51:17.000Z","2020-06-02T21:51:18.000Z","2020-06-02T21:51:19.000Z","2020-06-02T21:51:20.000Z","2020-06-02T21:51:21.000Z","2020-06-02T21:51:22.000Z","2020-06-02T21:51:23.000Z","2020-06-02T21:51:24.000Z","2020-06-02T21:51:25.000Z","2020-06-02T21:51:26.000Z","2020-06-02T21:51:27.000Z","2020-06-02T21:51:28.000Z","2020-06-02T21:51:29.000Z","2020-06-02T21:51:30.000Z","2020-06-02T21:51:31.000Z","2020-06-02T21:51:32.000Z","2020-06-02T21:51:33.000Z","2020-06-02T21:51:34.000Z","2020-06-02T21:51:35.000Z","2020-06-02T21:51:36.000Z","2020-06-02T21:51:37.000Z","2020-06-02T21:51:38.000Z","2020-06-02T21:51:39.000Z","2020-06-02T21:51:40.000Z","2020-06-02T21:51:41.000Z","2020-06-02T21:51:42.000Z","2020-06-02T21:51:43.000Z","2020-06-02T21:51:44.000Z","2020-06-02T21:51:45.000Z","2020-06-02T21:51:46.000Z","2020-06-02T21:51:47.000Z","2020-06-02T21:51:48.000Z","2020-06-02T21:51:49.000Z","2020-06-02T21:51:50.000Z","2020-06-02T21:51:51.000Z","2020-06-02T21:51:52.000Z","2020-06-02T21:51:53.000Z","2020-06-02T21:51:54.000Z","2020-06-02T21:51:55.000Z","2020-06-02T21:51:56.000Z","2020-06-02T21:51:57.000Z","2020-06-02T21:51:58.000Z","2020-06-02T21:51:59.000Z","2020-06-02T21:52:00.000Z","2020-06-02T21:52:01.000Z","2020-06-02T21:52:02.000Z","2020-06-02T21:52:03.000Z","2020-06-02T21:52:04.000Z","2020-06-02T21:52:05.000Z","2020-06-02T21:52:06.000Z","2020-06-02T21:52:07.000Z","2020-06-02T21:52:08.000Z","2020-06-02T21:52:09.000Z","2020-06-02T21:52:10.000Z","2020-06-02T21:52:11.000Z","2020-06-02T21:52:12.000Z","2020-06-02T21:52:13.000Z","2020-06-02T21:52:14.000Z","2020-06-02T21:52:15.000Z","2020-06-02T21:52:16.000Z","2020-06-02T21:52:17.000Z","2020-06-02T21:52:18.000Z","2020-06-02T21:52:19.000Z","2020-06-02T21:52:20.000Z","2020-06-02T21:52:21.000Z","2020-06-02T21:52:22.000Z","2020-06-02T21:52:23.000Z","2020-06-02T21:52:24.000Z","2020-06-02T21:52:25.000Z","2020-06-02T21:52:26.000Z","2020-06-02T21:52:27.000Z","2020-06-02T21:52:28.000Z","2020-06-02T21:52:29.000Z","2020-06-02T21:52:30.000Z","2020-06-02T21:52:31.000Z","2020-06-02T21:52:32.000Z","2020-06-02T21:52:33.000Z","2020-06-02T21:52:34.000Z","2020-06-02T21:52:35.000Z","2020-06-02T21:52:36.000Z","2020-06-02T21:52:37.000Z","2020-06-02T21:52:38.000Z","2020-06-02T21:52:39.000Z","2020-06-02T21:52:40.000Z","2020-06-02T21:52:41.000Z","2020-06-02T21:52:42.000Z","2020-06-02T21:52:43.000Z","2020-06-02T21:52:44.000Z","2020-06-02T21:52:45.000Z","2020-06-02T21:52:46.000Z","2020-06-02T21:52:47.000Z","2020-06-02T21:52:48.000Z","2020-06-02T21:52:49.000Z","2020-06-02T21:52:50.000Z","2020-06-02T21:52:51.000Z","2020-06-02T21:52:52.000Z","2020-06-02T21:52:53.000Z","2020-06-02T21:52:54.000Z","2020-06-02T21:52:55.000Z","2020-06-02T21:52:56.000Z","2020-06-02T21:52:57.000Z","2020-06-02T21:52:58.000Z","2020-06-02T21:52:59.000Z","2020-06-02T21:53:00.000Z","2020-06-02T21:53:01.000Z","2020-06-02T21:53:02.000Z","2020-06-02T21:53:03.000Z","2020-06-02T21:53:04.000Z","2020-06-02T21:53:05.000Z","2020-06-02T21:53:06.000Z","2020-06-02T21:53:07.000Z","2020-06-02T21:53:08.000Z","2020-06-02T21:53:09.000Z","2020-06-02T21:53:10.000Z","2020-06-02T21:53:11.000Z","2020-06-02T21:53:12.000Z","2020-06-02T21:53:13.000Z","2020-06-02T21:53:14.000Z","2020-06-02T21:53:15.000Z","2020-06-02T21:53:16.000Z","2020-06-02T21:53:17.000Z","2020-06-02T21:53:18.000Z","2020-06-02T21:53:19.000Z","2020-06-02T21:53:20.000Z","2020-06-02T21:53:21.000Z","2020-06-02T21:53:22.000Z","2020-06-02T21:53:23.000Z","2020-06-02T21:53:24.000Z","2020-06-02T21:53:25.000Z","2020-06-02T21:53:26.000Z","2020-06-02T21:53:27.000Z","2020-06-02T21:53:28.000Z","2020-06-02T21:53:29.000Z","2020-06-02T21:53:30.000Z","2020-06-02T21:53:31.000Z","2020-06-02T21:53:32.000Z","2020-06-02T21:53:33.000Z","2020-06-02T21:53:34.000Z","2020-06-02T21:53:35.000Z","2020-06-02T21:53:36.000Z","2020-06-02T21:53:37.000Z","2020-06-02T21:53:38.000Z","2020-06-02T21:53:39.000Z","2020-06-02T21:53:40.000Z","2020-06-02T21:53:41.000Z","2020-06-02T21:53:42.000Z","2020-06-02T21:53:43.000Z","2020-06-02T21:53:44.000Z","2020-06-02T21:53:45.000Z","2020-06-02T21:53:46.000Z","2020-06-02T21:53:47.000Z","2020-06-02T21:53:48.000Z","2020-06-02T21:53:49.000Z","2020-06-02T21:53:50.000Z","2020-06-02T21:53:51.000Z","2020-06-02T21:53:52.000Z","2020-06-02T21:53:53.000Z","2020-06-02T21:53:54.000Z","2020-06-02T21:53:55.000Z","2020-06-02T21:53:56.000Z","2020-06-02T21:53:57.000Z","2020-06-02T21:53:58.000Z","2020-06-02T21:53:59.000Z","2020-06-02T21:54:00.000Z","2020-06-02T21:54:01.000Z","2020-06-02T21:54:02.000Z","2020-06-02T21:54:03.000Z","2020-06-02T21:54:04.000Z","2020-06-02T21:54:05.000Z","2020-06-02T21:54:06.000Z","2020-06-02T21:54:07.000Z","2020-06-02T21:54:08.000Z","2020-06-02T21:54:09.000Z","2020-06-02T21:54:10.000Z","2020-06-02T21:54:11.000Z","2020-06-02T21:54:12.000Z","2020-06-02T21:54:13.000Z","2020-06-02T21:54:14.000Z","2020-06-02T21:54:15.000Z","2020-06-02T21:54:16.000Z","2020-06-02T21:54:17.000Z","2020-06-02T21:54:18.000Z","2020-06-02T21:54:19.000Z","2020-06-02T21:54:20.000Z","2020-06-02T21:54:21.000Z","2020-06-02T21:54:22.000Z","2020-06-02T21:54:23.000Z","2020-06-02T21:54:24.000Z","2020-06-02T21:54:25.000Z","2020-06-02T21:54:26.000Z","2020-06-02T21:54:27.000Z","2020-06-02T21:54:28.000Z","2020-06-02T21:54:29.000Z","2020-06-02T21:54:30.000Z","2020-06-02T21:54:31.000Z","2020-06-02T21:54:32.000Z","2020-06-02T21:54:33.000Z","2020-06-02T21:54:34.000Z","2020-06-02T21:54:35.000Z","2020-06-02T21:54:36.000Z","2020-06-02T21:54:37.000Z","2020-06-02T21:54:38.000Z","2020-06-02T21:54:39.000Z","2020-06-02T21:54:40.000Z","2020-06-02T21:54:41.000Z","2020-06-02T21:54:42.000Z","2020-06-02T21:54:43.000Z","2020-06-02T21:54:44.000Z","2020-06-02T21:54:45.000Z","2020-06-02T21:54:46.000Z","2020-06-02T21:54:47.000Z","2020-06-02T21:54:48.000Z","2020-06-02T21:54:49.000Z","2020-06-02T21:54:50.000Z","2020-06-02T21:54:51.000Z","2020-06-02T21:54:52.000Z","2020-06-02T21:54:53.000Z","2020-06-02T21:54:54.000Z","2020-06-02T21:54:55.000Z","2020-06-02T21:54:56.000Z","2020-06-02T21:54:57.000Z","2020-06-02T21:54:58.000Z","2020-06-02T21:54:59.000Z","2020-06-02T21:55:00.000Z","2020-06-02T21:55:01.000Z","2020-06-02T21:55:02.000Z","2020-06-02T21:55:03.000Z","2020-06-02T21:55:04.000Z","2020-06-02T21:55:05.000Z","2020-06-02T21:55:06.000Z","2020-06-02T21:55:07.000Z","2020-06-02T21:55:08.000Z","2020-06-02T21:55:09.000Z","2020-06-02T21:55:10.000Z","2020-06-02T21:55:11.000Z","2020-06-02T21:55:12.000Z","2020-06-02T21:55:13.000Z","2020-06-02T21:55:14.000Z","2020-06-02T21:55:15.000Z","2020-06-02T21:55:16.000Z","2020-06-02T21:55:17.000Z","2020-06-02T21:55:18.000Z","2020-06-02T21:55:19.000Z","2020-06-02T21:55:20.000Z","2020-06-02T21:55:21.000Z","2020-06-02T21:55:22.000Z","2020-06-02T21:55:23.000Z","2020-06-02T21:55:24.000Z","2020-06-02T21:55:25.000Z","2020-06-02T21:55:26.000Z","2020-06-02T21:55:27.000Z","2020-06-02T21:55:28.000Z","2020-06-02T21:55:29.000Z","2020-06-02T21:55:30.000Z","2020-06-02T21:55:31.000Z","2020-06-02T21:55:32.000Z","2020-06-02T21:55:33.000Z","2020-06-02T21:55:34.000Z","2020-06-02T21:55:35.000Z","2020-06-02T21:55:36.000Z","2020-06-02T21:55:37.000Z","2020-06-02T21:55:38.000Z","2020-06-02T21:55:39.000Z","2020-06-02T21:55:40.000Z","2020-06-02T21:55:41.000Z","2020-06-02T21:55:42.000Z","2020-06-02T21:55:43.000Z","2020-06-02T21:55:44.000Z","2020-06-02T21:55:45.000Z","2020-06-02T21:55:46.000Z","2020-06-02T21:55:47.000Z","2020-06-02T21:55:48.000Z","2020-06-02T21:55:49.000Z","2020-06-02T21:55:50.000Z","2020-06-02T21:55:51.000Z","2020-06-02T21:55:52.000Z","2020-06-02T21:55:53.000Z","2020-06-02T21:55:54.000Z","2020-06-02T21:55:55.000Z","2020-06-02T21:55:56.000Z","2020-06-02T21:55:57.000Z","2020-06-02T21:55:58.000Z","2020-06-02T21:55:59.000Z","2020-06-02T21:56:00.000Z","2020-06-02T21:56:01.000Z","2020-06-02T21:56:02.000Z","2020-06-02T21:56:03.000Z","2020-06-02T21:56:04.000Z","2020-06-02T21:56:05.000Z","2020-06-02T21:56:06.000Z","2020-06-02T21:56:07.000Z","2020-06-02T21:56:08.000Z","2020-06-02T21:56:09.000Z","2020-06-02T21:56:10.000Z","2020-06-02T21:56:11.000Z","2020-06-02T21:56:12.000Z","2020-06-02T21:56:13.000Z","2020-06-02T21:56:14.000Z","2020-06-02T21:56:15.000Z","2020-06-02T21:56:16.000Z","2020-06-02T21:56:17.000Z","2020-06-02T21:56:18.000Z","2020-06-02T21:56:19.000Z","2020-06-02T21:56:20.000Z","2020-06-02T21:56:21.000Z","2020-06-02T21:56:22.000Z","2020-06-02T21:56:23.000Z","2020-06-02T21:56:24.000Z","2020-06-02T21:56:25.000Z","2020-06-02T21:56:26.000Z","2020-06-02T21:56:27.000Z","2020-06-02T21:56:28.000Z","2020-06-02T21:56:29.000Z","2020-06-02T21:56:30.000Z","2020-06-02T21:56:31.000Z","2020-06-02T21:56:32.000Z","2020-06-02T21:56:33.000Z","2020-06-02T21:56:34.000Z","2020-06-02T21:56:35.000Z","2020-06-02T21:56:36.000Z","2020-06-02T21:56:37.000Z","2020-06-02T21:56:38.000Z","2020-06-02T21:56:39.000Z","2020-06-02T21:56:40.000Z","2020-06-02T21:56:41.000Z","2020-06-02T21:56:42.000Z","2020-06-02T21:56:43.000Z","2020-06-02T21:56:44.000Z","2020-06-02T21:56:45.000Z","2020-06-02T21:56:46.000Z","2020-06-02T21:56:47.000Z","2020-06-02T21:56:48.000Z","2020-06-02T21:56:49.000Z","2020-06-02T21:56:50.000Z","2020-06-02T21:56:51.000Z","2020-06-02T21:56:52.000Z","2020-06-02T21:56:53.000Z","2020-06-02T21:56:54.000Z","2020-06-02T21:56:55.000Z","2020-06-02T21:56:56.000Z","2020-06-02T21:56:57.000Z","2020-06-02T21:56:58.000Z","2020-06-02T21:56:59.000Z","2020-06-02T21:57:00.000Z","2020-06-02T21:57:01.000Z","2020-06-02T21:57:02.000Z","2020-06-02T21:57:03.000Z","2020-06-02T21:57:04.000Z","2020-06-02T21:57:05.000Z","2020-06-02T21:57:06.000Z","2020-06-02T21:57:07.000Z","2020-06-02T21:57:08.000Z","2020-06-02T21:57:09.000Z","2020-06-02T21:57:10.000Z","2020-06-02T21:57:11.000Z","2020-06-02T21:57:12.000Z","2020-06-02T21:57:13.000Z","2020-06-02T21:57:14.000Z","2020-06-02T21:57:15.000Z","2020-06-02T21:57:16.000Z","2020-06-02T21:57:17.000Z","2020-06-02T21:57:18.000Z","2020-06-02T21:57:19.000Z","2020-06-02T21:57:20.000Z","2020-06-02T21:57:21.000Z","2020-06-02T21:57:22.000Z","2020-06-02T21:57:23.000Z","2020-06-02T21:57:24.000Z","2020-06-02T21:57:25.000Z","2020-06-02T21:57:26.000Z","2020-06-02T21:57:27.000Z","2020-06-02T21:57:28.000Z","2020-06-02T21:57:29.000Z","2020-06-02T21:57:30.000Z","2020-06-02T21:57:31.000Z","2020-06-02T21:57:32.000Z","2020-06-02T21:57:33.000Z","2020-06-02T21:57:34.000Z","2020-06-02T21:57:35.000Z","2020-06-02T21:57:36.000Z","2020-06-02T21:57:37.000Z","2020-06-02T21:57:38.000Z","2020-06-02T21:57:39.000Z","2020-06-02T21:57:40.000Z","2020-06-02T21:57:41.000Z","2020-06-02T21:57:42.000Z","2020-06-02T21:57:43.000Z","2020-06-02T21:57:44.000Z","2020-06-02T21:57:45.000Z","2020-06-02T21:57:46.000Z","2020-06-02T21:57:47.000Z","2020-06-02T21:57:48.000Z","2020-06-02T21:57:49.000Z","2020-06-02T21:57:50.000Z","2020-06-02T21:57:51.000Z","2020-06-02T21:57:52.000Z","2020-06-02T21:57:53.000Z","2020-06-02T21:57:54.000Z","2020-06-02T21:57:55.000Z","2020-06-02T21:57:56.000Z","2020-06-02T21:57:57.000Z","2020-06-02T21:57:58.000Z","2020-06-02T21:57:59.000Z","2020-06-02T21:58:00.000Z","2020-06-02T21:58:01.000Z","2020-06-02T21:58:02.000Z","2020-06-02T21:58:03.000Z","2020-06-02T21:58:04.000Z","2020-06-02T21:58:05.000Z","2020-06-02T21:58:06.000Z","2020-06-02T21:58:07.000Z","2020-06-02T21:58:08.000Z","2020-06-02T21:58:09.000Z","2020-06-02T21:58:10.000Z","2020-06-02T21:58:11.000Z","2020-06-02T21:58:12.000Z","2020-06-02T21:58:13.000Z","2020-06-02T21:58:14.000Z","2020-06-02T21:58:15.000Z","2020-06-02T21:58:16.000Z","2020-06-02T21:58:17.000Z","2020-06-02T21:58:18.000Z","2020-06-02T21:58:19.000Z","2020-06-02T21:58:20.000Z","2020-06-02T21:58:21.000Z","2020-06-02T21:58:22.000Z","2020-06-02T21:58:23.000Z","2020-06-02T21:58:24.000Z","2020-06-02T21:58:25.000Z","2020-06-02T21:58:26.000Z","2020-06-02T21:58:27.000Z","2020-06-02T21:58:28.000Z","2020-06-02T21:58:29.000Z","2020-06-02T21:58:30.000Z","2020-06-02T21:58:31.000Z","2020-06-02T21:58:32.000Z","2020-06-02T21:58:33.000Z","2020-06-02T21:58:34.000Z","2020-06-02T21:58:35.000Z","2020-06-02T21:58:36.000Z","2020-06-02T21:58:37.000Z","2020-06-02T21:58:38.000Z","2020-06-02T21:58:39.000Z","2020-06-02T21:58:40.000Z","2020-06-02T21:58:41.000Z","2020-06-02T21:58:42.000Z","2020-06-02T21:58:43.000Z","2020-06-02T21:58:44.000Z","2020-06-02T21:58:45.000Z","2020-06-02T21:58:46.000Z","2020-06-02T21:58:47.000Z","2020-06-02T21:58:48.000Z","2020-06-02T21:58:49.000Z","2020-06-02T21:58:50.000Z","2020-06-02T21:58:51.000Z","2020-06-02T21:58:52.000Z","2020-06-02T21:58:53.000Z","2020-06-02T21:58:54.000Z","2020-06-02T21:58:55.000Z","2020-06-02T21:58:56.000Z","2020-06-02T21:58:57.000Z","2020-06-02T21:58:58.000Z","2020-06-02T21:58:59.000Z","2020-06-02T21:59:00.000Z","2020-06-02T21:59:01.000Z","2020-06-02T21:59:02.000Z","2020-06-02T21:59:03.000Z","2020-06-02T21:59:04.000Z","2020-06-02T21:59:05.000Z","2020-06-02T21:59:06.000Z","2020-06-02T21:59:07.000Z","2020-06-02T21:59:08.000Z","2020-06-02T21:59:09.000Z","2020-06-02T21:59:10.000Z","2020-06-02T21:59:11.000Z","2020-06-02T21:59:12.000Z","2020-06-02T21:59:13.000Z","2020-06-02T21:59:14.000Z","2020-06-02T21:59:15.000Z","2020-06-02T21:59:16.000Z","2020-06-02T21:59:17.000Z","2020-06-02T21:59:18.000Z","2020-06-02T21:59:19.000Z","2020-06-02T21:59:20.000Z","2020-06-02T21:59:21.000Z","2020-06-02T21:59:22.000Z","2020-06-02T21:59:23.000Z","2020-06-02T21:59:24.000Z","2020-06-02T21:59:25.000Z","2020-06-02T21:59:26.000Z","2020-06-02T21:59:27.000Z","2020-06-02T21:59:28.000Z","2020-06-02T21:59:29.000Z","2020-06-02T21:59:30.000Z","2020-06-02T21:59:31.000Z","2020-06-02T21:59:32.000Z","2020-06-02T21:59:33.000Z","2020-06-02T21:59:34.000Z","2020-06-02T21:59:35.000Z","2020-06-02T21:59:36.000Z","2020-06-02T21:59:37.000Z","2020-06-02T21:59:38.000Z","2020-06-02T21:59:39.000Z","2020-06-02T21:59:40.000Z","2020-06-02T21:59:41.000Z","2020-06-02T21:59:42.000Z","2020-06-02T21:59:43.000Z","2020-06-02T21:59:44.000Z","2020-06-02T21:59:45.000Z","2020-06-02T21:59:46.000Z","2020-06-02T21:59:47.000Z","2020-06-02T21:59:48.000Z","2020-06-02T21:59:49.000Z","2020-06-02T21:59:50.000Z","2020-06-02T21:59:51.000Z","2020-06-02T21:59:52.000Z","2020-06-02T21:59:53.000Z","2020-06-02T21:59:54.000Z","2020-06-02T21:59:55.000Z","2020-06-02T21:59:56.000Z","2020-06-02T21:59:57.000Z","2020-06-02T21:59:58.000Z","2020-06-02T21:59:59.000Z","2020-06-02T22:00:00.000Z","2020-06-02T22:00:01.000Z","2020-06-02T22:00:02.000Z","2020-06-02T22:00:03.000Z","2020-06-02T22:00:04.000Z","2020-06-02T22:00:05.000Z","2020-06-02T22:00:06.000Z","2020-06-02T22:00:07.000Z","2020-06-02T22:00:08.000Z","2020-06-02T22:00:09.000Z","2020-06-02T22:00:10.000Z","2020-06-02T22:00:11.000Z","2020-06-02T22:00:12.000Z","2020-06-02T22:00:13.000Z","2020-06-02T22:00:14.000Z","2020-06-02T22:00:15.000Z","2020-06-02T22:00:16.000Z","2020-06-02T22:00:17.000Z","2020-06-02T22:00:18.000Z","2020-06-02T22:00:19.000Z","2020-06-02T22:00:20.000Z","2020-06-02T22:00:21.000Z","2020-06-02T22:00:22.000Z","2020-06-02T22:00:23.000Z","2020-06-02T22:00:24.000Z","2020-06-02T22:00:25.000Z","2020-06-02T22:00:26.000Z","2020-06-02T22:00:27.000Z","2020-06-02T22:00:28.000Z","2020-06-02T22:00:29.000Z","2020-06-02T22:00:30.000Z","2020-06-02T22:00:31.000Z","2020-06-02T22:00:32.000Z","2020-06-02T22:00:33.000Z","2020-06-02T22:00:34.000Z","2020-06-02T22:00:35.000Z","2020-06-02T22:00:36.000Z","2020-06-02T22:00:37.000Z","2020-06-02T22:00:38.000Z","2020-06-02T22:00:39.000Z","2020-06-02T22:00:40.000Z","2020-06-02T22:00:41.000Z","2020-06-02T22:00:42.000Z","2020-06-02T22:00:43.000Z","2020-06-02T22:00:44.000Z","2020-06-02T22:00:45.000Z","2020-06-02T22:00:46.000Z","2020-06-02T22:00:47.000Z","2020-06-02T22:00:48.000Z","2020-06-02T22:00:49.000Z","2020-06-02T22:00:50.000Z","2020-06-02T22:00:51.000Z","2020-06-02T22:00:52.000Z","2020-06-02T22:00:53.000Z","2020-06-02T22:00:54.000Z","2020-06-02T22:00:55.000Z","2020-06-02T22:00:56.000Z","2020-06-02T22:00:57.000Z","2020-06-02T22:00:58.000Z","2020-06-02T22:00:59.000Z","2020-06-02T22:01:00.000Z","2020-06-02T22:01:01.000Z","2020-06-02T22:01:02.000Z","2020-06-02T22:01:03.000Z","2020-06-02T22:01:04.000Z","2020-06-02T22:01:05.000Z","2020-06-02T22:01:06.000Z","2020-06-02T22:01:07.000Z","2020-06-02T22:01:08.000Z","2020-06-02T22:01:09.000Z","2020-06-02T22:01:10.000Z","2020-06-02T22:01:11.000Z","2020-06-02T22:01:12.000Z","2020-06-02T22:01:13.000Z","2020-06-02T22:01:14.000Z","2020-06-02T22:01:15.000Z","2020-06-02T22:01:16.000Z","2020-06-02T22:01:17.000Z","2020-06-02T22:01:18.000Z","2020-06-02T22:01:19.000Z","2020-06-02T22:01:20.000Z","2020-06-02T22:01:21.000Z","2020-06-02T22:01:22.000Z","2020-06-02T22:01:23.000Z","2020-06-02T22:01:24.000Z","2020-06-02T22:01:25.000Z","2020-06-02T22:01:26.000Z","2020-06-02T22:01:27.000Z","2020-06-02T22:01:28.000Z","2020-06-02T22:01:29.000Z","2020-06-02T22:01:30.000Z","2020-06-02T22:01:31.000Z","2020-06-02T22:01:32.000Z","2020-06-02T22:01:33.000Z","2020-06-02T22:01:34.000Z","2020-06-02T22:01:35.000Z","2020-06-02T22:01:36.000Z","2020-06-02T22:01:37.000Z","2020-06-02T22:01:38.000Z","2020-06-02T22:01:39.000Z","2020-06-02T22:01:40.000Z","2020-06-02T22:01:41.000Z","2020-06-02T22:01:42.000Z","2020-06-02T22:01:43.000Z","2020-06-02T22:01:44.000Z","2020-06-02T22:01:45.000Z","2020-06-02T22:01:46.000Z","2020-06-02T22:01:47.000Z","2020-06-02T22:01:48.000Z","2020-06-02T22:01:49.000Z","2020-06-02T22:01:50.000Z","2020-06-02T22:01:51.000Z","2020-06-02T22:01:52.000Z","2020-06-02T22:01:53.000Z","2020-06-02T22:01:54.000Z","2020-06-02T22:01:55.000Z","2020-06-02T22:01:56.000Z","2020-06-02T22:01:57.000Z","2020-06-02T22:01:58.000Z","2020-06-02T22:01:59.000Z","2020-06-02T22:02:00.000Z","2020-06-02T22:02:01.000Z","2020-06-02T22:02:02.000Z","2020-06-02T22:02:03.000Z","2020-06-02T22:02:04.000Z","2020-06-02T22:02:05.000Z","2020-06-02T22:02:06.000Z","2020-06-02T22:02:07.000Z","2020-06-02T22:02:08.000Z","2020-06-02T22:02:09.000Z","2020-06-02T22:02:10.000Z","2020-06-02T22:02:11.000Z","2020-06-02T22:02:12.000Z","2020-06-02T22:02:13.000Z","2020-06-02T22:02:14.000Z","2020-06-02T22:02:15.000Z","2020-06-02T22:02:16.000Z","2020-06-02T22:02:17.000Z","2020-06-02T22:02:18.000Z","2020-06-02T22:02:19.000Z","2020-06-02T22:02:20.000Z","2020-06-02T22:02:21.000Z","2020-06-02T22:02:22.000Z","2020-06-02T22:02:23.000Z","2020-06-02T22:02:24.000Z","2020-06-02T22:02:25.000Z","2020-06-02T22:02:26.000Z","2020-06-02T22:02:27.000Z","2020-06-02T22:02:28.000Z","2020-06-02T22:02:29.000Z","2020-06-02T22:02:30.000Z","2020-06-02T22:02:31.000Z","2020-06-02T22:02:32.000Z","2020-06-02T22:02:33.000Z","2020-06-02T22:02:34.000Z","2020-06-02T22:02:35.000Z","2020-06-02T22:02:36.000Z","2020-06-02T22:02:37.000Z","2020-06-02T22:02:38.000Z","2020-06-02T22:02:39.000Z","2020-06-02T22:02:40.000Z","2020-06-02T22:02:41.000Z","2020-06-02T22:02:42.000Z","2020-06-02T22:02:43.000Z","2020-06-02T22:02:44.000Z","2020-06-02T22:02:45.000Z","2020-06-02T22:02:46.000Z","2020-06-02T22:02:47.000Z","2020-06-02T22:02:48.000Z","2020-06-02T22:02:49.000Z","2020-06-02T22:02:50.000Z","2020-06-02T22:02:51.000Z","2020-06-02T22:02:52.000Z","2020-06-02T22:02:53.000Z","2020-06-02T22:02:54.000Z","2020-06-02T22:02:55.000Z","2020-06-02T22:02:56.000Z","2020-06-02T22:02:57.000Z","2020-06-02T22:02:58.000Z","2020-06-02T22:02:59.000Z","2020-06-02T22:03:00.000Z","2020-06-02T22:03:01.000Z","2020-06-02T22:03:02.000Z","2020-06-02T22:03:03.000Z","2020-06-02T22:03:04.000Z","2020-06-02T22:03:05.000Z","2020-06-02T22:03:06.000Z","2020-06-02T22:03:07.000Z","2020-06-02T22:03:08.000Z","2020-06-02T22:03:09.000Z","2020-06-02T22:03:10.000Z","2020-06-02T22:03:11.000Z","2020-06-02T22:03:12.000Z","2020-06-02T22:03:13.000Z","2020-06-02T22:03:14.000Z","2020-06-02T22:03:15.000Z","2020-06-02T22:03:16.000Z","2020-06-02T22:03:17.000Z","2020-06-02T22:03:18.000Z","2020-06-02T22:03:19.000Z","2020-06-02T22:03:20.000Z","2020-06-02T22:03:21.000Z","2020-06-02T22:03:22.000Z","2020-06-02T22:03:23.000Z","2020-06-02T22:03:24.000Z","2020-06-02T22:03:25.000Z","2020-06-02T22:03:26.000Z","2020-06-02T22:03:27.000Z","2020-06-02T22:03:28.000Z","2020-06-02T22:03:29.000Z","2020-06-02T22:03:30.000Z","2020-06-02T22:03:31.000Z","2020-06-02T22:03:32.000Z","2020-06-02T22:03:33.000Z","2020-06-02T22:03:34.000Z","2020-06-02T22:03:35.000Z","2020-06-02T22:03:36.000Z","2020-06-02T22:03:37.000Z","2020-06-02T22:03:38.000Z","2020-06-02T22:03:39.000Z","2020-06-02T22:03:40.000Z","2020-06-02T22:03:41.000Z","2020-06-02T22:03:42.000Z","2020-06-02T22:03:43.000Z","2020-06-02T22:03:44.000Z","2020-06-02T22:03:45.000Z","2020-06-02T22:03:46.000Z","2020-06-02T22:03:47.000Z","2020-06-02T22:03:48.000Z","2020-06-02T22:03:49.000Z","2020-06-02T22:03:50.000Z","2020-06-02T22:03:51.000Z","2020-06-02T22:03:52.000Z","2020-06-02T22:03:53.000Z","2020-06-02T22:03:54.000Z","2020-06-02T22:03:55.000Z","2020-06-02T22:03:56.000Z","2020-06-02T22:03:57.000Z","2020-06-02T22:03:58.000Z","2020-06-02T22:03:59.000Z","2020-06-02T22:04:00.000Z","2020-06-02T22:04:01.000Z","2020-06-02T22:04:02.000Z","2020-06-02T22:04:03.000Z","2020-06-02T22:04:04.000Z","2020-06-02T22:04:05.000Z","2020-06-02T22:04:06.000Z","2020-06-02T22:04:07.000Z","2020-06-02T22:04:08.000Z","2020-06-02T22:04:09.000Z","2020-06-02T22:04:10.000Z","2020-06-02T22:04:11.000Z","2020-06-02T22:04:12.000Z","2020-06-02T22:04:13.000Z","2020-06-02T22:04:14.000Z","2020-06-02T22:04:15.000Z","2020-06-02T22:04:16.000Z","2020-06-02T22:04:17.000Z","2020-06-02T22:04:18.000Z","2020-06-02T22:04:19.000Z","2020-06-02T22:04:20.000Z","2020-06-02T22:04:21.000Z","2020-06-02T22:04:22.000Z","2020-06-02T22:04:23.000Z","2020-06-02T22:04:24.000Z","2020-06-02T22:04:25.000Z","2020-06-02T22:04:26.000Z","2020-06-02T22:04:27.000Z","2020-06-02T22:04:28.000Z","2020-06-02T22:04:29.000Z","2020-06-02T22:04:30.000Z","2020-06-02T22:04:31.000Z","2020-06-02T22:04:32.000Z","2020-06-02T22:04:33.000Z","2020-06-02T22:04:34.000Z","2020-06-02T22:04:35.000Z","2020-06-02T22:04:36.000Z","2020-06-02T22:04:37.000Z","2020-06-02T22:04:38.000Z","2020-06-02T22:04:39.000Z","2020-06-02T22:04:40.000Z","2020-06-02T22:04:41.000Z","2020-06-02T22:04:42.000Z","2020-06-02T22:04:43.000Z","2020-06-02T22:04:44.000Z","2020-06-02T22:04:45.000Z","2020-06-02T22:04:46.000Z","2020-06-02T22:04:47.000Z","2020-06-02T22:04:48.000Z","2020-06-02T22:04:49.000Z","2020-06-02T22:04:50.000Z","2020-06-02T22:04:51.000Z","2020-06-02T22:04:52.000Z","2020-06-02T22:04:53.000Z","2020-06-02T22:04:54.000Z","2020-06-02T22:04:55.000Z","2020-06-02T22:04:56.000Z","2020-06-02T22:04:57.000Z","2020-06-02T22:04:58.000Z","2020-06-02T22:04:59.000Z","2020-06-02T22:05:00.000Z","2020-06-02T22:05:01.000Z","2020-06-02T22:05:02.000Z","2020-06-02T22:05:03.000Z","2020-06-02T22:05:04.000Z","2020-06-02T22:05:05.000Z","2020-06-02T22:05:06.000Z","2020-06-02T22:05:07.000Z","2020-06-02T22:05:08.000Z","2020-06-02T22:05:09.000Z","2020-06-02T22:05:10.000Z","2020-06-02T22:05:11.000Z","2020-06-02T22:05:12.000Z","2020-06-02T22:05:13.000Z","2020-06-02T22:05:14.000Z","2020-06-02T22:05:15.000Z","2020-06-02T22:05:16.000Z","2020-06-02T22:05:17.000Z","2020-06-02T22:05:18.000Z","2020-06-02T22:05:19.000Z","2020-06-02T22:05:20.000Z","2020-06-02T22:05:21.000Z","2020-06-02T22:05:22.000Z","2020-06-02T22:05:23.000Z","2020-06-02T22:05:24.000Z","2020-06-02T22:05:25.000Z","2020-06-02T22:05:26.000Z","2020-06-02T22:05:27.000Z","2020-06-02T22:05:28.000Z","2020-06-02T22:05:29.000Z","2020-06-02T22:05:30.000Z","2020-06-02T22:05:31.000Z","2020-06-02T22:05:32.000Z","2020-06-02T22:05:33.000Z","2020-06-02T22:05:34.000Z","2020-06-02T22:05:35.000Z","2020-06-02T22:05:36.000Z","2020-06-02T22:05:37.000Z","2020-06-02T22:05:38.000Z","2020-06-02T22:05:39.000Z","2020-06-02T22:05:40.000Z","2020-06-02T22:05:41.000Z","2020-06-02T22:05:42.000Z","2020-06-02T22:05:43.000Z","2020-06-02T22:05:44.000Z","2020-06-02T22:05:45.000Z","2020-06-02T22:05:46.000Z","2020-06-02T22:05:47.000Z","2020-06-02T22:05:48.000Z","2020-06-02T22:05:49.000Z","2020-06-02T22:05:50.000Z","2020-06-02T22:05:51.000Z","2020-06-02T22:05:52.000Z","2020-06-02T22:05:53.000Z","2020-06-02T22:05:54.000Z","2020-06-02T22:05:55.000Z","2020-06-02T22:05:56.000Z","2020-06-02T22:05:57.000Z","2020-06-02T22:05:58.000Z","2020-06-02T22:05:59.000Z","2020-06-02T22:06:00.000Z","2020-06-02T22:06:01.000Z","2020-06-02T22:06:02.000Z","2020-06-02T22:06:03.000Z","2020-06-02T22:06:04.000Z","2020-06-02T22:06:05.000Z","2020-06-02T22:06:06.000Z","2020-06-02T22:06:07.000Z","2020-06-02T22:06:08.000Z","2020-06-02T22:06:09.000Z","2020-06-02T22:06:10.000Z","2020-06-02T22:06:11.000Z","2020-06-02T22:06:12.000Z","2020-06-02T22:06:13.000Z","2020-06-02T22:06:14.000Z","2020-06-02T22:06:15.000Z","2020-06-02T22:06:16.000Z","2020-06-02T22:06:17.000Z","2020-06-02T22:06:18.000Z","2020-06-02T22:06:19.000Z","2020-06-02T22:06:20.000Z","2020-06-02T22:06:21.000Z","2020-06-02T22:06:22.000Z","2020-06-02T22:06:23.000Z","2020-06-02T22:06:24.000Z","2020-06-02T22:06:25.000Z","2020-06-02T22:06:26.000Z","2020-06-02T22:06:27.000Z","2020-06-02T22:06:28.000Z","2020-06-02T22:06:29.000Z","2020-06-02T22:06:30.000Z","2020-06-02T22:06:31.000Z","2020-06-02T22:06:32.000Z","2020-06-02T22:06:33.000Z","2020-06-02T22:06:34.000Z","2020-06-02T22:06:35.000Z","2020-06-02T22:06:36.000Z","2020-06-02T22:06:37.000Z","2020-06-02T22:06:38.000Z","2020-06-02T22:06:39.000Z","2020-06-02T22:06:40.000Z","2020-06-02T22:06:41.000Z","2020-06-02T22:06:42.000Z","2020-06-02T22:06:43.000Z","2020-06-02T22:06:44.000Z","2020-06-02T22:06:45.000Z","2020-06-02T22:06:46.000Z","2020-06-02T22:06:47.000Z","2020-06-02T22:06:48.000Z","2020-06-02T22:06:49.000Z","2020-06-02T22:06:50.000Z","2020-06-02T22:06:51.000Z","2020-06-02T22:06:52.000Z","2020-06-02T22:06:53.000Z","2020-06-02T22:06:54.000Z","2020-06-02T22:06:55.000Z","2020-06-02T22:06:56.000Z","2020-06-02T22:06:57.000Z","2020-06-02T22:06:58.000Z","2020-06-02T22:06:59.000Z","2020-06-02T22:07:00.000Z","2020-06-02T22:07:01.000Z","2020-06-02T22:07:02.000Z","2020-06-02T22:07:03.000Z","2020-06-02T22:07:04.000Z","2020-06-02T22:07:05.000Z","2020-06-02T22:07:06.000Z","2020-06-02T22:07:07.000Z","2020-06-02T22:07:08.000Z","2020-06-02T22:07:09.000Z","2020-06-02T22:07:10.000Z","2020-06-02T22:07:11.000Z","2020-06-02T22:07:12.000Z","2020-06-02T22:07:13.000Z","2020-06-02T22:07:14.000Z","2020-06-02T22:07:15.000Z","2020-06-02T22:07:16.000Z","2020-06-02T22:07:17.000Z","2020-06-02T22:07:18.000Z","2020-06-02T22:07:19.000Z","2020-06-02T22:07:20.000Z","2020-06-02T22:07:21.000Z","2020-06-02T22:07:22.000Z","2020-06-02T22:07:23.000Z","2020-06-02T22:07:24.000Z","2020-06-02T22:07:25.000Z","2020-06-02T22:07:26.000Z","2020-06-02T22:07:27.000Z","2020-06-02T22:07:28.000Z","2020-06-02T22:07:29.000Z","2020-06-02T22:07:30.000Z","2020-06-02T22:07:31.000Z","2020-06-02T22:07:32.000Z","2020-06-02T22:07:33.000Z","2020-06-02T22:07:34.000Z","2020-06-02T22:07:35.000Z","2020-06-02T22:07:36.000Z","2020-06-02T22:07:37.000Z","2020-06-02T22:07:38.000Z","2020-06-02T22:07:39.000Z","2020-06-02T22:07:40.000Z","2020-06-02T22:07:41.000Z","2020-06-02T22:07:42.000Z","2020-06-02T22:07:43.000Z","2020-06-02T22:07:44.000Z","2020-06-02T22:07:45.000Z","2020-06-02T22:07:46.000Z"],"values":[{"id":"H","metadata":{"element":"H","network":"NT","station":"FRN","channel":"H","location":"A0"},"values":[23160.845,23160.822,23160.847,23160.866,23160.858,23160.876,23160.891,23160.92,23160.928,23160.931,23160.96,23160.964,23160.95,23160.975,23160.939,23160.969,23160.949,23160.931,23160.951,23160.924,23160.941,23160.93,23160.916,23160.911,23160.939,23160.91,23160.956,23160.945,23160.92,23160.911,23160.909,23160.951,23160.954,23160.968,23160.938,23160.919,23160.946,23160.926,23160.937,23160.916,23160.916,23160.914,23160.938,23160.968,23160.969,23160.96,23160.963,23160.982,23160.987,23161.01,23161.004,23161.012,23161.013,23161.031,23161.048,23161.058,23161.057,23161.076,23161.067,23161.064,23161.079,23161.09,23161.098,23161.097,23161.077,23161.092,23161.08,23161.044,23161.058,23160.995,23161.007,23161.019,23161.048,23160.949,23160.969,23160.99,23160.995,23160.904,23160.987,23160.995,23160.918,23160.95,23160.955,23160.943,23160.951,23160.935,23160.936,23160.92,23160.914,23160.948,23160.954,23160.993,23160.966,23160.989,23161.007,23161.009,23161.029,23161.038,23161.068,23161.047,23161.044,23161.086,23161.068,23161.056,23161.064,23161.04,23161.033,23161.0,23161.024,23160.998,23161.019,23161.056,23161.087,23161.066,23161.044,23161.071,23161.112,23161.051,23161.075,23161.075,23161.075,23161.073,23161.029,23161.045,23161.006,23161.009,23161.003,23160.996,23160.977,23160.955,23160.949,23160.966,23161.008,23161.012,23160.968,23160.978,23160.994,23161.005,23161.033,23161.043,23160.988,23160.997,23160.982,23160.958,23160.969,23160.989,23160.956,23160.958,23160.958,23160.972,23160.972,23161.017,23161.017,23161.02,23161.006,23160.994,23160.991,23161.006,23161.001,23160.98,23161.001,23160.996,23160.979,23161.024,23161.028,23161.002,23161.058,23161.038,23161.057,23161.024,23161.096,23161.033,23161.058,23161.043,23161.066,23161.093,23161.095,23161.085,23161.056,23161.095,23161.135,23161.144,23161.124,23161.114,23161.135,23161.168,23161.188,23161.132,23161.179,23161.172,23161.16,23161.133,23161.159,23161.135,23161.124,23161.146,23161.144,23161.125,23161.124,23161.1,23161.109,23161.121,23161.114,23161.124,23161.135,23161.122,23161.104,23161.097,23161.091,23161.093,23161.083,23161.075,23161.055,23161.056,23161.059,23161.065,23161.059,23161.057,23161.063,23161.053,23161.026,23161.017,23161.053,23161.051,23161.055,23161.05,23161.064,23161.042,23161.073,23161.031,23161.058,23161.069,23161.081,23161.098,23161.08,23161.088,23161.075,23161.089,23161.082,23161.072,23161.094,23161.083,23161.072,23161.051,23161.063,23161.067,23161.055,23161.062,23161.045,23161.048,23161.046,23161.055,23161.052,23161.043,23161.077,23161.063,23161.045,23161.046,23161.024,23161.036,23161.046,23161.004,23160.999,23161.032,23161.015,23161.016,23161.002,23160.999,23160.989,23161.015,23161.043,23161.051,23161.03,23161.05,23161.056,23161.023,23161.025,23161.058,23161.08,23161.074,23161.066,23161.074,23161.049,23161.032,23161.05,23161.083,23161.08,23161.095,23161.102,23161.115,23161.132,23161.141,23161.154,23161.169,23161.169,23161.185,23161.143,23161.165,23161.159,23161.149,23161.131,23161.13,23161.137,23161.135,23161.137,23161.15,23161.156,23161.157,23161.178,23161.148,23161.174,23161.144,23161.169,23161.161,23161.144,23161.148,23161.114,23161.104,23161.102,23161.084,23161.067,23161.079,23161.053,23161.096,23161.099,23161.092,23161.081,23161.111,23161.119,23161.103,23161.105,23161.086,23161.084,23161.07,23161.078,23161.095,23161.102,23161.08,23161.083,23161.071,23161.08,23161.101,23161.11,23161.086,23161.102,23161.129,23161.117,23161.127,23161.131,23161.15,23161.136,23161.141,23161.155,23161.135,23161.152,23161.169,23161.166,23161.16,23161.149,23161.165,23161.164,23161.186,23161.196,23161.171,23161.157,23161.164,23161.143,23161.183,23161.203,23161.199,23161.2,23161.183,23161.189,23161.18,23161.179,23161.194,23161.152,23161.152,23161.149,23161.174,23161.157,23161.121,23161.086,23161.107,23161.113,23161.091,23161.069,23161.092,23161.099,23161.076,23161.053,23161.064,23161.052,23161.06,23161.049,23161.048,23161.055,23161.071,23161.038,23161.052,23161.069,23161.053,23161.078,23161.066,23161.059,23161.103,23161.084,23161.073,23161.061,23161.063,23161.072,23161.082,23161.078,23161.072,23161.062,23161.055,23161.093,23161.088,23161.088,23161.091,23161.101,23161.062,23161.075,23161.082,23161.058,23161.066,23161.048,23161.07,23161.067,23161.064,23161.066,23161.061,23161.065,23161.05,23161.063,23161.021,23161.063,23161.041,23161.033,23161.014,23161.011,23161.013,23161.032,23161.033,23161.022,23161.011,23160.984,23160.989,23160.976,23160.982,23160.973,23160.975,23160.984,23160.981,23160.955,23160.962,23160.969,23160.995,23160.918,23160.964,23160.974,23160.946,23160.984,23160.976,23160.973,23160.978,23160.981,23160.997,23160.987,23160.967,23160.971,23160.979,23160.972,23160.991,23161.005,23160.968,23160.989,23160.941,23161.01,23160.92,23160.977,23160.997,23160.996,23160.914,23160.903,23160.957,23160.942,23160.901,23160.939,23160.91,23160.961,23160.899,23160.911,23160.872,23160.913,23160.916,23160.904,23160.879,23160.898,23160.893,23160.867,23160.878,23160.886,23160.847,23160.858,23160.81,23160.85,23160.837,23160.828,23160.831,23160.825,23160.795,23160.826,23160.803,23160.799,23160.775,23160.758,23160.748,23160.755,23160.752,23160.756,23160.754,23160.736,23160.769,23160.746,23160.743,23160.732,23160.754,23160.751,23160.723,23160.762,23160.767,23160.76,23160.745,23160.756,23160.8,23160.76,23160.776,23160.789,23160.79,23160.786,23160.8,23160.808,23160.827,23160.808,23160.818,23160.846,23160.849,23160.823,23160.781,23160.81,23160.817,23160.829,23160.83,23160.826,23160.866,23160.846,23160.838,23160.841,23160.829,23160.86,23160.843,23160.847,23160.827,23160.84,23160.849,23160.811,23160.815,23160.81,23160.784,23160.792,23160.749,23160.777,23160.764,23160.771,23160.759,23160.76,23160.703,23160.701,23160.735,23160.711,23160.711,23160.719,23160.693,23160.708,23160.714,23160.703,23160.681,23160.714,23160.711,23160.711,23160.71,23160.716,23160.756,23160.756,23160.761,23160.767,23160.807,23160.799,23160.778,23160.822,23160.838,23160.848,23160.812,23160.827,23160.815,23160.833,23160.812,23160.826,23160.799,23160.755,23160.753,23160.751,23160.766,23160.745,23160.736,23160.764,23160.758,23160.723,23160.735,23160.68,23160.696,23160.678,23160.7,23160.673,23160.681,23160.692,23160.678,23160.658,23160.668,23160.692,23160.677,23160.683,23160.681,23160.693,23160.683,23160.688,23160.691,23160.685,23160.685,23160.684,23160.674,23160.648,23160.673,23160.692,23160.658,23160.626,23160.68,23160.661,23160.637,23160.622,23160.648,23160.617,23160.604,23160.548,23160.565,23160.571,23160.548,23160.56,23160.543,23160.535,23160.501,23160.468,23160.456,23160.455,23160.454,23160.424,23160.433,23160.421,23160.422,23160.401,23160.391,23160.407,23160.386,23160.371,23160.361,23160.355,23160.361,23160.392,23160.374,23160.351,23160.354,23160.364,23160.367,23160.351,23160.381,23160.341,23160.333,23160.35,23160.34,23160.334,23160.3,23160.294,23160.329,23160.32,23160.317,23160.275,23160.274,23160.262,23160.293,23160.287,23160.286,23160.268,23160.274,23160.291,23160.293,23160.318,23160.286,23160.267,23160.24,23160.236,23160.206,23160.198,23160.206,23160.2,23160.17,23160.162,23160.089,23160.099,23160.101,23160.101,23160.107,23160.081,23160.08,23160.048,23160.063,23160.047,23160.025,23160.012,23160.007,23160.007,23159.996,23159.973,23159.963,23159.974,23159.928,23159.941,23159.892,23159.929,23159.938,23159.961,23159.933,23159.887,23159.949,23159.93,23159.903,23159.918,23159.911,23159.927,23159.896,23159.913,23159.895,23159.878,23159.878,23159.86,23159.866,23159.851,23159.84,23159.829,23159.809,23159.767,23159.789,23159.792,23159.796,23159.789,23159.792,23159.739,23159.746,23159.765,23159.754,23159.727,23159.736,23159.749,23159.722,23159.702,23159.68,23159.718,23159.691,23159.676,23159.698,23159.671,23159.675,23159.718,23159.69,23159.608,23159.628,23159.629,23159.636,23159.657,23159.638,23159.645,23159.643,23159.633,23159.625,23159.634,23159.59,23159.618,23159.621,23159.608,23159.62,23159.606,23159.591,23159.583,23159.609,23159.596,23159.595,23159.611,23159.607,23159.608,23159.574,23159.615,23159.619,23159.615,23159.614,23159.623,23159.638,23159.641,23159.601,23159.628,23159.607,23159.604,23159.627,23159.622,23159.629,23159.645,23159.647,23159.645,23159.609,23159.575,23159.584,23159.579,23159.587,23159.555,23159.558,23159.55,23159.541,23159.537,23159.512,23159.516,23159.511,23159.497,23159.476,23159.478,23159.461,23159.5,23159.485,23159.46,23159.463,23159.451,23159.45,23159.433,23159.408,23159.419,23159.438,23159.421,23159.404,23159.42,23159.414,23159.416,23159.396,23159.408,23159.418,23159.435,23159.443,23159.478,23159.444,23159.462,23159.455,23159.492,23159.483,23159.511,23159.517,23159.53,23159.558,23159.557,23159.567,23159.597,23159.629,23159.633,23159.642,23159.666,23159.692,23159.686,23159.696,23159.709,23159.712,23159.704,23159.696,23159.702,23159.68,23159.67,23159.66,23159.63,23159.63,23159.628,23159.592,23159.598,23159.577,23159.567,23159.556,23159.582,23159.556,23159.568,23159.581,23159.554,23159.559,23159.567,23159.574,23159.578,23159.554,23159.563,23159.579,23159.562,23159.552,23159.543,23159.525,23159.492,23159.489,23159.482,23159.489,23159.477,23159.477,23159.504,23159.491,23159.481,23159.471,23159.519,23159.512,23159.53,23159.55,23159.533,23159.546,23159.554,23159.554,23159.568,23159.585,23159.568,23159.572,23159.612,23159.604,23159.605,23159.631,23159.653,23159.656,23159.659,23159.681,23159.665,23159.68,23159.7,23159.688,23159.702,23159.75,23159.733,23159.742,23159.725,23159.702,23159.728,23159.739,23159.728,23159.723,23159.705,23159.703,23159.717,23159.736,23159.732,23159.748,23159.759,23159.746,23159.77,23159.78,23159.785,23159.818,23159.841,23159.841,23159.854,23159.869,23159.847,23159.866,23159.881,23159.83,23159.839,23159.849,23159.845,23159.858,23159.905,23159.886,23159.908,23159.918,23159.92,23159.931,23159.939,23159.955,23159.993,23159.967,23159.979,23160.012,23160.028,23160.027,23160.043,23160.031,23160.044,23160.075,23160.071,23160.087,23160.111,23160.12,23160.127,23160.099,23160.125,23160.081,23160.11,23160.107,23160.089,23160.095,23160.152,23160.096,23160.148,23160.125,23160.176,23160.204,23160.193,23160.262,23160.275,23160.24,23160.261,23160.273,23160.293,23160.313,23160.283,23160.317,23160.377,23160.326,23160.357,23160.356,23160.383,23160.361,23160.365,23160.369,23160.377,23160.402,23160.394,23160.411,23160.392,23160.399,23160.416,23160.452,23160.454,23160.45,23160.463,23160.446,23160.461,23160.449,23160.451,23160.453,23160.455,23160.447,23160.49,23160.481,23160.503,23160.509,23160.507,23160.483,23160.512,23160.519,23160.524,23160.526,23160.547,23160.554,23160.587,23160.579,23160.578,23160.574,23160.554,23160.55,23160.556,23160.571,23160.572,23160.551,23160.567,23160.546,23160.551,23160.532,23160.544,23160.536,23160.503,23160.512,23160.527,23160.503,23160.503,23160.478,23160.467,23160.484,23160.498,23160.494,23160.522,23160.492,23160.501,23160.511,23160.527,23160.514,23160.547,23160.55,23160.605,23160.567,23160.618,23160.601,23160.589,23160.615,23160.596,23160.629,23160.613,23160.581,23160.61,23160.636,23160.601,23160.561,23160.602,23160.614,23160.584,23160.587,23160.541,23160.556,23160.546,23160.556,23160.478,23160.565,23160.533,23160.515,23160.489,23160.549,23160.56,23160.565,23160.616,23160.641,23160.591,23160.599,23160.542,23160.576,23160.605,23160.625,23160.66,23160.69,23160.683,23160.692,23160.709,23160.7,23160.691,23160.714,23160.708,23160.69,23160.692,23160.704,23160.669,23160.644,23160.662,23160.662,23160.638,23160.608,23160.611,23160.633,23160.622,23160.631,23160.637,23160.645,23160.648,23160.609,23160.62,23160.605,23160.613,23160.593,23160.622,23160.592,23160.576,23160.548,23160.554,23160.532,23160.526,23160.516,23160.52,23160.543,23160.532,23160.527,23160.555,23160.541,23160.554,23160.562,23160.55,23160.6,23160.623,23160.614,23160.611,23160.587,23160.614,23160.607,23160.616,23160.591,23160.558,23160.576,23160.619,23160.61,23160.607,23160.627,23160.627,23160.612,23160.618,23160.599,23160.612,23160.584,23160.569,23160.56,23160.539,23160.533,23160.528,23160.527,23160.5,23160.493,23160.485,23160.492,23160.489,23160.475,23160.493,23160.511,23160.506,23160.486,23160.495,23160.482,23160.43,23160.436,23160.42,23160.398,23160.4,23160.427,23160.418,23160.417,23160.408,23160.396,23160.455,23160.46,23160.468,23160.484,23160.504,23160.518,23160.532,23160.526,23160.523,23160.51,23160.51,23160.483,23160.475,23160.45,23160.435,23160.438,23160.431,23160.442,23160.437,23160.448,23160.425,23160.441,23160.402,23160.405,23160.407,23160.391,23160.365,23160.332,23160.344,23160.334,23160.33,23160.322,23160.287,23160.309,23160.323,23160.294,23160.327,23160.332,23160.317,23160.312,23160.338,23160.352,23160.348,23160.338,23160.333,23160.328,23160.331,23160.31,23160.278,23160.286,23160.261,23160.265,23160.296,23160.283,23160.266,23160.282,23160.326,23160.315,23160.298,23160.333,23160.302,23160.295,23160.327,23160.31,23160.299,23160.309,23160.311,23160.304,23160.297,23160.326,23160.345,23160.346,23160.361,23160.351,23160.373,23160.425,23160.4,23160.423,23160.424,23160.394,23160.434,23160.412,23160.426,23160.432,23160.416,23160.415,23160.382,23160.338,23160.344,23160.394,23160.368,23160.324,23160.367,23160.36,23160.354,23160.377,23160.363,23160.353,23160.383,23160.384,23160.371,23160.355,23160.357,23160.356,23160.356,23160.351,23160.381,23160.397,23160.41,23160.421,23160.406,23160.439,23160.453,23160.478,23160.476,23160.494,23160.518,23160.51,23160.494,23160.523,23160.491,23160.533,23160.523,23160.505,23160.532,23160.547,23160.543,23160.558,23160.557,23160.547,23160.537,23160.571,23160.592,23160.577,23160.609,23160.594,23160.614,23160.557,23160.575,23160.583,23160.621,23160.611,23160.626,23160.614,23160.617,23160.617,23160.64,23160.63,23160.629,23160.603,23160.607,23160.618,23160.608,23160.612,23160.633,23160.628,23160.614,23160.635,23160.655,23160.627,23160.64,23160.652,23160.654,23160.654,23160.632,23160.627,23160.611,23160.621,23160.645,23160.642,23160.645,23160.663,23160.674,23160.669,23160.697,23160.695,23160.702,23160.713,23160.754,23160.748,23160.778,23160.795,23160.796,23160.834,23160.81,23160.812,23160.843,23160.848,23160.85,23160.834,23160.852,23160.893,23160.91,23160.898,23160.869,23160.875,23160.88,23160.875,23160.875,23160.868,23160.861,23160.875,23160.859,23160.883,23160.905,23160.868,23160.86,23160.889,23160.899,23160.9,23160.896,23160.877,23160.904,23160.927,23160.905,23160.889,23160.888,23160.923,23160.905,23160.932,23160.924,23160.925,23160.934,23160.938,23160.977,23160.973,23161.0,23160.988,23161.017,23160.999,23161.025,23161.031,23161.018,23161.038,23161.029,23161.026,23161.059,23161.067,23161.06,23161.083,23161.104,23161.098,23161.099,23161.097,23161.113,23161.104,23161.105,23161.096,23161.093,23161.083,23161.065,23161.09,23161.071,23161.074,23161.035,23161.042,23161.048,23161.041,23161.069,23161.054,23161.032,23161.033,23160.99,23160.971,23160.996,23160.952,23160.97,23160.93,23160.995,23160.952,23160.938,23160.967,23160.967,23160.958,23160.959,23160.949,23160.952,23160.954,23160.948,23160.943,23160.973,23160.977,23160.958,23160.972,23160.946,23160.943,23160.951,23160.929,23160.92,23160.935,23160.944,23160.937,23160.927,23160.913,23160.935,23160.93,23160.911,23160.877,23160.864,23160.863,23160.864,23160.862,23160.852,23160.874,23160.831,23160.839,23160.84,23160.845,23160.82,23160.785,23160.787,23160.781,23160.821,23160.77,23160.781,23160.8,23160.785,23160.805,23160.809,23160.811,23160.811,23160.804,23160.775,23160.802,23160.803,23160.827,23160.821,23160.821,23160.84,23160.836,23160.818,23160.843,23160.837,23160.895,23160.851,23160.817,23160.807,23160.837,23160.835,23160.848,23160.849,23160.823,23160.809,23160.863,23160.828,23160.806,23160.84,23160.814,23160.814,23160.807,23160.816,23160.821,23160.803,23160.772,23160.818,23160.826,23160.786,23160.782,23160.772,23160.769,23160.769,23160.748,23160.729,23160.786,23160.689,23160.692,23160.748,23160.737,23160.724,23160.745,23160.709,23160.718,23160.748,23160.693,23160.71,23160.719,23160.721,23160.733,23160.741,23160.748,23160.742,23160.745,23160.768,23160.731,23160.729,23160.756,23160.748,23160.728,23160.741,23160.745,23160.728,23160.73,23160.711,23160.707,23160.712,23160.695,23160.657,23160.651,23160.665,23160.64,23160.614,23160.638,23160.61,23160.595,23160.598,23160.621,23160.597,23160.587,23160.614,23160.592,23160.607,23160.587,23160.582,23160.6,23160.584,23160.583,23160.564,23160.567,23160.564,23160.56,23160.584,23160.576,23160.571,23160.571,23160.59,23160.57,23160.588,23160.595,23160.607,23160.61,23160.597,23160.58,23160.614,23160.571,23160.601,23160.544,23160.525,23160.523,23160.541,23160.525,23160.563,23160.538,23160.515,23160.507,23160.531,23160.554,23160.556,23160.547,23160.591,23160.586,23160.546,23160.606,23160.593,23160.56,23160.553,23160.547,23160.565,23160.526,23160.532,23160.557,23160.569,23160.548,23160.554,23160.55,23160.572,23160.571,23160.565,23160.529,23160.529,23160.53,23160.554,23160.549,23160.555,23160.578,23160.576,23160.594,23160.582,23160.564,23160.541,23160.533,23160.521,23160.512,23160.482,23160.461,23160.454,23160.424,23160.364,23160.383,23160.41,23160.351,23160.342,23160.368,23160.372,23160.362,23160.333,23160.383,23160.369,23160.373,23160.372,23160.369,23160.381,23160.343,23160.361,23160.385,23160.357,23160.368,23160.341,23160.34,23160.358,23160.346,23160.36,23160.303,23160.323,23160.297,23160.332,23160.354,23160.287,23160.282,23160.299,23160.293,23160.287,23160.239,23160.237,23160.226,23160.224,23160.199,23160.229,23160.224,23160.219,23160.217,23160.235,23160.219,23160.237,23160.238,23160.208,23160.236,23160.287,23160.29,23160.254,23160.228,23160.226,23160.207,23160.194,23160.216,23160.18,23160.195,23160.203,23160.204,23160.229,23160.24,23160.221,23160.218,23160.259,23160.246,23160.25,23160.26,23160.241,23160.252,23160.246,23160.26,23160.278,23160.276,23160.296,23160.254,23160.277,23160.267,23160.258,23160.259,23160.247,23160.241,23160.219,23160.204,23160.158,23160.155,23160.115,23160.142,23160.097,23160.061,23160.088,23160.085,23160.083,23160.079,23160.074,23160.091,23160.079,23160.096,23160.093,23160.096,23160.133,23160.135,23160.136,23160.139,23160.139,23160.201,23160.175,23160.215,23160.181,23160.229,23160.217,23160.162,23160.194,23160.174,23160.135,23160.186,23160.115,23160.1,23160.07,23160.067,23160.017,23160.02,23159.988,23159.962,23159.95,23159.959,23159.975,23159.982,23159.978,23159.998,23159.998,23159.989,23160.007,23160.002,23159.989,23160.003,23159.997,23160.016,23160.016,23160.01,23160.021,23160.017,23160.047,23160.034,23160.069,23160.04,23160.049,23160.074,23160.046,23160.063,23160.079,23160.083,23160.073,23160.081,23160.075,23160.064,23160.086,23160.057,23160.054,23160.029,23160.041,23160.037,23160.06,23160.035,23160.04,23160.051,23160.035,23160.048,23160.036,23160.014,23160.032,23160.042,23160.053,23160.045,23160.014,23160.072,23160.074,23160.035,23160.024,23160.064,23160.035,23160.05,23160.069,23160.058,23160.057,23160.049,23160.029,23160.026,23160.075,23160.067,23160.051,23160.052,23160.051,23160.023,23160.028,23160.021,23160.04,23160.051,23160.05,23160.048,23160.04,23160.113,23160.081,23160.101,23160.104,23160.13,23160.131,23160.131,23160.136,23160.132,23160.147,23160.144,23160.14,23160.134,23160.163,23160.167,23160.133,23160.124,23160.104,23160.11,23160.095,23160.099,23160.099,23160.118,23160.095,23160.12,23160.11,23160.121,23160.102,23160.113,23160.097,23160.113,23160.115,23160.063,23160.097,23160.08,23160.085,23160.136,23160.153,23160.158,23160.151,23160.164,23160.176,23160.175,23160.178,23160.193,23160.164,23160.146,23160.158,23160.129,23160.111,23160.136,23160.102,23160.075,23160.069,23160.075,23160.066,23160.063,23160.056,23160.063,23160.049,23160.044,23160.012,23160.008,23160.021,23160.0,23159.995,23159.981,23160.004,23159.977,23159.952,23160.013,23160.015,23160.011,23160.033,23160.021,23160.056,23160.06,23160.109,23160.047,23160.109,23160.145,23160.133,23160.093,23160.072,23160.058,23160.077,23160.025,23159.991,23159.962,23159.927,23159.914,23159.923,23159.915,23159.913,23159.894,23159.906,23159.886,23159.881,23159.89,23159.87,23159.874,23159.87,23159.855,23159.85,23159.836,23159.82,23159.771,23159.784,23159.795,23159.804,23159.829,23159.833,23159.847,23159.863,23159.831,23159.938,23159.892,23159.946,23159.977,23159.964,23160.007,23159.995,23159.971,23159.989,23159.988,23160.003,23159.987,23159.968,23159.975,23159.958,23159.966,23159.923,23159.953,23159.925,23159.893,23159.882,23159.889,23159.878,23159.861,23159.836,23159.802,23159.792,23159.78,23159.778,23159.786,23159.772,23159.775,23159.777,23159.749,23159.76,23159.747,23159.743,23159.773,23159.776,23159.767,23159.741,23159.735,23159.756,23159.792,23159.797,23159.816,23159.856,23159.88,23159.966,23160.024,23160.002,23160.057,23160.042,23160.07,23160.066,23160.101,23160.058,23160.031,23159.967,23159.92,23159.861,23159.821,23159.761,23159.743,23159.794,23159.708,23159.741,23159.775,23159.79,23159.801,23159.82,23159.821,23159.805,23159.816,23159.79,23159.785,23159.791,23159.776,23159.764,23159.774,23159.754,23159.743,23159.762,23159.807,23159.815,23159.846,23159.871,23159.926,23159.985,23159.986,23160.005,23160.036,23160.02,23160.055,23160.051,23160.026,23159.984,23159.955,23159.948,23159.928,23159.913,23159.869,23159.884,23159.864,23159.836,23159.859,23159.838,23159.828,23159.82,23159.822,23159.853,23159.808,23159.791,23159.829,23159.834,23159.857,23159.894,23159.881,23159.883,23159.858,23159.967,23159.986,23160.019,23160.026,23160.019,23160.049,23160.1,23160.136,23160.152,23160.165,23160.188,23160.191,23160.24,23160.23,23160.261,23160.254,23160.248,23160.231,23160.201,23160.179,23160.182,23160.149,23160.127,23160.129,23160.091,23160.073,23160.068,23160.072,23160.078,23160.058,23160.061,23160.063,23160.071,23160.056,23160.071,23160.087,23160.12,23160.112,23160.159,23160.144,23160.125,23160.124,23160.161,23160.17,23160.177,23160.17,23160.184,23160.189,23160.19,23160.218,23160.243,23160.243,23160.254,23160.271,23160.265,23160.266,23160.283,23160.28,23160.297,23160.318,23160.307,23160.318,23160.351,23160.339,23160.361,23160.371,23160.358,23160.398,23160.38,23160.359,23160.387,23160.36,23160.402,23160.362,23160.369,23160.379,23160.386,23160.362,23160.36,23160.349,23160.411,23160.42,23160.393,23160.413,23160.401,23160.426,23160.411,23160.402,23160.391,23160.393,23160.398,23160.413,23160.419,23160.434,23160.461,23160.495,23160.48,23160.512,23160.517,23160.531,23160.536,23160.546,23160.551,23160.588,23160.59,23160.61,23160.649,23160.651,23160.666,23160.689,23160.72,23160.743,23160.761,23160.782,23160.79,23160.796,23160.791,23160.792,23160.791,23160.769,23160.771,23160.755,23160.748,23160.749,23160.724,23160.738,23160.744,23160.706,23160.72,23160.743,23160.746,23160.753,23160.767,23160.78,23160.761,23160.744,23160.754,23160.744,23160.739,23160.725,23160.721,23160.726,23160.706,23160.698,23160.714,23160.761,23160.766,23160.769,23160.789,23160.809,23160.802,23160.813,23160.827,23160.805,23160.799,23160.824,23160.844,23160.865,23160.845,23160.844,23160.852,23160.851,23160.859,23160.828,23160.874,23160.843,23160.858,23160.887,23160.822,23160.85,23160.85,23160.887,23160.867,23160.846,23160.862,23160.801,23160.815,23160.773,23160.759,23160.784,23160.768,23160.763,23160.797,23160.784,23160.767,23160.77,23160.794,23160.77,23160.785,23160.78,23160.765,23160.773,23160.75,23160.773,23160.77,23160.742,23160.75,23160.74,23160.724,23160.727,23160.734,23160.748,23160.748,23160.738,23160.744,23160.709,23160.724,23160.713,23160.695,23160.701,23160.703,23160.682,23160.716,23160.704,23160.68,23160.693,23160.681,23160.692,23160.691,23160.699,23160.708,23160.701,23160.714,23160.714,23160.712,23160.721,23160.713,23160.714,23160.714,23160.686,23160.693,23160.682,23160.678,23160.669,23160.636,23160.666,23160.669,23160.657,23160.633,23160.663,23160.663,23160.637,23160.608,23160.613,23160.608,23160.586,23160.577,23160.536,23160.55,23160.556,23160.535,23160.545,23160.494,23160.526,23160.487,23160.512,23160.485,23160.477,23160.452,23160.446,23160.426,23160.441,23160.424,23160.419,23160.44,23160.432,23160.409,23160.41,23160.385,23160.375,23160.373,23160.378,23160.395,23160.38,23160.412,23160.401,23160.418,23160.392,23160.412,23160.383,23160.396,23160.4,23160.4,23160.4,23160.406,23160.408,23160.378,23160.363,23160.371,23160.369,23160.388,23160.391,23160.394,23160.381,23160.383,23160.372,23160.386,23160.381,23160.381,23160.389,23160.377,23160.379,23160.391,23160.369,23160.36,23160.369,23160.363,23160.364,23160.346,23160.359,23160.346,23160.369,23160.395,23160.373,23160.385,23160.396,23160.386,23160.379,23160.381,23160.342,23160.369,23160.359,23160.354,23160.326,23160.345,23160.338,23160.332,23160.321,23160.347,23160.347,23160.34,23160.333,23160.342,23160.328,23160.327,23160.348,23160.318,23160.348,23160.312,23160.315,23160.316,23160.331,23160.294,23160.33,23160.363,23160.349,23160.389,23160.385,23160.38,23160.396,23160.388,23160.416,23160.382,23160.407,23160.412,23160.419,23160.408,23160.386,23160.425,23160.416,23160.389,23160.415,23160.384,23160.376,23160.391,23160.41,23160.424,23160.418,23160.402,23160.412,23160.404,23160.392,23160.372,23160.374,23160.365,23160.361,23160.382,23160.402,23160.406,23160.398,23160.39,23160.399,23160.417,23160.418,23160.413,23160.43,23160.429,23160.436,23160.411,23160.452,23160.449,23160.438,23160.457,23160.457,23160.464,23160.463,23160.464,23160.451,23160.431,23160.409,23160.41,23160.422,23160.436,23160.456,23160.429,23160.406,23160.412,23160.402,23160.425,23160.409,23160.393,23160.383,23160.376,23160.39,23160.367,23160.395,23160.378,23160.335,23160.34,23160.312,23160.302,23160.358,23160.255,23160.266,23160.249,23160.233,23160.246,23160.198,23160.187,23160.181,23160.162,23160.131,23160.157,23160.123,23160.154,23160.153,23160.117,23160.123,23160.127,23160.144,23160.124,23160.101,23160.085,23160.096,23160.083,23160.101,23160.082,23160.078,23160.085,23160.12,23160.062,23160.108,23160.075,23160.114,23160.111,23160.086,23160.105,23160.086,23160.084,23160.073,23160.061,23160.041,23160.047,23160.087,23160.087,23160.116,23160.089,23160.104,23160.104,23160.113,23160.127,23160.135,23160.141,23160.107,23160.103,23160.126,23160.108,23160.113,23160.099,23160.095,23160.12,23160.102,23160.11,23160.131,23160.135,23160.141,23160.126,23160.12,23160.143,23160.125,23160.134,23160.156,23160.138,23160.126,23160.14,23160.163,23160.19,23160.153,23160.198,23160.209,23160.217,23160.21,23160.205,23160.222,23160.247,23160.239,23160.248,23160.259,23160.279,23160.285,23160.276,23160.274,23160.285,23160.285,23160.286,23160.316,23160.323,23160.33,23160.329,23160.346,23160.373,23160.389,23160.396,23160.398,23160.404,23160.426,23160.422,23160.43,23160.454,23160.445,23160.471,23160.488,23160.484,23160.503,23160.518,23160.54,23160.533,23160.51,23160.523,23160.516,23160.532,23160.535,23160.529,23160.56,23160.561,23160.584,23160.56,23160.549,23160.593,23160.59,23160.595,23160.6,23160.594,23160.603,23160.619,23160.638,23160.629,23160.656,23160.669,23160.6,23160.635,23160.609,23160.612,23160.617,23160.681,23160.653,23160.666,23160.653,23160.64,23160.664,23160.685,23160.651,23160.725,23160.667,23160.698,23160.684,23160.694,23160.703,23160.681,23160.736,23160.708,23160.718,23160.712,23160.742,23160.73,23160.721,23160.735,23160.722,23160.732,23160.749,23160.733,23160.761,23160.762,23160.728,23160.746,23160.754,23160.745,23160.766,23160.756,23160.771,23160.747,23160.775,23160.777,23160.765,23160.787,23160.769,23160.768,23160.759,23160.769,23160.751,23160.753,23160.751,23160.758,23160.767,23160.781,23160.77,23160.758,23160.77,23160.753,23160.756,23160.775,23160.752,23160.767,23160.759,23160.757,23160.755,23160.76,23160.731,23160.75,23160.712,23160.725,23160.78,23160.74,23160.735,23160.755,23160.767,23160.773,23160.771,23160.735,23160.745,23160.734,23160.743,23160.752,23160.728,23160.751,23160.753,23160.747,23160.753,23160.772,23160.756,23160.751,23160.738,23160.753,23160.765,23160.773,23160.757,23160.772,23160.777,23160.798,23160.764,23160.745,23160.776,23160.782,23160.794,23160.789,23160.806,23160.794,23160.801,23160.778,23160.788,23160.821,23160.797,23160.837,23160.816,23160.806,23160.796,23160.775,23160.788,23160.807,23160.773,23160.794,23160.765,23160.785,23160.76,23160.75,23160.736,23160.743,23160.729,23160.775,23160.744,23160.721,23160.739,23160.738,23160.724,23160.722,23160.721,23160.681,23160.704,23160.706,23160.71,23160.696,23160.694,23160.708,23160.69,23160.69,23160.706,23160.716,23160.705,23160.726,23160.715,23160.715,23160.7,23160.709,23160.699,23160.72,23160.723,23160.726,23160.739,23160.7,23160.706,23160.693,23160.706,23160.718,23160.732,23160.707,23160.726,23160.731,23160.748,23160.782,23160.758,23160.74,23160.75,23160.789,23160.789,23160.783,23160.794,23160.797,23160.808,23160.806,23160.807,23160.82,23160.832,23160.802,23160.821,23160.829,23160.794,23160.811,23160.817,23160.815,23160.832,23160.845,23160.826,23160.809,23160.791,23160.788,23160.774,23160.764,23160.754,23160.748,23160.706,23160.775,23160.769,23160.77,23160.77,23160.754,23160.77,23160.791,23160.778,23160.83,23160.859,23160.884,23160.881,23160.882,23160.898,23160.906,23160.915,23160.942,23160.95,23160.983,23160.989,23161.027,23161.024,23161.041,23161.066,23161.097,23161.081,23161.099,23161.11,23161.127,23161.135,23161.155,23161.145,23161.16,23161.168,23161.162,23161.176,23161.181,23161.198,23161.203,23161.17,23161.201,23161.2,23161.199,23161.182,23161.201,23161.201,23161.216,23161.224,23161.219,23161.186,23161.191,23161.267,23161.234,23161.245,23161.233,23161.232,23161.245,23161.254,23161.287,23161.272,23161.297,23161.289,23161.298,23161.291,23161.292,23161.292,23161.295,23161.3,23161.314,23161.32,23161.317,23161.305,23161.296,23161.312,23161.291,23161.315,23161.334,23161.328,23161.336,23161.362,23161.336,23161.348,23161.336,23161.352,23161.336,23161.344,23161.318,23161.36,23161.355,23161.354,23161.366,23161.343,23161.363,23161.383,23161.407,23161.431,23161.422,23161.449,23161.49,23161.496,23161.479,23161.48,23161.51,23161.489,23161.484,23161.514,23161.513,23161.528,23161.555,23161.557,23161.568,23161.555,23161.575,23161.595,23161.58,23161.622,23161.641,23161.619,23161.623,23161.615,23161.607,23161.613,23161.636,23161.647,23161.699,23161.653,23161.648,23161.64,23161.627,23161.634,23161.588,23161.606,23161.622,23161.604,23161.609,23161.579,23161.565,23161.597,23161.582,23161.55,23161.545,23161.566,23161.56,23161.546,23161.559,23161.564,23161.565,23161.6,23161.566,23161.543,23161.551,23161.571,23161.576,23161.58,23161.611,23161.594,23161.587,23161.604,23161.613,23161.607,23161.622,23161.616,23161.618,23161.639,23161.669,23161.645,23161.649,23161.69,23161.718,23161.693,23161.694,23161.694,23161.689,23161.713,23161.698,23161.741,23161.73,23161.699,23161.737,23161.726,23161.741,23161.742,23161.742,23161.775,23161.754,23161.764,23161.765,23161.783,23161.778,23161.769,23161.746,23161.761,23161.757,23161.745,23161.741,23161.746,23161.766,23161.776,23161.777,23161.768,23161.762,23161.793,23161.774,23161.802,23161.8,23161.793,23161.786,23161.769,23161.782,23161.772,23161.754,23161.758,23161.754,23161.755,23161.732,23161.726,23161.739,23161.752,23161.796,23161.78,23161.774,23161.806,23161.833,23161.856,23161.82,23161.859,23161.861,23161.889,23161.896,23161.899,23161.907,23161.905,23161.921,23161.932,23161.935,23161.927,23161.972,23161.96,23161.964,23162.004,23162.011,23162.028,23162.034,23162.059,23162.082,23162.111,23162.116,23162.13,23162.138,23162.142,23162.146,23162.167,23162.182,23162.219,23162.225,23162.254,23162.272,23162.271,23162.253,23162.252,23162.27,23162.275,23162.321,23162.32,23162.307,23162.319,23162.338,23162.316,23162.344,23162.353,23162.342,23162.37,23162.386,23162.367,23162.399,23162.397,23162.4,23162.407,23162.405,23162.419,23162.419,23162.426,23162.446,23162.431,23162.446,23162.479,23162.457,23162.461,23162.485,23162.483,23162.476,23162.48,23162.496,23162.502,23162.502,23162.506,23162.523,23162.542,23162.56,23162.579,23162.593,23162.592,23162.578,23162.62,23162.604,23162.623,23162.659,23162.634,23162.636,23162.62,23162.648,23162.661,23162.641,23162.659,23162.695,23162.702,23162.734,23162.739,23162.764,23162.791,23162.787,23162.788,23162.817,23162.801,23162.824,23162.827,23162.819,23162.842,23162.861,23162.858,23162.877,23162.873,23162.868,23162.859,23162.882,23162.929,23162.921,23162.911,23162.928,23162.909,23162.908,23162.908,23162.919,23162.926,23162.929,23162.925,23162.932,23162.912,23162.912,23162.898,23162.892,23162.901,23162.894,23162.876,23162.858,23162.862,23162.89,23162.845,23162.858,23162.829,23162.835,23162.833,23162.844,23162.82,23162.835,23162.816,23162.844,23162.822,23162.823,23162.818,23162.818,23162.801,23162.797,23162.801,23162.801,23162.831,23162.826,23162.845,23162.848,23162.876,23162.881,23162.895,23162.92,23162.912,23162.933,23162.972,23162.98,23162.985,23163.012,23163.034,23163.049,23163.068,23163.103,23163.095,23163.085,23163.145,23163.16,23163.151,23163.187,23163.169,23163.183,23163.202,23163.196,23163.224,23163.222,23163.229,23163.288,23163.242,23163.252,23163.225,23163.243,23163.252,23163.255,23163.25,23163.249,23163.236,23163.216,23163.216,23163.238,23163.237,23163.218,23163.233,23163.255,23163.207,23163.249,23163.248,23163.259,23163.25,23163.269,23163.246,23163.268,23163.293,23163.282,23163.311,23163.312,23163.354,23163.338,23163.372,23163.387,23163.392,23163.404,23163.427,23163.443,23163.456,23163.443,23163.481,23163.503,23163.497,23163.516,23163.539,23163.587,23163.569,23163.563,23163.553,23163.526,23163.592,23163.588,23163.626,23163.647,23163.626,23163.611,23163.663,23163.663,23163.702,23163.722,23163.697,23163.713,23163.73,23163.723,23163.756,23163.746,23163.747,23163.755,23163.767,23163.766,23163.754,23163.781,23163.8,23163.823,23163.813,23163.834,23163.84,23163.828,23163.856,23163.822,23163.853,23163.857,23163.855,23163.83,23163.864,23163.854,23163.854,23163.885,23163.876,23163.852,23163.847,23163.842,23163.85,23163.872,23163.852,23163.829,23163.799,23163.801,23163.797,23163.798,23163.793,23163.789,23163.769,23163.804,23163.793,23163.792,23163.773,23163.782,23163.808,23163.79,23163.779,23163.787,23163.769,23163.76,23163.77,23163.774,23163.781,23163.8,23163.797,23163.787,23163.765,23163.775,23163.769,23163.803,23163.767,23163.791,23163.777,23163.772,23163.772,23163.747,23163.765,23163.789,23163.788,23163.787,23163.803,23163.787,23163.798,23163.813,23163.826,23163.818,23163.846,23163.886,23163.899,23163.914,23163.893,23163.931,23163.917,23163.93,23163.953,23163.965,23163.956,23163.969,23163.968,23163.982,23163.988,23163.972,23163.968,23163.99,23163.97,23164.024,23164.012,23164.015,23164.037,23164.088,23164.06,23164.078,23164.085,23164.116,23164.122,23164.144,23164.177,23164.177,23164.206,23164.214,23164.226,23164.236,23164.217,23164.227,23164.229,23164.241,23164.25,23164.258,23164.251,23164.281,23164.278,23164.292,23164.281,23164.289,23164.308,23164.301,23164.293,23164.298,23164.291,23164.289,23164.292,23164.28,23164.289,23164.273,23164.289,23164.307,23164.286,23164.323,23164.328,23164.311,23164.315,23164.339,23164.356,23164.346,23164.33,23164.314,23164.335,23164.351,23164.36,23164.357,23164.382,23164.353,23164.368,23164.402,23164.358,23164.356,23164.372,23164.36,23164.361,23164.351,23164.318,23164.323,23164.347,23164.334,23164.363,23164.37,23164.368,23164.382,23164.392,23164.395,23164.356,23164.385,23164.39,23164.406,23164.423,23164.44,23164.45,23164.478,23164.46,23164.466,23164.464,23164.474,23164.462,23164.478,23164.446,23164.453,23164.416,23164.454,23164.436,23164.457,23164.4,23164.449,23164.445,23164.416,23164.416,23164.369,23164.438,23164.392,23164.405,23164.377,23164.383,23164.369,23164.378,23164.397,23164.395,23164.398,23164.413,23164.409,23164.395,23164.394,23164.431,23164.448,23164.464,23164.44,23164.457,23164.466,23164.47,23164.474,23164.493,23164.494,23164.501,23164.494,23164.504,23164.487,23164.476,23164.479,23164.47,23164.493,23164.483,23164.491,23164.49,23164.503,23164.478,23164.484,23164.524,23164.509,23164.479,23164.482,23164.472,23164.453,23164.472,23164.455,23164.46,23164.452,23164.44,23164.44,23164.454,23164.444,23164.432,23164.437,23164.443,23164.463,23164.458,23164.474,23164.496,23164.471,23164.502,23164.486,23164.472,23164.468,23164.466,23164.469,23164.475,23164.458,23164.483,23164.451,23164.491,23164.473,23164.46,23164.443,23164.456,23164.431,23164.435,23164.456,23164.447,23164.45,23164.451,23164.418,23164.444,23164.453,23164.417,23164.423,23164.429,23164.397,23164.421,23164.422,23164.398,23164.44,23164.441,23164.432,23164.435,23164.443,23164.431,23164.428,23164.485,23164.453,23164.458,23164.467,23164.475,23164.44,23164.453,23164.447,23164.454,23164.444,23164.459,23164.495,23164.47,23164.472,23164.481,23164.477,23164.444,23164.448,23164.469,23164.461,23164.478,23164.454,23164.451,23164.464,23164.48,23164.454,23164.457,23164.457,23164.434,23164.438,23164.461,23164.431,23164.456,23164.457,23164.446,23164.432,23164.455,23164.468,23164.47,23164.487,23164.5,23164.497,23164.504,23164.494,23164.536,23164.528,23164.516,23164.546,23164.553,23164.541,23164.531,23164.574,23164.578,23164.589,23164.589,23164.65,23164.596,23164.643,23164.667,23164.629,23164.624,23164.632,23164.64,23164.626,23164.64,23164.651,23164.649,23164.668,23164.642,23164.661,23164.659,23164.677,23164.689,23164.657,23164.665,23164.653,23164.667,23164.658,23164.665,23164.662,23164.674,23164.63,23164.692,23164.634,23164.689,23164.68,23164.674,23164.661,23164.672,23164.67,23164.68,23164.663,23164.68,23164.682,23164.704,23164.701,23164.729,23164.725,23164.719,23164.716,23164.717,23164.727,23164.78,23164.764,23164.814,23164.794,23164.832,23164.839,23164.851,23164.861,23164.857,23164.873,23164.826,23164.846,23164.859,23164.849,23164.848,23164.824,23164.831,23164.845,23164.877,23164.845,23164.827,23164.818,23164.804,23164.796,23164.837,23164.832,23164.822,23164.827,23164.843,23164.812,23164.819,23164.799,23164.817,23164.827,23164.828,23164.8,23164.811,23164.802,23164.787,23164.807,23164.797,23164.792,23164.773,23164.786,23164.776,23164.766,23164.763,23164.77,23164.759,23164.747,23164.748,23164.749,23164.731,23164.737,23164.727,23164.725,23164.71,23164.727,23164.702,23164.711,23164.686,23164.665,23164.661,23164.652,23164.638,23164.623,23164.61,23164.607,23164.637,23164.611,23164.599,23164.574,23164.576,23164.55,23164.593,23164.564,23164.521,23164.525,23164.512,23164.504,23164.499,23164.478,23164.482,23164.472,23164.447,23164.474,23164.457,23164.45,23164.419,23164.425,23164.379,23164.392,23164.349,23164.381,23164.427,23164.358,23164.383,23164.376,23164.366,23164.382,23164.421,23164.412,23164.403,23164.432,23164.405,23164.411,23164.401,23164.427,23164.417,23164.423,23164.435,23164.396,23164.416,23164.426,23164.426,23164.448,23164.407,23164.47,23164.423,23164.443,23164.456,23164.463,23164.449,23164.445,23164.414,23164.452,23164.452,23164.463,23164.456,23164.455,23164.47,23164.441,23164.429,23164.453,23164.415,23164.446,23164.441,23164.446,23164.433,23164.416,23164.421,23164.42,23164.412,23164.43,23164.419,23164.406,23164.41,23164.391,23164.409,23164.39,23164.4,23164.381,23164.367,23164.353,23164.332,23164.338,23164.369,23164.36,23164.364,23164.347,23164.37,23164.377,23164.386,23164.355,23164.383,23164.398,23164.398,23164.395,23164.392,23164.403,23164.42,23164.399,23164.459,23164.422,23164.452,23164.456,23164.491,23164.498,23164.521,23164.487,23164.463,23164.543,23164.544,23164.555,23164.536,23164.565,23164.572,23164.588,23164.612,23164.621,23164.618,23164.601,23164.639,23164.641,23164.656,23164.651,23164.648,23164.661,23164.684,23164.661,23164.698,23164.704,23164.708,23164.756,23164.76,23164.756,23164.78,23164.783,23164.805,23164.808,23164.815,23164.845,23164.845,23164.827,23164.838,23164.873,23164.863,23164.911,23164.888,23164.912,23164.896,23164.919,23164.953,23164.968,23164.95,23164.979,23164.973,23164.991,23164.996,23165.034,23165.037,23165.028,23165.038,23165.064,23165.042,23165.025,23165.077,23165.077,23165.069,23165.092,23165.094,23165.129,23165.106,23165.154,23165.16,23165.153,23165.151,23165.172,23165.162,23165.147,23165.17,23165.168,23165.171,23165.163,23165.173,23165.191,23165.185,23165.191,23165.205,23165.219,23165.232,23165.241,23165.224,23165.233,23165.257,23165.258,23165.243,23165.255,23165.288,23165.294,23165.279,23165.283,23165.285,23165.304,23165.334,23165.32,23165.353,23165.353,23165.316,23165.347,23165.34,23165.364,23165.368,23165.341,23165.347,23165.352,23165.364,23165.354,23165.374,23165.388,23165.4,23165.377,23165.404,23165.397,23165.417,23165.421,23165.41,23165.43,23165.447,23165.436,23165.448,23165.443,23165.455,23165.44,23165.486,23165.496,23165.481,23165.496,23165.49,23165.495,23165.49,23165.5,23165.516,23165.541,23165.523,23165.554,23165.553,23165.558,23165.566,23165.572,23165.544,23165.572,23165.561,23165.577,23165.577,23165.569,23165.578,23165.59,23165.577,23165.592,23165.619,23165.614,23165.59,23165.616,23165.618,23165.628,23165.608,23165.592,23165.604,23165.606,23165.617,23165.605,23165.617,23165.637,23165.657,23165.643,23165.682,23165.659,23165.664,23165.668,23165.688,23165.686,23165.672,23165.68,23165.659,23165.656,23165.649,23165.692,23165.687,23165.693,23165.714,23165.722,23165.72,23165.715,23165.737,23165.744,23165.744,23165.764,23165.744,23165.747,23165.743,23165.784,23165.774,23165.787,23165.8,23165.779,23165.813,23165.852,23165.812,23165.843,23165.866,23165.877,23165.861,23165.879,23165.876,23165.838,23165.878,23165.881,23165.874,23165.9,23165.915,23165.88,23165.896,23165.881,23165.902,23165.896,23165.869,23165.872,23165.893,23165.896,23165.892,23165.901,23165.881,23165.875,23165.907,23165.887,23165.859,23165.875,23165.879,23165.89,23165.909,23165.899,23165.917,23165.927,23165.952,23165.941,23165.984,23165.987,23165.977,23165.993,23166.006,23166.024,23166.034,23166.02,23166.031,23166.033,23166.039,23166.037,23166.112,23166.106,23166.108,23166.099,23166.112,23166.146,23166.114,23166.12,23166.114,23166.121,23166.107,23166.06,23166.079,23166.009,23166.003,23165.959,23165.979,23166.027,23166.015,23166.03,23166.016,23166.022,23166.006,23165.998,23166.022,23166.026,23166.018,23165.969,23165.98,23165.972,23165.979,23165.943,23165.977,23165.931,23165.941,23165.893,23165.937,23165.898,23165.914,23165.895,23165.928,23165.914,23165.915,23165.898,23165.91,23165.88,23165.863,23165.831,23165.874,23165.85,23165.83,23165.796,23165.773,23165.742,23165.702,23165.7,23165.688,23165.678,23165.649,23165.646,23165.631,23165.601,23165.626,23165.621,23165.627,23165.636,23165.627,23165.608,23165.637,23165.618,23165.62,23165.635,23165.617,23165.649,23165.626,23165.628,23165.654,23165.672,23165.702,23165.707,23165.729,23165.774,23165.777,23165.802,23165.838,23165.842,23165.856,23165.876,23165.876,23165.898,23165.916,23165.885,23165.887,23165.878,23165.86,23165.815,23165.778,23165.779,23165.75,23165.697,23165.7,23165.617,23165.67,23165.611,23165.56,23165.567,23165.503,23165.429,23165.496,23165.485,23165.463,23165.406,23165.333,23165.303,23165.277,23165.227,23165.186,23165.156,23165.127,23165.107,23165.082,23165.051,23165.033,23164.994,23164.975,23164.956,23164.914,23164.94,23164.904,23164.917,23164.907,23164.89,23164.925,23164.961,23164.966,23164.98,23164.996,23165.013,23165.052,23165.103,23165.146,23165.144,23165.174,23165.219,23165.247,23165.272,23165.309,23165.309,23165.302,23165.321,23165.332,23165.335,23165.3,23165.295,23165.255,23165.26,23165.24,23165.219,23165.179,23165.161,23165.151,23165.144,23165.12,23165.098,23165.077,23165.096,23165.077,23165.049,23165.026,23165.034,23165.0,23164.932,23164.916,23164.85,23164.849,23164.791,23164.774,23164.703,23164.686,23164.605,23164.561,23164.578,23164.553,23164.532,23164.497,23164.493,23164.511,23164.491,23164.545,23164.561,23164.539,23164.557,23164.557,23164.593,23164.571,23164.564,23164.576,23164.586,23164.605,23164.598,23164.622,23164.585,23164.639,23164.632,23164.656,23164.671,23164.683,23164.682,23164.664,23164.668,23164.671,23164.62,23164.555,23164.473,23164.415,23164.346,23164.311,23164.244,23164.211,23164.184,23164.208,23164.226,23164.208,23164.234,23164.26,23164.302,23164.328,23164.387,23164.416,23164.495,23164.484,23164.47,23164.486,23164.48,23164.49,23164.491,23164.5,23164.513,23164.518,23164.534,23164.57,23164.633,23164.653,23164.701,23164.743,23164.742,23164.785,23164.82,23164.793,23164.786,23164.802,23164.795,23164.817,23164.824,23164.844,23164.846,23164.905,23164.914,23164.954,23164.999,23165.031,23165.084,23165.107,23165.135,23165.122,23165.136,23165.126,23165.122,23165.093,23165.081,23165.052,23165.046,23165.017,23165.012,23165.028,23165.039,23165.016,23164.991,23164.975,23164.926,23164.868,23164.859,23164.789,23164.807,23164.775,23164.719,23164.677,23164.65,23164.651,23164.627,23164.612,23164.65,23164.64,23164.658,23164.664,23164.689,23164.672,23164.717,23164.74,23164.702,23164.725,23164.753,23164.769,23164.773,23164.791,23164.827,23164.841,23164.854,23164.887,23164.917,23164.92,23164.945,23164.956,23165.004,23165.033,23165.085,23165.126,23165.167,23165.184,23165.204,23165.252,23165.291,23165.283,23165.347,23165.38,23165.379,23165.414,23165.395,23165.398,23165.403,23165.406,23165.416,23165.434,23165.422,23165.426,23165.469,23165.452,23165.43,23165.424,23165.414,23165.441,23165.424,23165.443,23165.432,23165.459,23165.438,23165.441,23165.461,23165.475,23165.476,23165.452,23165.453,23165.445,23165.474,23165.457,23165.441,23165.435,23165.433,23165.413,23165.424,23165.362,23165.419,23165.435,23165.327,23165.397,23165.387,23165.348,23165.383,23165.348,23165.349,23165.366,23165.38,23165.382,23165.383,23165.4,23165.411,23165.42,23165.387,23165.388,23165.392,23165.397,23165.427,23165.413,23165.418,23165.422,23165.392,23165.377,23165.352,23165.355,23165.32,23165.331,23165.286,23165.297,23165.269,23165.266,23165.243,23165.231,23165.238,23165.231,23165.223,23165.204,23165.169,23165.158,23165.106,23165.051,23165.049,23165.014,23164.981,23164.944,23164.939,23164.897,23164.894,23164.848,23164.826,23164.806,23164.792,23164.77,23164.793,23164.783,23164.738,23164.725,23164.733,23164.71,23164.738,23164.727,23164.694,23164.687,23164.682,23164.683,23164.707,23164.644,23164.692,23164.649,23164.643,23164.614,23164.656,23164.627,23164.614,23164.622,23164.608,23164.626,23164.633,23164.616,23164.652,23164.609,23164.563,23164.654,23164.605,23164.561,23164.623,23164.637,23164.568,23164.573,23164.565,23164.568,23164.553,23164.512,23164.544,23164.52,23164.544,23164.517,23164.518,23164.524,23164.553,23164.539,23164.54,23164.515,23164.529,23164.512,23164.485,23164.504,23164.511,23164.486,23164.487,23164.489,23164.497,23164.513,23164.505,23164.514,23164.536,23164.549,23164.523,23164.539,23164.537,23164.549,23164.535,23164.547,23164.54,23164.501,23164.557,23164.484,23164.536,23164.509,23164.551,23164.516,23164.54,23164.511,23164.497,23164.511,23164.475,23164.49,23164.473,23164.467,23164.49,23164.464,23164.476,23164.461,23164.381,23164.394,23164.365,23164.355,23164.35,23164.364,23164.33,23164.333,23164.311,23164.285,23164.266,23164.273,23164.248,23164.261,23164.241,23164.227,23164.239,23164.189,23164.197,23164.177,23164.186,23164.186,23164.136,23164.166,23164.124,23164.083,23164.067,23164.08,23164.077,23164.06,23164.057,23164.06,23164.012,23163.967,23163.985,23163.969,23163.989,23163.98,23163.955,23163.931,23163.906,23163.904,23163.892,23163.892,23163.932,23163.888,23163.877,23163.895,23163.878,23163.864,23163.849,23163.834,23163.832,23163.828,23163.808,23163.824,23163.825,23163.811,23163.796,23163.762,23163.741,23163.724,23163.718,23163.697,23163.727,23163.693,23163.697,23163.691,23163.708,23163.634,23163.637,23163.67,23163.673,23163.661,23163.667,23163.634,23163.635,23163.604,23163.589,23163.58,23163.596,23163.615,23163.573,23163.571,23163.582,23163.512,23163.546,23163.523,23163.506,23163.488,23163.515,23163.484,23163.511,23163.489,23163.501,23163.475,23163.481,23163.458,23163.499,23163.49,23163.492,23163.461,23163.465,23163.472,23163.438,23163.469,23163.385,23163.431,23163.461,23163.395,23163.376,23163.469,23163.444,23163.453,23163.48,23163.436,23163.419,23163.435,23163.452,23163.464,23163.452,23163.433,23163.44,23163.426,23163.417,23163.372,23163.427,23163.401,23163.397,23163.404,23163.41,23163.39,23163.379,23163.395,23163.394,23163.355,23163.368,23163.38,23163.359,23163.328,23163.309,23163.308,23163.3,23163.326,23163.309,23163.272,23163.288,23163.285,23163.256,23163.253,23163.248,23163.236,23163.236,23163.211,23163.201,23163.197,23163.175,23163.159,23163.17,23163.127,23163.127,23163.103,23163.084,23163.087,23163.061,23163.05,23163.035,23163.026,23163.009,23162.996,23163.036,23163.008,23163.009,23163.01,23162.968,23163.006,23162.999,23162.998,23162.997,23162.991,23162.989,23162.995,23162.966,23162.951,23162.972,23162.984,23162.965,23162.952,23162.957,23162.993,23162.958,23162.968,23162.942,23162.965,23162.879,23162.919,23162.938,23162.914,23162.912,23162.919,23162.928,23162.907,23162.906,23162.898,23162.916,23162.907,23162.912,23162.862,23162.85,23162.826,23162.821,23162.783,23162.774,23162.768,23162.74,23162.739,23162.74,23162.713,23162.727,23162.706,23162.703,23162.667,23162.686,23162.665,23162.663,23162.679,23162.676,23162.693,23162.655,23162.664,23162.643,23162.651,23162.628,23162.61,23162.598,23162.604,23162.595,23162.625,23162.587,23162.613,23162.612,23162.6,23162.582,23162.593,23162.601,23162.611,23162.603,23162.621,23162.599,23162.605,23162.601,23162.597,23162.579,23162.579,23162.59,23162.544,23162.518,23162.504,23162.487,23162.483,23162.471,23162.451,23162.458,23162.45,23162.451,23162.444,23162.436,23162.442,23162.426,23162.42,23162.437,23162.433,23162.405,23162.41,23162.427,23162.423,23162.412,23162.436,23162.431,23162.456,23162.453,23162.464,23162.469,23162.476,23162.463,23162.442,23162.487,23162.485,23162.455,23162.437,23162.452,23162.483,23162.469,23162.488,23162.489,23162.452,23162.459,23162.427,23162.473,23162.418,23162.387,23162.401,23162.389,23162.374,23162.366,23162.367,23162.356,23162.362,23162.337,23162.321,23162.345,23162.332,23162.317,23162.301,23162.287,23162.321,23162.314,23162.302,23162.295,23162.327,23162.289,23162.283,23162.263,23162.269,23162.236,23162.227,23162.272,23162.18,23162.188,23162.153,23162.199,23162.178,23162.112,23162.165,23162.139,23162.209,23162.199,23162.207,23162.192,23162.173,23162.147,23162.138,23162.165,23162.138,23162.133,23162.103,23162.088,23162.099,23162.111,23162.1,23162.109,23162.1,23162.121,23162.114,23162.102,23162.124,23162.121,23162.117,23162.095,23162.097,23162.12,23162.081,23162.092,23162.066,23162.072,23162.067,23162.065,23162.07,23162.066,23162.067,23162.033,23162.039,23162.019,23162.015,23161.991,23161.988,23161.967,23161.965,23161.961,23161.963,23161.936,23161.958,23161.946,23161.977,23161.969,23161.94,23161.97,23161.973,23161.991,23161.962,23161.966,23161.993,23161.976,23161.949,23161.972,23161.986,23161.96,23162.009,23162.016,23161.979,23162.005,23161.995,23161.993,23161.984,23161.987,23161.981,23161.971,23161.968,23161.976,23161.961,23161.931,23161.968,23161.956,23161.949,23161.956,23161.943,23161.923,23161.948,23161.938,23161.942,23161.911,23161.935,23161.917,23161.944,23161.947,23161.932,23161.922,23161.908,23161.903,23161.931,23161.868,23161.907,23161.899,23161.886,23161.89,23161.903,23161.858,23161.881,23161.856,23161.883,23161.882,23161.899,23161.864,23161.852,23161.829,23161.811,23161.788,23161.793,23161.794,23161.781,23161.778,23161.759,23161.752,23161.735,23161.727,23161.717,23161.709,23161.663,23161.665,23161.642,23161.643,23161.611,23161.632,23161.63,23161.643,23161.613,23161.599,23161.615,23161.576,23161.564,23161.573,23161.584,23161.585,23161.596,23161.6,23161.603,23161.61,23161.608,23161.622,23161.626,23161.619,23161.618,23161.639,23161.665,23161.623,23161.647,23161.677,23161.682,23161.688,23161.693,23161.683,23161.684,23161.688,23161.684,23161.686,23161.675,23161.676,23161.686,23161.658,23161.66,23161.702,23161.664,23161.666,23161.655,23161.632,23161.612,23161.621,23161.607,23161.621,23161.604,23161.597,23161.595,23161.579,23161.574,23161.593,23161.585,23161.571,23161.579,23161.555,23161.561,23161.553,23161.585,23161.566,23161.571,23161.579,23161.557,23161.554,23161.536,23161.564,23161.567,23161.566,23161.585,23161.577,23161.566,23161.541,23161.576,23161.598,23161.591,23161.599,23161.609,23161.599,23161.627,23161.629,23161.62,23161.634,23161.631,23161.63,23161.645,23161.637,23161.647,23161.664,23161.686,23161.661,23161.669,23161.65,23161.664,23161.681,23161.676,23161.657,23161.683,23161.659,23161.633,23161.647,23161.65,23161.676,23161.659,23161.637,23161.654,23161.658,23161.685,23161.657,23161.651,23161.667,23161.6,23161.665,23161.619,23161.688,23161.667,23161.678,23161.675,23161.683,23161.683,23161.679,23161.706,23161.668,23161.68,23161.665,23161.648,23161.684,23161.695,23161.674,23161.69,23161.655,23161.666,23161.657,23161.644,23161.669,23161.636,23161.629,23161.651,23161.657,23161.686,23161.675,23161.664,23161.671,23161.663,23161.693,23161.671,23161.69,23161.665,23161.683,23161.681,23161.673,23161.69,23161.716,23161.683,23161.68,23161.695,23161.71,23161.713,23161.69,23161.719,23161.724,23161.723,23161.707,23161.689,23161.735,23161.751,23161.704,23161.74,23161.732,23161.737,23161.758,23161.758,23161.731,23161.746,23161.696,23161.703,23161.696,23161.673,23161.682,23161.686,23161.722,23161.689,23161.738,23161.683,23161.669,23161.673,23161.675,23161.645,23161.649,23161.616,23161.651,23161.629,23161.639,23161.615,23161.619,23161.583,23161.628,23161.598,23161.588,23161.596,23161.567,23161.576,23161.556,23161.556,23161.589,23161.571,23161.572,23161.608,23161.598,23161.558,23161.58,23161.58,23161.583,23161.581,23161.556,23161.571,23161.583,23161.555,23161.551,23161.525,23161.606,23161.593,23161.587,23161.502,23161.574,23161.513,23161.508,23161.494,23161.487,23161.488,23161.484,23161.51,23161.471,23161.477,23161.446,23161.481,23161.47,23161.475,23161.474,23161.467,23161.454,23161.459,23161.447,23161.467,23161.432,23161.423,23161.413,23161.386,23161.389,23161.379,23161.376,23161.34,23161.371,23161.406,23161.374,23161.36,23161.322,23161.3,23161.313,23161.291,23161.29,23161.272,23161.286,23161.282,23161.296,23161.263,23161.284,23161.259,23161.272,23161.239,23161.233,23161.243,23161.237,23161.245,23161.256,23161.231,23161.232,23161.268,23161.244,23161.244,23161.282,23161.288,23161.286,23161.298,23161.306,23161.286,23161.292,23161.268,23161.26,23161.29,23161.263,23161.269,23161.267,23161.212,23161.168,23161.252,23161.204,23161.194,23161.177,23161.156,23161.183,23161.187,23161.171,23161.157,23161.155,23161.14,23161.129,23161.129,23161.097,23161.099,23161.094,23161.087,23161.051,23161.057,23161.002,23161.001,23161.007,23160.989,23160.978,23160.983,23160.944,23160.958,23160.954,23160.936,23160.923,23160.935,23160.928,23160.89,23160.875,23160.877,23160.887,23160.874,23160.882,23160.895,23160.886,23160.883,23160.862,23160.863,23160.89,23160.845,23160.86,23160.852,23160.874,23160.851,23160.85,23160.839,23160.841,23160.864,23160.85,23160.832,23160.815,23160.82,23160.816,23160.796,23160.774,23160.784,23160.765,23160.739,23160.711,23160.693,23160.66,23160.654,23160.649,23160.639,23160.597,23160.575,23160.555,23160.569,23160.529,23160.545,23160.516,23160.518,23160.495,23160.495,23160.47,23160.434,23160.432,23160.425,23160.412,23160.417,23160.376,23160.352,23160.342,23160.344,23160.318,23160.33,23160.323,23160.313,23160.279,23160.278,23160.295,23160.27,23160.255,23160.229,23160.212,23160.206,23160.186,23160.197,23160.177,23160.177,23160.177,23160.149,23160.138,23160.147,23160.138,23160.14,23160.118,23160.134,23160.104,23160.086,23160.111,23160.074,23160.042,23160.033,23160.051,23160.0,23159.993,23159.993,23159.998,23159.978,23159.963,23159.953,23159.943,23159.931,23159.943,23159.964,23159.945,23159.951,23159.9,23159.878,23159.89,23159.881,23159.86,23159.864,23159.852,23159.821,23159.817,23159.803,23159.778,23159.791,23159.761,23159.735,23159.77,23159.749,23159.762,23159.748,23159.739,23159.748,23159.787,23159.783,23159.801,23159.775,23159.745,23159.76,23159.705,23159.73,23159.723,23159.725,23159.714,23159.697,23159.688,23159.677,23159.675,23159.672,23159.669,23159.674,23159.668,23159.645,23159.643,23159.648,23159.649,23159.655,23159.631,23159.643,23159.63,23159.615,23159.578,23159.588,23159.578,23159.565,23159.561,23159.523,23159.547,23159.548,23159.521,23159.537,23159.528,23159.552,23159.524,23159.499,23159.521,23159.515,23159.503,23159.507,23159.475,23159.463,23159.459,23159.499,23159.49,23159.488,23159.46,23159.433,23159.404,23159.434,23159.412,23159.394,23159.37,23159.39,23159.329,23159.338,23159.309,23159.309,23159.287,23159.278,23159.276,23159.256,23159.234,23159.235,23159.208,23159.184,23159.175,23159.148,23159.14,23159.098,23159.082,23159.074,23159.036,23159.045,23159.017,23159.012,23159.012,23159.006,23158.979,23158.951,23158.93,23158.92,23158.913,23158.877,23158.875,23158.861,23158.837,23158.824,23158.805,23158.766,23158.762,23158.745,23158.718,23158.702,23158.69,23158.665,23158.654,23158.642,23158.621,23158.62,23158.583,23158.582,23158.553,23158.544,23158.535,23158.518,23158.475,23158.467,23158.469,23158.48,23158.476,23158.469,23158.47,23158.488,23158.489,23158.467,23158.47,23158.461,23158.497,23158.504,23158.481,23158.512,23158.509,23158.488,23158.496,23158.523,23158.511,23158.529,23158.55,23158.545,23158.546,23158.542,23158.548,23158.533,23158.56,23158.559,23158.555,23158.522,23158.54,23158.518,23158.541,23158.497,23158.506,23158.482,23158.456,23158.449,23158.438,23158.412,23158.387,23158.371,23158.346,23158.319,23158.292,23158.241,23158.234,23158.227,23158.218,23158.195,23158.193,23158.186,23158.174,23158.185,23158.19,23158.185,23158.194,23158.176,23158.202,23158.2,23158.201,23158.201,23158.223,23158.223,23158.24,23158.261,23158.265,23158.29,23158.265,23158.285,23158.333,23158.322,23158.317,23158.335,23158.35,23158.371,23158.349,23158.373,23158.354,23158.355,23158.339,23158.347,23158.363,23158.363,23158.356,23158.364,23158.369,23158.386,23158.395,23158.357,23158.386,23158.397,23158.416,23158.42,23158.371,23158.363,23158.373,23158.375,23158.353,23158.374,23158.34,23158.365,23158.352,23158.34,23158.35,23158.358,23158.344,23158.334,23158.345,23158.367,23158.377,23158.371,23158.343,23158.352,23158.361,23158.307,23158.337,23158.327,23158.332,23158.325,23158.309,23158.292,23158.309,23158.285,23158.267,23158.265,23158.27,23158.236,23158.218,23158.239,23158.188,23158.198,23158.21,23158.171,23158.176,23158.17,23158.159,23158.111,23158.123,23158.142,23158.121,23158.102,23158.09,23158.099,23158.111,23158.094,23158.104,23158.093,23158.056,23158.103,23158.101,23158.081,23158.074,23158.054,23158.049,23158.054,23158.058,23158.051,23158.025,23158.055,23158.041,23158.042,23158.033,23158.039,23158.064,23158.091,23158.067,23158.047,23158.071,23158.051,23158.058,23158.043,23158.055,23158.059,23158.02,23158.003,23158.004,23158.029,23158.058,23158.056,23158.037,23158.006,23158.012,23157.987,23157.996,23157.963,23157.983,23157.98,23157.971,23157.963,23157.962,23157.958,23157.963,23157.965,23157.957,23157.976,23157.972,23157.976,23157.966,23157.971,23157.958,23157.926,23157.931,23157.963,23157.947,23157.92,23157.936,23157.924,23157.915,23157.917,23157.927,23157.897,23157.915,23157.87,23157.857,23157.9,23157.891,23157.866,23157.844,23157.832,23157.863,23157.823,23157.816,23157.81,23157.812,23157.809,23157.8,23157.784,23157.797,23157.79,23157.784,23157.795,23157.811,23157.782,23157.788,23157.768,23157.753,23157.784,23157.782,23157.783,23157.774,23157.762,23157.729,23157.734,23157.721,23157.709,23157.708,23157.728,23157.707,23157.688,23157.64,23157.649,23157.643,23157.611,23157.634,23157.621,23157.584,23157.571,23157.557,23157.559,23157.522,23157.504,23157.483,23157.434,23157.442,23157.411,23157.411,23157.42,23157.388,23157.377,23157.328,23157.357,23157.373,23157.353,23157.326,23157.289,23157.304,23157.28,23157.268,23157.24,23157.227,23157.232,23157.24,23157.206,23157.201,23157.192,23157.195,23157.214,23157.242,23157.246,23157.252,23157.256,23157.274,23157.268,23157.239,23157.257,23157.231,23157.265,23157.256,23157.267,23157.273,23157.321,23157.305,23157.316,23157.34,23157.343,23157.344,23157.357,23157.411,23157.396,23157.4,23157.431,23157.434,23157.432,23157.444,23157.461,23157.442,23157.448,23157.434,23157.47,23157.428,23157.409,23157.427,23157.409,23157.377,23157.449,23157.473,23157.397,23157.421,23157.415,23157.402,23157.387,23157.395,23157.4,23157.388,23157.386,23157.385,23157.378,23157.371,23157.355,23157.338,23157.331,23157.3,23157.308,23157.31,23157.3,23157.295,23157.285,23157.284,23157.287,23157.272,23157.25,23157.259,23157.238,23157.213,23157.245,23157.211,23157.182,23157.211,23157.175,23157.155,23157.161,23157.159,23157.169,23157.166,23157.151,23157.121,23157.137,23157.103,23157.07,23157.093,23157.075,23157.043,23157.035,23157.068,23157.053,23157.036,23157.045,23157.029,23157.026,23157.048,23157.053,23157.033,23157.03,23157.031,23157.009,23157.037,23156.991,23157.01,23157.01,23157.007,23157.005,23157.0,23156.989,23156.991,23157.007,23156.964,23156.965,23156.974,23156.954,23156.969,23156.989,23156.985,23156.986,23156.997,23156.976,23157.006,23156.987,23156.993,23157.006,23157.025,23157.01,23157.011,23156.985,23157.004,23156.991,23156.97,23156.971,23156.98,23156.931,23156.995,23156.952,23156.947,23156.979,23156.961,23156.959,23157.007,23156.999,23157.005,23157.011,23157.033,23157.018,23157.016,23157.066,23157.045,23157.047,23157.049,23157.081,23157.088,23157.116,23157.148,23157.125,23157.142,23157.112,23157.107,23157.105,23157.08,23157.093,23157.095,23157.102,23157.112,23157.147,23157.143,23157.193,23157.206,23157.209,23157.203,23157.23,23157.221,23157.227,23157.205,23157.218,23157.209,23157.195,23157.211,23157.205,23157.192,23157.187,23157.203,23157.185,23157.215,23157.223,23157.197,23157.204,23157.239,23157.233,23157.275,23157.271,23157.263,23157.271,23157.265,23157.278,23157.258,23157.262,23157.253,23157.285,23157.272,23157.253,23157.257,23157.288,23157.279,23157.253,23157.26,23157.264,23157.262,23157.295,23157.275,23157.264,23157.262,23157.249,23157.278,23157.294,23157.278,23157.283,23157.277,23157.279,23157.283,23157.274,23157.265,23157.262,23157.258,23157.275,23157.29,23157.281,23157.282,23157.292,23157.268,23157.312,23157.303,23157.33,23157.353,23157.373,23157.367,23157.391,23157.435,23157.443,23157.445,23157.44,23157.471,23157.47,23157.496,23157.478,23157.497,23157.511,23157.51,23157.521,23157.542,23157.533,23157.556,23157.568,23157.567,23157.592,23157.588,23157.566,23157.607,23157.607,23157.617,23157.594,23157.589,23157.585,23157.558,23157.6,23157.6,23157.594,23157.598,23157.613,23157.611,23157.605,23157.633,23157.615,23157.648,23157.671,23157.657,23157.679,23157.686,23157.681,23157.696,23157.688,23157.723,23157.701,23157.721,23157.721,23157.749,23157.742,23157.765,23157.774,23157.737,23157.765,23157.782,23157.777,23157.802,23157.854,23157.783,23157.786,23157.819,23157.821,23157.849,23157.835,23157.875,23157.844,23157.872,23157.861,23157.838,23157.869,23157.871,23157.876,23157.88,23157.887,23157.887,23157.875,23157.897,23157.91,23157.914,23157.914,23157.912,23157.916,23157.91,23157.928,23157.954,23157.969,23157.981,23157.951,23157.927,23157.943,23157.936,23157.933,23157.921,23157.897,23157.926,23157.916,23157.89,23157.914,23157.919,23157.918,23157.948,23157.938,23157.946,23157.971,23157.971,23157.987,23158.014,23157.992,23158.004,23158.011,23158.007,23158.026,23158.037,23158.054,23158.019,23158.028,23158.032,23158.016,23158.016,23158.05,23158.053,23158.05,23158.087,23158.1,23158.098,23158.11,23158.144,23158.114,23158.118,23158.147,23158.15,23158.17,23158.165,23158.152,23158.209,23158.194,23158.218,23158.201,23158.201,23158.252,23158.294,23158.266,23158.313,23158.323,23158.309,23158.338,23158.31,23158.329,23158.392,23158.382,23158.386,23158.423,23158.428,23158.463,23158.425,23158.43,23158.451,23158.435,23158.47,23158.505,23158.507,23158.512,23158.502,23158.549,23158.548,23158.576,23158.561,23158.571,23158.58,23158.608,23158.608,23158.623,23158.661,23158.653,23158.66,23158.649,23158.683,23158.677,23158.724,23158.716,23158.748,23158.775,23158.772,23158.784,23158.824,23158.844,23158.855,23158.839,23158.882,23158.912,23158.891,23158.915,23158.911,23158.955,23158.957,23158.963,23158.952,23158.981,23158.997,23158.993,23159.012,23159.001,23159.006,23159.029,23159.028,23159.071,23159.089,23159.089,23159.096,23159.141,23159.147,23159.171,23159.169,23159.169,23159.153,23159.164,23159.173,23159.223,23159.235,23159.221,23159.249,23159.262,23159.288,23159.324,23159.329,23159.335,23159.366,23159.359,23159.396,23159.403,23159.432,23159.461,23159.475,23159.488,23159.497,23159.519,23159.528,23159.524,23159.533,23159.538,23159.553,23159.59,23159.607,23159.611,23159.641,23159.671,23159.712,23159.71,23159.731,23159.731,23159.766,23159.785,23159.786,23159.83,23159.823,23159.818,23159.837,23159.854,23159.878,23159.906,23159.883,23159.898,23159.915,23159.958,23159.965,23159.979,23160.003,23159.997,23160.013,23160.054,23160.051,23160.052,23160.067,23160.078,23160.113,23160.105,23160.137,23160.131,23160.143,23160.159,23160.15,23160.159,23160.149,23160.185,23160.211,23160.25,23160.253,23160.262,23160.281,23160.307,23160.322,23160.31,23160.347,23160.356,23160.358,23160.369,23160.359,23160.366,23160.389,23160.406,23160.407,23160.41,23160.397,23160.404,23160.403,23160.407,23160.439,23160.458,23160.459,23160.472,23160.481,23160.494,23160.502,23160.49,23160.485,23160.517,23160.508,23160.515,23160.489,23160.515,23160.526,23160.514,23160.514,23160.535,23160.524,23160.542,23160.561,23160.543,23160.533,23160.585,23160.526,23160.492,23160.549,23160.64,23160.608,23160.558,23160.577,23160.594,23160.63,23160.623,23160.61,23160.631,23160.656,23160.663,23160.663,23160.691,23160.696,23160.689,23160.697,23160.694,23160.757,23160.756,23160.749,23160.734,23160.762,23160.766,23160.795,23160.802,23160.805,23160.819,23160.855,23160.878,23160.853,23160.874,23160.863,23160.895,23160.924,23160.922,23160.927,23160.907,23160.919,23160.919,23160.915,23160.939,23160.951,23160.94,23160.957,23160.987,23160.986,23160.996,23160.963,23161.011,23160.969,23160.986,23160.968,23160.965,23160.942,23160.972,23160.976,23160.985,23160.976,23161.019,23161.006,23161.027,23161.043,23161.053,23161.066,23161.076,23161.113,23161.111,23161.098,23161.102,23161.137,23161.141,23161.115,23161.123,23161.108,23161.138,23161.137,23161.141,23161.149,23161.142,23161.174,23161.212,23161.215,23161.228,23161.247,23161.265,23161.293,23161.292,23161.29,23161.314,23161.312,23161.305,23161.316,23161.304,23161.302,23161.313,23161.327,23161.345,23161.333,23161.347,23161.374,23161.383,23161.397,23161.397,23161.433,23161.413,23161.426,23161.439,23161.451,23161.452,23161.487,23161.465,23161.469,23161.479,23161.479,23161.51,23161.55,23161.525,23161.56,23161.557,23161.562,23161.565,23161.592,23161.589,23161.6,23161.596,23161.602,23161.601,23161.606,23161.627,23161.632,23161.631,23161.646,23161.618,23161.646,23161.676,23161.666,23161.686,23161.693,23161.702,23161.718,23161.728,23161.739,23161.717,23161.742,23161.777,23161.749,23161.777,23161.778,23161.775,23161.769,23161.785,23161.766,23161.765,23161.793,23161.806,23161.834,23161.833,23161.823,23161.825,23161.822,23161.857,23161.845,23161.88,23161.872,23161.907,23161.884,23161.899,23161.912,23161.911,23161.942,23161.906,23161.948,23161.956,23161.988,23161.962,23161.966,23161.969,23161.982,23161.978,23162.013,23162.024,23162.016,23162.024,23162.056,23162.043,23162.056,23162.051,23162.076,23162.088,23162.106,23162.128,23162.115,23162.127,23162.135,23162.14,23162.141,23162.191,23162.188,23162.186,23162.179,23162.202,23162.206,23162.239,23162.256,23162.25,23162.246,23162.276,23162.266,23162.291,23162.288,23162.285,23162.285,23162.291,23162.308,23162.351,23162.338,23162.355,23162.357,23162.357,23162.358,23162.348,23162.344,23162.387,23162.385,23162.404,23162.423,23162.395,23162.404,23162.441,23162.411,23162.392,23162.428,23162.423,23162.409,23162.434,23162.428,23162.42,23162.42,23162.434,23162.433,23162.445,23162.44,23162.45,23162.46,23162.522,23162.53,23162.526,23162.537,23162.557,23162.562,23162.549,23162.55,23162.568,23162.56,23162.574,23162.544,23162.56,23162.569,23162.586,23162.579,23162.6,23162.579,23162.579,23162.593,23162.586,23162.587,23162.629,23162.574,23162.61,23162.623,23162.639,23162.626,23162.638,23162.636,23162.645,23162.627,23162.686,23162.674,23162.682,23162.71,23162.68,23162.701,23162.704,23162.699,23162.696,23162.721,23162.751,23162.733,23162.752,23162.743,23162.711,23162.707,23162.728,23162.726,23162.717,23162.749,23162.708,23162.713,23162.742,23162.718,23162.73,23162.738,23162.728,23162.769,23162.748,23162.761,23162.75,23162.735,23162.751,23162.755,23162.741,23162.739,23162.773,23162.733,23162.772,23162.769,23162.805,23162.812,23162.793,23162.802,23162.815,23162.822,23162.789,23162.801,23162.79,23162.802,23162.795,23162.801,23162.809,23162.82,23162.817,23162.829,23162.831,23162.855,23162.888,23162.868,23162.875,23162.89,23162.91,23162.905,23162.911,23162.917,23162.898,23162.903,23162.917,23162.913,23162.933,23162.927,23162.929,23162.914,23162.924,23162.938,23162.936,23162.936,23162.949,23162.953,23162.96,23162.966,23162.981,23162.961,23162.958,23162.968,23162.968,23162.936,23162.925,23162.967,23162.949,23162.959,23162.962,23163.0,23162.994,23162.98,23162.99,23162.981,23162.977,23162.994,23163.002,23162.98,23163.009,23163.004,23163.013,23162.991,23163.024,23163.029,23163.017,23163.02,23163.031,23163.019,23163.022,23163.02,23163.009,23163.014,23163.042,23163.03,23163.043,23163.047,23163.023,23163.046,23163.027,23163.02,23163.038,23163.041,23163.015,23163.053,23163.057,23163.06,23163.065,23163.079,23163.063,23163.073,23163.06,23163.091,23163.121,23163.079,23163.1,23163.083,23163.067,23163.102,23163.108,23163.124,23163.121,23163.093,23163.115,23163.097,23163.108,23163.1,23163.131,23163.141,23163.127,23163.115,23163.124,23163.112,23163.1,23163.136,23163.125,23163.139,23163.117,23163.145,23163.116,23163.113,23163.112,23163.146,23163.124,23163.105,23163.108,23163.119,23163.144,23163.153,23163.131,23163.124,23163.093,23163.09,23163.105,23163.097,23163.097,23163.101,23163.089,23163.114,23163.133,23163.129,23163.093,23163.115,23163.109,23163.106,23163.103,23163.134,23163.104,23163.141,23163.111,23163.1,23163.104,23163.114,23163.111,23163.116,23163.142,23163.133,23163.127,23163.143,23163.124,23163.109,23163.136,23163.137,23163.165,23163.163,23163.138,23163.14,23163.142,23163.14,23163.118,23163.132,23163.143,23163.165,23163.171,23163.159,23163.164,23163.149,23163.158,23163.181,23163.179,23163.177,23163.17,23163.198,23163.173,23163.16,23163.15,23163.132,23163.127,23163.157,23163.12,23163.123,23163.129,23163.093,23163.102,23163.108,23163.093,23163.109,23163.084,23163.054,23163.076,23163.052,23163.045,23163.045,23163.04,23163.038,23163.028,23163.041,23163.023,23163.015,23163.035,23163.052,23162.996,23163.026,23163.005,23163.013,23163.02,23163.012,23162.993,23163.008,23163.008,23163.035,23162.997,23163.027,23163.011,23163.002,23163.0,23163.019,23163.004,23163.007,23163.011,23162.995,23163.029,23163.002,23163.007,23163.01,23162.967,23162.982,23163.0,23162.984,23162.958,23162.961,23162.966,23162.954,23162.948,23162.943,23162.947,23162.965,23162.939,23162.939,23162.909,23162.949,23162.925,23162.931,23162.958,23162.937,23162.936,23162.954,23162.943,23162.97,23162.964,23162.974,23162.993,23163.002,23163.004,23163.015,23163.031,23163.026,23163.061,23163.06,23163.073,23163.067,23163.112,23163.122,23163.136,23163.128,23163.147,23163.169,23163.188,23163.172,23163.186,23163.195,23163.18,23163.173,23163.18,23163.158,23163.165,23163.147,23163.158,23163.148,23163.12,23163.144,23163.156,23163.147,23163.12,23163.144,23163.126,23163.13,23163.133,23163.128,23163.124,23163.106,23163.108,23163.09,23163.085,23163.052,23163.049,23163.045,23163.065,23163.069,23163.059,23163.038,23163.063,23163.045,23163.051,23163.059,23163.031,23163.093,23163.074,23163.101,23163.085,23163.1,23163.077,23163.069,23163.093,23163.093,23163.115,23163.114,23163.124,23163.161,23163.138,23163.143,23163.154,23163.154,23163.162,23163.191,23163.164,23163.191,23163.183,23163.186,23163.212,23163.201,23163.219,23163.213,23163.23,23163.256,23163.255,23163.25,23163.257,23163.29,23163.315,23163.328,23163.309,23163.321,23163.314,23163.333,23163.351,23163.374,23163.373,23163.368,23163.384,23163.428,23163.403,23163.419,23163.42,23163.451,23163.465,23163.449,23163.465,23163.478,23163.469,23163.536,23163.513,23163.523,23163.523,23163.558,23163.537,23163.569,23163.547,23163.553,23163.581,23163.561,23163.598,23163.57,23163.603,23163.638,23163.628,23163.633,23163.612,23163.627,23163.66,23163.664,23163.68,23163.697,23163.719,23163.721,23163.719,23163.741,23163.734,23163.755,23163.784,23163.77,23163.781,23163.778,23163.796,23163.815,23163.809,23163.82,23163.822,23163.857,23163.853,23163.886,23163.891,23163.891,23163.927,23163.889,23163.956,23163.968,23163.978,23164.004,23163.956,23163.979,23163.976,23163.996,23163.999,23164.008,23164.026,23164.018,23164.045,23164.088,23164.083,23164.119,23164.101,23164.123,23164.141,23164.163,23164.164,23164.166,23164.154,23164.162,23164.16,23164.154,23164.218,23164.197,23164.201,23164.201,23164.192,23164.204,23164.238,23164.206,23164.227,23164.252,23164.255,23164.294,23164.274,23164.29,23164.316,23164.329,23164.334,23164.356,23164.329,23164.359,23164.382,23164.382,23164.367,23164.391,23164.386,23164.413,23164.433,23164.432,23164.412,23164.507,23164.505,23164.532,23164.514,23164.538,23164.539,23164.558,23164.549,23164.565,23164.568,23164.583,23164.607,23164.585,23164.624,23164.609,23164.645,23164.644,23164.642,23164.657,23164.678,23164.697,23164.673,23164.692,23164.712,23164.691,23164.714,23164.728,23164.793,23164.769,23164.72,23164.787,23164.737,23164.769,23164.775,23164.78,23164.781,23164.816,23164.81,23164.828,23164.842,23164.849,23164.86,23164.879,23164.879,23164.858,23164.886,23164.89,23164.902,23164.903,23164.897,23164.922,23164.921,23164.94,23164.96,23164.949,23164.966,23164.98,23164.961,23164.999,23165.003,23165.005,23165.034,23165.01,23165.018,23165.032,23165.027,23165.018,23165.084,23165.062,23165.079,23165.097,23165.119,23165.125,23165.117,23165.106,23165.123,23165.131,23165.126,23165.13,23165.171,23165.155,23165.175,23165.17,23165.179,23165.155,23165.168,23165.182,23165.188,23165.192,23165.214,23165.22,23165.24,23165.239,23165.25,23165.267,23165.211,23165.263,23165.293,23165.284,23165.305,23165.293,23165.303,23165.276,23165.284,23165.309,23165.299,23165.31,23165.326,23165.336,23165.335,23165.346,23165.362,23165.363,23165.33,23165.371,23165.371,23165.394,23165.386,23165.419,23165.398,23165.438,23165.418,23165.415,23165.417,23165.438,23165.484,23165.508,23165.517,23165.532,23165.546,23165.564,23165.559,23165.584,23165.601,23165.606,23165.594,23165.617,23165.632,23165.627,23165.63,23165.639,23165.661,23165.665,23165.674,23165.681,23165.682,23165.72,23165.704,23165.724,23165.72,23165.719,23165.749,23165.73,23165.725,23165.707,23165.741,23165.742,23165.762,23165.759,23165.79,23165.757,23165.768,23165.77,23165.801,23165.796,23165.795,23165.797,23165.807,23165.813,23165.795,23165.788,23165.788,23165.794,23165.797,23165.83,23165.838,23165.807,23165.836,23165.868,23165.894,23165.895,23165.887,23165.927,23165.924,23165.883,23165.881,23165.921,23165.901,23165.9,23165.923,23165.957,23165.931,23165.95,23165.986,23165.975,23165.976,23165.97,23165.98,23166.004,23166.007,23166.02,23165.993,23166.014,23166.0,23166.022,23166.038,23166.06,23166.044,23166.071,23166.109,23166.068,23166.07,23166.057,23166.077,23166.082,23166.082,23166.076,23166.083,23166.077,23166.056,23166.079,23166.068,23166.068,23166.064,23166.12,23166.115,23166.131,23166.116,23166.154,23166.175,23166.182,23166.213,23166.188,23166.204,23166.197,23166.202,23166.218,23166.203,23166.199,23166.231,23166.222,23166.274,23166.277,23166.265,23166.287,23166.307,23166.306,23166.328,23166.331,23166.363,23166.334,23166.327,23166.364,23166.356,23166.323,23166.36,23166.388,23166.386,23166.396,23166.41,23166.424,23166.45,23166.435,23166.457,23166.493,23166.504,23166.488,23166.481,23166.473,23166.52,23166.512,23166.529,23166.53,23166.544,23166.523,23166.525,23166.526,23166.532,23166.526,23166.588,23166.558,23166.59,23166.59,23166.605,23166.604,23166.617,23166.614,23166.62,23166.614,23166.607,23166.611,23166.627,23166.638,23166.637,23166.666,23166.679,23166.693,23166.687,23166.693,23166.694,23166.659,23166.68,23166.672,23166.666,23166.693,23166.644,23166.673,23166.652,23166.656,23166.677,23166.702,23166.696,23166.699,23166.736,23166.753,23166.735,23166.79,23166.778,23166.748,23166.772,23166.742,23166.768,23166.777,23166.796,23166.786,23166.793,23166.84,23166.837,23166.847,23166.839,23166.879,23166.874,23166.865,23166.861,23166.885,23166.903,23166.88,23166.914,23166.893,23166.918,23166.909,23166.915,23166.936,23166.949,23166.909,23166.925,23166.938,23166.924,23166.94,23166.937,23166.93,23166.921,23166.908,23166.922,23166.905,23166.92,23166.923,23166.916,23166.95,23166.955,23166.939,23166.954,23166.952,23166.944,23166.935,23166.949,23166.961,23166.977,23166.996,23166.975,23166.976,23166.963,23166.996,23166.984,23167.005,23167.005,23167.059,23167.061,23167.069,23167.033,23167.036,23167.021,23167.047,23167.041,23167.075,23167.042,23167.038,23167.052,23167.063,23167.042,23167.031,23167.084,23167.063,23167.08,23167.07,23167.074,23167.087,23167.077,23167.105,23167.107,23167.106,23167.094,23167.122,23167.123,23167.119,23167.102,23167.126,23167.126,23167.093,23167.105,23167.11,23167.143,23167.123,23167.137,23167.133,23167.163,23167.121,23167.178,23167.163,23167.169,23167.168,23167.177,23167.175,23167.153,23167.138,23167.153,23167.153,23167.156,23167.14,23167.142,23167.142,23167.145,23167.138,23167.139,23167.138,23167.132,23167.165,23167.16,23167.171,23167.178,23167.175,23167.18,23167.168,23167.146,23167.185,23167.201,23167.197,23167.188,23167.196,23167.227,23167.233,23167.215,23167.226,23167.217,23167.239,23167.23,23167.249,23167.26,23167.233,23167.24,23167.22,23167.246,23167.252,23167.251,23167.269,23167.27,23167.27,23167.258,23167.255,23167.273,23167.288,23167.274,23167.262,23167.262,23167.262,23167.273,23167.295,23167.294,23167.286,23167.307,23167.305,23167.281,23167.281,23167.291,23167.276,23167.303,23167.316,23167.306,23167.324,23167.33,23167.327,23167.332,23167.355,23167.36,23167.347,23167.344,23167.337,23167.307,23167.323,23167.318,23167.304,23167.316,23167.325,23167.301,23167.297,23167.291,23167.333,23167.306,23167.312,23167.318,23167.327,23167.33,23167.337,23167.358,23167.353,23167.35,23167.36,23167.374,23167.382,23167.378,23167.38,23167.382,23167.378,23167.397,23167.414,23167.389,23167.379,23167.378,23167.368,23167.394,23167.364,23167.376,23167.351,23167.392,23167.39,23167.356,23167.385,23167.351,23167.353,23167.371,23167.362,23167.374,23167.354,23167.376,23167.365,23167.35,23167.353,23167.355,23167.344,23167.357,23167.345,23167.347,23167.349,23167.312,23167.291,23167.306,23167.283,23167.289,23167.284,23167.279,23167.285,23167.297,23167.294,23167.305,23167.289,23167.287,23167.265,23167.303,23167.32,23167.298,23167.278,23167.246,23167.275,23167.283,23167.283,23167.313,23167.306,23167.264,23167.302,23167.299,23167.313,23167.297,23167.314,23167.315,23167.313,23167.328,23167.316,23167.319,23167.294,23167.308,23167.324,23167.338,23167.329,23167.321,23167.308,23167.321,23167.311,23167.307,23167.293,23167.307,23167.303,23167.307,23167.287,23167.303,23167.33,23167.297,23167.317,23167.292,23167.262,23167.277,23167.305,23167.291,23167.29,23167.274,23167.287,23167.306,23167.277,23167.266,23167.284,23167.285,23167.282,23167.294,23167.274,23167.277,23167.28,23167.288,23167.299,23167.286,23167.296,23167.295,23167.288,23167.266,23167.282,23167.284,23167.287,23167.277,23167.285,23167.293,23167.297,23167.281,23167.295,23167.283,23167.26,23167.26,23167.291,23167.271,23167.284,23167.264,23167.285,23167.277,23167.254,23167.27,23167.278,23167.266,23167.288,23167.278,23167.267,23167.282,23167.261,23167.269,23167.272,23167.27,23167.251,23167.258,23167.249,23167.244,23167.242,23167.223,23167.233,23167.254,23167.226,23167.242,23167.251,23167.267,23167.242,23167.247,23167.273,23167.266,23167.273,23167.264,23167.284,23167.279,23167.237,23167.262,23167.274,23167.239,23167.228,23167.236,23167.23,23167.238,23167.228,23167.24,23167.281,23167.275,23167.273,23167.297,23167.3,23167.325,23167.33,23167.338,23167.337,23167.347,23167.349,23167.354,23167.363,23167.371,23167.391,23167.404,23167.409,23167.392,23167.435,23167.417,23167.456,23167.435,23167.423,23167.432,23167.423,23167.433,23167.413,23167.444,23167.394,23167.427,23167.4,23167.387,23167.37,23167.362,23167.37,23167.385,23167.374,23167.37,23167.373,23167.361,23167.36,23167.367,23167.395,23167.41,23167.376,23167.35,23167.41,23167.385,23167.414,23167.409,23167.39,23167.412,23167.399,23167.4,23167.405,23167.383,23167.413,23167.425,23167.417,23167.443,23167.436,23167.436,23167.439,23167.461,23167.475,23167.486,23167.497,23167.509,23167.512,23167.504,23167.5,23167.542,23167.507,23167.492,23167.516,23167.489,23167.523,23167.519,23167.507,23167.494,23167.512,23167.529,23167.572,23167.58,23167.573,23167.596,23167.601,23167.592,23167.623,23167.622,23167.651,23167.653,23167.644,23167.686,23167.68,23167.684,23167.709,23167.695,23167.702,23167.711,23167.701,23167.744,23167.737,23167.726,23167.709,23167.748,23167.749,23167.746,23167.743,23167.733,23167.721,23167.725,23167.725,23167.728,23167.755,23167.733,23167.742,23167.768,23167.776,23167.8,23167.803,23167.814,23167.845,23167.879,23167.891,23167.906,23167.919,23167.947,23167.956,23167.942,23167.936,23167.933,23167.965,23167.965,23167.962,23167.933,23167.964,23167.948,23167.918,23167.923,23167.922,23167.915,23167.932,23167.922,23167.921,23167.926,23167.942,23167.952,23167.957,23167.951,23167.986,23167.982,23167.964,23167.971,23167.957,23167.954,23167.971,23167.95,23167.947,23167.926,23167.955,23167.962,23167.972,23167.966,23167.959,23167.948,23167.97,23168.011,23168.027,23168.038,23168.06,23168.077,23168.067,23168.088,23168.096,23168.115,23168.092,23168.097,23168.132,23168.14,23168.131,23168.12,23168.159,23168.175,23168.199,23168.188,23168.198,23168.221,23168.21,23168.23,23168.246,23168.239,23168.27,23168.266,23168.255,23168.26,23168.27,23168.253,23168.276,23168.281,23168.309,23168.292,23168.29,23168.289,23168.292,23168.277,23168.301,23168.283,23168.333,23168.324,23168.356,23168.347,23168.348,23168.337,23168.333,23168.339,23168.349,23168.345,23168.385,23168.372,23168.383,23168.394,23168.383,23168.398,23168.413,23168.429,23168.432,23168.412,23168.402,23168.423,23168.44,23168.437,23168.453,23168.479,23168.459,23168.469,23168.484,23168.465,23168.485,23168.484,23168.504,23168.518,23168.51,23168.499,23168.528,23168.541,23168.544,23168.554,23168.583,23168.576,23168.56,23168.562,23168.536,23168.56,23168.575,23168.575,23168.594,23168.611,23168.6,23168.608,23168.632,23168.632,23168.646,23168.658,23168.65,23168.679,23168.672,23168.674,23168.671,23168.647,23168.707,23168.695,23168.669,23168.627,23168.675,23168.661,23168.671,23168.702,23168.7,23168.702,23168.728,23168.724,23168.725,23168.706,23168.728,23168.73,23168.723,23168.708,23168.724,23168.745,23168.744,23168.707,23168.709,23168.695,23168.715,23168.726,23168.719,23168.733,23168.763,23168.736,23168.721,23168.714,23168.752,23168.734,23168.724,23168.717,23168.726,23168.769,23168.763,23168.786,23168.777,23168.783,23168.766,23168.784,23168.817,23168.837,23168.812,23168.834,23168.838,23168.85,23168.85,23168.862,23168.862,23168.883,23168.915,23168.918,23168.929,23168.911,23168.926,23168.958,23168.979,23168.984,23168.979,23168.974,23168.979,23168.971,23168.961,23168.957,23168.953,23168.961,23168.949,23168.94,23168.947,23168.965,23168.966,23168.981,23168.991,23168.974,23169.001,23168.986,23168.983,23168.998,23169.004,23169.035,23169.023,23169.009,23168.988,23168.992,23168.98,23168.95,23168.968,23168.945,23168.941,23168.949,23168.973,23168.969,23168.973,23169.01,23169.018,23169.047,23169.018,23169.016,23169.029,23169.059,23169.067,23169.065,23169.05,23169.051,23169.081,23169.058,23169.053,23169.05,23169.057,23169.082,23169.054,23169.076,23169.107,23169.099,23169.136,23169.093,23169.124,23169.116,23169.1,23169.122,23169.091,23169.069,23169.042,23169.064,23169.051,23169.048,23169.056,23169.05,23169.078,23169.079,23169.06,23169.042,23169.068,23169.064,23169.076,23169.102,23169.053,23169.071,23169.076,23169.075,23169.069,23169.05,23169.067,23169.066,23169.071,23169.033,23169.065,23169.045,23169.083,23169.026,23169.053,23169.048,23169.033,23169.06,23169.028,23169.035,23169.028,23169.034,23169.021,23169.009,23169.031,23168.986,23168.987,23169.019,23169.017,23169.01,23169.013,23169.028,23169.0,23169.018,23169.016,23169.039,23169.037,23169.031,23169.056,23169.044,23169.069,23169.048,23169.054,23169.06,23169.07,23169.081,23169.061,23169.068,23169.067,23169.042,23169.045,23169.063,23169.077,23169.082,23169.094,23169.091,23169.084,23169.074,23169.086,23169.067,23169.061,23169.08,23169.095,23169.063,23169.075,23169.084,23169.059,23169.055,23169.031,23169.052,23169.069,23169.044,23169.045,23169.016,23169.017,23169.003,23168.99,23169.004,23169.015,23168.982,23168.989,23169.016,23168.99,23168.987,23168.99,23168.98,23168.991,23168.979,23168.989,23168.996,23169.008,23168.941,23168.969,23168.944,23168.936,23168.943,23168.995,23168.983,23168.944,23169.002,23169.021,23169.056,23169.103,23169.116,23169.125,23169.189,23169.221,23169.249,23169.308,23169.344,23169.407,23169.436,23169.483,23169.516,23169.564,23169.57,23169.653,23169.68,23169.697,23169.674,23169.69,23169.696,23169.644,23169.616,23169.567,23169.517,23169.445,23169.404,23169.326,23169.235,23169.167,23169.078,23169.015,23168.95,23168.868,23168.834,23168.786,23168.75,23168.677,23168.623,23168.598,23168.552,23168.519,23168.529,23168.496,23168.495,23168.492,23168.503,23168.519,23168.5,23168.547,23168.55,23168.579,23168.619,23168.635,23168.651,23168.737,23168.755,23168.789,23168.792,23168.834,23168.843,23168.858,23168.864,23168.888,23168.877,23168.866,23168.872,23168.839,23168.832,23168.838,23168.805,23168.818,23168.775,23168.789,23168.778,23168.784,23168.778,23168.768,23168.795,23168.767,23168.805,23168.765,23168.708,23168.72,23168.769,23168.815,23168.815,23168.815,23168.821,23168.834,23168.826,23168.823,23168.868,23168.85,23168.865,23168.877,23168.891,23168.873,23168.913,23168.95,23168.909,23168.925,23168.923,23168.942,23168.963,23168.932,23168.93,23168.936,23168.919,23168.944,23168.94,23168.955,23168.937,23168.939,23168.911,23168.904,23168.896,23168.847,23168.849,23168.856,23168.859,23168.843,23168.806,23168.802,23168.775,23168.765,23168.746,23168.728,23168.731,23168.728,23168.755,23168.729,23168.739,23168.712,23168.72,23168.716,23168.712,23168.716,23168.699,23168.679,23168.679,23168.664,23168.671,23168.671,23168.635,23168.615,23168.609,23168.581,23168.569,23168.567,23168.536,23168.544,23168.506,23168.501,23168.45,23168.428,23168.4,23168.365,23168.37,23168.334,23168.317,23168.289,23168.263,23168.255,23168.246,23168.203,23168.174,23168.211,23168.223,23168.17,23168.128,23168.155,23168.135,23168.14,23168.111,23168.099,23168.114,23168.121,23168.13,23168.144,23168.135,23168.154,23168.199,23168.21,23168.22,23168.237,23168.204,23168.214,23168.253,23168.289,23168.295,23168.344,23168.346,23168.356,23168.378,23168.402,23168.422,23168.447,23168.438,23168.452,23168.486,23168.48,23168.514,23168.507,23168.509,23168.48,23168.513,23168.505,23168.521,23168.502,23168.487,23168.487,23168.471,23168.474,23168.491,23168.465,23168.445,23168.449,23168.429,23168.419,23168.422,23168.441,23168.425,23168.425,23168.407,23168.416,23168.43,23168.446,23168.441,23168.439,23168.441,23168.441,23168.417,23168.458,23168.445,23168.453,23168.43,23168.399,23168.414,23168.387,23168.411,23168.39,23168.371,23168.363,23168.437,23168.38,23168.404,23168.387,23168.394,23168.383,23168.358,23168.357,23168.372,23168.349,23168.355,23168.324,23168.289,23168.265,23168.283,23168.283,23168.22,23168.231,23168.253,23168.256,23168.248,23168.268,23168.199,23168.206,23168.204,23168.214,23168.198,23168.189,23168.201,23168.181,23168.185,23168.178,23168.181,23168.212,23168.214,23168.226,23168.194,23168.205,23168.176,23168.205,23168.216,23168.232,23168.273,23168.212,23168.209,23168.215,23168.205,23168.219,23168.226,23168.234,23168.244,23168.255,23168.248,23168.291,23168.261,23168.278,23168.284,23168.293,23168.296,23168.291,23168.313,23168.303,23168.286,23168.296,23168.291,23168.274,23168.241,23168.248,23168.213,23168.199,23168.204,23168.173,23168.152,23168.148,23168.128,23168.105,23168.072,23168.061,23168.086,23168.063,23168.021,23168.02,23168.011,23168.01,23168.042,23168.043,23168.026,23168.028,23168.036,23168.065,23168.059,23168.055,23168.057,23168.071,23168.076,23168.074,23168.078,23168.081,23168.095,23168.103,23168.12,23168.116,23168.104,23168.085,23168.099,23168.128,23168.125,23168.154,23168.15,23168.17,23168.166,23168.185,23168.176,23168.128,23168.135,23168.148,23168.131,23168.096,23168.08,23168.058,23168.038,23168.021,23168.041,23168.037,23168.046,23168.04,23168.051,23168.043,23168.086,23168.048,23168.068,23168.046,23168.063,23168.06,23168.077,23168.073,23168.094,23168.078,23168.075,23168.07,23168.089,23168.065,23168.059,23168.048,23168.051,23168.036,23168.017,23168.045,23168.007,23168.016,23168.01,23168.009,23168.005,23167.99,23168.002,23167.963,23167.97,23167.991,23167.975,23167.98,23167.978,23167.963,23167.984,23167.975,23167.967,23167.957,23167.968,23167.996,23167.997,23167.996,23167.987,23168.023,23168.022,23168.017,23168.026,23168.039,23168.104,23168.108,23168.083,23168.09,23168.114,23168.115,23168.113,23168.138,23168.158,23168.176,23168.179,23168.177,23168.189,23168.18,23168.201,23168.231,23168.207,23168.203,23168.181,23168.195,23168.18,23168.195,23168.193,23168.203,23168.177,23168.158,23168.163,23168.145,23168.137,23168.149,23168.141,23168.121,23168.117,23168.089,23168.081,23168.104,23168.099,23168.111,23168.121,23168.1,23168.083,23168.075,23168.058,23168.085,23168.054,23168.08,23168.089,23168.07,23168.055,23168.076,23168.076,23168.045,23168.057,23168.066,23168.075,23168.064,23168.07,23168.061,23168.049,23168.045,23168.043,23168.043,23168.033,23168.026,23168.056,23168.041,23168.073,23168.079,23168.073,23168.087,23168.098,23168.083,23168.073,23168.073,23168.074,23168.067,23168.082,23168.097,23168.117,23168.101,23168.117,23168.129,23168.129,23168.092,23168.116,23168.171,23168.144,23168.138,23168.146,23168.104,23168.174,23168.163,23168.165,23168.168,23168.168,23168.188,23168.225,23168.193,23168.204,23168.225,23168.205,23168.228,23168.231,23168.225,23168.186,23168.204,23168.221,23168.236,23168.234,23168.255,23168.213,23168.232,23168.24,23168.228,23168.236,23168.259,23168.274,23168.306,23168.299,23168.309,23168.289,23168.29,23168.276,23168.281,23168.268,23168.268,23168.296,23168.264,23168.306,23168.3,23168.289,23168.289,23168.304,23168.318,23168.301,23168.314,23168.294,23168.306,23168.299,23168.304,23168.31,23168.292,23168.282,23168.31,23168.297,23168.312,23168.305,23168.341,23168.341,23168.32,23168.333,23168.319,23168.348,23168.36,23168.345,23168.351,23168.356,23168.351,23168.368,23168.369,23168.348,23168.363,23168.392,23168.415,23168.401,23168.415,23168.436,23168.431,23168.413,23168.418,23168.449,23168.443,23168.436,23168.462,23168.465,23168.455,23168.461,23168.456,23168.474,23168.462,23168.468,23168.482,23168.487,23168.484,23168.486,23168.496,23168.499,23168.477,23168.463,23168.437,23168.461,23168.479,23168.467,23168.466,23168.443,23168.461,23168.47,23168.463,23168.493,23168.454,23168.44,23168.414,23168.42,23168.454,23168.454,23168.458,23168.467,23168.455,23168.482,23168.496,23168.515,23168.513,23168.521,23168.511,23168.55,23168.567,23168.564,23168.584,23168.575,23168.584,23168.597,23168.594,23168.598,23168.592,23168.585,23168.614,23168.594,23168.599,23168.62,23168.613,23168.593,23168.599,23168.583,23168.59,23168.603,23168.569,23168.604,23168.616,23168.595,23168.592,23168.602,23168.573,23168.546,23168.547,23168.554,23168.538,23168.539,23168.526,23168.523,23168.526,23168.518,23168.538,23168.503,23168.498,23168.533,23168.541,23168.523,23168.53,23168.537,23168.526,23168.524,23168.533,23168.553,23168.56,23168.558,23168.554,23168.545,23168.528,23168.559,23168.521,23168.513,23168.518,23168.495,23168.503,23168.481,23168.46,23168.46,23168.452,23168.433,23168.435,23168.42,23168.418,23168.407,23168.392,23168.428,23168.414,23168.37,23168.353,23168.352,23168.351,23168.359,23168.341,23168.355,23168.336,23168.34,23168.34,23168.343,23168.349,23168.322,23168.304,23168.311,23168.301,23168.324,23168.333,23168.301,23168.302,23168.31,23168.295,23168.287,23168.254,23168.25,23168.287,23168.272,23168.26,23168.269,23168.266,23168.266,23168.263,23168.248,23168.248,23168.252,23168.235,23168.248,23168.23,23168.221,23168.227,23168.235,23168.237,23168.239,23168.225,23168.209,23168.206,23168.177,23168.186,23168.172,23168.154,23168.169,23168.14,23168.144,23168.16,23168.156,23168.159,23168.152,23168.148,23168.171,23168.171,23168.148,23168.16,23168.178,23168.191,23168.165,23168.16,23168.187,23168.201,23168.21,23168.224,23168.235,23168.23,23168.265,23168.235,23168.27,23168.247,23168.294,23168.281,23168.272,23168.297,23168.285,23168.308,23168.309,23168.319,23168.289,23168.263,23168.275,23168.281,23168.271,23168.277,23168.286,23168.303,23168.287,23168.307,23168.284,23168.297,23168.269,23168.275,23168.314,23168.325,23168.319,23168.337,23168.33,23168.306,23168.334,23168.327,23168.34,23168.37,23168.361,23168.372,23168.356,23168.374,23168.391,23168.387,23168.399,23168.4,23168.401,23168.397,23168.435,23168.451,23168.442,23168.454,23168.471,23168.45,23168.458,23168.455,23168.446,23168.472,23168.472,23168.459,23168.459,23168.461,23168.46,23168.446,23168.449,23168.459,23168.457,23168.451,23168.464,23168.477,23168.485,23168.478,23168.479,23168.474,23168.487,23168.481,23168.507,23168.526,23168.519,23168.492,23168.512,23168.526,23168.545,23168.564,23168.559,23168.551,23168.568,23168.544,23168.523,23168.526,23168.55,23168.572,23168.541,23168.521,23168.556,23168.535,23168.549,23168.557,23168.53,23168.55,23168.551,23168.554,23168.518,23168.551,23168.537,23168.524,23168.536,23168.528,23168.526,23168.505,23168.528,23168.508,23168.534,23168.518,23168.489,23168.505,23168.492,23168.502,23168.481,23168.482,23168.467,23168.483,23168.472,23168.506,23168.469,23168.493,23168.493,23168.475,23168.504,23168.5,23168.496,23168.517,23168.485,23168.509,23168.504,23168.496,23168.554,23168.548,23168.545,23168.543,23168.542,23168.571,23168.553,23168.533,23168.533,23168.559,23168.525,23168.521,23168.537,23168.546,23168.57,23168.56,23168.564,23168.564,23168.578,23168.584,23168.561,23168.573,23168.591,23168.582,23168.6,23168.58,23168.584,23168.593,23168.592,23168.579,23168.594,23168.587,23168.579,23168.582,23168.572,23168.63,23168.606,23168.636,23168.627,23168.635,23168.661,23168.66,23168.655,23168.68,23168.699,23168.696,23168.678,23168.677,23168.682,23168.687,23168.688,23168.702,23168.688,23168.685,23168.707,23168.721,23168.727,23168.767,23168.748,23168.784,23168.784,23168.81,23168.822,23168.831,23168.809,23168.83,23168.813,23168.821,23168.808,23168.839,23168.849,23168.865,23168.869,23168.851,23168.829,23168.882,23168.889,23168.891,23168.908,23168.913,23168.934,23168.919,23168.955,23168.958,23168.979,23168.95,23168.92,23168.955,23168.944,23168.964,23168.974,23168.99,23168.982,23168.965,23168.962,23168.994,23169.01,23169.017,23169.011,23169.022,23169.032,23169.069,23169.04,23169.02,23169.04,23169.011,23169.014,23169.012,23169.027,23169.056,23169.042,23169.053,23169.052,23169.079,23169.035,23169.017,23169.063,23169.059,23169.091,23169.055,23169.083,23169.07,23169.09,23169.085,23169.081,23169.071,23169.055,23169.058,23169.07,23169.05,23169.073,23169.074,23169.108,23169.115,23169.089,23169.077,23169.063,23169.093,23169.083,23169.097,23169.05,23169.076,23169.039,23169.069,23169.049,23169.03,23169.021,23169.045,23169.043,23169.038,23169.059,23169.041,23169.061,23169.071,23169.051,23169.08,23169.057,23169.08,23169.099,23169.093,23169.074,23169.079,23169.075,23169.086,23169.099,23169.101,23169.106,23169.129,23169.132,23169.139,23169.147,23169.169,23169.174,23169.154,23169.147,23169.163,23169.178,23169.176,23169.183,23169.159,23169.168,23169.176,23169.172,23169.19,23169.209,23169.217,23169.204,23169.236,23169.265,23169.262,23169.265,23169.278,23169.303,23169.321,23169.333,23169.302,23169.317,23169.324,23169.336,23169.362,23169.359,23169.372,23169.388,23169.376,23169.385,23169.407,23169.408,23169.41,23169.397,23169.413,23169.405,23169.385,23169.414,23169.412,23169.376,23169.406,23169.401,23169.41,23169.402,23169.394,23169.425,23169.388,23169.425,23169.406,23169.407,23169.427,23169.442,23169.432,23169.436,23169.434,23169.44,23169.429,23169.444,23169.472,23169.431,23169.447,23169.475,23169.487,23169.496,23169.496,23169.503,23169.495,23169.489,23169.464,23169.482,23169.516,23169.518,23169.524,23169.514,23169.533,23169.548,23169.548,23169.57,23169.568,23169.562,23169.582,23169.586,23169.588,23169.597,23169.594,23169.602,23169.575,23169.586,23169.597,23169.603,23169.622,23169.607,23169.603,23169.619,23169.626,23169.644,23169.651,23169.632,23169.655,23169.658,23169.659,23169.659,23169.656,23169.69,23169.684,23169.694,23169.705,23169.71,23169.699,23169.719,23169.714,23169.713,23169.719,23169.75,23169.726,23169.747,23169.749,23169.784,23169.779,23169.806,23169.815,23169.828,23169.809,23169.826,23169.864,23169.871,23169.886,23169.918,23169.906,23169.933,23169.926,23169.926,23169.921,23169.943,23169.907,23169.935,23169.917,23169.932,23169.936,23169.941,23169.962,23169.968,23169.989,23169.992,23169.975,23169.989,23169.985,23170.012,23170.004,23170.004,23170.006,23170.012,23169.996,23170.004,23170.036,23170.055,23170.057,23170.068,23170.097,23170.072,23170.123,23170.104,23170.115,23170.134,23170.172,23170.159,23170.181,23170.181,23170.17,23170.173,23170.131,23170.143,23170.18,23170.144,23170.157,23170.158,23170.186,23170.19,23170.185,23170.173,23170.182,23170.172,23170.19,23170.217,23170.222,23170.229,23170.224,23170.211,23170.199,23170.217,23170.192,23170.197,23170.186,23170.191,23170.171,23170.161,23170.137,23170.147,23170.156,23170.142,23170.136,23170.13,23170.149,23170.179,23170.163,23170.169,23170.165,23170.17,23170.159,23170.165,23170.212,23170.173,23170.199,23170.214,23170.198,23170.219,23170.2,23170.221,23170.249,23170.25,23170.245,23170.209,23170.239,23170.239,23170.245,23170.236,23170.233,23170.256,23170.227,23170.255,23170.256,23170.269,23170.283,23170.264,23170.281,23170.279,23170.273,23170.297,23170.303,23170.281,23170.281,23170.276,23170.292,23170.278,23170.259,23170.279,23170.276,23170.264,23170.256,23170.277,23170.28,23170.263,23170.284,23170.302,23170.3,23170.307,23170.302,23170.352,23170.319,23170.324,23170.315,23170.332,23170.346,23170.352,23170.35,23170.363,23170.379,23170.393,23170.394,23170.385,23170.396,23170.411,23170.392,23170.392,23170.395,23170.424,23170.395,23170.397,23170.395,23170.371,23170.365,23170.402,23170.407,23170.412,23170.428,23170.448,23170.431,23170.421,23170.464,23170.48,23170.458,23170.455,23170.482,23170.486,23170.484,23170.502,23170.502,23170.532,23170.527,23170.526,23170.534,23170.539,23170.586,23170.576,23170.583,23170.598,23170.59,23170.609,23170.621,23170.639,23170.641,23170.646,23170.659,23170.677,23170.681,23170.705,23170.719,23170.724,23170.718,23170.736,23170.73,23170.773,23170.772,23170.798,23170.805,23170.816,23170.822,23170.874,23170.881,23170.87,23170.873,23170.881,23170.917,23170.934,23170.946,23170.966,23170.981,23170.965,23170.976,23170.979,23171.002,23171.009,23171.013,23171.024,23171.03,23171.041,23171.061,23171.079,23171.096,23171.114,23171.133,23171.145,23171.176,23171.199,23171.208,23171.219,23171.238,23171.233,23171.268,23171.261,23171.282,23171.287,23171.286,23171.32,23171.337,23171.338,23171.368,23171.377,23171.428,23171.448,23171.47,23171.483,23171.475,23171.5,23171.526,23171.521,23171.56,23171.531,23171.588,23171.537,23171.547,23171.568,23171.584,23171.573,23171.546,23171.55,23171.613,23171.625,23171.633,23171.628,23171.659,23171.676,23171.686,23171.669,23171.716,23171.689,23171.71,23171.733,23171.763,23171.742,23171.723,23171.715,23171.741,23171.748,23171.748,23171.779,23171.792,23171.825,23171.851,23171.806,23171.824,23171.836,23171.849,23171.857,23171.856,23171.864,23171.884,23171.857,23171.861,23171.87,23171.86,23171.852,23171.845,23171.875,23171.888,23171.864,23171.859,23171.906,23171.923,23171.893,23171.898,23171.9,23171.926,23171.912,23171.897,23171.883,23171.863,23171.859,23171.858,23171.811,23171.846,23171.846,23171.875,23171.867,23171.858,23171.855,23171.849,23171.887,23171.875,23171.902,23171.86,23171.887,23171.895,23171.874,23171.889,23171.917,23171.911,23171.887,23171.904,23171.9,23171.91,23171.908,23171.965,23171.973,23171.963,23171.979,23171.989,23171.994,23171.999,23171.999,23172.02,23171.981,23172.02,23172.035,23172.026,23172.025,23172.024,23172.025,23172.014,23172.037,23172.084,23172.06,23172.058,23172.064,23172.076,23172.083,23172.09,23172.1,23172.118,23172.112,23172.132,23172.167,23172.2,23172.232,23172.24,23172.228,23172.246,23172.25,23172.252,23172.231,23172.229,23172.227,23172.232,23172.222,23172.241,23172.279,23172.268,23172.3,23172.322,23172.341,23172.389,23172.406,23172.453,23172.481,23172.485,23172.466,23172.495,23172.533,23172.534,23172.535,23172.531,23172.533,23172.526,23172.514,23172.518,23172.519,23172.565,23172.604,23172.655,23172.702,23172.728,23172.75,23172.763,23172.789,23172.811,23172.82,23172.809,23172.82,23172.816,23172.818,23172.821,23172.81,23172.808,23172.814,23172.837,23172.867,23172.882,23172.865,23172.885,23172.907,23172.956,23172.935,23172.996,23172.991,23172.956,23172.994,23173.002,23173.0,23173.013,23173.053,23173.048,23173.05,23173.068,23173.079,23173.111,23173.128,23173.114,23173.139,23173.16,23173.151,23173.185,23173.213,23173.23,23173.213,23173.213,23173.249,23173.27,23173.295,23173.327,23173.346,23173.343,23173.351,23173.397,23173.419,23173.412,23173.439,23173.474,23173.476,23173.497,23173.514,23173.519,23173.538,23173.535,23173.541,23173.572,23173.568,23173.59,23173.599,23173.594,23173.628,23173.645,23173.618,23173.649,23173.657,23173.685,23173.687,23173.684,23173.714,23173.724,23173.742,23173.73,23173.749,23173.769,23173.775,23173.763,23173.801,23173.802,23173.802,23173.801,23173.839,23173.814,23173.822,23173.849,23173.852,23173.862,23173.843,23173.869,23173.87,23173.894,23173.876,23173.871,23173.877,23173.896,23173.897,23173.899,23173.9,23173.889,23173.924,23173.914,23173.963,23173.941,23173.919,23173.951,23173.949,23173.938,23173.932,23173.932,23173.975,23173.945,23173.939,23173.933,23173.947,23173.951,23173.968,23173.959,23173.932,23173.968,23173.932,23173.919,23173.926,23173.94,23173.948,23173.926,23173.918,23173.89,23173.903,23173.95,23173.915,23173.902,23173.919,23173.904,23173.926,23173.95,23173.937,23173.938,23173.933,23173.942,23173.943,23173.999,23173.99,23174.035,23174.005,23174.028,23174.017,23174.001,23173.995,23173.997,23174.007,23174.014,23174.01,23174.034,23174.028,23174.052,23174.025,23174.057,23174.04,23174.033,23174.035,23174.045,23174.039,23174.035,23174.055,23174.065,23174.065,23174.069,23174.059,23174.089,23174.083,23174.073,23174.063,23174.076,23174.095,23174.079,23174.083,23174.109,23174.076,23174.097,23174.09,23174.09,23174.11,23174.135,23174.112,23174.145,23174.136,23174.139,23174.156,23174.119,23174.128,23174.133,23174.124,23174.116,23174.126,23174.117,23174.153,23174.164,23174.151,23174.131,23174.11,23174.114,23174.125,23174.131,23174.095,23174.111,23174.081,23174.088,23174.082,23174.1,23174.115,23174.089,23174.076,23174.109,23174.118,23174.134,23174.142,23174.134,23174.148,23174.171,23174.15,23174.143,23174.176,23174.185,23174.169,23174.17,23174.192,23174.199,23174.214,23174.218,23174.207,23174.204,23174.2,23174.22,23174.231,23174.223,23174.208,23174.214,23174.222,23174.217,23174.222,23174.206,23174.178,23174.217,23174.205,23174.201,23174.188,23174.21,23174.217,23174.216,23174.227,23174.214,23174.249,23174.257,23174.259,23174.24,23174.241,23174.261,23174.278,23174.239,23174.27,23174.262,23174.276,23174.301,23174.305,23174.301,23174.321,23174.329,23174.357,23174.358,23174.371,23174.371,23174.399,23174.402,23174.426,23174.436,23174.435,23174.453,23174.465,23174.454,23174.453,23174.443,23174.454,23174.439,23174.463,23174.451,23174.45,23174.447,23174.465,23174.459,23174.448,23174.456,23174.45,23174.464,23174.482,23174.473,23174.457,23174.473,23174.506,23174.486,23174.498,23174.504,23174.522,23174.497,23174.548,23174.559,23174.557,23174.556,23174.553,23174.554,23174.528,23174.538,23174.57,23174.559,23174.563,23174.541,23174.56,23174.572,23174.553,23174.56,23174.577,23174.566,23174.57,23174.554,23174.573,23174.566,23174.579,23174.556,23174.588,23174.594,23174.584,23174.591,23174.581,23174.562,23174.563,23174.55,23174.535,23174.535,23174.555,23174.548,23174.535,23174.539,23174.527,23174.536,23174.528,23174.489,23174.501,23174.503,23174.492,23174.487,23174.47,23174.463,23174.445,23174.423,23174.422,23174.409,23174.428,23174.442,23174.397,23174.429,23174.429,23174.42,23174.44,23174.466,23174.456,23174.46,23174.471,23174.441,23174.453,23174.45,23174.462,23174.476,23174.469,23174.48,23174.486,23174.453,23174.446,23174.455,23174.442,23174.393,23174.402,23174.405,23174.403,23174.387,23174.365,23174.376,23174.365,23174.345,23174.329,23174.323,23174.339,23174.363,23174.347,23174.344,23174.375,23174.38,23174.358,23174.382,23174.369,23174.382,23174.38,23174.376,23174.353,23174.347,23174.354,23174.34,23174.33,23174.332,23174.328,23174.333,23174.33,23174.318,23174.303,23174.288,23174.29,23174.306,23174.285,23174.276,23174.274,23174.29,23174.274,23174.263,23174.225,23174.256,23174.273,23174.25,23174.262,23174.275,23174.264,23174.261,23174.284,23174.281,23174.252,23174.275,23174.28,23174.266,23174.307,23174.289,23174.27,23174.305,23174.304,23174.311,23174.26,23174.28,23174.274,23174.311,23174.298,23174.303,23174.291,23174.277,23174.286,23174.279,23174.303,23174.31,23174.315,23174.304,23174.3,23174.318,23174.319,23174.343,23174.343,23174.351,23174.351,23174.346,23174.364,23174.37,23174.388,23174.384,23174.389,23174.42,23174.41,23174.425,23174.426,23174.439,23174.438,23174.452,23174.461,23174.435,23174.411,23174.439,23174.439,23174.414,23174.417,23174.405,23174.415,23174.425,23174.445,23174.464,23174.455,23174.46,23174.467,23174.475,23174.43,23174.447,23174.441,23174.414,23174.433,23174.44,23174.417,23174.403,23174.374,23174.374,23174.39,23174.356,23174.341,23174.332,23174.332,23174.335,23174.342,23174.315,23174.317,23174.329,23174.343,23174.356,23174.345,23174.345,23174.38,23174.368,23174.344,23174.365,23174.392,23174.379,23174.367,23174.354,23174.355,23174.332,23174.339,23174.347,23174.361,23174.39,23174.387,23174.365,23174.335,23174.345,23174.341,23174.351,23174.34,23174.339,23174.31,23174.332,23174.303,23174.276,23174.265,23174.286,23174.25,23174.282,23174.292,23174.28,23174.291,23174.293,23174.291,23174.299,23174.282,23174.29,23174.277,23174.262,23174.268,23174.259,23174.292,23174.26,23174.267,23174.295,23174.286,23174.289,23174.31,23174.317,23174.329,23174.321,23174.361,23174.334,23174.358,23174.36,23174.387,23174.411,23174.372,23174.386,23174.409,23174.389,23174.396,23174.427,23174.433,23174.423,23174.427,23174.462,23174.462,23174.453,23174.453,23174.478,23174.472,23174.468,23174.453,23174.43,23174.431,23174.453,23174.422,23174.457,23174.46,23174.469,23174.467,23174.479,23174.501,23174.515,23174.503,23174.472,23174.48,23174.479,23174.51,23174.464,23174.446,23174.458,23174.435,23174.437,23174.463,23174.414,23174.422,23174.428,23174.429,23174.407,23174.437,23174.455,23174.44,23174.445,23174.44,23174.426,23174.435,23174.46,23174.428,23174.421,23174.411,23174.414,23174.401,23174.392,23174.373,23174.376,23174.367,23174.395,23174.393,23174.38,23174.413,23174.404,23174.392,23174.414,23174.44,23174.423,23174.421,23174.42,23174.39,23174.404,23174.366,23174.361,23174.376,23174.339,23174.347,23174.323,23174.321,23174.324,23174.342,23174.321,23174.314,23174.348,23174.378,23174.373,23174.393,23174.363,23174.388,23174.413,23174.391,23174.41,23174.393,23174.366,23174.353,23174.314,23174.3,23174.293,23174.285,23174.264,23174.257,23174.272,23174.268,23174.283,23174.287,23174.286,23174.313,23174.339,23174.334,23174.338,23174.336,23174.338,23174.33,23174.317,23174.269,23174.267,23174.237,23174.262,23174.23,23174.214,23174.221,23174.26,23174.245,23174.299,23174.317,23174.317,23174.356,23174.345,23174.358,23174.327,23174.323,23174.315,23174.319,23174.322,23174.303,23174.296,23174.282,23174.272,23174.233,23174.236,23174.27,23174.25,23174.229,23174.216,23174.24,23174.225,23174.205,23174.214,23174.217,23174.19,23174.164,23174.161,23174.167,23174.157,23174.152,23174.165,23174.175,23174.146,23174.139,23174.15,23174.162,23174.175,23174.158,23174.216,23174.219,23174.207,23174.205,23174.202,23174.194,23174.186,23174.189,23174.185,23174.171,23174.172,23174.168,23174.153,23174.163,23174.147,23174.128,23174.117,23174.142,23174.161,23174.143,23174.105,23174.124,23174.096,23174.091,23174.079,23174.09,23174.074,23174.074,23174.033,23174.038,23174.026,23174.025,23174.012,23173.994,23174.008,23173.981,23173.979,23173.961,23173.978,23173.948,23173.938,23173.912,23173.932,23173.935,23173.951,23173.916,23173.913,23173.888,23173.887,23173.9,23173.881,23173.86,23173.89,23173.872,23173.864,23173.874,23173.885,23173.877,23173.875,23173.84,23173.842,23173.83,23173.817,23173.839,23173.831,23173.828,23173.849,23173.864,23173.863,23173.87,23173.889,23173.879,23173.896,23173.893,23173.898,23173.9,23173.909,23173.879,23173.878,23173.892,23173.869,23173.851,23173.867,23173.891,23173.861,23173.873,23173.869,23173.868,23173.872,23173.894,23173.877,23173.873,23173.87,23173.87,23173.875,23173.877,23173.864,23173.862,23173.85,23173.84,23173.845,23173.86,23173.882,23173.868,23173.861,23173.827,23173.83,23173.865,23173.84,23173.851,23173.842,23173.848,23173.847,23173.849,23173.836,23173.865,23173.864,23173.839,23173.833,23173.815,23173.809,23173.823,23173.822,23173.806,23173.809,23173.805,23173.782,23173.8,23173.804,23173.793,23173.793,23173.798,23173.813,23173.793,23173.777,23173.778,23173.744,23173.738,23173.753,23173.729,23173.765,23173.737,23173.744,23173.738,23173.748,23173.747,23173.763,23173.717,23173.761,23173.768,23173.774,23173.778,23173.761,23173.787,23173.781,23173.801,23173.818,23173.803,23173.792,23173.842,23173.818,23173.83,23173.807,23173.803,23173.816,23173.807,23173.818,23173.789,23173.767,23173.787,23173.768,23173.751,23173.768,23173.765,23173.763,23173.787,23173.767,23173.79,23173.81,23173.805,23173.809,23173.821,23173.838,23173.828,23173.826,23173.828,23173.834,23173.824,23173.834,23173.853,23173.893,23173.895,23173.883,23173.905,23173.9,23173.898,23173.919,23173.937,23173.929,23173.944,23173.959,23173.967,23173.96,23173.981,23173.954,23173.989,23173.983,23173.987,23174.01,23173.978,23173.988,23173.974,23173.962,23173.964,23173.946,23173.972,23173.966,23173.985,23173.979,23173.969,23173.959,23173.976,23174.003,23173.981,23173.983,23173.991,23174.016,23174.004,23173.976,23173.985,23173.999,23174.032,23174.037,23174.057,23174.05,23174.048,23174.087,23174.075,23174.081,23174.098,23174.135,23174.141,23174.129,23174.132,23174.141,23174.169,23174.14,23174.124,23174.144,23174.164,23174.156,23174.149,23174.159,23174.116,23174.137,23174.16,23174.167,23174.165,23174.17,23174.182,23174.174,23174.195,23174.164,23174.158,23174.186,23174.193,23174.175,23174.17,23174.171,23174.155,23174.174,23174.167,23174.177,23174.171,23174.159,23174.151,23174.166,23174.149,23174.152,23174.14,23174.139,23174.113,23174.11,23174.102,23174.113,23174.088,23174.097,23174.095,23174.095,23174.091,23174.097,23174.081,23174.078,23174.076,23174.06,23174.067,23174.084,23174.116,23174.098,23174.117,23174.124,23174.106,23174.107,23174.116,23174.119,23174.084,23174.093,23174.099,23174.102,23174.106,23174.111]}]}')

Get EX channel

Here we will get an EX channel, it will return a channel_dataset object.

Change ex to hx to look at the magnetic channels

[5]:
ex = m.get_channel("CAS04", "a", "hx", survey="CONUS_South")
ChannelTS object

Convert the ex channel to a ChannelTS object which is based on an xarray.DataArray and has methods to working with the data. More information can be found Time Series Objects

[6]:
exts = ex.to_channel_ts()
exts = exts.get_slice(start=exts.start, n_samples=4096)
Channel Response Filter

With the ChannelTS object comes the ChannelResponseFilter object. This includes all the filters that need to be applied to convert to physical units. Here we print out what the channel_response_filter attribute includes.

[7]:
exts.channel_response_filter
[7]:
Filters Included:
=========================
pole_zero_filter:
        calibration_date = 1980-01-01
        comments = NIMS magnetic field 3 pole Butterworth 0.5 low pass (analog)
        gain = 1.0
        name = magnetic_butterworth_low_pass
        normalization_factor = 2002.26936395594
        poles = [ -6.283185+10.882477j  -6.283185-10.882477j -12.566371 +0.j      ]
        type = zpk
        units_in = nT
        units_out = V
        zeros = []
--------------------
coefficient_filter:
        calibration_date = 1980-01-01
        comments = analog to digital conversion (magnetic)
        gain = 100.0
        name = magnetic_analog_to_digital
        type = coefficient
        units_in = V
        units_out = count
--------------------
time_delay_filter:
        calibration_date = 1980-01-01
        comments = time offset in seconds (digital)
        delay = -0.192
        gain = 1.0
        name = hx_time_offset
        type = time delay
        units_in = count
        units_out = count
--------------------
Plot the channel response

The gray areas are the pass band of the filters and the vertical black line is the normalization frequency. These are estimated values.

[8]:
exts.channel_response_filter.plot_response(x_units="frequency")
_images/examples_notebooks_remove_instrument_response_example_15_0.png

Remove Instrument Response

Here we will attempt to remove the instrument response to calibrate the data in physical units. Create a few helper functions

  • zero_pad will pad an input array to a power of 2.

[20]:
calibrated_ex = exts.remove_instrument_response(plot=True)
[18]:
from matplotlib import pyplot as plt
import numpy as np
from scipy import signal
[29]:
# plot normalized version of original and calibrated
fig = plt.figure(2, figsize=[5,4])
fig.clf()
ax_original = fig.add_subplot(2, 1, 1)
original_ts = exts.ts.copy()
original_ts = original_ts - original_ts.mean()
original_ts = original_ts / original_ts.max()
frn_ts = np.array(frn_json["values"][0]["values"], dtype=float)
frn_ts = signal.detrend(frn_ts, type="linear")
frn_ts = frn_ts - frn_ts.mean()
#frn_ts = frn_ts / frn_ts.max()
l1, = ax_original.twinx().plot(exts._ts.time, original_ts, color=(0, 0, 0))
l2, = ax_original.plot(exts._ts.time, calibrated_ex.ts, color=(.75, .1, .1))
l3, = ax_original.plot(np.array(frn_json["times"], dtype=np.datetime64), frn_ts, color=(.2, .1, .75))
ax_original.set_ylabel("nT")
ax_original.set_xlim((exts._ts.time[0], exts._ts.time[-1]))


ax_original_fft = fig.add_subplot(2, 1, 2)
f = np.fft.rfftfreq(original_ts.size, d=1)
original_fft = abs(np.fft.rfft(original_ts))
calibrated = abs(np.fft.rfft(calibrated_ex.ts))
response = abs(exts.channel_response_filter.complex_response(f))[::-1]
response[-1] = response[-2]
frn_spectra = abs(np.fft.rfft(frn_ts))
f_frn = np.fft.rfftfreq(frn_ts.size, d=1)
l1, = ax_original_fft.loglog(f, original_fft / original_fft.max(), color=(0, 0, 0))
l2, = ax_original_fft.loglog(f, calibrated / calibrated.max(), color=(.75, .1, .2))
l3, = ax_original_fft.loglog(f, response / response.max(), color=(.1, .75, .2))
l4, = ax_original_fft.loglog(f_frn, frn_spectra / frn_spectra.max(), color=(.2, .1, .75))
ax_original_fft.set_ylim((1E-6, 10))
fig.legend(
    [l1, l2, l4, l3],
    ["Original", "Calibrated", "Observatory", "Response"],
    ncol=4,
    loc="upper center",
)

fig.canvas.toolbar_visible = True
fig.canvas.header_visible = True
fig.canvas.resizable = True

fig.tight_layout()

plt.show()
C:\Users\jpeacock\AppData\Local\Temp\11\ipykernel_9464\1879432517.py:14: DeprecationWarning: parsing timezone aware datetimes is deprecated; this will raise an error in the future
  l3, = ax_original.plot(np.array(frn_json["times"], dtype=np.datetime64), frn_ts, color=(.2, .1, .75))
[ ]:
#m.close_mth5()
[ ]:

File Readers

Basics

The file readers have been setup to be like plugins, well hacked so it is setup like plugins. Further work needs to be done to fully set it up as Python plugins, but for now it works.

There is a generic reader that is loaded when MTH5 is imported called read_file. It will pick the correct reader based on the file extension or if the extension is ambiguous the user can input the specific file type.

>>> import mth5
>>> run_obj = mth5.read_file(r"/home/mt_data/mt001.bin")
>>> run_obj = mth5.read_file(r"/home/mt_data/mt001.bin", file_type='nims')

Supported Data Types

The following file types are supported:

File Structure

MTH5 Key

File Types

Returns

NIMS

nims

[.bin, .bnn]

RunTS

LEMI424

lemi

[.txt]

RunTS

USGS ASCII

usgs_ascii

[.asc, .gzip]

RunTS

Zonge Z3D

zen

[.z3d]

ChannelTS

Phoenix

phoenix

[.td_*, .bin]

ChannelTS

As you can see, NIMS, USGS ASCII, LEMI424 will return a mth5.timeseries.RunTS object and Zonge Z3D and Phoenix returns a mth5.timeseries.ChannelTS object. The return type depends on the structure of the file. Long period instruments usually record each channel in a single block of data, so all channels are in a single file. Whereas, broadband instruments record each channel as a single file. It might make sense in to return the same data type, but for now this is the way it is. Also returned are any extra metadata that might not belong to a channel or run. Specifically, most files have information about location and some other metadata about the station that could be helpful in filling out metadata for the station.

Collection

To further make intake easier, collections classes have been developed for each data type. For example LEMI424 files are typically 1 day long, but that does not necessarily mean that is the defines a single run. A run may incorporate multiple days, thus multiple files. The role of the LEMICollection object is to organize the files, in a logical way and assign a run ID to runs defined as a continuous block of data that may spand multiple files. See the next section for further examples.

Examples of Reading in Data

Note

In these examples the data are stored locally and will not run if you don’t have the data. We are working on setting up a repository to store data. But if you have data you can easily swap out the directory path.

Make an MTH5 from LEMI data

This notebook provides an example of how to read in LEMI (.TXT) files into an MTH5.

[1]:
from mth5.mth5 import MTH5
from mth5.io.lemi import LEMICollection
from mth5 import read_file
2022-09-07 18:07:09,774 [line 135] mth5.setup_logger - INFO: Logging file can be found C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\logs\mth5_debug.log
LEMI Collection

We will use the LEMICollection to assemble the .txt files into a logical order by schedule action or run. The output LEMI files include all data for each channel.

IMPORTANT: LEMICollection assumes the given file path is for a single station.

Metadata: we need to input the station_id and the survey_id to provide minimal metadata when making an MTH5 fild.

The LEMICollection.get_runs() will return a two level ordered dictionary (OrderedDict). The first level is keyed by station ID. These objects are in turn ordered dictionaries by run ID. Therefore you can loop over stations and runs.

Note: n_samples is an estimate based on file size not the data. To get an accurate number you should read in the full file.

[2]:
zc = LEMICollection(r"c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0110")
zc.station_id = "mt001"
zc.survey_id = "test"
runs = zc.get_runs(sample_rates=[1])
print(f"Found {len(runs)} station with {len(runs[list(runs.keys())[0]])} runs")
Found 1 station with 5 runs
[3]:
for run_id, run_df in runs[zc.station_id].items():
    display(run_df)
survey station run start end channel_id component fn sample_rate file_size n_samples sequence_number instrument_id calibration_fn
0 test mt001 sr1_0001 2020-09-30 20:21:00+00:00 2020-09-30 20:28:15+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 66272 436 0 LEMI424 None
survey station run start end channel_id component fn sample_rate file_size n_samples sequence_number instrument_id calibration_fn
1 test mt001 sr1_0002 2020-09-30 20:29:00+00:00 2020-09-30 20:42:16+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 121144 797 0 LEMI424 None
survey station run start end channel_id component fn sample_rate file_size n_samples sequence_number instrument_id calibration_fn
2 test mt001 sr1_0003 2020-09-30 20:54:00+00:00 2020-09-30 21:11:01+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 155344 1022 0 LEMI424 None
survey station run start end channel_id component fn sample_rate file_size n_samples sequence_number instrument_id calibration_fn
3 test mt001 sr1_0004 2020-09-30 21:12:00+00:00 2020-09-30 21:13:45+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 16112 106 0 LEMI424 None
survey station run start end channel_id component fn sample_rate file_size n_samples sequence_number instrument_id calibration_fn
4 test mt001 sr1_0005 2020-09-30 21:14:00+00:00 2020-09-30 23:59:59+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 1513920 9960 0 LEMI424 None
5 test mt001 sr1_0005 2020-10-01 00:00:00+00:00 2020-10-01 23:59:59+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 13132800 86400 0 LEMI424 None
6 test mt001 sr1_0005 2020-10-02 00:00:00+00:00 2020-10-02 23:59:59+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 13132800 86400 0 LEMI424 None
7 test mt001 sr1_0005 2020-10-03 00:00:00+00:00 2020-10-03 23:59:59+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 13132800 86400 0 LEMI424 None
8 test mt001 sr1_0005 2020-10-04 00:00:00+00:00 2020-10-04 23:59:59+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 13132800 86400 0 LEMI424 None
9 test mt001 sr1_0005 2020-10-05 00:00:00+00:00 2020-10-05 23:59:59+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 13132801 86400 0 LEMI424 None
10 test mt001 sr1_0005 2020-10-06 00:00:00+00:00 2020-10-06 23:59:59+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 13132800 86400 0 LEMI424 None
11 test mt001 sr1_0005 2020-10-07 00:00:00+00:00 2020-10-07 14:19:46+00:00 1 temperature_e,temperature_h,e1,e2,bx,by,bz c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0... 1.0 7841224 51587 0 LEMI424 None
Build MTH5

Now that we have a logical collection of files, lets load them into an MTH5. We will simply loop of the stations, runs, and channels in the ordered dictionary.

There are a few things that to keep in mind:

  • The LEMI raw files come with very little metadata, so as a user you will have to manually input most of it.

  • The output files from a LEMI are already calibrated into units of nT and mV/km (I think), therefore there are no filter to apply to calibrate the data.

  • Since this is a MTH5 file version 0.2.0 the filters are in the survey_group so add them there.

[4]:
m = MTH5()
m.open_mth5(zc.file_path.joinpath("from_lemi.h5"))
2022-09-07 18:07:12,663 [line 663] mth5.mth5.MTH5._initialize_file - INFO: Initialized MTH5 0.2.0 file c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0110\from_lemi.h5 in mode a
[5]:
survey_group = m.add_survey(zc.survey_id)
[6]:
%%time
for station_id in runs.keys():
    station_group = survey_group.stations_group.add_station(station_id)
    for run_id, run_df in runs[station_id].items():
        run_group = station_group.add_run(run_id)
        run_ts = read_file(run_df.fn.to_list())
        run_group.from_runts(run_ts)
    station_group.metadata.update(run_ts.station_metadata)
    station_group.write_metadata()
Wall time: 42 s
[7]:
%%time
station_group.validate_station_metadata()
station_group.write_metadata()

survey_group.update_survey_metadata()
survey_group.write_metadata()
Wall time: 27.3 s
MTH5 Structure

Have a look at the MTH5 structure and make sure it looks correct.

[8]:
m
[8]:
/:
====================
    |- Group: Experiment
    --------------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Surveys
        -----------------
            |- Group: test
            --------------
                |- Group: Filters
                -----------------
                    |- Group: coefficient
                    ---------------------
                    |- Group: fap
                    -------------
                    |- Group: fir
                    -------------
                    |- Group: time_delay
                    --------------------
                    |- Group: zpk
                    -------------
                |- Group: Reports
                -----------------
                |- Group: Standards
                -------------------
                    --> Dataset: summary
                    ......................
                |- Group: Stations
                ------------------
                    |- Group: mt001
                    ---------------
                        |- Group: Transfer_Functions
                        ----------------------------
                        |- Group: sr1_0001
                        ------------------
                            --> Dataset: bx
                            .................
                            --> Dataset: by
                            .................
                            --> Dataset: bz
                            .................
                            --> Dataset: e1
                            .................
                            --> Dataset: e2
                            .................
                            --> Dataset: temperature_e
                            ............................
                            --> Dataset: temperature_h
                            ............................
                        |- Group: sr1_0002
                        ------------------
                            --> Dataset: bx
                            .................
                            --> Dataset: by
                            .................
                            --> Dataset: bz
                            .................
                            --> Dataset: e1
                            .................
                            --> Dataset: e2
                            .................
                            --> Dataset: temperature_e
                            ............................
                            --> Dataset: temperature_h
                            ............................
                        |- Group: sr1_0003
                        ------------------
                            --> Dataset: bx
                            .................
                            --> Dataset: by
                            .................
                            --> Dataset: bz
                            .................
                            --> Dataset: e1
                            .................
                            --> Dataset: e2
                            .................
                            --> Dataset: temperature_e
                            ............................
                            --> Dataset: temperature_h
                            ............................
                        |- Group: sr1_0004
                        ------------------
                            --> Dataset: bx
                            .................
                            --> Dataset: by
                            .................
                            --> Dataset: bz
                            .................
                            --> Dataset: e1
                            .................
                            --> Dataset: e2
                            .................
                            --> Dataset: temperature_e
                            ............................
                            --> Dataset: temperature_h
                            ............................
                        |- Group: sr1_0005
                        ------------------
                            --> Dataset: bx
                            .................
                            --> Dataset: by
                            .................
                            --> Dataset: bz
                            .................
                            --> Dataset: e1
                            .................
                            --> Dataset: e2
                            .................
                            --> Dataset: temperature_e
                            ............................
                            --> Dataset: temperature_h
                            ............................
        --> Dataset: channel_summary
        ..............................
        --> Dataset: tf_summary
        .........................
Channel Summary

Have a look at the channel summary and make sure everything looks good.

[9]:
m.channel_summary.summarize()
m.channel_summary.to_dataframe()
[9]:
survey station run latitude longitude elevation component start end n_samples sample_rate measurement_type azimuth tilt units hdf5_reference run_hdf5_reference station_hdf5_reference
0 test mt001 a 34.080655 -107.214079 2202.8 bx 2020-09-30 20:21:00+00:00 2020-09-30 20:28:16+00:00 436 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
1 test mt001 a 34.080655 -107.214079 2202.8 by 2020-09-30 20:21:00+00:00 2020-09-30 20:28:16+00:00 436 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
2 test mt001 a 34.080655 -107.214079 2202.8 bz 2020-09-30 20:21:00+00:00 2020-09-30 20:28:16+00:00 436 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
3 test mt001 a 34.080655 -107.214079 2202.8 e1 2020-09-30 20:21:00+00:00 2020-09-30 20:28:16+00:00 436 1.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
4 test mt001 a 34.080655 -107.214079 2202.8 e2 2020-09-30 20:21:00+00:00 2020-09-30 20:28:16+00:00 436 1.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
5 test mt001 a 34.080655 -107.214079 2202.8 temperature_e 2020-09-30 20:21:00+00:00 2020-09-30 20:28:16+00:00 436 1.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
6 test mt001 a 34.080655 -107.214079 2202.8 temperature_h 2020-09-30 20:21:00+00:00 2020-09-30 20:28:16+00:00 436 1.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
7 test mt001 a 34.080655 -107.214079 2202.8 bx 2020-09-30 20:29:00+00:00 2020-09-30 20:42:17+00:00 797 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
8 test mt001 a 34.080655 -107.214079 2202.8 by 2020-09-30 20:29:00+00:00 2020-09-30 20:42:17+00:00 797 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
9 test mt001 a 34.080655 -107.214079 2202.8 bz 2020-09-30 20:29:00+00:00 2020-09-30 20:42:17+00:00 797 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
10 test mt001 a 34.080655 -107.214079 2202.8 e1 2020-09-30 20:29:00+00:00 2020-09-30 20:42:17+00:00 797 1.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
11 test mt001 a 34.080655 -107.214079 2202.8 e2 2020-09-30 20:29:00+00:00 2020-09-30 20:42:17+00:00 797 1.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
12 test mt001 a 34.080655 -107.214079 2202.8 temperature_e 2020-09-30 20:29:00+00:00 2020-09-30 20:42:17+00:00 797 1.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
13 test mt001 a 34.080655 -107.214079 2202.8 temperature_h 2020-09-30 20:29:00+00:00 2020-09-30 20:42:17+00:00 797 1.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
14 test mt001 a 34.080655 -107.214079 2202.8 bx 2020-09-30 20:54:00+00:00 2020-09-30 21:11:02+00:00 1022 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
15 test mt001 a 34.080655 -107.214079 2202.8 by 2020-09-30 20:54:00+00:00 2020-09-30 21:11:02+00:00 1022 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
16 test mt001 a 34.080655 -107.214079 2202.8 bz 2020-09-30 20:54:00+00:00 2020-09-30 21:11:02+00:00 1022 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
17 test mt001 a 34.080655 -107.214079 2202.8 e1 2020-09-30 20:54:00+00:00 2020-09-30 21:11:02+00:00 1022 1.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
18 test mt001 a 34.080655 -107.214079 2202.8 e2 2020-09-30 20:54:00+00:00 2020-09-30 21:11:02+00:00 1022 1.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
19 test mt001 a 34.080655 -107.214079 2202.8 temperature_e 2020-09-30 20:54:00+00:00 2020-09-30 21:11:02+00:00 1022 1.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
20 test mt001 a 34.080655 -107.214079 2202.8 temperature_h 2020-09-30 20:54:00+00:00 2020-09-30 21:11:02+00:00 1022 1.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
21 test mt001 a 34.080655 -107.214079 2202.8 bx 2020-09-30 21:12:00+00:00 2020-09-30 21:13:46+00:00 106 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
22 test mt001 a 34.080655 -107.214079 2202.8 by 2020-09-30 21:12:00+00:00 2020-09-30 21:13:46+00:00 106 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
23 test mt001 a 34.080655 -107.214079 2202.8 bz 2020-09-30 21:12:00+00:00 2020-09-30 21:13:46+00:00 106 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
24 test mt001 a 34.080655 -107.214079 2202.8 e1 2020-09-30 21:12:00+00:00 2020-09-30 21:13:46+00:00 106 1.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
25 test mt001 a 34.080655 -107.214079 2202.8 e2 2020-09-30 21:12:00+00:00 2020-09-30 21:13:46+00:00 106 1.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
26 test mt001 a 34.080655 -107.214079 2202.8 temperature_e 2020-09-30 21:12:00+00:00 2020-09-30 21:13:46+00:00 106 1.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
27 test mt001 a 34.080655 -107.214079 2202.8 temperature_h 2020-09-30 21:12:00+00:00 2020-09-30 21:13:46+00:00 106 1.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
28 test mt001 a 34.080655 -107.214079 2202.8 bx 2020-09-30 21:14:00+00:00 2020-10-07 17:05:47+00:00 589907 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
29 test mt001 a 34.080655 -107.214079 2202.8 by 2020-09-30 21:14:00+00:00 2020-10-07 17:05:47+00:00 589907 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
30 test mt001 a 34.080655 -107.214079 2202.8 bz 2020-09-30 21:14:00+00:00 2020-10-07 17:05:47+00:00 589907 1.0 magnetic 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
31 test mt001 a 34.080655 -107.214079 2202.8 e1 2020-09-30 21:14:00+00:00 2020-10-07 17:05:47+00:00 589907 1.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
32 test mt001 a 34.080655 -107.214079 2202.8 e2 2020-09-30 21:14:00+00:00 2020-10-07 17:05:47+00:00 589907 1.0 electric 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
33 test mt001 a 34.080655 -107.214079 2202.8 temperature_e 2020-09-30 21:14:00+00:00 2020-10-07 17:05:47+00:00 589907 1.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
34 test mt001 a 34.080655 -107.214079 2202.8 temperature_h 2020-09-30 21:14:00+00:00 2020-10-07 17:05:47+00:00 589907 1.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
Close the MTH5

This is important, you should close the file after you are done using it. Otherwise bad things can happen if you try to open it with another program or Python interpreter.

[10]:
m.close_mth5()
2022-09-07 18:08:23,271 [line 744] mth5.mth5.MTH5.close_mth5 - INFO: Flushing and closing c:\Users\jpeacock\OneDrive - DOI\mt\lemi\DATA0110\from_lemi.h5
Make an MTH5 from NIMS data

This notebook provides an example of how to read in NIMS (.BIN) files into an MTH5. NIMS files represent a single run.

[1]:
from mth5.mth5 import MTH5
from mth5.io.nims import NIMSCollection
from mth5 import read_file
2022-09-07 13:02:52,583 [line 135] mth5.setup_logger - INFO: Logging file can be found C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\logs\mth5_debug.log
NIMS Collection

We will use the NIMSCollection to assemble the .bin files into a logical order by run. The output NIMS files include all data for each channel for a single run. Therefore the collection is relatively simple.

Metadata: we need to input the survey_id to provide minimal metadata when making an MTH5 file.

The NIMSCollection.get_runs() will return a two level ordered dictionary (OrderedDict). The first level is keyed by station ID. These objects are in turn ordered dictionaries by run ID. Therefore you can loop over stations and runs.

Note: n_samples and end are estimates based on file size not the data. To get an accurate number you should read in the full file.

[2]:
nc = NIMSCollection(r"c:\Users\jpeacock\OneDrive - DOI\mt\nims")
nc.survey_id = "test"
runs = nc.get_runs(sample_rates=[8])
print(f"Found {len(runs)} station with {len(runs[list(runs.keys())[0]])} runs")
2022-09-07 13:02:53,062 [line 123] mth5.io.nims.header.NIMS.read_header - INFO: Reading NIMS file c:\Users\jpeacock\OneDrive - DOI\mt\nims\mnp300a.BIN
2022-09-07 13:02:53,081 [line 242] mth5.io.nims.header.NIMS.end_time - WARNING: Estimating end time from n_samples
2022-09-07 13:02:53,086 [line 123] mth5.io.nims.header.NIMS.read_header - INFO: Reading NIMS file c:\Users\jpeacock\OneDrive - DOI\mt\nims\mnp300b.BIN
2022-09-07 13:02:53,099 [line 242] mth5.io.nims.header.NIMS.end_time - WARNING: Estimating end time from n_samples
Found 1 station with 2 runs
[3]:
for run_id, run_df in runs["mnp300"].items():
    display(run_df)
survey station run start end channel_id component fn sample_rate file_size n_samples sequence_number instrument_id calibration_fn
0 test mnp300 mnp300a 2019-09-26 18:29:29+00:00 2019-10-01 15:03:23+00:00 1 hx,hy,hz,ex,ey,temperature c:\Users\jpeacock\OneDrive - DOI\mt\nims\mnp30... 8 54972155 3357078 1 NIMS None
survey station run start end channel_id component fn sample_rate file_size n_samples sequence_number instrument_id calibration_fn
1 test mnp300 mnp300b 2019-10-01 16:16:42+00:00 2019-10-03 22:55:52+00:00 1 hx,hy,hz,ex,ey,temperature c:\Users\jpeacock\OneDrive - DOI\mt\nims\mnp30... 8 25774314 1574003 2 NIMS None
Build MTH5

Now that we have a logical collection of files, lets load them into an MTH5. We will simply loop of the stations, runs, and channels in the ordered dictionary.

There are a few things that to keep in mind:

  • The LEMI raw files come with very little metadata, so as a user you will have to manually input most of it.

  • The output files from a LEMI are already calibrated into units of nT and mV/km (I think), therefore there are no filter to apply to calibrate the data.

  • Since this is a MTH5 file version 0.2.0 the filters are in the survey_group so add them there.

TODO:

- make sure filters get propagated throught to mth5
- think about run names
[4]:
calibrate = True
m = MTH5()
if calibrate:
    m.data_level = 2
m.open_mth5(nc.file_path.joinpath("from_nims.h5"))

2022-09-07 13:02:53,674 [line 663] mth5.mth5.MTH5._initialize_file - INFO: Initialized MTH5 0.2.0 file c:\Users\jpeacock\OneDrive - DOI\mt\nims\from_nims.h5 in mode a
[5]:
survey_group = m.add_survey(nc.survey_id)
[6]:
%%time
for station_id in runs.keys():
    station_group = survey_group.stations_group.add_station(station_id)
    for run_id, run_df in runs[station_id].items():
        run_group = station_group.add_run(run_id)
        run_ts = read_file(run_df.fn.unique()[0])
        if calibrate:
            run_ts = run_ts.calibrate()
        run_group.from_runts(run_ts)
    station_group.metadata.update(run_ts.station_metadata)
    station_group.write_metadata()
2022-09-07 13:02:54,234 [line 123] mth5.io.nims.header.NIMS.read_header - INFO: Reading NIMS file c:\Users\jpeacock\OneDrive - DOI\mt\nims\mnp300a.BIN
2022-09-07 13:02:54,649 [line 927] mth5.io.nims.header.NIMS.read_nims - WARNING: odd number of bytes 54971209, not even blocks cutting down the data by 72 bits
2022-09-07 13:02:55,667 [line 1019] mth5.io.nims.header.NIMS.read_nims - INFO: Reading took 1.43 seconds
2022-09-07 13:02:56,835 [line 288] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: end time of dataset 2019-10-01T15:07:08+00:00 does not match metadata end 2019-10-01T15:07:07.875000+00:00 updating metatdata value to 2019-10-01T15:07:08+00:00
2022-09-07 13:02:57,909 [line 288] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: end time of dataset 2019-10-01T15:07:08+00:00 does not match metadata end 2019-10-01T15:07:07.875000+00:00 updating metatdata value to 2019-10-01T15:07:08+00:00
2022-09-07 13:03:07,154 [line 123] mth5.io.nims.header.NIMS.read_header - INFO: Reading NIMS file c:\Users\jpeacock\OneDrive - DOI\mt\nims\mnp300b.BIN
2022-09-07 13:03:07,769 [line 1019] mth5.io.nims.header.NIMS.read_nims - INFO: Reading took 0.61 seconds
2022-09-07 13:03:08,620 [line 288] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: end time of dataset 2019-10-03T23:01:04+00:00 does not match metadata end 2019-10-03T23:01:03.875000+00:00 updating metatdata value to 2019-10-03T23:01:04+00:00
2022-09-07 13:03:09,250 [line 288] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: end time of dataset 2019-10-03T23:01:04+00:00 does not match metadata end 2019-10-03T23:01:03.875000+00:00 updating metatdata value to 2019-10-03T23:01:04+00:00
Wall time: 26.9 s
[7]:
%%time
station_group.validate_station_metadata()
station_group.write_metadata()

survey_group.update_survey_metadata()
survey_group.write_metadata()
Wall time: 7.68 s
MTH5 Structure

Have a look at the MTH5 structure and make sure it looks correct.

[8]:
m
[8]:
/:
====================
    |- Group: Experiment
    --------------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Surveys
        -----------------
            |- Group: test
            --------------
                |- Group: Filters
                -----------------
                    |- Group: coefficient
                    ---------------------
                        |- Group: dipole_101.00
                        -----------------------
                        |- Group: dipole_106.00
                        -----------------------
                        |- Group: dipole_109.00
                        -----------------------
                        |- Group: e_analog_to_digital
                        -----------------------------
                        |- Group: h_analog_to_digital
                        -----------------------------
                        |- Group: to_mt_units
                        ---------------------
                    |- Group: fap
                    -------------
                    |- Group: fir
                    -------------
                    |- Group: time_delay
                    --------------------
                        |- Group: ex_time_offset
                        ------------------------
                        |- Group: ey_time_offset
                        ------------------------
                        |- Group: hx_time_offset
                        ------------------------
                        |- Group: hy_time_offset
                        ------------------------
                        |- Group: hz_time_offset
                        ------------------------
                    |- Group: zpk
                    -------------
                        |- Group: nims_1_pole_butterworth
                        ---------------------------------
                            --> Dataset: poles
                            ....................
                            --> Dataset: zeros
                            ....................
                        |- Group: nims_3_pole_butterworth
                        ---------------------------------
                            --> Dataset: poles
                            ....................
                            --> Dataset: zeros
                            ....................
                        |- Group: nims_5_pole_butterworth
                        ---------------------------------
                            --> Dataset: poles
                            ....................
                            --> Dataset: zeros
                            ....................
                |- Group: Reports
                -----------------
                |- Group: Standards
                -------------------
                    --> Dataset: summary
                    ......................
                |- Group: Stations
                ------------------
                    |- Group: mnp300
                    ----------------
                        |- Group: Transfer_Functions
                        ----------------------------
                        |- Group: mnp300a
                        -----------------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                            --> Dataset: temperature
                            ..........................
                        |- Group: mnp300b
                        -----------------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                            --> Dataset: temperature
                            ..........................
        --> Dataset: channel_summary
        ..............................
        --> Dataset: tf_summary
        .........................
Channel Summary

Have a look at the channel summary and make sure everything looks good.

[9]:
m.channel_summary.summarize()
m.channel_summary.to_dataframe()
[9]:
survey station run latitude longitude elevation component start end n_samples sample_rate measurement_type azimuth tilt units hdf5_reference run_hdf5_reference station_hdf5_reference
0 test mnp300 mnp300a 34.726823 -115.735015 940.0 ex 2019-09-26 18:33:21+00:00 2019-10-01 15:07:08+00:00 3357016 8.0 electric 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
1 test mnp300 mnp300a 34.726823 -115.735015 940.0 ey 2019-09-26 18:33:21+00:00 2019-10-01 15:07:08+00:00 3357016 8.0 electric 90.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
2 test mnp300 mnp300a 34.726823 -115.735015 940.0 hx 2019-09-26 18:33:21+00:00 2019-10-01 15:07:08+00:00 3357016 8.0 magnetic 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
3 test mnp300 mnp300a 34.726823 -115.735015 940.0 hy 2019-09-26 18:33:21+00:00 2019-10-01 15:07:08+00:00 3357016 8.0 magnetic 90.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
4 test mnp300 mnp300a 34.726823 -115.735015 940.0 hz 2019-09-26 18:33:21+00:00 2019-10-01 15:07:08+00:00 3357016 8.0 magnetic 0.0 90.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
5 test mnp300 mnp300a 34.726823 -115.735015 940.0 temperature 2019-09-26 18:33:21+00:00 2019-10-01 15:07:08+00:00 3357016 8.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
6 test mnp300 mnp300b 34.726823 -115.735015 940.0 ex 2019-10-01 16:22:01+00:00 2019-10-03 23:01:04+00:00 1573944 8.0 electric 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
7 test mnp300 mnp300b 34.726823 -115.735015 940.0 ey 2019-10-01 16:22:01+00:00 2019-10-03 23:01:04+00:00 1573944 8.0 electric 90.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
8 test mnp300 mnp300b 34.726823 -115.735015 940.0 hx 2019-10-01 16:22:01+00:00 2019-10-03 23:01:04+00:00 1573944 8.0 magnetic 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
9 test mnp300 mnp300b 34.726823 -115.735015 940.0 hy 2019-10-01 16:22:01+00:00 2019-10-03 23:01:04+00:00 1573944 8.0 magnetic 90.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
10 test mnp300 mnp300b 34.726823 -115.735015 940.0 hz 2019-10-01 16:22:01+00:00 2019-10-03 23:01:04+00:00 1573944 8.0 magnetic 0.0 90.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
11 test mnp300 mnp300b 34.726823 -115.735015 940.0 temperature 2019-10-01 16:22:01+00:00 2019-10-03 23:01:04+00:00 1573944 8.0 auxiliary 0.0 0.0 none <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
Close the MTH5

This is important, you should close the file after you are done using it. Otherwise bad things can happen if you try to open it with another program or Python interpreter.

[10]:
m.close_mth5()
2022-09-07 13:03:29,078 [line 744] mth5.mth5.MTH5.close_mth5 - INFO: Flushing and closing c:\Users\jpeacock\OneDrive - DOI\mt\nims\from_nims.h5
[11]:
run_ts.plot()
_images/examples_notebooks_make_mth5_from_nims_16_0.png
[ ]:

Make an MTH5 from Phoenix Data

This example demonstrates how to read Phoenix data into an MTH5 file. The data comes from example data in PhoenixGeoPy. Here I downloaded those data into a local folder on my computer by forking the main branch.

Imports
[1]:
from pathlib import Path

from mth5.mth5 import MTH5
from mth5 import read_file
from mth5.io.phoenix import ReceiverMetadataJSON, PhoenixCollection
2022-08-31 13:19:07,431 [line 135] mth5.setup_logger - INFO: Logging file can be found C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\logs\mth5_debug.log
Data Directory

Specify the station directory. Phoenix files place each channel in a folder under the station directory named by the channel number. There is also a recmeta.json file that has metadata output by the receiver that can be useful. In the PhoenixGeopPy/sample_data there are 2 folders one for native data, these are .bin files which are the raw data in counts sampled at 24k. There is also a folder for segmented files, these files are calibrated to millivolts and decimated or segmented data according to the recording configuration. Most of the time you would use the segmented files?

[2]:
station_dir = Path(r"c:\Users\jpeacock\OneDrive - DOI\mt\phoenix_example_data\10291_2019-09-06-015630")
File Collection

We’ve developed a collection dataframe to help sort out which files are which and which files can be grouped together into runs. Continous runs will be given a single run name and segmented data will have sequential run names. Both will have the pattern sr{sample_rate}_#### for the run.id.

Here PhoenixCollection.get_runs returns a two level ordered dictionary (OrderedDict). The first level is keyed by station ID. These objects are in turn ordered dictionaries by run ID. Therefore you can loop over stations and runs.

Receiver Metadata

The data logger or receiver will output a JSON file that contains useful metadata that is missing from the data files. The recmeta.json file can be read into an object with methods to translate to mt_metadata objects. This is read in by PhoenixCollection and is in the attribute receiver_metadata.

[3]:
phx_collection = PhoenixCollection(file_path=station_dir)
run_dict = phx_collection.get_runs(sample_rates=[150, 24000])
Initiate MTH5

First initiate an MTH5 file, can use the receiver metadata to fill in some Survey metadata

[4]:
m = MTH5()
m.open_mth5(station_dir.joinpath("mth5_from_phoenix.h5"), "w")
2022-08-31 13:19:32,755 [line 596] mth5.mth5.MTH5.open_mth5 - WARNING: mth5_from_phoenix.h5 will be overwritten in 'w' mode
2022-08-31 13:19:33,267 [line 663] mth5.mth5.MTH5._initialize_file - INFO: Initialized MTH5 0.2.0 file c:\Users\jpeacock\OneDrive - DOI\mt\phoenix_example_data\10291_2019-09-06-015630\mth5_from_phoenix.h5 in mode w
[5]:
survey_metadata = phx_collection.receiver_metadata.survey_metadata
survey_group = m.add_survey(survey_metadata.id)
Loop through Stations, Runs, and Channels

Using the run_dict value output by the PhoenixCollection.get_runs we can simply loop through the runs without knowing if the data are continous or discontinuous, the read_file will take care of that.

Users should note the Phoenix file structure. Inside the folder are files with extensions of .td_24k and td_150.

  • .td_24k are usually bursts of a few seconds of data sampled at 24k samples per second to get high frequency information. The returned object is a mth5.timeseries.ChannelTS.

  • td_150 is data continuously sampled at 150 samples per second. These files usually have a set length, commonly an hour. The returned object is a mth5.timeseries.ChannelTS.

[6]:
%%time

for station_id, station_dict in run_dict.items():
    station_metadata = phx_collection.receiver_metadata.station_metadata
    station_group = survey_group.stations_group.add_station(
        station_metadata.id,
        station_metadata=station_metadata
    )
    for run_id, run_df in station_dict.items():
        run_metadata = phx_collection.receiver_metadata.run_metadata
        run_metadata.id = run_id
        run_metadata.sample_rate = float(run_df.sample_rate.unique()[0])

        run_group = station_group.add_run(run_metadata.id, run_metadata=run_metadata)
        for row in run_df.itertuples():
            ch_ts = read_file(row.fn, **{"channel_map":phx_collection.receiver_metadata.channel_map})
            ch_metadata = phx_collection.receiver_metadata.get_ch_metadata(
                ch_ts.channel_metadata.channel_number
            )
            # need to update the time period and sample rate as estimated from the data not the metadata
            ch_metadata.sample_rate = ch_ts.sample_rate
            ch_metadata.time_period.update(ch_ts.channel_metadata.time_period)
            ch_ts.channel_metadata.update(ch_metadata)

            # add channel to the run group
            ch_dataset = run_group.from_channel_ts(ch_ts)
Wall time: 3min 57s
Update metadata before closing

Need to update the metadata to account for added stations, runs, and channels.

[7]:
%%time
station_group.validate_station_metadata()
station_group.write_metadata()

survey_group.update_survey_metadata()
survey_group.write_metadata()
Wall time: 9min 19s
[8]:
m.channel_summary.summarize()
m.channel_summary.to_dataframe()
[8]:
survey station run latitude longitude elevation component start end n_samples sample_rate measurement_type azimuth tilt units hdf5_reference run_hdf5_reference station_hdf5_reference
0 UofS REMOTE sr150_0001 50.979666 -106.794989 559.472154 e1 2019-09-05 18:56:30+00:00 2019-09-06 07:15:38+00:00 6652200 150.0 electric 0.0 0.0 millivolts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
1 UofS REMOTE sr150_0001 50.979666 -106.794989 559.472154 e2 2019-09-05 18:56:30+00:00 2019-09-06 07:15:38+00:00 6652200 150.0 electric 0.0 0.0 millivolts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
2 UofS REMOTE sr150_0001 50.979666 -106.794989 559.472154 h1 2019-09-05 18:56:30+00:00 2019-09-06 07:15:38+00:00 6652200 150.0 magnetic 0.0 0.0 millivolts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
3 UofS REMOTE sr150_0001 50.979666 -106.794989 559.472154 h2 2019-09-05 18:56:30+00:00 2019-09-06 07:15:38+00:00 6652200 150.0 magnetic 0.0 0.0 millivolts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
4 UofS REMOTE sr150_0001 50.979666 -106.794989 559.472154 h3 2019-09-05 18:56:30+00:00 2019-09-06 07:15:38+00:00 6652200 150.0 magnetic 0.0 0.0 millivolts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ... ...
995 UofS REMOTE sr24k_0124 50.979666 -106.794989 559.472154 h2 2019-09-06 14:14:31+00:00 2019-09-06 14:14:33+00:00 48000 24000.0 magnetic 0.0 0.0 millivolts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
996 UofS REMOTE sr24k_0124 50.979666 -106.794989 559.472154 h3 2019-09-06 14:14:31+00:00 2019-09-06 14:14:33+00:00 48000 24000.0 magnetic 0.0 0.0 millivolts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
997 UofS REMOTE sr24k_0124 50.979666 -106.794989 559.472154 h4 2019-09-06 14:14:31+00:00 2019-09-06 14:14:33+00:00 48000 24000.0 magnetic 0.0 0.0 millivolts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
998 UofS REMOTE sr24k_0124 50.979666 -106.794989 559.472154 h5 2019-09-06 14:14:31+00:00 2019-09-06 14:14:33+00:00 48000 24000.0 magnetic 0.0 0.0 millivolts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
999 UofS REMOTE sr24k_0124 50.979666 -106.794989 559.472154 h6 2019-09-06 14:14:31+00:00 2019-09-06 14:14:33+00:00 48000 24000.0 magnetic 0.0 0.0 millivolts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>

1000 rows × 18 columns

[9]:
m.close_mth5()
2022-08-31 13:32:58,389 [line 744] mth5.mth5.MTH5.close_mth5 - INFO: Flushing and closing c:\Users\jpeacock\OneDrive - DOI\mt\phoenix_example_data\10291_2019-09-06-015630\mth5_from_phoenix.h5
Make an MTH5 from ZEN data

This notebook provides an example of how to read in ZEN (.Z3D) files into an MTH5.

[1]:
from mth5.mth5 import MTH5
from mth5.io.zen import Z3DCollection
from mth5 import read_file
2022-09-07 18:20:17,973 [line 135] mth5.setup_logger - INFO: Logging file can be found C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\logs\mth5_debug.log
Z3D Collection

We will use the Z3DCollection to assemble the .z3d files into a logical order by schedule action or run.

Note: n_samples is an estimate based on file size not the data. To get an accurate number you should read in the full file. Same with start and end. start is based on the schedule start time which is usually 2 seconds earlier than the data start because of instrument buffer while chaning sampling rates. end is based on file size and sample rate.

The Z3DCollection.get_runs() will return a two level ordered dictionary (OrderedDict). The first level is keyed by station ID. These objects are in turn ordered dictionaries by run ID. Therefore you can loop over stations and runs.

[2]:
zc = Z3DCollection(r"c:\Users\jpeacock\OneDrive - DOI\mt\example_z3d_data")
runs = zc.get_runs(sample_rates=[4096, 256])
print(f"Found {len(runs)} station with {len(runs[list(runs.keys())[0]])} runs")
Found 1 station with 2 runs
[3]:
runs["100"]["sr4096_0001"]
[3]:
survey station run start end channel_id component fn sample_rate file_size n_samples sequence_number instrument_id calibration_fn
5 100 sr4096_0001 2022-05-17 12:59:57+00:00 2022-05-17 13:09:53.349854+00:00 4 ex c:\Users\jpeacock\OneDrive - DOI\mt\example_z3... 4096.0 9641572 2442649 1 ZEN_024 None
6 100 sr4096_0001 2022-05-17 12:59:57+00:00 2022-05-17 13:09:53.351807+00:00 5 ey c:\Users\jpeacock\OneDrive - DOI\mt\example_z3... 4096.0 9641604 2442657 1 ZEN_024 None
7 100 sr4096_0001 2022-05-17 12:59:57+00:00 2022-05-17 13:09:53.348877+00:00 1 hx c:\Users\jpeacock\OneDrive - DOI\mt\example_z3... 4096.0 9644628 2442645 1 ZEN_024 None
8 100 sr4096_0001 2022-05-17 12:59:57+00:00 2022-05-17 13:09:53.351318+00:00 2 hy c:\Users\jpeacock\OneDrive - DOI\mt\example_z3... 4096.0 9644156 2442655 1 ZEN_024 None
9 100 sr4096_0001 2022-05-17 12:59:57+00:00 2022-05-17 13:09:53.351562+00:00 3 hz c:\Users\jpeacock\OneDrive - DOI\mt\example_z3... 4096.0 9644160 2442656 1 ZEN_024 None
Build MTH5

Now that we have a logical collection of files, lets load them into an MTH5. We will simply loop of the stations, runs, and channels in the ordered dictionary.

There are a few things that we need to keep track of.

  • The station metadata pulled directly from the Z3D files can be input into the station metadata, be sure to use the write_metadata method to write the metadata to the MTH5.

  • The Z3D files have the coil response and zen response embedded in the file, so we can put those into the appropriate filter container in MTH5. This is important for calibrating later.

  • Since this is a MTH5 file version 0.2.0 the filters are in the survey_group so add them there.

  • If you want to calibrate the data set calibrate to True.

[4]:
calibrate = True
m = MTH5()
if calibrate:
    m.data_level = 2
m.open_mth5(zc.file_path.joinpath("from_z3d.h5"))
2022-09-07 18:20:18,860 [line 663] mth5.mth5.MTH5._initialize_file - INFO: Initialized MTH5 0.2.0 file c:\Users\jpeacock\OneDrive - DOI\mt\example_z3d_data\from_z3d.h5 in mode a
[5]:
survey_group = m.add_survey("test")
[6]:
%%time
for station_id in runs.keys():
    station_group = survey_group.stations_group.add_station(station_id)
    station_group.metadata.update(zc.station_metadata_dict[station_id])
    station_group.write_metadata()
    for run_id, run_df in runs[station_id].items():
        run_group = station_group.add_run(run_id)
        for row in run_df.itertuples():
            ch_ts = read_file(row.fn)
            # NOTE: this is where the calibration occurs
            if calibrate:
                ch_ts = ch_ts.remove_instrument_response()
            run_group.from_channel_ts(ch_ts)
2022-09-07 18:20:22,178 [line 221] mt_metadata.base.metadata.frequency_response_table_filter.complex_response - WARNING: Extrapolating, use values outside calibration frequencies with caution
2022-09-07 18:20:23,720 [line 221] mt_metadata.base.metadata.frequency_response_table_filter.complex_response - WARNING: Extrapolating, use values outside calibration frequencies with caution
2022-09-07 18:20:25,140 [line 221] mt_metadata.base.metadata.frequency_response_table_filter.complex_response - WARNING: Extrapolating, use values outside calibration frequencies with caution
2022-09-07 18:20:29,176 [line 221] mt_metadata.base.metadata.frequency_response_table_filter.complex_response - WARNING: Extrapolating, use values outside calibration frequencies with caution
2022-09-07 18:20:30,853 [line 221] mt_metadata.base.metadata.frequency_response_table_filter.complex_response - WARNING: Extrapolating, use values outside calibration frequencies with caution
2022-09-07 18:20:32,481 [line 221] mt_metadata.base.metadata.frequency_response_table_filter.complex_response - WARNING: Extrapolating, use values outside calibration frequencies with caution
Wall time: 14 s
[7]:
%%time
station_group.validate_station_metadata()
station_group.write_metadata()

survey_group.update_survey_metadata()
survey_group.write_metadata()
Wall time: 5.76 s
MTH5 Structure

Have a look at the MTH5 structure and make sure it looks correct.

[8]:
m
[8]:
/:
====================
    |- Group: Experiment
    --------------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Surveys
        -----------------
            |- Group: test
            --------------
                |- Group: Filters
                -----------------
                    |- Group: coefficient
                    ---------------------
                        |- Group: dipole_55.00m
                        -----------------------
                        |- Group: dipole_56.00m
                        -----------------------
                        |- Group: zen_counts2mv
                        -----------------------
                    |- Group: fap
                    -------------
                        |- Group: ant4_2314_response
                        ----------------------------
                            --> Dataset: fap_table
                            ........................
                        |- Group: ant4_2324_response
                        ----------------------------
                            --> Dataset: fap_table
                            ........................
                        |- Group: ant4_2334_response
                        ----------------------------
                            --> Dataset: fap_table
                            ........................
                    |- Group: fir
                    -------------
                    |- Group: time_delay
                    --------------------
                    |- Group: zpk
                    -------------
                |- Group: Reports
                -----------------
                |- Group: Standards
                -------------------
                    --> Dataset: summary
                    ......................
                |- Group: Stations
                ------------------
                    |- Group: 100
                    -------------
                        |- Group: Transfer_Functions
                        ----------------------------
                        |- Group: sr256_0002
                        --------------------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                        |- Group: sr4096_0001
                        ---------------------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
        --> Dataset: channel_summary
        ..............................
        --> Dataset: tf_summary
        .........................
Channel Summary

Have a look at the channel summary and make sure everything looks good.

[9]:
m.channel_summary.summarize()
m.channel_summary.to_dataframe()
[9]:
survey station run latitude longitude elevation component start end n_samples sample_rate measurement_type azimuth tilt units hdf5_reference run_hdf5_reference station_hdf5_reference
0 test 100 sr256_0002 40.497576 -116.821188 1456.7 ex 2022-05-17 13:09:58+00:00 2022-05-17 15:54:42+00:00 2530304 256.0 electric 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
1 test 100 sr256_0002 40.497576 -116.821188 1456.7 ey 2022-05-17 13:09:58+00:00 2022-05-17 15:54:42+00:00 2530304 256.0 electric 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
2 test 100 sr256_0002 40.497576 -116.821188 1456.7 hx 2022-05-17 13:09:58+00:00 2022-05-17 15:54:42+00:00 2530304 256.0 magnetic 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
3 test 100 sr256_0002 40.497576 -116.821188 1456.7 hy 2022-05-17 13:09:58+00:00 2022-05-17 15:54:42+00:00 2530304 256.0 magnetic 90.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
4 test 100 sr256_0002 40.497576 -116.821188 1456.7 hz 2022-05-17 13:09:58+00:00 2022-05-17 15:54:42+00:00 2530304 256.0 magnetic 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
5 test 100 sr4096_0001 40.497576 -116.821188 1456.7 ex 2022-05-17 12:59:58+00:00 2022-05-17 13:09:41.997559+00:00 2392054 4096.0 electric 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
6 test 100 sr4096_0001 40.497576 -116.821188 1456.7 ey 2022-05-17 12:59:58+00:00 2022-05-17 13:09:41.999023+00:00 2392060 4096.0 electric 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
7 test 100 sr4096_0001 40.497576 -116.821188 1456.7 hx 2022-05-17 12:59:58+00:00 2022-05-17 13:09:41.996094+00:00 2392048 4096.0 magnetic 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
8 test 100 sr4096_0001 40.497576 -116.821188 1456.7 hy 2022-05-17 12:59:58+00:00 2022-05-17 13:09:41.997070+00:00 2392052 4096.0 magnetic 90.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
9 test 100 sr4096_0001 40.497576 -116.821188 1456.7 hz 2022-05-17 12:59:58+00:00 2022-05-17 13:09:41.998779+00:00 2392059 4096.0 magnetic 0.0 0.0 count <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
Close the MTH5

This is important, you should close the file after you are done using it. Otherwise bad things can happen if you try to open it with another program or Python interpreter.

[10]:
m.close_mth5()
2022-09-07 18:20:39,445 [line 744] mth5.mth5.MTH5.close_mth5 - INFO: Flushing and closing c:\Users\jpeacock\OneDrive - DOI\mt\example_z3d_data\from_z3d.h5

Adding Plugins

Everyone has their own file structure and therefore there will need to be various readers for the different data formats. If you have a data format that isn’t supported adding a reader would be a welcomed contribution. To keep things somewhat uniform here are some guidelines to add a reader.

Reader Structure

The reader should be setup with having a class that contains the metadata which is inherited to a class that holds the data. This makes things a little easier to separate and read. It helps if the metadata has similar names as the standards but don’t have to be it just means you have to do some translation.

It helps if you have properties, if attributes are not appropriate, for important information that is passed onto mth5.timeseries.ChannelTS or mth5.timeseries.RunTS.

from mth5.timeseries import ChannelTS, RunTS

class MyFileMetadata:
        """ Read in metadata into appropriate objects """
        def __init__(self, fn):
                self.fn = fn
                self.start = None
                self.end = None
                self.sample_rate = None

        def read_metadata():
                """ function to read in metadata and fill attribute values """
                pass

class MyFile(MyFileMetadata):
        """ inheret metadata and read data """
        def __init__(self, fn):
                self.fn = fn
                self.data = None

                super().__init__()

        @property
        def station_metadata(self):
                """ Any station metadata within the file """

                station_meta_dict = {}
                station_meta_dict['location.latitude'] = self.latitude

                return {'Station': station_meta_dict}

        @property
        def run_metadata(self):
                """ Any run metadata within the file """

                run_meta_dict = {}
                run_meta_dict['id'] = f"{self.station}a"

                return {'Run': run_meta_dict}

        @property
        def channel_metadata(self):
                """ channel metadata filled from information in the file """
                channel_meta_dict = {}
                channel_meta_dict['time_period.start'] = self.start
                channel_meta_dict['time_period.end'] = self.end
                channel_meta_dict['sample_rate'] = self.sample_rate

                return {'Electric': channel_meta_dict}


        @property
        def ex(self):
                """ ex convenience property """
                # if a pandas dataframe or numpy structured array
                return timeseries.ChannelTS('electric',
                                                           data=self.data['ex'],
                                                           channel_metadata=self.channel_metadata,
                                                           station_metadata=self.station_metadata,
                                                           run_metadata=self.run_metadata)

        def read_my_file(self):
                """ read in data """
                # suggest reading into a data type like numpy, pandas, xarray
                # xarray is the main object used for time series data in mth5
                return RunTS([self.ex, self.ey, self.hx, self.hy, self.hx])


def read_my_file(fn):
        """ the helper function to read the file """
        new_obj = MyFile(fn)
        return new_obj.read_my_file()

See also

mth5.io.zen and mth5.io.nims for working examples.

Once you have come up a reader you can add it to the reader module. You just need to add a file name and associated file types.

In the dictionary in mth5.reader ‘readers’ add a line like:

"my_file": {"file_types": ["dat", "data"], "reader": my_file.read_my_file},

Then you can see if your reader works

>>> import mth5
>>> run = mth5.read_file(r"/home/mt_data/test.dat", file_type='my_file')
Collections Structure

If you add a reader, you should also add a collection class that can sort your given file structure into runs. This should inherit mth5.io.collection.Collection.

    import pandas as pd
from mth5.io import Collection
    from mth5.io.my_file import MyFileReader

    class MyCollection(Collection):

            def __init__(self, **kwargs):
                    super()__init__(self, **kwargs)
                    self.file_ext = "my_file_extension"

            def to_dataframe(self, sample_rates, run_name_zeros=4, calibration_path=None):
                    """
                    Create a :class:`pandas.DataFrame` from my_file_type files.  This should
                    be specific enough to your file structure and generic enough to plug in.

                    This method should only read the metadata from the files, and not open
                    the entire file.

                    :param sample_rates: sample rate to get, will always be 1 for LEMI data
                    defaults to [1]
                    :type sample_rates: int or list, optional
                    :param run_name_zeros: number of zeros to assing to the run name,
                    defaults to 4
                    :type run_name_zeros: int, optional
                    :param calibration_path: path to calibration files, defaults to None
                    :type calibration_path: string or Path, optional
                    :return: Dataframe with information of each TXT file in the given
                    directory.
                    :rtype: :class:`pandas.DataFrame`

                    """

                    entries = []
                    for fn in self.get_files(self.file_ext):
                            my_file_obj = MyFileReader(fn)
                            n_samples = int(my_file_obj.n_samples)
                            my_file_obj.read_metadata()

                            entry = {}
                            entry["survey"] = self.survey_metadata.id
                            entry["station"] = self.station_metadata.id
                            entry["run"] = None
                            entry["start"] = my_file_obj.start.isoformat()
                            entry["end"] = my_file_obj.end.isoformat()
                            entry["channel_id"] = my_file_obj.channel_metadata.id
                            entry["component"] = my_file_obj.channel_metadata.component
                            entry["fn"] = fn
                            entry["sample_rate"] = my_file_obj.sample_rate
                            entry["file_size"] = my_file_obj.file_size
                            entry["n_samples"] = my_file_obj.n_samples
                            entry["sequence_number"] = 0
                            entry["instrument_id"] = "MyInstrument"
                            entry["calibration_fn"] = None

                            entries.append(entry)

                    # make pandas dataframe and set data types
                    df = self._sort_df(
                            self._set_df_dtypes(pd.DataFrame(entries)), run_name_zeros
                    )

                    return df

            def assign_run_names(self, df, zeros=4):
                    """
                    Assign run names based on the file structure. Remember
                    a run is defined as a continuously recorded block of
                    data.  So if your file structure splits up files during
                    a run make sure there is logic to assign the same run ID
                    to those files that are in the same run.

                    Below is an example of testing the start time of one file
                    against the end time of the next file.

                    Run names are assigned as sr{sample_rate}_{run_number:0{zeros}}.

                    :param df: Dataframe with the appropriate columns
                    :type df: :class:`pandas.DataFrame`
                    :param zeros: number of zeros in run name, defaults to 4
                    :type zeros: int, optional
                    :return: Dataframe with run names
                    :rtype: :class:`pandas.DataFrame`

                    """

                    count = 1
                    for row in df.itertuples():
                            if row.Index == 0:
                                    df.loc[row.Index, "run"] = f"sr1_{count:0{zeros}}"
                                    previous_end = row.end
                            else:
                                    if (
                                            row.start - previous_end
                                    ).total_seconds() / row.sample_rate == row.sample_rate:
                                            df.loc[row.Index, "run"] = f"sr1_{count:0{zeros}}"
                                    else:
                                            count += 1
                                            df.loc[row.Index, "run"] = f"sr1_{count:0{zeros}}"
                                    previous_end = row.end

                    return df

Transfer Functions

There exists the ability to store transfer functions in an MTH5 for a comprehensive representation of data. The transfer functions are stored at the Station level in a group called TransferFunctionsGroup. Within this group station transfer functions are stored as a single group for each transfer function TransferFunctionGroup. Each TransferFunctionGroup has a data set for each statistical estimate provided within the transfer function. Currently supported estimates are

Estimate

Description

Shape

Data Type

transfer_function

Full transfer function

n_periods x n_inputs x n_outputs

complex

transfer_function_error

Full transfer function error estimation

n_periods x n_inputs x n_outputs

real

impedance

Only horizontal components, traditional impedance tensor

n_periods x 2 x 2

complex

impedance_error

Only horizontal components, traditional impedance tensor error

n_periods x 2 x 2

real

tipper

horizontal and vertical magnetic transfer function, Tipper

n_periods x 1 x 2

complex

tipper_error

horizontal and vertical magnetic transfer function, Tipper error

n_periods x 1 x 2

real

inverse_signal_power

covariance of input channels (sources)

n_periods x n_inputs x n_inputs

complex

residual_covariance

covariance of output channels (responses)

n_periods x n_outputs x n_outputs

complex

Note: There are plans to add phase tensor and resitivity/phase estimations in the future.

This examples demonstrates how transfer functions can be added to an MTH5 file.

TF object

The TF object comes from mt_metadata.transfer_functions.core.TF and is meant to be the common container for transfer functions. It has readers for:

  • EDI

  • EMFT XML

  • Z-files (EMTF output)

  • J-files (BIRRP output)

  • AVG files (Zonge output)

The TF object has two important metadata objects survey_metadata and station_metadata. Metadata from the other files are translated into these containers and translated back when writing a file.

The statistical estimates are stored as xarray.Datasets that have coordinates of period, input_channels, output_channels. This way the transfer function can be generalized. impedance and tipper are stored in transfer_function and TF provides convenience functions to access impedance and tipper and associated errors. Variances are stored as covariances for input channels (inverse_signal_power) and output channels (residual_covariance) when possible, and the transfer_function_error is stored as well.

Note: There are future plans to include phase tensor and resistivity/phase representation as well.

[1]:
from mth5.mth5 import MTH5

from mt_metadata import TF_XML, TF_EDI_SPECTRA, TF_ZMM, TF_EDI_CGG
from mt_metadata.transfer_functions.core import TF
2023-04-21 10:31:36,442 [line 141] mth5.setup_logger - INFO: Logging file can be found C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\logs\mth5_debug.log
[2]:
m = MTH5(file_version="0.2.0")
m.open_mth5(r"transfer_function_example.h5", "w")

2023-04-21 10:31:42,346 [line 672] mth5.mth5.MTH5._initialize_file - INFO: Initialized MTH5 0.2.0 file transfer_function_example.h5 in mode w

Read in Transfer functions

  • TF_XML: An example of EMTF XML format, the preferred format for archiving

  • TF_EDI_SPECTRA: An example of an EDI file stored as spectra

  • TF_EDI_CGG: An example of an output file from a contractor

  • TF_ZMM: An example of an output file from EMFT

[3]:
tf1 = TF(TF_XML)
tf1.read_tf_file()

tf2 = TF(TF_EDI_SPECTRA)
tf2.read_tf_file()

tf3 = TF(TF_EDI_CGG)
tf3.read_tf_file()

tf4 = TF(TF_ZMM)
tf4.read_tf_file()

Add TF_XML to the MTH5

When we add a transfer function to an MTH5, it looks for the survey.id and station.id, if it doesn’t find any then they are created. If information is provided on which runs were processed and channels used those are filled in as well.

Note: If you have multiple transfer functions for a given station be sure to rename the file, for an EDI this is in HEADER under the attribute DATAID. Name it something like station_sample_rate or station_runs

[4]:
tf_group_01 = m.add_transfer_function(tf1)
tf_group_01
2023-04-21 10:31:49,566 [line 1054] mth5.mth5.MTH5.get_survey - WARNING: /Experiment/Surveys/CONUS_South does not exist, check survey_list for existing names.
[4]:
/Experiment/Surveys/CONUS_South/Stations/NMX20/Transfer_Functions/NMX20:
====================
    --> Dataset: inverse_signal_power
    ...................................
    --> Dataset: period
    .....................
    --> Dataset: residual_covariance
    ..................................
    --> Dataset: transfer_function
    ................................
    --> Dataset: transfer_function_error
    ......................................
[5]:
est = tf_group_01.get_estimate("transfer_function")
tf_group_01.has_estimate("covariance")
[5]:
True
Have a look at what was added to the MTH5

Note that an EMTF XML has comprehensive metadata which can be used to populate the MTH5 as necessary, including Runs and Channels

Add an example EDI

Here the survey is not specified therefore we need to fill that information in ourselves otherwise an error is raised, see below.

[6]:
#tf_group_02 = m.add_transfer_function(tf2)

Here we give the survey the id unknown_survey. Also note that because the data are stored as spectra in the EDI we can calculate the inverse_signal_power and residual_covariance.

[6]:
tf2.survey_metadata.id = "unknown_survey"
tf_group_02 = m.add_transfer_function(tf2)
print(tf_group_02.has_estimate("covariance"))
2023-04-21 10:32:07,195 [line 1054] mth5.mth5.MTH5.get_survey - WARNING: /Experiment/Surveys/unknown_survey does not exist, check survey_list for existing names.
True

Add typical EDI file

This file only has impedance and tipper and minimal metadata, which are converted into a full transfer function for storage.

[7]:
tf3.survey_metadata.id = "unknown_survey"
tf_group_03 = m.add_transfer_function(tf3)
tf_group_03.has_estimate("covariance")
2023-04-21 10:32:11,616 [line 222] mth5.groups.base.TransferFunction.get_estimate - ERROR: residual_covariance does not exist, check groups_list for existing names
[7]:
False

Add an output from EMTF

A ZMM file contains the full covariance and transfer functions but has minimal metadata

[8]:
print(tf4)
Station: 300
--------------------------------------------------
        Survey:            None
        Project:           None
        Acquired by:       None
        Acquired date:     1980-01-01
        Latitude:          34.727
        Longitude:         -115.735
        Elevation:         0.000
        Declination:
                Value:     13.1
                Model:     WMM
        Coordinate System: geographic
        Impedance:         True
        Tipper:            True
        N Periods:     38
        Period Range:
                Min:   1.16364E+00 s
                Max:   1.09227E+04 s
        Frequency Range:
                Min:   9.15527E-05 Hz
                Max:   8.59372E-01 Hz
[9]:
tf4.survey_metadata.id = "unknown_survey"
tf_group_04 = m.add_transfer_function(tf4)
tf_group_04.has_estimate("impedance")
[9]:
True

Have a look at the MTH5 file

Everything has been filled in now in the MTH5 including metadata about runs and channels

[10]:
m
[10]:
/:
====================
    |- Group: Experiment
    --------------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Surveys
        -----------------
            |- Group: CONUS_South
            ---------------------
                |- Group: Filters
                -----------------
                    |- Group: coefficient
                    ---------------------
                    |- Group: fap
                    -------------
                    |- Group: fir
                    -------------
                    |- Group: time_delay
                    --------------------
                    |- Group: zpk
                    -------------
                |- Group: Reports
                -----------------
                |- Group: Standards
                -------------------
                    --> Dataset: summary
                    ......................
                |- Group: Stations
                ------------------
                    |- Group: NMX20
                    ---------------
                        |- Group: NMX20a
                        ----------------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                        |- Group: NMX20b
                        ----------------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                        |- Group: Transfer_Functions
                        ----------------------------
                            |- Group: NMX20
                            ---------------
                                --> Dataset: inverse_signal_power
                                ...................................
                                --> Dataset: period
                                .....................
                                --> Dataset: residual_covariance
                                ..................................
                                --> Dataset: transfer_function
                                ................................
                                --> Dataset: transfer_function_error
                                ......................................
            |- Group: unknown_survey
            ------------------------
                |- Group: Filters
                -----------------
                    |- Group: coefficient
                    ---------------------
                    |- Group: fap
                    -------------
                    |- Group: fir
                    -------------
                    |- Group: time_delay
                    --------------------
                    |- Group: zpk
                    -------------
                |- Group: Reports
                -----------------
                |- Group: Standards
                -------------------
                    --> Dataset: summary
                    ......................
                |- Group: Stations
                ------------------
                    |- Group: 300
                    -------------
                        |- Group: 300a
                        --------------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                        |- Group: Transfer_Functions
                        ----------------------------
                            |- Group: 300
                            -------------
                                --> Dataset: inverse_signal_power
                                ...................................
                                --> Dataset: period
                                .....................
                                --> Dataset: residual_covariance
                                ..................................
                                --> Dataset: transfer_function
                                ................................
                                --> Dataset: transfer_function_error
                                ......................................
                    |- Group: SAGE_2005
                    -------------------
                        |- Group: SAGE_2005a
                        --------------------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                        |- Group: Transfer_Functions
                        ----------------------------
                            |- Group: SAGE_2005
                            -------------------
                                --> Dataset: inverse_signal_power
                                ...................................
                                --> Dataset: period
                                .....................
                                --> Dataset: residual_covariance
                                ..................................
                                --> Dataset: transfer_function
                                ................................
                                --> Dataset: transfer_function_error
                                ......................................
            |- Group: unknown_survey_001
            ----------------------------
                |- Group: Filters
                -----------------
                    |- Group: coefficient
                    ---------------------
                    |- Group: fap
                    -------------
                    |- Group: fir
                    -------------
                    |- Group: time_delay
                    --------------------
                    |- Group: zpk
                    -------------
                |- Group: Reports
                -----------------
                |- Group: Standards
                -------------------
                    --> Dataset: summary
                    ......................
                |- Group: Stations
                ------------------
                    |- Group: TEST01
                    ----------------
                        |- Group: TEST01a
                        -----------------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                        |- Group: Transfer_Functions
                        ----------------------------
                            |- Group: TEST01
                            ----------------
                                --> Dataset: period
                                .....................
                                --> Dataset: transfer_function
                                ................................
                                --> Dataset: transfer_function_error
                                ......................................
        --> Dataset: channel_summary
        ..............................
        --> Dataset: tf_summary
        .........................

Get a transfer function object from MTH5

To retrieve a transfer function from the MTH5 file a convenience function m.get_transfer_function is supplied. You only need to know the station.id, tf.id, and the survey.id. Here the tf.id is the same as the station.id.

[11]:
tf1_h5 = m.get_transfer_function(tf1.station_metadata.id, tf1.tf_id, tf1.survey_metadata.id)
[12]:
print(tf1)
print(tf1_h5)
Station: NMX20
--------------------------------------------------
        Survey:            CONUS South
        Project:           USMTArray
        Acquired by:       National Geoelectromagnetic Facility
        Acquired date:     2020-09-20
        Latitude:          34.471
        Longitude:         -108.712
        Elevation:         1940.050
        Declination:
                Value:     9.09
                Model:     WMM
        Coordinate System: geographic
        Impedance:         True
        Tipper:            True
        N Periods:     33
        Period Range:
                Min:   4.65455E+00 s
                Max:   2.91271E+04 s
        Frequency Range:
                Min:   3.43323E-05 Hz
                Max:   2.14844E-01 Hz
Station: NMX20
--------------------------------------------------
        Survey:            CONUS South
        Project:           USMTArray
        Acquired by:       National Geoelectromagnetic Facility
        Acquired date:     2020-09-20
        Latitude:          34.471
        Longitude:         -108.712
        Elevation:         1940.050
        Declination:
                Value:     9.09
                Model:     WMM
        Coordinate System: geographic
        Impedance:         True
        Tipper:            True
        N Periods:     33
        Period Range:
                Min:   4.65455E+00 s
                Max:   2.91271E+04 s
        Frequency Range:
                Min:   3.43323E-05 Hz
                Max:   2.14844E-01 Hz

Summarize what transfer functions are in the file

In the MTH5 file there is a property called tf_summary, this provides an array that can be converted to a pandas.DataFrame of the transfer functions within the file. There is a column called hdf5_reference which can be used to get the transfer function directly. This table is updated when the file is close, so when you open the file next it should be up to date. This table should be read only, if you want to change metadata, you should do it directly in the transfer function object.

[13]:
m.tf_summary.clear_table()
m.tf_summary.summarize()
tf_df = m.tf_summary.to_dataframe()
tf_df
[13]:
station survey latitude longitude elevation tf_id units has_impedance has_tipper has_covariance period_min period_max hdf5_reference station_hdf5_reference
0 NMX20 CONUS_South 34.470528 -108.712288 1940.05 NMX20 none True True True 4.654550 29127.110000 <HDF5 object reference> <HDF5 object reference>
1 300 unknown_survey 34.727000 -115.735000 0.00 300 none True True True 1.163640 10922.666990 <HDF5 object reference> <HDF5 object reference>
2 SAGE_2005 unknown_survey 35.550000 -106.283333 0.00 SAGE_2005 none True True True 0.004196 209.731544 <HDF5 object reference> <HDF5 object reference>
3 TEST01 unknown_survey_001 -30.930285 127.229230 175.27 TEST01 none True True False 0.001212 1211.527490 <HDF5 object reference> <HDF5 object reference>
Get TF from reference
[14]:
tf_object = m.from_reference(tf_df.iloc[0].hdf5_reference)

print(tf_object)
Station: NMX20
--------------------------------------------------
        Survey:            CONUS South
        Project:           USMTArray
        Acquired by:       National Geoelectromagnetic Facility
        Acquired date:     2020-09-20
        Latitude:          34.471
        Longitude:         -108.712
        Elevation:         1940.050
        Declination:
                Value:     9.09
                Model:     WMM
        Coordinate System: geographic
        Impedance:         True
        Tipper:            True
        N Periods:     33
        Period Range:
                Min:   4.65455E+00 s
                Max:   2.91271E+04 s
        Frequency Range:
                Min:   3.43323E-05 Hz
                Max:   2.14844E-01 Hz

MTpy

To analyze, plot, prepare input files one should look to use MTpy. Note: MTpy version 2.0 will use MTH5 as the storage mechanism and TF to read/write files.

[15]:
m.close_mth5()
2023-04-21 10:32:58,441 [line 753] mth5.mth5.MTH5.close_mth5 - INFO: Flushing and closing transfer_function_example.h5
[ ]:

Examples

Make MTH5 from IRIS Data Managment Center v0.1.0

This example demonstrates how to build an MTH5 from data archived at IRIS, it could work with any MT data stored at an FDSN data center (probably).

We will use the mth5.clients.FDSN class to build the file. There is also second way using the more generic mth5.clients.MakeMTH5 class, which will be highlighted below.

Note: this example assumes that data availability (Network, Station, Channel, Start, End) are all previously known. If you do not know the data that you want to download use IRIS tools to get data availability.

[1]:
from pathlib import Path

import numpy as np
import pandas as pd
from mth5.mth5 import MTH5
from mth5.clients import FDSN

from matplotlib import pyplot as plt
%matplotlib widget
2023-03-23 14:32:18,703 [line 135] mth5.setup_logger - INFO: Logging file can be found C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\logs\mth5_debug.log
Set the path to save files to as the current working directory
[2]:
default_path = Path().cwd()
Initialize a MakeMTH5 object

Here, we are setting the MTH5 file version to 0.1.0 so that we can only have one survey in a single file. Also, setting the client to “IRIS”. Here, we are using obspy.clients tools for the request. Here are the available FDSN clients.

Note: Only the “IRIS” client has been tested.

[3]:
fdsn_object = FDSN(mth5_version='0.1.0')
fdsn_object.client = "IRIS"
Make the data inquiry as a DataFrame

There are a few ways to make the inquiry to request data.

  1. Make a DataFrame by hand. Here we will make a list of entries and then create a DataFrame with the proper column names

  2. You can create a CSV file with a row for each entry. There are some formatting that you need to be aware of. That is the column names and making sure that date-times are YYYY-MM-DDThh:mm:ss

Column Name

Description

network

FDSN Network code (2 letters)

station

FDSN Station code (usually 5 characters)

location

FDSN Location code (typically not used for MT)

channel

FDSN Channel code (3 characters)

start

Start time (YYYY-MM-DDThh:mm:ss) UTC

end

End time (YYYY-MM-DDThh:mm:ss) UTC

[6]:
channels = ["LFE", "LFN", "LFZ", "LQE", "LQN"]
CAS04 = ["8P", "CAS04",  '2020-06-02T19:00:00', '2020-07-13T19:00:00']

request_list = []
for entry in [CAS04]:
    for channel in channels:
        request_list.append(
            [entry[0], entry[1], "", channel, entry[2], entry[3]]
        )

# Turn list into dataframe
request_df =  pd.DataFrame(request_list, columns=fdsn_object.request_columns)
request_df
[6]:
network station location channel start end
0 8P CAS04 LFE 2020-06-02T19:00:00 2020-07-13T19:00:00
1 8P CAS04 LFN 2020-06-02T19:00:00 2020-07-13T19:00:00
2 8P CAS04 LFZ 2020-06-02T19:00:00 2020-07-13T19:00:00
3 8P CAS04 LQE 2020-06-02T19:00:00 2020-07-13T19:00:00
4 8P CAS04 LQN 2020-06-02T19:00:00 2020-07-13T19:00:00
Save the request as a CSV

Its helpful to be able to save the request as a CSV and modify it and use it later. A CSV can be input as a request to MakeMTH5

[7]:
request_df.to_csv(default_path.joinpath("fdsn_request.csv"))
Get only the metadata from IRIS

It can be helpful to make sure that your request is what you would expect. For that you can request only the metadata from IRIS. The request is quick and light so shouldn’t need to worry about the speed. This returns a StationXML file and is loaded into an obspy.Inventory object.

[8]:
inventory, data = fdsn_object.get_inventory_from_df(request_df, data=False)

Have a look at the Inventory to make sure it contains what is requested.

[9]:
inventory
[9]:
Inventory created at 2023-03-23T21:32:59.642492Z
        Created by: ObsPy 1.3.0
                    https://www.obspy.org
        Sending institution: MTH5
        Contains:
                Networks (1):
                        8P
                Stations (1):
                        8P.CAS04 (Corral Hollow, CA, USA)
                Channels (5):
                        8P.CAS04..LFZ, 8P.CAS04..LFN, 8P.CAS04..LFE, 8P.CAS04..LQN,
                        8P.CAS04..LQE
Make an MTH5 from a request

Now that we’ve created a request, and made sure that its what we expect, we can make an MTH5 file. The input can be either the DataFrame or the CSV file.

We are going to time it just to get an indication how long it might take. Should take about 4 minutes.

Note: we are setting interact=False. If you want to just to keep the file open to interogat it set interact=True.

Make an MTH5 using MakeMTH5

Another way to make a file is using the mth5.clients.MakeMTH5 class, which is more generic than FDSN, but doesn’t have as many methods. The MakeMTH5 class is meant to be a convienence method for the various clients.

from mth5.clients import MakeMTH5

make_mth5_object = MakeMTH5(mth5_version='0.1.0', interact=False)
mth5_filename = make_mth5_object.from_fdsn_client(request_df, client="IRIS")
[8]:
%%time

mth5_filename = fdsn_object.make_mth5_from_fdsn_client(request_df, interact=False)

print(f"Created {mth5_filename}")
2022-09-12 17:00:43,438 [line 605] mth5.mth5.MTH5.open_mth5 - WARNING: 8P_CAS04.h5 will be overwritten in 'w' mode
2022-09-12 17:00:43,981 [line 672] mth5.mth5.MTH5._initialize_file - INFO: Initialized MTH5 0.1.0 file C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\docs\examples\notebooks\8P_CAS04.h5 in mode w
2022-09-12 17:00:52,792 [line 120] mt_metadata.base.metadata.station.add_run - WARNING: Run a is being overwritten with current information
2022-09-12T17:00:53 [line 134] obspy_stages.create_filter_from_stage - INFO: Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter
2022-09-12T17:00:53 [line 134] obspy_stages.create_filter_from_stage - INFO: Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter
2022-09-12T17:00:53 [line 134] obspy_stages.create_filter_from_stage - INFO: Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter
2022-09-12T17:00:53 [line 134] obspy_stages.create_filter_from_stage - INFO: Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter
2022-09-12T17:00:53 [line 134] obspy_stages.create_filter_from_stage - INFO: Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter
2022-09-12T17:00:53 [line 134] obspy_stages.create_filter_from_stage - INFO: Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter
2022-09-12T17:00:53 [line 134] obspy_stages.create_filter_from_stage - INFO: Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter
2022-09-12T17:00:53 [line 134] obspy_stages.create_filter_from_stage - INFO: Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter
2022-09-12T17:00:53 [line 134] obspy_stages.create_filter_from_stage - INFO: Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter
2022-09-12T17:00:53 [line 134] obspy_stages.create_filter_from_stage - INFO: Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter
2022-09-12 17:00:56,417 [line 784] mth5.groups.base.Station.add_run - INFO: run a already exists, returning existing group.
2022-09-12 17:00:56,963 [line 272] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: start time of dataset 2020-06-02T19:00:00+00:00 does not match metadata start 2020-06-02T18:41:43+00:00 updating metatdata value to 2020-06-02T19:00:00+00:00
2022-09-12 17:00:56,963 [line 286] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: end time of dataset 2020-06-02T22:07:47+00:00 does not match metadata end 2020-06-02T22:07:46+00:00 updating metatdata value to 2020-06-02T22:07:47+00:00
2022-09-12 17:01:09,192 [line 784] mth5.groups.base.Station.add_run - INFO: run b already exists, returning existing group.
2022-09-12 17:01:09,740 [line 286] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: end time of dataset 2020-06-12T17:52:24+00:00 does not match metadata end 2020-06-12T17:52:23+00:00 updating metatdata value to 2020-06-12T17:52:24+00:00
2022-09-12 17:01:22,183 [line 784] mth5.groups.base.Station.add_run - INFO: run c already exists, returning existing group.
2022-09-12 17:01:22,937 [line 286] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: end time of dataset 2020-07-01T17:33:00+00:00 does not match metadata end 2020-07-01T17:32:59+00:00 updating metatdata value to 2020-07-01T17:33:00+00:00
2022-09-12 17:01:38,198 [line 784] mth5.groups.base.Station.add_run - INFO: run d already exists, returning existing group.
2022-09-12 17:01:38,778 [line 286] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: end time of dataset 2020-07-13T19:00:01+00:00 does not match metadata end 2020-07-13T21:46:12+00:00 updating metatdata value to 2020-07-13T19:00:01+00:00
2022-09-12 17:01:49,581 [line 753] mth5.mth5.MTH5.close_mth5 - INFO: Flushing and closing C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\docs\examples\notebooks\8P_CAS04.h5
Created C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\docs\examples\notebooks\8P_CAS04.h5
Wall time: 1min 6s
[9]:
# open file already created
mth5_object = MTH5()
mth5_object.open_mth5(mth5_filename)
Have a look at the contents of the created file
[10]:
mth5_object
[10]:
/:
====================
    |- Group: Survey
    ----------------
        |- Group: Filters
        -----------------
            |- Group: coefficient
            ---------------------
                |- Group: electric_analog_to_digital
                ------------------------------------
                |- Group: electric_dipole_92.000
                --------------------------------
                |- Group: electric_si_units
                ---------------------------
                |- Group: magnetic_analog_to_digital
                ------------------------------------
            |- Group: fap
            -------------
            |- Group: fir
            -------------
            |- Group: time_delay
            --------------------
                |- Group: electric_time_offset
                ------------------------------
                |- Group: hx_time_offset
                ------------------------
                |- Group: hy_time_offset
                ------------------------
                |- Group: hz_time_offset
                ------------------------
            |- Group: zpk
            -------------
                |- Group: electric_butterworth_high_pass
                ----------------------------------------
                    --> Dataset: poles
                    ....................
                    --> Dataset: zeros
                    ....................
                |- Group: electric_butterworth_low_pass
                ---------------------------------------
                    --> Dataset: poles
                    ....................
                    --> Dataset: zeros
                    ....................
                |- Group: magnetic_butterworth_low_pass
                ---------------------------------------
                    --> Dataset: poles
                    ....................
                    --> Dataset: zeros
                    ....................
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Stations
        ------------------
            |- Group: CAS04
            ---------------
                |- Group: Transfer_Functions
                ----------------------------
                |- Group: a
                -----------
                    --> Dataset: ex
                    .................
                    --> Dataset: ey
                    .................
                    --> Dataset: hx
                    .................
                    --> Dataset: hy
                    .................
                    --> Dataset: hz
                    .................
                |- Group: b
                -----------
                    --> Dataset: ex
                    .................
                    --> Dataset: ey
                    .................
                    --> Dataset: hx
                    .................
                    --> Dataset: hy
                    .................
                    --> Dataset: hz
                    .................
                |- Group: c
                -----------
                    --> Dataset: ex
                    .................
                    --> Dataset: ey
                    .................
                    --> Dataset: hx
                    .................
                    --> Dataset: hy
                    .................
                    --> Dataset: hz
                    .................
                |- Group: d
                -----------
                    --> Dataset: ex
                    .................
                    --> Dataset: ey
                    .................
                    --> Dataset: hx
                    .................
                    --> Dataset: hy
                    .................
                    --> Dataset: hz
                    .................
        --> Dataset: channel_summary
        ..............................
        --> Dataset: tf_summary
        .........................
Channel Summary

A convenience table is supplied with an MTH5 file. This table provides some information about each channel that is present in the file. It also provides columns hdf5_reference, run_hdf5_reference, and station_hdf5_reference, these are internal references within an HDF5 file and can be used to directly access a group or dataset by using mth5_object.from_reference method.

Note: When a MTH5 file is close the table is resummarized so when you open the file next the channel_summary will be up to date. Same with the tf_summary.

[11]:
mth5_object.channel_summary.clear_table()
mth5_object.channel_summary.summarize()

ch_df = mth5_object.channel_summary.to_dataframe()
ch_df
[11]:
survey station run latitude longitude elevation component start end n_samples sample_rate measurement_type azimuth tilt units hdf5_reference run_hdf5_reference station_hdf5_reference
0 CONUS South CAS04 a 37.633351 -121.468382 329.3875 ex 2020-06-02 19:00:00+00:00 2020-06-02 22:07:47+00:00 11267 1.0 electric 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
1 CONUS South CAS04 a 37.633351 -121.468382 329.3875 ey 2020-06-02 19:00:00+00:00 2020-06-02 22:07:47+00:00 11267 1.0 electric 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
2 CONUS South CAS04 a 37.633351 -121.468382 329.3875 hx 2020-06-02 19:00:00+00:00 2020-06-02 22:07:47+00:00 11267 1.0 magnetic 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
3 CONUS South CAS04 a 37.633351 -121.468382 329.3875 hy 2020-06-02 19:00:00+00:00 2020-06-02 22:07:47+00:00 11267 1.0 magnetic 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
4 CONUS South CAS04 a 37.633351 -121.468382 329.3875 hz 2020-06-02 19:00:00+00:00 2020-06-02 22:07:47+00:00 11267 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
5 CONUS South CAS04 b 37.633351 -121.468382 329.3875 ex 2020-06-02 22:24:55+00:00 2020-06-12 17:52:24+00:00 847649 1.0 electric 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
6 CONUS South CAS04 b 37.633351 -121.468382 329.3875 ey 2020-06-02 22:24:55+00:00 2020-06-12 17:52:24+00:00 847649 1.0 electric 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
7 CONUS South CAS04 b 37.633351 -121.468382 329.3875 hx 2020-06-02 22:24:55+00:00 2020-06-12 17:52:24+00:00 847649 1.0 magnetic 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
8 CONUS South CAS04 b 37.633351 -121.468382 329.3875 hy 2020-06-02 22:24:55+00:00 2020-06-12 17:52:24+00:00 847649 1.0 magnetic 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
9 CONUS South CAS04 b 37.633351 -121.468382 329.3875 hz 2020-06-02 22:24:55+00:00 2020-06-12 17:52:24+00:00 847649 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
10 CONUS South CAS04 c 37.633351 -121.468382 329.3875 ex 2020-06-12 18:32:17+00:00 2020-07-01 17:33:00+00:00 1638043 1.0 electric 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
11 CONUS South CAS04 c 37.633351 -121.468382 329.3875 ey 2020-06-12 18:32:17+00:00 2020-07-01 17:33:00+00:00 1638043 1.0 electric 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
12 CONUS South CAS04 c 37.633351 -121.468382 329.3875 hx 2020-06-12 18:32:17+00:00 2020-07-01 17:33:00+00:00 1638043 1.0 magnetic 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
13 CONUS South CAS04 c 37.633351 -121.468382 329.3875 hy 2020-06-12 18:32:17+00:00 2020-07-01 17:33:00+00:00 1638043 1.0 magnetic 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
14 CONUS South CAS04 c 37.633351 -121.468382 329.3875 hz 2020-06-12 18:32:17+00:00 2020-07-01 17:33:00+00:00 1638043 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
15 CONUS South CAS04 d 37.633351 -121.468382 329.3875 ex 2020-07-01 19:36:55+00:00 2020-07-13 19:00:01+00:00 1034586 1.0 electric 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
16 CONUS South CAS04 d 37.633351 -121.468382 329.3875 ey 2020-07-01 19:36:55+00:00 2020-07-13 19:00:01+00:00 1034586 1.0 electric 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
17 CONUS South CAS04 d 37.633351 -121.468382 329.3875 hx 2020-07-01 19:36:55+00:00 2020-07-13 19:00:01+00:00 1034586 1.0 magnetic 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
18 CONUS South CAS04 d 37.633351 -121.468382 329.3875 hy 2020-07-01 19:36:55+00:00 2020-07-13 19:00:01+00:00 1034586 1.0 magnetic 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
19 CONUS South CAS04 d 37.633351 -121.468382 329.3875 hz 2020-07-01 19:36:55+00:00 2020-07-13 19:00:01+00:00 1034586 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
Have a look at a station

Lets grab one station CAS04 and have a look at its metadata and contents. Here we will grab it from the mth5_object.

[12]:
cas04 = mth5_object.get_station("CAS04")
cas04.metadata
[12]:
{
    "station": {
        "acquired_by.name": null,
        "channels_recorded": [],
        "data_type": "MT",
        "fdsn.id": "CAS04",
        "geographic_name": "Corral Hollow, CA, USA",
        "hdf5_reference": "<HDF5 object reference>",
        "id": "CAS04",
        "location.declination.comments": "igrf.m by Drew Compston",
        "location.declination.model": "IGRF-13",
        "location.declination.value": 13.1745887285666,
        "location.elevation": 329.3875,
        "location.latitude": 37.633351,
        "location.longitude": -121.468382,
        "mth5_type": "Station",
        "orientation.method": "compass",
        "orientation.reference_frame": "geographic",
        "provenance.creation_time": "1980-01-01T00:00:00+00:00",
        "provenance.software.author": "Anna Kelbert, USGS",
        "provenance.software.name": "mth5_metadata.m",
        "provenance.software.version": "2022-03-31",
        "provenance.submitter.email": null,
        "provenance.submitter.organization": null,
        "run_list": [
            "a",
            "b",
            "c",
            "d"
        ],
        "time_period.end": "2020-07-13T21:46:12+00:00",
        "time_period.start": "2020-06-02T18:41:43+00:00"
    }
}
Changing Metadata

If you want to change the metadata of any group, be sure to use the write_metadata method. Here’s an example:

[13]:
cas04.metadata.location.declination.value = -13.5
cas04.write_metadata()
print(cas04.metadata.location.declination)
declination:
        comments = igrf.m by Drew Compston
        model = IGRF-13
        value = -13.5
Have a look at a single channel

Let’s pick out a channel and interogate it. There are a couple ways 1. Get a channel the first will be from the hdf5_reference [demonstrated here] 2. Get a channel from mth5_object 3. Get a station first then get a channel

[14]:
ex = mth5_object.from_reference(ch_df.iloc[0].hdf5_reference).to_channel_ts()
print(ex)
Channel Summary:
        Station:      CAS04
        Run:          a
        Channel Type: electric
        Component:    ex
        Sample Rate:  1.0
        Start:        2020-06-02T19:00:00+00:00
        End:          2020-06-02T22:07:47+00:00
        N Samples:    11267
[15]:
ex.channel_metadata
[15]:
{
    "electric": {
        "channel_number": 0,
        "comments": "run_ids: [c,b,a]",
        "component": "ex",
        "data_quality.rating.value": 0,
        "dipole_length": 92.0,
        "filter.applied": [
            false
        ],
        "filter.name": [
            "electric_si_units",
            "electric_dipole_92.000",
            "electric_butterworth_low_pass",
            "electric_butterworth_high_pass",
            "electric_analog_to_digital",
            "electric_time_offset"
        ],
        "hdf5_reference": "<HDF5 object reference>",
        "measurement_azimuth": 13.2,
        "measurement_tilt": 0.0,
        "mth5_type": "Electric",
        "negative.elevation": 329.4,
        "negative.id": "200406D",
        "negative.latitude": 37.633351,
        "negative.longitude": -121.468382,
        "negative.manufacturer": "Oregon State University",
        "negative.model": "Pb-PbCl2 kaolin gel Petiau 2 chamber type",
        "negative.type": "electrode",
        "positive.elevation": 329.4,
        "positive.id": "200406B",
        "positive.latitude": 37.633351,
        "positive.longitude": -121.468382,
        "positive.manufacturer": "Oregon State University",
        "positive.model": "Pb-PbCl2 kaolin gel Petiau 2 chamber type",
        "positive.type": "electrode",
        "sample_rate": 1.0,
        "time_period.end": "2020-06-02T22:07:47+00:00",
        "time_period.start": "2020-06-02T19:00:00+00:00",
        "type": "electric",
        "units": "digital counts"
    }
}
Calibrate time series data

Most data loggers output data in digital counts. Then a series of filters that represent the various instrument responses are applied to get the data into physical units. The data can then be analyzed and processed. Commonly this is done during the processing step, but it is important to be able to look at time series data in physical units. Here we provide a remove_instrument_response method in the ChananelTS object. Here’s an example:

[16]:
print(ex.channel_response_filter)
ex.channel_response_filter.plot_response(np.logspace(-4, 1, 50))
Filters Included:
=========================
coefficient_filter:
        calibration_date = 1980-01-01
        comments = practical to SI unit conversion
        gain = 1e-06
        name = electric_si_units
        type = coefficient
        units_in = mV/km
        units_out = V/m
--------------------
coefficient_filter:
        calibration_date = 1980-01-01
        comments = electric dipole for electric field
        gain = 92.0
        name = electric_dipole_92.000
        type = coefficient
        units_in = V/m
        units_out = V
--------------------
pole_zero_filter:
        calibration_date = 1980-01-01
        comments = NIMS electric field 5 pole Butterworth 0.5 low pass (analog)
        gain = 1.0
        name = electric_butterworth_low_pass
        normalization_factor = 313383.493219835
        poles = [ -3.883009+11.951875j  -3.883009-11.951875j -10.166194 +7.386513j
 -10.166194 -7.386513j -12.566371 +0.j      ]
        type = zpk
        units_in = V
        units_out = V
        zeros = []
--------------------
pole_zero_filter:
        calibration_date = 1980-01-01
        comments = NIMS electric field 1 pole Butterworth high pass (analog)
        gain = 1.0
        name = electric_butterworth_high_pass
        normalization_factor = 1.00000000378188
        poles = [-0.000167+0.j]
        type = zpk
        units_in = V
        units_out = V
        zeros = [0.+0.j]
--------------------
coefficient_filter:
        calibration_date = 1980-01-01
        comments = analog to digital conversion (electric)
        gain = 409600000.0
        name = electric_analog_to_digital
        type = coefficient
        units_in = V
        units_out = count
--------------------
time_delay_filter:
        calibration_date = 1980-01-01
        comments = time offset in seconds (digital)
        delay = -0.285
        gain = 1.0
        name = electric_time_offset
        type = time delay
        units_in = count
        units_out = count
--------------------

[17]:
ex.remove_instrument_response(plot=True)
C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\mth5\timeseries\ts_filters.py:500: UserWarning: Attempted to set non-positive left xlim on a log-scaled axis.
Invalid limit will be ignored.
  ax2.set_xlim((f[0], f[-1]))
[17]:
Channel Summary:
        Station:      CAS04
        Run:          a
        Channel Type: electric
        Component:    ex
        Sample Rate:  1.0
        Start:        2020-06-02T19:00:00+00:00
        End:          2020-06-02T22:07:47+00:00
        N Samples:    11267
Have a look at a run

Let’s pick out a run, take a slice of it, and interogate it. There are a couple ways 1. Get a run the first will be from the run_hdf5_reference [demonstrated here] 2. Get a run from mth5_object 3. Get a station first then get a run

[18]:
run_from_reference = mth5_object.from_reference(ch_df.iloc[0].run_hdf5_reference).to_runts(start=ch_df.iloc[0].start.isoformat(), n_samples=360)
print(run_from_reference)
RunTS Summary:
        Station:     CAS04
        Run:         a
        Start:       2020-06-02T19:00:00+00:00
        End:         2020-06-02T19:06:01+00:00
        Sample Rate: 1.0
        Components:  ['ex', 'ey', 'hx', 'hy', 'hz']
[19]:
run_from_reference.plot()
Calibrate Run
[20]:
calibrated_run = run_from_reference.calibrate()
calibrated_run.plot()
2022-09-12 17:02:02,653 [line 286] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: end time of dataset 2020-06-02T19:06:01+00:00 does not match metadata end 2020-06-02T22:07:47+00:00 updating metatdata value to 2020-06-02T19:06:01+00:00
Load Transfer Functions

You can download the transfer functions for CAS04 and NVR08 from IRIS SPUD EMTF. This has already been done as EMTF XML format and will be loaded here.

[21]:
cas04_tf = r"USMTArray.CAS04.2020.xml"
[22]:
from mt_metadata.transfer_functions.core import TF
[23]:
for tf_fn in [cas04_tf]:
    tf_obj = TF(tf_fn)
    tf_obj.read_tf_file()
    mth5_object.add_transfer_function(tf_obj)
Have a look at the transfer function summary
[24]:
mth5_object.tf_summary.summarize()
tf_df = mth5_object.tf_summary.to_dataframe()
tf_df
[24]:
station survey latitude longitude elevation tf_id units has_impedance has_tipper has_covariance period_min period_max hdf5_reference station_hdf5_reference
0 CAS04 CONUS_South 37.633351 -121.468382 329.387 CAS04 none True True True 4.65455 29127.11 <HDF5 object reference> <HDF5 object reference>
Plot the transfer functions using MTpy

Note: This currently works on branch mtpy/v2_plots

[25]:
from mtpy import MTCollection
2022-09-12T17:02:07 [line 121] mtpy.utils.mtpy_decorator._check_gdal_data - INFO: GDAL_DATA is set to: C:\Users\jpeacock\Anaconda3\envs\em\Library\share\gdal
2022-09-12 17:02:07,362 [line 133] matplotlib.get_mtpy_logger - INFO: Logging file can be found C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mtpy\logs\matplotlib_warn.log
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
~\AppData\Local\Temp\11\ipykernel_6372\2507741362.py in <cell line: 1>()
----> 1 from mtpy import MTCollection

ImportError: cannot import name 'MTCollection' from 'mtpy' (C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mtpy\mtpy\__init__.py)
[ ]:
mc = MTCollection()
mc.open_collection(r"8P_CAS04_NVR08")
[ ]:
pmr = mc.plot_mt_response(["CAS04", "NVR08"], plot_style="1")
Plot Station locations

Here we can plot station locations for all stations in the file, or we can give it a bounding box. If you have internet access a basemap will be plotted using Contextily.

[ ]:
st = mc.plot_stations(pad=.9, fig_num=5, fig_size=[6, 4])
[ ]:
st.fig.get_axes()[0].set_xlim((-121.9, -117.75))
st.fig.get_axes()[0].set_ylim((37.35, 38.5))
st.update_plot()
[ ]:
mth5_object.close_mth5()
[ ]:

Make MTH5 from IRIS Data Managment Center v0.2.0

This example demonstrates how to build an MTH5 from data archived at IRIS, it could work with any MT data stored at an FDSN data center (probably).

We will use the mth5.clients.FDSN class to build the file. There is also second way using the more generic mth5.clients.MakeMTH5 class, which will be highlighted below.

Note: this example assumes that data availability (Network, Station, Channel, Start, End) are all previously known. If you do not know the data that you want to download use IRIS tools to get data availability.

[1]:
from pathlib import Path

import numpy as np
import pandas as pd
from mth5.mth5 import MTH5
from mth5.clients.make_mth5 import FDSN

from matplotlib import pyplot as plt
%matplotlib widget
Set the path to save files to as the current working directory
[2]:
default_path = Path().cwd()
Initialize a MakeMTH5 object

Here, we are setting the MTH5 file version to 0.2.0 so that we can have multiple surveys in a single file. Also, setting the client to “IRIS”. Here, we are using obspy.clients tools for the request. Here are the available FDSN clients.

Note: Only the “IRIS” client has been tested.

[3]:
fdsn_object = FDSN(mth5_version='0.2.0')
fdsn_object.client = "IRIS"
Make the data inquiry as a DataFrame

There are a few ways to make the inquiry to request data.

  1. Make a DataFrame by hand. Here we will make a list of entries and then create a DataFrame with the proper column names

  2. You can create a CSV file with a row for each entry. There are some formatting that you need to be aware of. That is the column names and making sure that date-times are YYYY-MM-DDThh:mm:ss

Column Name

Description

network

FDSN Network code (2 letters)

station

FDSN Station code (usually 5 characters)

location

FDSN Location code (typically not used for MT)

channel

FDSN Channel code (3 characters)

start

Start time (YYYY-MM-DDThh:mm:ss) UTC

end

End time (YYYY-MM-DDThh:mm:ss) UTC

[4]:
channels = ["LFE", "LFN", "LFZ", "LQE", "LQN"]
CAS04 = ["8P", "CAS04",  '2020-06-02T19:00:00', '2020-07-13T19:00:00']
NVR08 = ["8P", "NVR08", '2020-06-02T19:00:00', '2020-07-13T19:00:00']

request_list = []
for entry in [CAS04, NVR08]:
    for channel in channels:
        request_list.append(
            [entry[0], entry[1], "", channel, entry[2], entry[3]]
        )

# Turn list into dataframe
request_df =  pd.DataFrame(request_list, columns=fdsn_object.request_columns)
request_df
[4]:
network station location channel start end
0 8P CAS04 LFE 2020-06-02T19:00:00 2020-07-13T19:00:00
1 8P CAS04 LFN 2020-06-02T19:00:00 2020-07-13T19:00:00
2 8P CAS04 LFZ 2020-06-02T19:00:00 2020-07-13T19:00:00
3 8P CAS04 LQE 2020-06-02T19:00:00 2020-07-13T19:00:00
4 8P CAS04 LQN 2020-06-02T19:00:00 2020-07-13T19:00:00
5 8P NVR08 LFE 2020-06-02T19:00:00 2020-07-13T19:00:00
6 8P NVR08 LFN 2020-06-02T19:00:00 2020-07-13T19:00:00
7 8P NVR08 LFZ 2020-06-02T19:00:00 2020-07-13T19:00:00
8 8P NVR08 LQE 2020-06-02T19:00:00 2020-07-13T19:00:00
9 8P NVR08 LQN 2020-06-02T19:00:00 2020-07-13T19:00:00
Save the request as a CSV

Its helpful to be able to save the request as a CSV and modify it and use it later. A CSV can be input as a request to MakeMTH5

[5]:
request_df.to_csv(default_path.joinpath("fdsn_request.csv"))
Get only the metadata from IRIS

It can be helpful to make sure that your request is what you would expect. For that you can request only the metadata from IRIS. The request is quick and light so shouldn’t need to worry about the speed. This returns a StationXML file and is loaded into an obspy.Inventory object.

[6]:
inventory, data = fdsn_object.get_inventory_from_df(request_df, data=False)

Have a look at the Inventory to make sure it contains what is requested.

[7]:
inventory
[7]:
Inventory created at 2023-07-22T21:45:37.964795Z
        Created by: ObsPy 1.4.0
                    https://www.obspy.org
        Sending institution: MTH5
        Contains:
                Networks (1):
                        8P
                Stations (2):
                        8P.CAS04 (Corral Hollow, CA, USA)
                        8P.NVR08 (Rhodes Salt Marsh, NV, USA)
                Channels (10):
                        8P.CAS04..LFZ, 8P.CAS04..LFN, 8P.CAS04..LFE, 8P.CAS04..LQN,
                        8P.CAS04..LQE, 8P.NVR08..LFZ, 8P.NVR08..LFN, 8P.NVR08..LFE,
                        8P.NVR08..LQN, 8P.NVR08..LQE
Make an MTH5 from a request

Now that we’ve created a request, and made sure that its what we expect, we can make an MTH5 file. The input can be either the DataFrame or the CSV file.

We are going to time it just to get an indication how long it might take. Should take about 4 minutes.

Note: we are setting interact=False. If you want to just to keep the file open to interogat it set interact=True.

Make an MTH5 using MakeMTH5

Another way to make a file is using the mth5.clients.MakeMTH5 class, which is more generic than FDSN, but doesn’t have as many methods. The MakeMTH5 class is meant to be a convienence method for the various clients.

from mth5.clients import MakeMTH5

make_mth5_object = MakeMTH5(mth5_version='0.2.0', interact=False)
mth5_filename = make_mth5_object.from_fdsn_client(request_df, client="IRIS")
[8]:
%%time

mth5_object = fdsn_object.make_mth5_from_fdsn_client(request_df, interact=False)

print(f"Created {mth5_object}")
2023-07-22T14:46:04 [line 677] mth5.mth5.MTH5._initialize_file - INFO: Initialized MTH5 0.2.0 file /home/kkappler/software/irismt/mth5/docs/examples/notebooks/8P_CAS04_NVR08.h5 in mode w
2023-07-22T14:47:23.232455-0700 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter.
2023-07-22T14:47:23.240907-0700 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter.
2023-07-22T14:47:23.286284-0700 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter.
2023-07-22T14:47:23.295961-0700 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | Converting PoleZerosResponseStage electric_dipole_92.000 to a CoefficientFilter.
2023-07-22T14:47:23.447693-0700 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter.
2023-07-22T14:47:23.455765-0700 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | Converting PoleZerosResponseStage electric_dipole_94.000 to a CoefficientFilter.
2023-07-22T14:47:23.504153-0700 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | Converting PoleZerosResponseStage electric_si_units to a CoefficientFilter.
2023-07-22T14:47:23.512332-0700 | INFO | mt_metadata.timeseries.filters.obspy_stages | create_filter_from_stage | Converting PoleZerosResponseStage electric_dipole_94.000 to a CoefficientFilter.
2023-07-22T14:47:24 [line 311] mth5.groups.base.Station._add_group - INFO: RunGroup a already exists, returning existing group.
2023-07-22T14:47:24 [line 664] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: start time of dataset 2020-06-02T19:00:00+00:00 does not match metadata start 2020-06-02T18:41:43+00:00 updating metatdata value to 2020-06-02T19:00:00+00:00
2023-07-22T14:47:24 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id a
2023-07-22T14:47:24 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id a
2023-07-22T14:47:25 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id a
2023-07-22T14:47:25 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id a
2023-07-22T14:47:25 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id a
2023-07-22T14:47:25 [line 311] mth5.groups.base.Station._add_group - INFO: RunGroup b already exists, returning existing group.
2023-07-22T14:47:25 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id b
2023-07-22T14:47:26 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id b
2023-07-22T14:47:26 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id b
2023-07-22T14:47:26 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id b
2023-07-22T14:47:26 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id b
2023-07-22T14:47:26 [line 311] mth5.groups.base.Station._add_group - INFO: RunGroup c already exists, returning existing group.
2023-07-22T14:47:26 [line 936] mth5.timeseries.run_ts.RunTS.from_obspy_stream - WARNING: could not find ey
2023-07-22T14:47:27 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id c
2023-07-22T14:47:27 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id c
2023-07-22T14:47:27 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id c
2023-07-22T14:47:27 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id c
2023-07-22T14:47:27 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id c
2023-07-22T14:47:27 [line 311] mth5.groups.base.Station._add_group - INFO: RunGroup d already exists, returning existing group.
2023-07-22T14:47:27 [line 936] mth5.timeseries.run_ts.RunTS.from_obspy_stream - WARNING: could not find ex
2023-07-22T14:47:27 [line 936] mth5.timeseries.run_ts.RunTS.from_obspy_stream - WARNING: could not find ey
2023-07-22T14:47:28 [line 678] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: end time of dataset 2020-07-13T19:00:00+00:00 does not match metadata end 2020-07-13T21:46:12+00:00 updating metatdata value to 2020-07-13T19:00:00+00:00
2023-07-22T14:47:28 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id d
2023-07-22T14:47:28 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id d
2023-07-22T14:47:28 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id d
2023-07-22T14:47:28 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id d
2023-07-22T14:47:28 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id d
2023-07-22T14:47:28 [line 311] mth5.groups.base.Station._add_group - INFO: RunGroup a already exists, returning existing group.
2023-07-22T14:47:29 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id a
2023-07-22T14:47:29 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id a
2023-07-22T14:47:29 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id a
2023-07-22T14:47:29 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id a
2023-07-22T14:47:29 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id a
2023-07-22T14:47:29 [line 311] mth5.groups.base.Station._add_group - INFO: RunGroup b already exists, returning existing group.
2023-07-22T14:47:30 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id b
2023-07-22T14:47:30 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id b
2023-07-22T14:47:30 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id b
2023-07-22T14:47:30 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id b
2023-07-22T14:47:30 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id b
2023-07-22T14:47:31 [line 311] mth5.groups.base.Station._add_group - INFO: RunGroup c already exists, returning existing group.
2023-07-22T14:47:31 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id c
2023-07-22T14:47:31 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id c
2023-07-22T14:47:31 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id c
2023-07-22T14:47:31 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id c
2023-07-22T14:47:32 [line 623] mth5.groups.base.Run.from_runts - WARNING: Channel run.id sr1_001 !=  group run.id c
2023-07-22T14:47:32 [line 758] mth5.mth5.MTH5.close_mth5 - INFO: Flushing and closing /home/kkappler/software/irismt/mth5/docs/examples/notebooks/8P_CAS04_NVR08.h5
Created /home/kkappler/software/irismt/mth5/docs/examples/notebooks/8P_CAS04_NVR08.h5
CPU times: user 9.73 s, sys: 279 ms, total: 10 s
Wall time: 1min 28s
[9]:
# open file already created
mth5_object = MTH5()
mth5_object.open_mth5("8P_CAS04_NVR08.h5")
Have a look at the contents of the created file
[10]:
mth5_object
[10]:
/:
====================
    |- Group: Experiment
    --------------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Surveys
        -----------------
            |- Group: CONUS_South
            ---------------------
                |- Group: Filters
                -----------------
                    |- Group: coefficient
                    ---------------------
                        |- Group: electric_analog_to_digital
                        ------------------------------------
                        |- Group: electric_dipole_92.000
                        --------------------------------
                        |- Group: electric_dipole_94.000
                        --------------------------------
                        |- Group: electric_si_units
                        ---------------------------
                        |- Group: magnetic_analog_to_digital
                        ------------------------------------
                    |- Group: fap
                    -------------
                    |- Group: fir
                    -------------
                    |- Group: time_delay
                    --------------------
                        |- Group: electric_time_offset
                        ------------------------------
                        |- Group: hx_time_offset
                        ------------------------
                        |- Group: hy_time_offset
                        ------------------------
                        |- Group: hz_time_offset
                        ------------------------
                    |- Group: zpk
                    -------------
                        |- Group: electric_butterworth_high_pass
                        ----------------------------------------
                            --> Dataset: poles
                            ....................
                            --> Dataset: zeros
                            ....................
                        |- Group: electric_butterworth_low_pass
                        ---------------------------------------
                            --> Dataset: poles
                            ....................
                            --> Dataset: zeros
                            ....................
                        |- Group: magnetic_butterworth_low_pass
                        ---------------------------------------
                            --> Dataset: poles
                            ....................
                            --> Dataset: zeros
                            ....................
                |- Group: Reports
                -----------------
                |- Group: Standards
                -------------------
                    --> Dataset: summary
                    ......................
                |- Group: Stations
                ------------------
                    |- Group: CAS04
                    ---------------
                        |- Group: Fourier_Coefficients
                        ------------------------------
                        |- Group: Transfer_Functions
                        ----------------------------
                        |- Group: a
                        -----------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                        |- Group: b
                        -----------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                        |- Group: c
                        -----------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                        |- Group: d
                        -----------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                    |- Group: NVR08
                    ---------------
                        |- Group: Fourier_Coefficients
                        ------------------------------
                        |- Group: Transfer_Functions
                        ----------------------------
                        |- Group: a
                        -----------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                        |- Group: b
                        -----------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
                        |- Group: c
                        -----------
                            --> Dataset: ex
                            .................
                            --> Dataset: ey
                            .................
                            --> Dataset: hx
                            .................
                            --> Dataset: hy
                            .................
                            --> Dataset: hz
                            .................
        --> Dataset: channel_summary
        ..............................
        --> Dataset: tf_summary
        .........................
Channel Summary

A convenience table is supplied with an MTH5 file. This table provides some information about each channel that is present in the file. It also provides columns hdf5_reference, run_hdf5_reference, and station_hdf5_reference, these are internal references within an HDF5 file and can be used to directly access a group or dataset by using mth5_object.from_reference method.

Note: When a MTH5 file is close the table is resummarized so when you open the file next the channel_summary will be up to date. Same with the tf_summary.

[11]:
mth5_object.channel_summary.clear_table()
mth5_object.channel_summary.summarize()

ch_df = mth5_object.channel_summary.to_dataframe()
ch_df
[11]:
survey station run latitude longitude elevation component start end n_samples sample_rate measurement_type azimuth tilt units hdf5_reference run_hdf5_reference station_hdf5_reference
0 CONUS South CAS04 a 37.633351 -121.468382 329.3875 ex 2020-06-02 19:00:00+00:00 2020-06-02 22:07:46+00:00 11267 1.0 electric 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
1 CONUS South CAS04 a 37.633351 -121.468382 329.3875 ey 2020-06-02 19:00:00+00:00 2020-06-02 22:07:46+00:00 11267 1.0 electric 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
2 CONUS South CAS04 a 37.633351 -121.468382 329.3875 hx 2020-06-02 19:00:00+00:00 2020-06-02 22:07:46+00:00 11267 1.0 magnetic 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
3 CONUS South CAS04 a 37.633351 -121.468382 329.3875 hy 2020-06-02 19:00:00+00:00 2020-06-02 22:07:46+00:00 11267 1.0 magnetic 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
4 CONUS South CAS04 a 37.633351 -121.468382 329.3875 hz 2020-06-02 19:00:00+00:00 2020-06-02 22:07:46+00:00 11267 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
5 CONUS South CAS04 b 37.633351 -121.468382 329.3875 ex 2020-06-02 22:24:55+00:00 2020-06-12 17:52:23+00:00 847649 1.0 electric 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
6 CONUS South CAS04 b 37.633351 -121.468382 329.3875 ey 2020-06-02 22:24:55+00:00 2020-06-12 17:52:23+00:00 847649 1.0 electric 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
7 CONUS South CAS04 b 37.633351 -121.468382 329.3875 hx 2020-06-02 22:24:55+00:00 2020-06-12 17:52:23+00:00 847649 1.0 magnetic 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
8 CONUS South CAS04 b 37.633351 -121.468382 329.3875 hy 2020-06-02 22:24:55+00:00 2020-06-12 17:52:23+00:00 847649 1.0 magnetic 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
9 CONUS South CAS04 b 37.633351 -121.468382 329.3875 hz 2020-06-02 22:24:55+00:00 2020-06-12 17:52:23+00:00 847649 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
10 CONUS South CAS04 c 37.633351 -121.468382 329.3875 ex 2020-06-12 18:32:17+00:00 2020-07-01 17:32:59+00:00 1638043 1.0 electric 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
11 CONUS South CAS04 c 37.633351 -121.468382 329.3875 ey 2020-06-12 18:32:17+00:00 2020-07-01 17:32:59+00:00 1638043 1.0 electric 0.0 0.0 counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
12 CONUS South CAS04 c 37.633351 -121.468382 329.3875 hx 2020-06-12 18:32:17+00:00 2020-07-01 17:32:59+00:00 1638043 1.0 magnetic 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
13 CONUS South CAS04 c 37.633351 -121.468382 329.3875 hy 2020-06-12 18:32:17+00:00 2020-07-01 17:32:59+00:00 1638043 1.0 magnetic 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
14 CONUS South CAS04 c 37.633351 -121.468382 329.3875 hz 2020-06-12 18:32:17+00:00 2020-07-01 17:32:59+00:00 1638043 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
15 CONUS South CAS04 d 37.633351 -121.468382 329.3875 ex 2020-07-01 19:36:55+00:00 2020-07-13 19:00:00+00:00 1034586 1.0 electric 0.0 0.0 counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
16 CONUS South CAS04 d 37.633351 -121.468382 329.3875 ey 2020-07-01 19:36:55+00:00 2020-07-13 19:00:00+00:00 1034586 1.0 electric 0.0 0.0 counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
17 CONUS South CAS04 d 37.633351 -121.468382 329.3875 hx 2020-07-01 19:36:55+00:00 2020-07-13 19:00:00+00:00 1034586 1.0 magnetic 13.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
18 CONUS South CAS04 d 37.633351 -121.468382 329.3875 hy 2020-07-01 19:36:55+00:00 2020-07-13 19:00:00+00:00 1034586 1.0 magnetic 103.2 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
19 CONUS South CAS04 d 37.633351 -121.468382 329.3875 hz 2020-07-01 19:36:55+00:00 2020-07-13 19:00:00+00:00 1034586 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
20 CONUS South NVR08 a 38.326630 -118.082382 1375.4250 ex 2020-06-03 19:10:11+00:00 2020-06-03 19:57:51+00:00 2861 1.0 electric 12.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
21 CONUS South NVR08 a 38.326630 -118.082382 1375.4250 ey 2020-06-03 19:10:11+00:00 2020-06-03 19:57:51+00:00 2861 1.0 electric 102.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
22 CONUS South NVR08 a 38.326630 -118.082382 1375.4250 hx 2020-06-03 19:10:11+00:00 2020-06-03 19:57:51+00:00 2861 1.0 magnetic 12.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
23 CONUS South NVR08 a 38.326630 -118.082382 1375.4250 hy 2020-06-03 19:10:11+00:00 2020-06-03 19:57:51+00:00 2861 1.0 magnetic 102.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
24 CONUS South NVR08 a 38.326630 -118.082382 1375.4250 hz 2020-06-03 19:10:11+00:00 2020-06-03 19:57:51+00:00 2861 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
25 CONUS South NVR08 b 38.326630 -118.082382 1375.4250 ex 2020-06-03 20:14:13+00:00 2020-06-14 16:56:02+00:00 938510 1.0 electric 12.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
26 CONUS South NVR08 b 38.326630 -118.082382 1375.4250 ey 2020-06-03 20:14:13+00:00 2020-06-14 16:56:02+00:00 938510 1.0 electric 102.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
27 CONUS South NVR08 b 38.326630 -118.082382 1375.4250 hx 2020-06-03 20:14:13+00:00 2020-06-14 16:56:02+00:00 938510 1.0 magnetic 12.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
28 CONUS South NVR08 b 38.326630 -118.082382 1375.4250 hy 2020-06-03 20:14:13+00:00 2020-06-14 16:56:02+00:00 938510 1.0 magnetic 102.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
29 CONUS South NVR08 b 38.326630 -118.082382 1375.4250 hz 2020-06-03 20:14:13+00:00 2020-06-14 16:56:02+00:00 938510 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
30 CONUS South NVR08 c 38.326630 -118.082382 1375.4250 ex 2020-06-14 18:00:44+00:00 2020-06-24 15:55:46+00:00 856503 1.0 electric 12.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
31 CONUS South NVR08 c 38.326630 -118.082382 1375.4250 ey 2020-06-14 18:00:44+00:00 2020-06-24 15:55:46+00:00 856503 1.0 electric 102.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
32 CONUS South NVR08 c 38.326630 -118.082382 1375.4250 hx 2020-06-14 18:00:44+00:00 2020-06-24 15:55:46+00:00 856503 1.0 magnetic 12.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
33 CONUS South NVR08 c 38.326630 -118.082382 1375.4250 hy 2020-06-14 18:00:44+00:00 2020-06-24 15:55:46+00:00 856503 1.0 magnetic 102.6 0.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
34 CONUS South NVR08 c 38.326630 -118.082382 1375.4250 hz 2020-06-14 18:00:44+00:00 2020-06-24 15:55:46+00:00 856503 1.0 magnetic 0.0 90.0 digital counts <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
Check Filters
[15]:
ch_df["n_filters"] = -1
for i_row, row in ch_df.iterrows():
    channel = mth5_object.get_channel(row.station, row.run, row.component, row.survey)
    n_filters = len(channel.channel_response_filter.filters_list)
    ch_df.n_filters.iat[i_row] = n_filters

Take a look at the dataframe below, inspecting to see if there are any rows with n_filters=0, this would be unexpected for data drawn from an FSDN archive.

[16]:
ch_df[["station", "run", "component", "start", "end", "n_filters"]]

[16]:
station run component start end n_filters
0 CAS04 a ex 2020-06-02 19:00:00+00:00 2020-06-02 22:07:46+00:00 6
1 CAS04 a ey 2020-06-02 19:00:00+00:00 2020-06-02 22:07:46+00:00 6
2 CAS04 a hx 2020-06-02 19:00:00+00:00 2020-06-02 22:07:46+00:00 3
3 CAS04 a hy 2020-06-02 19:00:00+00:00 2020-06-02 22:07:46+00:00 3
4 CAS04 a hz 2020-06-02 19:00:00+00:00 2020-06-02 22:07:46+00:00 3
5 CAS04 b ex 2020-06-02 22:24:55+00:00 2020-06-12 17:52:23+00:00 6
6 CAS04 b ey 2020-06-02 22:24:55+00:00 2020-06-12 17:52:23+00:00 6
7 CAS04 b hx 2020-06-02 22:24:55+00:00 2020-06-12 17:52:23+00:00 3
8 CAS04 b hy 2020-06-02 22:24:55+00:00 2020-06-12 17:52:23+00:00 3
9 CAS04 b hz 2020-06-02 22:24:55+00:00 2020-06-12 17:52:23+00:00 3
10 CAS04 c ex 2020-06-12 18:32:17+00:00 2020-07-01 17:32:59+00:00 6
11 CAS04 c ey 2020-06-12 18:32:17+00:00 2020-07-01 17:32:59+00:00 0
12 CAS04 c hx 2020-06-12 18:32:17+00:00 2020-07-01 17:32:59+00:00 3
13 CAS04 c hy 2020-06-12 18:32:17+00:00 2020-07-01 17:32:59+00:00 3
14 CAS04 c hz 2020-06-12 18:32:17+00:00 2020-07-01 17:32:59+00:00 3
15 CAS04 d ex 2020-07-01 19:36:55+00:00 2020-07-13 19:00:00+00:00 0
16 CAS04 d ey 2020-07-01 19:36:55+00:00 2020-07-13 19:00:00+00:00 0
17 CAS04 d hx 2020-07-01 19:36:55+00:00 2020-07-13 19:00:00+00:00 3
18 CAS04 d hy 2020-07-01 19:36:55+00:00 2020-07-13 19:00:00+00:00 3
19 CAS04 d hz 2020-07-01 19:36:55+00:00 2020-07-13 19:00:00+00:00 3
20 NVR08 a ex 2020-06-03 19:10:11+00:00 2020-06-03 19:57:51+00:00 6
21 NVR08 a ey 2020-06-03 19:10:11+00:00 2020-06-03 19:57:51+00:00 6
22 NVR08 a hx 2020-06-03 19:10:11+00:00 2020-06-03 19:57:51+00:00 3
23 NVR08 a hy 2020-06-03 19:10:11+00:00 2020-06-03 19:57:51+00:00 3
24 NVR08 a hz 2020-06-03 19:10:11+00:00 2020-06-03 19:57:51+00:00 3
25 NVR08 b ex 2020-06-03 20:14:13+00:00 2020-06-14 16:56:02+00:00 6
26 NVR08 b ey 2020-06-03 20:14:13+00:00 2020-06-14 16:56:02+00:00 6
27 NVR08 b hx 2020-06-03 20:14:13+00:00 2020-06-14 16:56:02+00:00 3
28 NVR08 b hy 2020-06-03 20:14:13+00:00 2020-06-14 16:56:02+00:00 3
29 NVR08 b hz 2020-06-03 20:14:13+00:00 2020-06-14 16:56:02+00:00 3
30 NVR08 c ex 2020-06-14 18:00:44+00:00 2020-06-24 15:55:46+00:00 6
31 NVR08 c ey 2020-06-14 18:00:44+00:00 2020-06-24 15:55:46+00:00 6
32 NVR08 c hx 2020-06-14 18:00:44+00:00 2020-06-24 15:55:46+00:00 3
33 NVR08 c hy 2020-06-14 18:00:44+00:00 2020-06-24 15:55:46+00:00 3
34 NVR08 c hz 2020-06-14 18:00:44+00:00 2020-06-24 15:55:46+00:00 3
Have a look at a station

Lets grab one station CAS04 and have a look at its metadata and contents. Here we will grab it from the mth5_object.

[17]:
cas04 = mth5_object.get_station("CAS04", survey="CONUS_South")
cas04.metadata
[17]:
{
    "station": {
        "acquired_by.name": null,
        "channels_recorded": [
            "ex",
            "ey",
            "hx",
            "hy",
            "hz"
        ],
        "data_type": "MT",
        "fdsn.id": "CAS04",
        "geographic_name": "Corral Hollow, CA, USA",
        "hdf5_reference": "<HDF5 object reference>",
        "id": "CAS04",
        "location.declination.comments": "igrf.m by Drew Compston",
        "location.declination.model": "IGRF-13",
        "location.declination.value": 13.1745887285666,
        "location.elevation": 329.3875,
        "location.latitude": 37.633351,
        "location.longitude": -121.468382,
        "mth5_type": "Station",
        "orientation.method": "compass",
        "orientation.reference_frame": "geographic",
        "provenance.creation_time": "1980-01-01T00:00:00+00:00",
        "provenance.software.author": "Anna Kelbert, USGS",
        "provenance.software.name": "mth5_metadata.m",
        "provenance.software.version": "2022-03-31",
        "provenance.submitter.email": null,
        "provenance.submitter.organization": null,
        "release_license": "CC0-1.0",
        "run_list": [
            "a",
            "b",
            "c",
            "d"
        ],
        "time_period.end": "2020-07-13T21:46:12+00:00",
        "time_period.start": "2020-06-02T18:41:43+00:00"
    }
}
Changing Metadata

If you want to change the metadata of any group, be sure to use the write_metadata method. Here’s an example:

[18]:
cas04.metadata.location.declination.value = -13.5
cas04.write_metadata()
print(cas04.metadata.location.declination)
declination:
        comments = igrf.m by Drew Compston
        model = IGRF-13
        value = -13.5
Have a look at a single channel

Let’s pick out a channel and interogate it. There are a couple ways 1. Get a channel the first will be from the hdf5_reference [demonstrated here] 2. Get a channel from mth5_object 3. Get a station first then get a channel

[19]:
ex = mth5_object.from_reference(ch_df.iloc[0].hdf5_reference).to_channel_ts()
print(ex)
Channel Summary:
        Survey:       CONUS South
        Station:      CAS04
        Run:          a
        Channel Type: Electric
        Component:    ex
        Sample Rate:  1.0
        Start:        2020-06-02T19:00:00+00:00
        End:          2020-06-02T22:07:46+00:00
        N Samples:    11267
[20]:
ex.channel_metadata
[20]:
{
    "electric": {
        "channel_number": 0,
        "comments": "run_ids: [c,b,a]",
        "component": "ex",
        "data_quality.rating.value": 0,
        "dipole_length": 92.0,
        "filter.applied": [
            false,
            false,
            false,
            false,
            false,
            false
        ],
        "filter.name": [
            "electric_si_units",
            "electric_dipole_92.000",
            "electric_butterworth_low_pass",
            "electric_butterworth_high_pass",
            "electric_analog_to_digital",
            "electric_time_offset"
        ],
        "hdf5_reference": "<HDF5 object reference>",
        "measurement_azimuth": 13.2,
        "measurement_tilt": 0.0,
        "mth5_type": "Electric",
        "negative.elevation": 329.4,
        "negative.id": "200406D",
        "negative.latitude": 37.633351,
        "negative.longitude": -121.468382,
        "negative.manufacturer": "Oregon State University",
        "negative.model": "Pb-PbCl2 kaolin gel Petiau 2 chamber type",
        "negative.type": "electrode",
        "positive.elevation": 329.4,
        "positive.id": "200406B",
        "positive.latitude": 37.633351,
        "positive.longitude": -121.468382,
        "positive.manufacturer": "Oregon State University",
        "positive.model": "Pb-PbCl2 kaolin gel Petiau 2 chamber type",
        "positive.type": "electrode",
        "sample_rate": 1.0,
        "time_period.end": "2020-06-02T22:07:46+00:00",
        "time_period.start": "2020-06-02T19:00:00+00:00",
        "type": "electric",
        "units": "digital counts"
    }
}
Calibrate time series data

Most data loggers output data in digital counts. Then a series of filters that represent the various instrument responses are applied to get the data into physical units. The data can then be analyzed and processed. Commonly this is done during the processing step, but it is important to be able to look at time series data in physical units. Here we provide a remove_instrument_response method in the ChananelTS object. Here’s an example:

[21]:
print(ex.channel_response_filter)
ex.channel_response_filter.plot_response(np.logspace(-4, 1, 50))
Filters Included:
=========================
coefficient_filter:
        calibration_date = 1980-01-01
        comments = practical to SI unit conversion
        gain = 1e-06
        name = electric_si_units
        type = coefficient
        units_in = mV/km
        units_out = V/m
--------------------
coefficient_filter:
        calibration_date = 1980-01-01
        comments = electric dipole for electric field
        gain = 92.0
        name = electric_dipole_92.000
        type = coefficient
        units_in = V/m
        units_out = V
--------------------
pole_zero_filter:
        calibration_date = 1980-01-01
        comments = NIMS electric field 5 pole Butterworth 0.5 low pass (analog)
        gain = 1.0
        name = electric_butterworth_low_pass
        normalization_factor = 313383.493219835
        poles = [ -3.883009+11.951875j  -3.883009-11.951875j -10.166194 +7.386513j
 -10.166194 -7.386513j -12.566371 +0.j      ]
        type = zpk
        units_in = V
        units_out = V
        zeros = []
--------------------
pole_zero_filter:
        calibration_date = 1980-01-01
        comments = NIMS electric field 1 pole Butterworth high pass (analog)
        gain = 1.0
        name = electric_butterworth_high_pass
        normalization_factor = 1.00000000378188
        poles = [-0.000167+0.j]
        type = zpk
        units_in = V
        units_out = V
        zeros = [0.+0.j]
--------------------
coefficient_filter:
        calibration_date = 1980-01-01
        comments = analog to digital conversion (electric)
        gain = 409600000.0
        name = electric_analog_to_digital
        type = coefficient
        units_in = V
        units_out = count
--------------------
time_delay_filter:
        calibration_date = 1980-01-01
        comments = time offset in seconds (digital)
        delay = -0.285
        gain = 1.0
        name = electric_time_offset
        type = time delay
        units_in = count
        units_out = count
--------------------

[22]:
ex.remove_instrument_response(plot=True)
/home/kkappler/software/irismt/mth5/mth5/timeseries/ts_filters.py:498: UserWarning: Attempt to set non-positive xlim on a log-scaled axis will be ignored.
  ax2.set_xlim((f[0], f[-1]))
[22]:
Channel Summary:
        Survey:       CONUS South
        Station:      CAS04
        Run:          a
        Channel Type: Electric
        Component:    ex
        Sample Rate:  1.0
        Start:        2020-06-02T19:00:00+00:00
        End:          2020-06-02T22:07:46+00:00
        N Samples:    11267
Have a look at a run

Let’s pick out a run, take a slice of it, and interogate it. There are a couple ways 1. Get a run the first will be from the run_hdf5_reference [demonstrated here] 2. Get a run from mth5_object 3. Get a station first then get a run

[23]:
run_from_reference = mth5_object.from_reference(ch_df.iloc[0].run_hdf5_reference).to_runts(start=ch_df.iloc[0].start.isoformat(), n_samples=360)
print(run_from_reference)
2023-07-22T14:50:42 [line 678] mth5.timeseries.run_ts.RunTS.validate_metadata - WARNING: end time of dataset 2020-06-02T19:05:59+00:00 does not match metadata end 2020-06-02T22:07:46+00:00 updating metatdata value to 2020-06-02T19:05:59+00:00
RunTS Summary:
        Survey:      CONUS South
        Station:     CAS04
        Run:         a
        Start:       2020-06-02T19:00:00+00:00
        End:         2020-06-02T19:05:59+00:00
        Sample Rate: 1.0
        Components:  ['ex', 'ey', 'hx', 'hy', 'hz']
[24]:
run_from_reference.plot()
Calibrate Run
[25]:
calibrated_run = run_from_reference.calibrate()
calibrated_run.plot()
Load Transfer Functions

You can download the transfer functions for CAS04 and NVR08 from IRIS SPUD EMTF. This has already been done as EMTF XML format and will be loaded here.

[26]:
cas04_tf = r"USMTArray.CAS04.2020.xml"
nvr08_tf = r"USMTArray.NVR08.2020.xml"
[27]:
from mt_metadata.transfer_functions.core import TF
[28]:
for tf_fn in [cas04_tf, nvr08_tf]:
    tf_obj = TF(tf_fn)
    tf_obj.read_tf_file()
    mth5_object.add_transfer_function(tf_obj)
Have a look at the transfer function summary
[24]:
mth5_object.tf_summary.summarize()
tf_df = mth5_object.tf_summary.to_dataframe()
tf_df
[24]:
station survey latitude longitude elevation tf_id units has_impedance has_tipper has_covariance period_min period_max hdf5_reference station_hdf5_reference
0 CAS04 CONUS_South 37.633351 -121.468382 329.387 CAS04 none True True True 4.65455 29127.11 <HDF5 object reference> <HDF5 object reference>
1 NVR08 CONUS_South 38.326630 -118.082382 1375.425 NVR08 none True True True 4.65455 29127.11 <HDF5 object reference> <HDF5 object reference>
Plot the transfer functions using MTpy

Note: This currently works on branch mtpy/v2_plots

[25]:
from mtpy import MTCollection
2022-09-12T15:31:25 [line 121] mtpy.utils.mtpy_decorator._check_gdal_data - INFO: GDAL_DATA is set to: C:\Users\jpeacock\Anaconda3\envs\em\Library\share\gdal
2022-09-12 15:31:25,580 [line 133] matplotlib.get_mtpy_logger - INFO: Logging file can be found C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mtpy\logs\matplotlib_warn.log
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
~\AppData\Local\Temp\11\ipykernel_21912\2507741362.py in <cell line: 1>()
----> 1 from mtpy import MTCollection

ImportError: cannot import name 'MTCollection' from 'mtpy' (C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mtpy\mtpy\__init__.py)
[ ]:
mc = MTCollection()
mc.open_collection(r"8P_CAS04_NVR08")
[ ]:
pmr = mc.plot_mt_response(["CAS04", "NVR08"], plot_style="1")
Plot Station locations

Here we can plot station locations for all stations in the file, or we can give it a bounding box. If you have internet access a basemap will be plotted using Contextily.

[ ]:
st = mc.plot_stations(pad=.9, fig_num=5, fig_size=[6, 4])
[ ]:
st.fig.get_axes()[0].set_xlim((-121.9, -117.75))
st.fig.get_axes()[0].set_ylim((37.35, 38.5))
st.update_plot()
[ ]:
mth5_object.close_mth5()
[ ]:

Build MTH5 from USGS Geomagnetic data

Its common to look at observatory data for geomagnetic storms or to use as a remote reference. The USGS provides geomagnetic observatory data for observatories in North America. In the future this will be expanded to the various other observatories using well developed packages like geomagpy.

You will need to know ahead of time what observatories you would like to download data from, dates, and type of data. There are no wildcards. See USGS Geomagnetic webservices for more information on allowed options.

Here we will download 2 days of data from 2 different observatories for the x and y components of calibrated data (‘adjusted’).

[1]:
import pandas as pd

from mth5.clients import MakeMTH5
2023-03-23 15:44:33,968 [line 135] mth5.setup_logger - INFO: Logging file can be found C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\logs\mth5_debug.log
Create a request DataFrame

The request input is in the form of a pandas.DataFrame with the following columns

Column

Description

Options

observatory

Observatory code

BDT, BOU, TST, BRW, BRT, BSL, CMO, CMT, DED, DHT, FRD, FRN, GUA, HON, NEW, SHU, SIT, SJG, TUC, USGS, BLC, BRD, CBB, EUA, FCC, IQA, MEA, OTT, RES, SNK, STJ, VIC, YKC, HAD, HER, KAK

type

The type of data to download

variation, adjusted, quasi-definitive, definitivevariation, adjusted (default), quasi-definitive, definitive

elements

Components or elements of the geomagnetic data to download, should be a list

D, DIST, DST, E, E-E, E-N, F, G, H, SQ, SV, UK1, UK2, UK3, UK4, X, Y, ZD, DIST, DST, E, E-E, E-N, F, G, H, SQ, SV, UK1, UK2, UK3, UK4, X, Y, Z

sampling_period

Sampling period of data to download in seconds

1, 60, 3600

start

Start time (YYYY-MM-DDThh:mm:ss) in UTC time

end

End time (YYYY-MM-DDThh:mm:ss) in UTC time

[2]:
request_df = pd.DataFrame(
    {
        "observatory": ["frn", "frn", "ott", "ott"],
        "type": ["adjusted"] * 4,
        "elements": [["x", "y"]] * 4,
        "sampling_period": [1] * 4,
        "start": [
            "2022-01-01T00:00:00",
            "2022-01-03T00:00:00",
            "2022-01-01T00:00:00",
            "2022-01-03T00:00:00",
        ],
        "end": [
            "2022-01-02T00:00:00",
            "2022-01-04T00:00:00",
            "2022-01-02T00:00:00",
            "2022-01-04T00:00:00",
        ],
    }
)
[3]:
request_df
[3]:
observatory type elements sampling_period start end
0 frn adjusted [x, y] 1 2022-01-01T00:00:00 2022-01-02T00:00:00
1 frn adjusted [x, y] 1 2022-01-03T00:00:00 2022-01-04T00:00:00
2 ott adjusted [x, y] 1 2022-01-01T00:00:00 2022-01-02T00:00:00
3 ott adjusted [x, y] 1 2022-01-03T00:00:00 2022-01-04T00:00:00
Adding Run ID

When the request is input automatically run names will be assigned to different windows of time by f"sp{sampling_period}_{count:03}". So the first run is sp1_001, alternatively you can add a run column and name them as you like.

Create MTH5

Once the request is complete get the data. The file name will be created automatically as usgs_geomag_{list of observatories}_{list of elements}.h5

[4]:
make_mth5_object = MakeMTH5(mth5_version="0.2.0", interact=True)
mth5_object = make_mth5_object.from_usgs_geomag(request_df)
2023-03-23 15:44:34,903 [line 674] mth5.mth5.MTH5._initialize_file - INFO: Initialized MTH5 0.2.0 file C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\docs\examples\notebooks\usgs_geomag_frn_ott_xy.h5 in mode a
2023-03-23 15:44:43,406 [line 311] mth5.groups.base.MasterStation.add_station - INFO: Station Fresno already exists, returning existing group.
2023-03-23 15:44:48,843 [line 311] mth5.groups.base.MasterStation.add_station - INFO: Station Ottowa already exists, returning existing group.
Check to make sure everything was downloaded properly
[8]:
mth5_object.channel_summary.summarize()
mth5_object.channel_summary.to_dataframe()
[8]:
survey station run latitude longitude elevation component start end n_samples sample_rate measurement_type azimuth tilt units hdf5_reference run_hdf5_reference station_hdf5_reference
0 USGS-GEOMAG Fresno sp1_001 37.091 -119.718 331.0 hx 2022-01-01 00:00:00+00:00 2022-01-02 00:00:00+00:00 86401 1.0 magnetic 0.0 0.0 nanotesla <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
1 USGS-GEOMAG Fresno sp1_001 37.091 -119.718 331.0 hy 2022-01-01 00:00:00+00:00 2022-01-02 00:00:00+00:00 86401 1.0 magnetic 90.0 0.0 nanotesla <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
2 USGS-GEOMAG Fresno sp1_002 37.091 -119.718 331.0 hx 2022-01-03 00:00:00+00:00 2022-01-04 00:00:00+00:00 86401 1.0 magnetic 0.0 0.0 nanotesla <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
3 USGS-GEOMAG Fresno sp1_002 37.091 -119.718 331.0 hy 2022-01-03 00:00:00+00:00 2022-01-04 00:00:00+00:00 86401 1.0 magnetic 90.0 0.0 nanotesla <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
4 USGS-GEOMAG Ottowa sp1_001 45.400 -75.500 0.0 hx 2022-01-01 00:00:00+00:00 2022-01-02 00:00:00+00:00 86401 1.0 magnetic 0.0 0.0 nanotesla <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
5 USGS-GEOMAG Ottowa sp1_001 45.400 -75.500 0.0 hy 2022-01-01 00:00:00+00:00 2022-01-02 00:00:00+00:00 86401 1.0 magnetic 90.0 0.0 nanotesla <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
6 USGS-GEOMAG Ottowa sp1_002 45.400 -75.500 0.0 hx 2022-01-03 00:00:00+00:00 2022-01-04 00:00:00+00:00 86401 1.0 magnetic 0.0 0.0 nanotesla <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
7 USGS-GEOMAG Ottowa sp1_002 45.400 -75.500 0.0 hy 2022-01-03 00:00:00+00:00 2022-01-04 00:00:00+00:00 86401 1.0 magnetic 90.0 0.0 nanotesla <HDF5 object reference> <HDF5 object reference> <HDF5 object reference>
Have a look at a run
[12]:
run = mth5_object.get_run("Fresno", "sp1_001", "USGS-GEOMAG")
[13]:
run_ts = run.to_runts()
run_ts.plot()
_images/examples_notebooks_make_mth5_from_geomag_12_0.png
Close the MTH5 file

IMPORTANT: Be sure to close the file, otherwise bad things can happen.

[14]:
mth5_object.close_mth5()
2023-03-23 15:48:55,374 [line 755] mth5.mth5.MTH5.close_mth5 - INFO: Flushing and closing C:\Users\jpeacock\OneDrive - DOI\Documents\GitHub\mth5\docs\examples\notebooks\usgs_geomag_frn_ott_xy.h5
[ ]:

Parallel HDF5/MTH5

Creating an mth5 file using Parallel HDF5

This tutorial will examine how to build an mth5 file and write MT time series data for each station in a survey in parallel by utilising h5py’s Parallel HDF5 functionality. Parallel HDF5 allows users to open files across multiple parallel processes by utilising the Messaging Passing Interface (MPI) standard in mpi4py.

Note that in order to use Parallel HDF5, both HDF5 and h5py must be compiled with MPI support turned on.

For this example, we will be converting Earth Data Logger (ASCII) time series data from 93 stations of the AusLAMP Musgraves Province survey (see https://dx.doi.org/10.25914/5eaa30d63bd17). The ASCII time series were concatenated per run for each station and the total volume of time series data was 106 GB. 90 of the 93 stations were a single run - stations SA246, SA299 and SA324-2 had multiple runs.

This example was tested on the National Computational Infrastructure’s Gadi HPC system. Gadi is Australia’s most powerful supercomputer, a highly parallel cluster comprising more than 200,000 processor cores on ten different types of compute nodes. The example also makes use of the NCI-geophysics 2022.06 module which contains Parallel HDF5.

Building an mth5 skeleton

To build our mth5 file requires a two step process: 1. create an mth5 skeleton 2. populate the skeleton with the time series data in parallel using Parallel HDF5.

Let’s start by building the mth5 skeleton script which requires the following libraries:

[ ]:
from mth5.mth5 import MTH5
import numpy as np
from os import path
import os
import glob
import time

from mt_metadata.utils.mttime import MTime

startTime = time.time()

Next we can define our working directories and file paths which will need to be changed for your use case:

[ ]:
### directory on Gadi file system that contains merged ASCII time series per station run
work_dir = '/g/data/.../.../merged_data_all'

### full path to the concatenated Earth Data Logger ASCII time series files
full_path_to_files = sorted(glob.glob(work_dir+"/*"))

### directory on Gadi to write final mth5 file to
mth5_test_dir = '/g/data/.../.../mth5_outdir'

### name of the mth5 file
hdf5_filename = 'example_mth5_file.h5'

### full path to the mth5 file
h5_fn = mth5_test_dir+'/'+hdf5_filename

Now we can define: 1. the Earth Data Logger channels 2. the stations in our survey 3. the survey name 4. the run number for stations with a single run

[ ]:
### define raw time series data channels from Earth Data Logger

raw_data_channels = ['EX','EY','BX','BY','BZ','TP','ambientTemperature']


### define stations to go into mth5 file

stations_all = ['SA225-2','SA227',   'SA242',  'SA243',  'SA245',
                'SA247',  'SA248',   'SA249',  'SA250',  'SA251',
                'SA252',  'SA26W-2', 'SA270',  'SA271',  'SA272',
                'SA273',  'SA274-2', 'SA274',  'SA275',  'SA276',
                'SA277',  'SA293-2', 'SA294',  'SA295',  'SA296',
                'SA297',  'SA298',   'SA300',  'SA301',  'SA319',
                'SA320',  'SA320-2', 'SA321',  'SA322',  'SA323',
                'SA324',  'SA325-2', 'SA325',  'SA326N', 'SA326S',
                'SA344',  'SA344-2', 'SA345',  'SA346',  'SA347',
                'SA348',  'SA349',   'SA350',  'SA351',             ### 49 single run SA stations
                'WA10',   'WA13',    'WA14',   'WA15',   'WA26',
                'WA27',   'WA29',    'WA30',   'WA31',   'WA42',
                'WA43',   'WA44',    'WA45',   'WA46',   'WA47',
                'WA54',   'WA55',    'WA56',   'WA57',   'WA58',
                'WA60',   'WA61',    'WA62',   'WA63',   'WA64',
                'WA65',   'WA66',    'WA67',   'WA68',   'WA69',
                'WA70',   'WA71',    'WA72',   'WA73',   'WA74',
                'WA75',   'WANT19',  'WANT38', 'WANT45', 'WASA302',
                'WASA327',                          ### 41 single run WA stations
                'SA246',  'SA299',   'SA324-2']     ### 3 stations with multiple runs


### define survey name and run number (for stations with a single run)

survey_name = "AusLAMP_Musgraves"
run_number = "001"

We will define some functions that will be used to create our MTH5 skeleton:

[ ]:
def make_mth5_dir(mth5_output_directory):
### creates mth5 output directory if it doesn't already exist
    try:
        os.makedirs(mth5_output_directory)
    except FileExistsError:
        # directory already exists
        print('directory already exists!')
        pass


def remove_existing_mth5_file(mth5_file):
### removes existing mth5 file if it exists
    if path.exists(mth5_file):
        os.unlink(mth5_file)
        print("INFO: Removed existing file {mth5_file}")
    else:
        print("File does not exist")


def channel_line_count(channels):
### counts lines in electromagnetic ASCII files
    EX = [file for file in channels if file.endswith('EX')]
    EY = [file for file in channels if file.endswith('EY')]
    BX = [file for file in channels if file.endswith('BX')]
    BY = [file for file in channels if file.endswith('BY')]
    BZ = [file for file in channels if file.endswith('BZ')]

    count_ex = sum(1 for line in open(EX[0]))
    count_ey = sum(1 for line in open(EY[0]))
    count_bx = sum(1 for line in open(BX[0]))
    count_by = sum(1 for line in open(BY[0]))
    count_bz = sum(1 for line in open(BZ[0]))

    return count_ex, count_ey, count_bx, count_by, count_bz


def create_mth5_group_station_run_channel(station):
### creates the mth5 gropus, stations, runs and channels for the mth5 skeleton
    add_station = m.add_station(station, survey=survey_name)
    channels = []
    for file in full_path_to_files:
        if station in file:
            channels.append(file)
        else:
            continue
### for stations with a single run:
    if len(channels) == len(raw_data_channels):
        add_run = m.add_run(station, run_number, survey=survey_name)

        count_ex,count_ey,count_bx,count_by,count_bz = channel_line_count(channels)

        ex_zeros = np.zeros(count_ex,)
        ey_zeros = np.zeros(count_ey,)
        bx_zeros = np.zeros(count_bx,)
        by_zeros = np.zeros(count_by,)
        bz_zeros = np.zeros(count_bz,)

        ex = m.add_channel(station, run_number, "ex", "electric", ex_zeros, survey=survey_name)
        ey = m.add_channel(station, run_number, "ey", "electric", ey_zeros, survey=survey_name)
        bx = m.add_channel(station, run_number, "bx", "magnetic", bx_zeros, survey=survey_name)
        by = m.add_channel(station, run_number, "by", "magnetic", by_zeros, survey=survey_name)
        bz = m.add_channel(station, run_number, "bz", "magnetic", bz_zeros, survey=survey_name)

        #######################################################################################
        # Note: At the time of writing this example, the resizing of datasets caused h5py     #
        # parallel to fail when running using the mpio driver. A workaround was to create     #
        # 'zeros' arrays of size count_<xx> (see above).                                      #                                       #
        #                                                                                     #
        # ex = m.add_channel(station, run_number, "ex", "electric", None, survey=survey_name) #
        # ey = m.add_channel(station, run_number, "ey", "electric", None, survey=survey_name) #
        # bx = m.add_channel(station, run_number, "bx", "magnetic", None, survey=survey_name) #
        # by = m.add_channel(station, run_number, "by", "magnetic", None, survey=survey_name) #
        # bz = m.add_channel(station, run_number, "bz", "magnetic", None, survey=survey_name) #
        #                                                                                     #
        # ex.hdf5_dataset.resize((count_ex,))                                                 #
        # ey.hdf5_dataset.resize((count_ey,))                                                 #
        # bx.hdf5_dataset.resize((count_bx,))                                                 #
        # by.hdf5_dataset.resize((count_by,))                                                 #
        # bz.hdf5_dataset.resize((count_bz,))                                                 #
        #######################################################################################

### for stations with multiple runs:
    elif len(channels) > len(raw_data_channels):
        sort_files = sorted(channels)
        number_of_channels = len(raw_data_channels)
        split_lists = [sort_files[x:x+number_of_channels] for x in range(0, len(sort_files), number_of_channels)]
        for i,group in enumerate(split_lists):
            mrun_number = i+1
            run = "00%i" % mrun_number
            add_run = m.add_run(station, run, survey=survey_name)
            count_ex,count_ey,count_bx,count_by,count_bz = channel_line_count(group)
            ex_zeros = np.zeros(count_ex,)
            ey_zeros = np.zeros(count_ey,)
            bx_zeros = np.zeros(count_bx,)
            by_zeros = np.zeros(count_by,)
            bz_zeros = np.zeros(count_bz,)

            ex = m.add_channel(station, run, "ex", "electric", ex_zeros, survey=survey_name)
            ey = m.add_channel(station, run, "ey", "electric", ey_zeros, survey=survey_name)
            bx = m.add_channel(station, run, "bx", "magnetic", bx_zeros, survey=survey_name)
            by = m.add_channel(station, run, "by", "magnetic", by_zeros, survey=survey_name)
            bz = m.add_channel(station, run, "bz", "magnetic", bz_zeros, survey=survey_name)

    elif len(channels) < len(raw_data_channels):
        print('you are likely missing some channels')
        print(station)

    else:
        print('something has gone wrong')

The final step is to create our mth5 skeleton. Note that the Parallel HDF5 version used in this example does not support compression, so compression was turned off when generating the MTH5 skeleton:

[ ]:
### create mth5 directory (if it doesn't already exist)
make_mth5_dir(mth5_test_dir)

### remove any existing mth5 file in our mth5 directory
remove_existing_mth5_file(h5_fn)

start = MTime()
start.now()

### ensure compression is turned off
m = MTH5(file_version='0.2.0',shuffle=None,fletcher32=None,compression=None,compression_opts=None)

### open mth5 file in write mode
m.open_mth5(h5_fn, "w")

### add survey group
survey_group = m.add_survey(survey_name)

### create station, run and channel groups for all stations in our survey
for station in sorted(stations_all):
    create_mth5_group_station_run_channel(station)
m.close_mth5()

### print total time to run our mth5 skeleton script
print('The script took {0} seconds !'.format(time.time()-startTime))

Putting this all together into a Python script (mth5_skeleton.py):

[ ]:
from mth5.mth5 import MTH5
import numpy as np
from os import path
import os
import glob
import time

from mt_metadata.utils.mttime import MTime


startTime = time.time()


### define working directories and file paths

work_dir = '/g/data/.../.../merged_data_all'
mth5_test_dir = '/g/data/.../.../mth5_outdir'
hdf5_filename = 'example_mth5_file.h5'
h5_fn = mth5_test_dir+'/'+hdf5_filename
full_path_to_files = sorted(glob.glob(work_dir+"/*"))


### define raw time series data channels

raw_data_channels = ['EX','EY','BX','BY','BZ','TP','ambientTemperature']


### define stations to go into mth5 file

stations_all = ['SA225-2','SA227',   'SA242',  'SA243',  'SA245',
                'SA247',  'SA248',   'SA249',  'SA250',  'SA251',
                'SA252',  'SA26W-2', 'SA270',  'SA271',  'SA272',
                'SA273',  'SA274-2', 'SA274',  'SA275',  'SA276',
                'SA277',  'SA293-2', 'SA294',  'SA295',  'SA296',
                'SA297',  'SA298',   'SA300',  'SA301',  'SA319',
                'SA320',  'SA320-2', 'SA321',  'SA322',  'SA323',
                'SA324',  'SA325-2', 'SA325',  'SA326N', 'SA326S',
                'SA344',  'SA344-2', 'SA345',  'SA346',  'SA347',
                'SA348',  'SA349',   'SA350',  'SA351',             ### 49 single run SA stations
                'WA10',   'WA13',    'WA14',   'WA15',   'WA26',
                'WA27',   'WA29',    'WA30',   'WA31',   'WA42',
                'WA43',   'WA44',    'WA45',   'WA46',   'WA47',
                'WA54',   'WA55',    'WA56',   'WA57',   'WA58',
                'WA60',   'WA61',    'WA62',   'WA63',   'WA64',
                'WA65',   'WA66',    'WA67',   'WA68',   'WA69',
                'WA70',   'WA71',    'WA72',   'WA73',   'WA74',
                'WA75',   'WANT19',  'WANT38', 'WANT45', 'WASA302',
                'WASA327',                          ### 41 single run WA stations
                'SA246',  'SA299',   'SA324-2']     ### 3 stations with multiple runs


### define survey name and run number (for stations with a single run)

survey_name = "AusLAMP_Musgraves"
run_number = "001"

### define functions

def make_mth5_dir(mth5_output_directory):
### creates mth5 output directory if it doesn't already exist
    try:
        os.makedirs(mth5_output_directory)
    except FileExistsError:
        # directory already exists
        print('directory already exists!')
        pass


def remove_existing_mth5_file(mth5_file):
### removes existing mth5 file if it exists
    if path.exists(mth5_file):
        os.unlink(mth5_file)
        print("INFO: Removed existing file {mth5_file}")
    else:
        print("File does not exist")


def channel_line_count(channels):
### counts lines in electromagnetic ASCII files
    EX = [file for file in channels if file.endswith('EX')]
    EY = [file for file in channels if file.endswith('EY')]
    BX = [file for file in channels if file.endswith('BX')]
    BY = [file for file in channels if file.endswith('BY')]
    BZ = [file for file in channels if file.endswith('BZ')]

    count_ex = sum(1 for line in open(EX[0]))
    count_ey = sum(1 for line in open(EY[0]))
    count_bx = sum(1 for line in open(BX[0]))
    count_by = sum(1 for line in open(BY[0]))
    count_bz = sum(1 for line in open(BZ[0]))

    return count_ex, count_ey, count_bx, count_by, count_bz


def create_mth5_group_station_run_channel(station):
### creates the mth5 gropus, stations, runs and channels for the mth5 skeleton
    add_station = m.add_station(station, survey=survey_name)
    channels = []
    for file in full_path_to_files:
        if station in file:
            channels.append(file)
        else:
            continue
### for stations with a single run:
    if len(channels) == len(raw_data_channels):
        add_run = m.add_run(station, run_number, survey=survey_name)

        count_ex,count_ey,count_bx,count_by,count_bz = channel_line_count(channels)

        ex_zeros = np.zeros(count_ex,)
        ey_zeros = np.zeros(count_ey,)
        bx_zeros = np.zeros(count_bx,)
        by_zeros = np.zeros(count_by,)
        bz_zeros = np.zeros(count_bz,)

        ex = m.add_channel(station, run_number, "ex", "electric", ex_zeros, survey=survey_name)
        ey = m.add_channel(station, run_number, "ey", "electric", ey_zeros, survey=survey_name)
        bx = m.add_channel(station, run_number, "bx", "magnetic", bx_zeros, survey=survey_name)
        by = m.add_channel(station, run_number, "by", "magnetic", by_zeros, survey=survey_name)
        bz = m.add_channel(station, run_number, "bz", "magnetic", bz_zeros, survey=survey_name)

        #######################################################################################
        # Note: At the time of writing this example, the resizing of datasets caused h5py     #
        # parallel to fail when running using the mpio driver. A workaround was to create     #
        # 'zeros' arrays of size count_<xx> (see above).                                      #                                       #
        #                                                                                     #
        # ex = m.add_channel(station, run_number, "ex", "electric", None, survey=survey_name) #
        # ey = m.add_channel(station, run_number, "ey", "electric", None, survey=survey_name) #
        # bx = m.add_channel(station, run_number, "bx", "magnetic", None, survey=survey_name) #
        # by = m.add_channel(station, run_number, "by", "magnetic", None, survey=survey_name) #
        # bz = m.add_channel(station, run_number, "bz", "magnetic", None, survey=survey_name) #
        #                                                                                     #
        # ex.hdf5_dataset.resize((count_ex,))                                                 #
        # ey.hdf5_dataset.resize((count_ey,))                                                 #
        # bx.hdf5_dataset.resize((count_bx,))                                                 #
        # by.hdf5_dataset.resize((count_by,))                                                 #
        # bz.hdf5_dataset.resize((count_bz,))                                                 #
        #######################################################################################

### for stations with multiple runs:
    elif len(channels) > len(raw_data_channels):
        sort_files = sorted(channels)
        number_of_channels = len(raw_data_channels)
        split_lists = [sort_files[x:x+number_of_channels] for x in range(0, len(sort_files), number_of_channels)]
        for i,group in enumerate(split_lists):
            mrun_number = i+1
            run = "00%i" % mrun_number
            add_run = m.add_run(station, run, survey=survey_name)
            count_ex,count_ey,count_bx,count_by,count_bz = channel_line_count(group)
            ex_zeros = np.zeros(count_ex,)
            ey_zeros = np.zeros(count_ey,)
            bx_zeros = np.zeros(count_bx,)
            by_zeros = np.zeros(count_by,)
            bz_zeros = np.zeros(count_bz,)

            ex = m.add_channel(station, run, "ex", "electric", ex_zeros, survey=survey_name)
            ey = m.add_channel(station, run, "ey", "electric", ey_zeros, survey=survey_name)
            bx = m.add_channel(station, run, "bx", "magnetic", bx_zeros, survey=survey_name)
            by = m.add_channel(station, run, "by", "magnetic", by_zeros, survey=survey_name)
            bz = m.add_channel(station, run, "bz", "magnetic", bz_zeros, survey=survey_name)

    elif len(channels) < len(raw_data_channels):
        print('you are likely missing some channels')
        print(station)

    else:
        print('something has gone wrong')


### create mth5 file skeleton

make_mth5_dir(mth5_test_dir)
remove_existing_mth5_file(h5_fn)
start = MTime()
start.now()

m = MTH5(file_version='0.2.0',shuffle=None,fletcher32=None,compression=None,compression_opts=None)

m.open_mth5(h5_fn, "w")
survey_group = m.add_survey(survey_name)
for station in sorted(stations_all):
    create_mth5_group_station_run_channel(station)
m.close_mth5()

### print total time to run script

print('The script took {0} seconds !'.format(time.time()-startTime))

For the mth5 skeleton, we have populated a single mth5 file with all station groups in our survey. No actual time series data has been added yet (this will happen in parallel in the next steps). It is important to note that this process is a collective process that modifies the structure of an HDF5 file and all processes must participate. It is for this reason that mth5_skeleton.py can not be run in parallel when generating a single mth5 file for all stations (see Collective versus independent operations in https://docs.h5py.org/en/stable/mpi.html).

The following is a summary of how long mth5_skeleton.py took to run for the AusLAMP Musgraves Province survey.

run on: Gadi
number of stations: 93
number of runs: 101
NCPUs used: 1
memory used: 30GB
walltime used: 1264 seconds (21m04s)
CPU time used: 1162 seconds (19m22s)
service units: 5.27
Populating our mth5 skeleton with time series data in parallel

Now that we have created our mth5 skeleton, it’s time to populate the time time series data from each station in our survey using Parallel HDF5.

First we will import the required Python libraries. The important libraries are mpi4py and h5py.

[ ]:
from mpi4py import MPI
import h5py
import os, psutil
import glob
import numpy as np
import shutil
import sys
from os import path
import time

startTime = time.time()

Next we will define the MPI communicator, rank and size:

[ ]:
comm = MPI.COMM_WORLD
rank = MPI.COMM_WORLD.rank
size = MPI.COMM_WORLD.size

Once again, we will define our: 1. working directories and file paths 2. Earth Data Logger channels 3. survey stations

[ ]:
### directory on Gadi file system that contains merged ASCII time series per station per run
work_dir = '/g/data/.../.../merged_data_all'

### directory to write final mth5 file
mth5_test_dir = '/g/data/.../.../mth5_outdir'

### name of mth5 file
hdf5_filename = 'example_mth5_file.h5'

### full path to mth5 file
h5_fn = mth5_test_dir+'/'+hdf5_filename

### full path to concatenated Earth Data Logger ASCII time series files
full_path_files = sorted(glob.glob(work_dir+"/*"))

### raw data channels
raw_data_channels = ['EX','EY','BX','BY','BZ','TP','ambientTemperature']

### MT stations to go into mth5 file
stations_all = ['SA225-2','SA227',   'SA242',  'SA243',  'SA245',
                'SA247',  'SA248',   'SA249',  'SA250',  'SA251',
                'SA252',  'SA26W-2', 'SA270',  'SA271',  'SA272',
                'SA273',  'SA274-2', 'SA274',  'SA275',  'SA276',
                'SA277',  'SA293-2', 'SA294',  'SA295',  'SA296',
                'SA297',  'SA298',   'SA300',  'SA301',  'SA319',
                'SA320',  'SA320-2', 'SA321',  'SA322',  'SA323',
                'SA324',  'SA325-2', 'SA325',  'SA326N', 'SA326S',
                'SA344',  'SA344-2', 'SA345',  'SA346',  'SA347',
                'SA348',  'SA349',   'SA350',  'SA351',             ### 49 single run SA stations
                'WA10',   'WA13',    'WA14',   'WA15',   'WA26',
                'WA27',   'WA29',    'WA30',   'WA31',   'WA42',
                'WA43',   'WA44',    'WA45',   'WA46',   'WA47',
                'WA54',   'WA55',    'WA56',   'WA57',   'WA58',
                'WA60',   'WA61',    'WA62',   'WA63',   'WA64',
                'WA65',   'WA66',    'WA67',   'WA68',   'WA69',
                'WA70',   'WA71',    'WA72',   'WA73',   'WA74',
                'WA75',   'WANT19',  'WANT38', 'WANT45', 'WASA302'
                'WASA327',                          ### 41 single run WA stations
                'SA246',  'SA299',   'SA324-2']     ### 3 stations with multiple runs


Let’s define the functions that will be used to populate our MTH5 skeleton with our Earth Data Logger time series data:

[ ]:
def channel_data_extraction(channels):
### extracts electromagnetic time series data from concatenated (per run) Earth Data Logger ASCII time series files
    EX = [file for file in channels if file.endswith('EX')]
    EY = [file for file in channels if file.endswith('EY')]
    BX = [file for file in channels if file.endswith('BX')]
    BY = [file for file in channels if file.endswith('BY')]
    BZ = [file for file in channels if file.endswith('BZ')]
    with open(EX[0], 'r') as file:
        EX1 = file.read().splitlines()
        ex = np.array(EX1).astype(np.int32)
    with open(EY[0], 'r') as file:
        EY1 = file.read().splitlines()
        ey = np.array(EY1).astype(np.int32)
    with open(BX[0], 'r') as file:
        BX1 = file.read().splitlines()
        bx = np.array(BX1).astype(np.int32)
    with open(BY[0], 'r') as file:
        BY1 = file.read().splitlines()
        by = np.array(BY1).astype(np.int32)
    with open(BZ[0], 'r') as file:
        BZ1 = file.read().splitlines()
        bz = np.array(BZ1).astype(np.int32)

    return ex, ey, bx, by, bz


def write_channels(list_of_stations,full_path_to_files):
### writes time series data to the mth5 skeleton file in an embarrisingly parallel way - that is,
### each rank is dedicated to writing data from a single station.
    for i,station in enumerate(sorted(list_of_stations)):
        if i%size!=rank:
            continue
        channels = []
        for file in full_path_to_files:
            if station in file:
                channels.append(file)
            else:
                continue

### for stations with a single run:
        if len(channels) == len(raw_data_channels):
            run = '001'
            station_group = 'Experiment/Surveys/AusLAMP_Musgraves/Stations/'
            site_run_path = station_group+station+'/'+run
            channel_ex_path = site_run_path+'/'+'ex'
            channel_ey_path = site_run_path+'/'+'ey'
            channel_bx_path = site_run_path+'/'+'bx'
            channel_by_path = site_run_path+'/'+'by'
            channel_bz_path = site_run_path+'/'+'bz'

            channel_ex = f[channel_ex_path]
            channel_ey = f[channel_ey_path]
            channel_bx = f[channel_bx_path]
            channel_by = f[channel_by_path]
            channel_bz = f[channel_bz_path]

            ex,ey,bx,by,bz = channel_data_extraction(channels)

            channel_ex[:] = ex
            channel_ey[:] = ey
            channel_bx[:] = bx
            channel_by[:] = by
            channel_bz[:] = bz

            process = psutil.Process(os.getpid())
            print('this run took %d MB of memory' % (process.memory_info().rss / 1024 ** 2))
            print("Station number %d (%s) being done by processor %d of %d" % (i, station, rank, size))

### for stations with multiple runs:
        elif len(channels) > len(raw_data_channels):
            sort_files = sorted(channels)
            number_of_channels = len(raw_data_channels)
            split_lists = [sort_files[x:x+number_of_channels] for x in range(0, len(sort_files), number_of_channels)]
            for i,group in enumerate(split_lists):
                mrun_number = i+1
                run = "00%i" % mrun_number
                station_group = 'Experiment/Surveys/AusLAMP_Musgraves/Stations/'
                site_run_path = station_group+station+'/'+run
                channel_ex_path = site_run_path+'/'+'ex'
                channel_ey_path = site_run_path+'/'+'ey'
                channel_bx_path = site_run_path+'/'+'bx'
                channel_by_path = site_run_path+'/'+'by'
                channel_bz_path = site_run_path+'/'+'bz'

                channel_ex = f[channel_ex_path]
                channel_ey = f[channel_ey_path]
                channel_bx = f[channel_bx_path]
                channel_by = f[channel_by_path]
                channel_bz = f[channel_bz_path]

                ex,ey,bx,by,bz = channel_data_extraction(group)

                channel_ex[:] = ex
                channel_ey[:] = ey
                channel_bx[:] = bx
                channel_by[:] = by
                channel_bz[:] = bz

                process = psutil.Process(os.getpid())
                print('this run took %d MB of memory' % (process.memory_info().rss / 1024 ** 2))
                print("Station number %d (%s) being done by processor %d of %d" % (i, station, rank, size))

        elif len(channels) < len(raw_data_channels):
            print('you are likely missing some channels')

        else:
            print('something has gone wrong')

Finally, we need to open our MTH5 skeleton in append mode utilising the mpio driver. We can then write our time series channels for all our stations in parallel and close the file:

[ ]:
### write to mth5 file in parallel using the mpio driver
with h5py.File(h5_fn,'a',driver='mpio',comm=MPI.COMM_WORLD) as f:
    write_channels(stations_all,full_path_files)
f.close()

### print total time for script to run
if rank==0:
    print('The script took {0} seconds !'.format(time.time()-startTime))

Putting this all together into a Python script (mth5_muscle.py):

[ ]:
from mpi4py import MPI
import h5py
import os, psutil
import glob
import numpy as np
import shutil
import sys
from os import path
import time

startTime = time.time()


### define MPI comm, rank and size

comm = MPI.COMM_WORLD
rank = MPI.COMM_WORLD.rank
size = MPI.COMM_WORLD.size


### define working directories, hdf5 file name and the full path to the ASCII MT time series files

work_dir = '/g/data/.../.../merged_data_all'
mth5_test_dir = '/g/data/.../.../mth5_outdir'
hdf5_filename = 'example_mth5_file.h5'
h5_fn = mth5_test_dir+'/'+hdf5_filename
full_path_files = sorted(glob.glob(work_dir+"/*"))


### define raw data channels

raw_data_channels = ['EX','EY','BX','BY','BZ','TP','ambientTemperature']


### define MT stations to go into mth5 file

stations_all = ['SA225-2','SA227',   'SA242',  'SA243',  'SA245',
                'SA247',  'SA248',   'SA249',  'SA250',  'SA251',
                'SA252',  'SA26W-2', 'SA270',  'SA271',  'SA272',
                'SA273',  'SA274-2', 'SA274',  'SA275',  'SA276',
                'SA277',  'SA293-2', 'SA294',  'SA295',  'SA296',
                'SA297',  'SA298',   'SA300',  'SA301',  'SA319',
                'SA320',  'SA320-2', 'SA321',  'SA322',  'SA323',
                'SA324',  'SA325-2', 'SA325',  'SA326N', 'SA326S',
                'SA344',  'SA344-2', 'SA345',  'SA346',  'SA347',
                'SA348',  'SA349',   'SA350',  'SA351',             ### 49 single run SA stations
                'WA10',   'WA13',    'WA14',   'WA15',   'WA26',
                'WA27',   'WA29',    'WA30',   'WA31',   'WA42',
                'WA43',   'WA44',    'WA45',   'WA46',   'WA47',
                'WA54',   'WA55',    'WA56',   'WA57',   'WA58',
                'WA60',   'WA61',    'WA62',   'WA63',   'WA64',
                'WA65',   'WA66',    'WA67',   'WA68',   'WA69',
                'WA70',   'WA71',    'WA72',   'WA73',   'WA74',
                'WA75',   'WANT19',  'WANT38', 'WANT45', 'WASA302'
                'WASA327',                          ### 41 single run WA stations
                'SA246',  'SA299',   'SA324-2']     ### 3 stations with multiple runs


### define functions

def channel_data_extraction(channels):
    EX = [file for file in channels if file.endswith('EX')]
    EY = [file for file in channels if file.endswith('EY')]
    BX = [file for file in channels if file.endswith('BX')]
    BY = [file for file in channels if file.endswith('BY')]
    BZ = [file for file in channels if file.endswith('BZ')]
    with open(EX[0], 'r') as file:
        EX1 = file.read().splitlines()
        ex = np.array(EX1).astype(np.int32)
    with open(EY[0], 'r') as file:
        EY1 = file.read().splitlines()
        ey = np.array(EY1).astype(np.int32)
    with open(BX[0], 'r') as file:
        BX1 = file.read().splitlines()
        bx = np.array(BX1).astype(np.int32)
    with open(BY[0], 'r') as file:
        BY1 = file.read().splitlines()
        by = np.array(BY1).astype(np.int32)
    with open(BZ[0], 'r') as file:
        BZ1 = file.read().splitlines()
        bz = np.array(BZ1).astype(np.int32)

    return ex, ey, bx, by, bz

def write_channels(list_of_stations,full_path_to_files):
    for i,station in enumerate(sorted(list_of_stations)):
        if i%size!=rank:
            continue
        channels = []
        for file in full_path_to_files:
            if station in file:
                channels.append(file)
            else:
                continue
        if len(channels) == len(raw_data_channels):
            run = '001'
            station_group = 'Experiment/Surveys/AusLAMP_Musgraves/Stations/'
            site_run_path = station_group+station+'/'+run
            channel_ex_path = site_run_path+'/'+'ex'
            channel_ey_path = site_run_path+'/'+'ey'
            channel_bx_path = site_run_path+'/'+'bx'
            channel_by_path = site_run_path+'/'+'by'
            channel_bz_path = site_run_path+'/'+'bz'

            channel_ex = f[channel_ex_path]
            channel_ey = f[channel_ey_path]
            channel_bx = f[channel_bx_path]
            channel_by = f[channel_by_path]
            channel_bz = f[channel_bz_path]

            ex,ey,bx,by,bz = channel_data_extraction(channels)

            channel_ex[:] = ex
            channel_ey[:] = ey
            channel_bx[:] = bx
            channel_by[:] = by
            channel_bz[:] = bz

            process = psutil.Process(os.getpid())
            print('this run took %d MB of memory' % (process.memory_info().rss / 1024 ** 2))
            print("Station number %d (%s) being done by processor %d of %d" % (i, station, rank, size))


        elif len(channels) > len(raw_data_channels):
            sort_files = sorted(channels)
            number_of_channels = len(raw_data_channels)
            split_lists = [sort_files[x:x+number_of_channels] for x in range(0, len(sort_files), number_of_channels)]
            for i,group in enumerate(split_lists):
                mrun_number = i+1
                run = "00%i" % mrun_number
                station_group = 'Experiment/Surveys/AusLAMP_Musgraves/Stations/'
                site_run_path = station_group+station+'/'+run
                channel_ex_path = site_run_path+'/'+'ex'
                channel_ey_path = site_run_path+'/'+'ey'
                channel_bx_path = site_run_path+'/'+'bx'
                channel_by_path = site_run_path+'/'+'by'
                channel_bz_path = site_run_path+'/'+'bz'

                channel_ex = f[channel_ex_path]
                channel_ey = f[channel_ey_path]
                channel_bx = f[channel_bx_path]
                channel_by = f[channel_by_path]
                channel_bz = f[channel_bz_path]

                ex,ey,bx,by,bz = channel_data_extraction(group)

                channel_ex[:] = ex
                channel_ey[:] = ey
                channel_bx[:] = bx
                channel_by[:] = by
                channel_bz[:] = bz

                process = psutil.Process(os.getpid())
                print('this run took %d MB of memory' % (process.memory_info().rss / 1024 ** 2))
                print("Station number %d (%s) being done by processor %d of %d" % (i, station, rank, size))

        elif len(channels) < len(raw_data_channels):
            print('you are likely missing some channels')

        else:
            print('something has gone wrong')


### write to mth5 file in parallel using the mpio driver

with h5py.File(h5_fn,'a',driver='mpio',comm=MPI.COMM_WORLD) as f:
    write_channels(stations_all,full_path_files)
f.close()

### print total time for script to run

if rank==0:
    print('The script took {0} seconds !'.format(time.time()-startTime))

Now that we have created our mth5_muscle.py script, we next need to create a job submission script to submit to the Gadi PBSPro scheduler. The job submission script specifys the queue to use and the duration/resources needed for the job.

#!/bin/bash

#PBS -N mth5_mpi4py
#PBS -q hugemem
#PBS -P fp0
#PBS -l walltime=0:30:00
#PBS -l ncpus=96
#PBS -l mem=1000GB
#PBS -l jobfs=10GB
#PBS -l storage=gdata/fp0+gdata/my80+gdata/lm70+gdata/up99

module use /g/data/up99/modulefiles
module load NCI-geophys/22.06

cd ${PBS_O_WORKDIR}

mpirun -np $PBS_NCPUS python3 mth5_muscle.py > ./pbs_job_logs/$PBS_JOBID.log

For our jobscript above (mth5_muscle.sh), we have requested: 1. to use the hugemem queue 2. 96 CPUs (2 nodes) 3. 1 TB of memory 4. half an hour of walltime

We also need to define the NCI project codes used in mth5_muscle.py. The project code up99 is required to make use of the NCI-geophys/22.06 module that contains Parallel HDF5.

To submit this jobscript (mth5_muscle.sh) on Gadi:

$ qsub mth5_muscle.sh

This job will process the 93 stations in our survey using 96 MPI ranks (one station per rank).

The following is a summary of how long mth5_muscle.py took to run for the AusLAMP Musgraves Province survey.

run on: Gadi
number of stations: 93
number of runs: 101
NCPUs used: 96 (2 nodes)
memory used: 736 GB
walltime used: 836 seconds (13m56s)
CPU time used: 48570 seconds (13h29m30s)
service units: 66.88
Concluding remarks

Generating one mth5 file for many stations can take a significant amount of time if no parallelism is introduced. For the Musgraves example above, if using the mth5 library alone it would have taken approximately 14 hours to generate our final mth5 file. By utilising Parallel HDF5 we have managed to reduce this time to approximately 35 minutes.

For the “all stations in a single mth5 file” model, the bottleneck lies in generating the mth5 skeleton as this can’t be done in parallel. The authors have created the tutorial mth5_in_parallel_one_file_per_station that shows how to generate a single mth5 file per station in parallel, which yields much quicker results than the “all stations in a single mth5 file” model.

This example only dealt with the writing of MT time series data and did not consider mt_metadata. The mth5_in_parallel_one_file_per_station tutorial demonstrates how one could add mt_metadata into their mth5 automations.

Creating one mth5 file per station in parallel

In this tutorial, we will demonstrate how to generate one mth5 file per station (with associated mt_metadata) from a survey in parallel on HPC.

For this example, we will be converting Earth Data Logger (ASCII) time series data from 93 stations of the AusLAMP Musgraves Province survey (see https://dx.doi.org/10.25914/5eaa30d63bd17). The ASCII time series were concatenated per run for each station and the total volume of time series data was 106 GB. 90 of the 93 stations were a single run - stations SA246, SA299 and SA324-2 had multiple runs.

This example was tested on the National Computational Infrastructure’s Gadi HPC system. Gadi is Australia’s most powerful supercomputer, a highly parallel cluster comprising more than 200,000 processor cores on ten different types of compute nodes. The example also makes use of the NCI-geophysics 2022.06 module which contains the Python libraries used in this tutorial.

Building our mth5_onefileperstation.py code

In the mth5_in_parallel tutorial, we built a single mth5 file containing all stations in our survey. In that example, we had to construct two codes: 1. mth5_skeleton.py which created the mth5 file structure without adding in any time series data. This code could not be run in parallel as the processes involved were collective (see Collective versus independent operations in https://docs.h5py.org/en/stable/mpi.html). 2. mth5_muscle.py which added in the timeseries data for each station in parallel (i.e. one station per node).

For this example, we will be creating a single file per station with associated mt_metadata. As a result, we only need to create one code as the generation of each file is independent (i.e., the different mpi ranks do not need to communicate with each other). The authors use mpi4py for this tutorial, but note that other libraries such as Dask or multiprocessing could also be used.

Our mth5_onefileperstation.py code requires the following Python libraries:

[ ]:
from mpi4py import MPI
import h5py
from mth5.mth5 import MTH5
import numpy as np
from os import path
import os, psutil
import glob
import nc_time_axis
import time

from mt_metadata import timeseries as metadata
from mt_metadata.utils.mttime import MTime
import json

startTime = time.time()

As we are using mpi4py, we will need to define the MPI communicator, rank and size:

[ ]:
comm = MPI.COMM_WORLD
rank = MPI.COMM_WORLD.rank
size = MPI.COMM_WORLD.size

Next, we will define our: 1. working directories and file paths 2. Earth Data Logger channels 3. survey stations 4. survey name and run number (for stations with a single run)

[ ]:
### define working directories and file paths

work_dir = '/g/data/.../.../merged_data_all'
mth5_output_directory = '/g/data/.../.../mth5_outdir_single_file_per_station'
full_path_to_mth5_files = sorted(glob.glob(mth5_output_directory+"/*"))
full_path_to_ascii_files = sorted(glob.glob(work_dir+"/*"))
mt_metadata_dir = '/g/data/.../.../mt_metadata_json'
full_path_to_mt_metadata = sorted(glob.glob(mt_metadata_dir+"/*"))
survey_file_name = 'survey.json'
survey_json = mt_metadata_dir+'/'+survey_file_name

### define raw time series data channels

raw_data_channels = ['EX','EY','BX','BY','BZ','TP','ambientTemperature']


### define the stations in our survey

stations_all = ['SA225-2','SA227',   'SA242',  'SA243',  'SA245',
                'SA247',  'SA248',   'SA249',  'SA250',  'SA251',
                'SA252',  'SA26W-2', 'SA270',  'SA271',  'SA272',
                'SA273',  'SA274-2', 'SA274',  'SA275',  'SA276',
                'SA277',  'SA293-2', 'SA294',  'SA295',  'SA296',
                'SA297',  'SA298',   'SA300',  'SA301',  'SA319',
                'SA320',  'SA320-2', 'SA321',  'SA322',  'SA323',
                'SA324',  'SA325-2', 'SA325',  'SA326N', 'SA326S',
                'SA344',  'SA344-2', 'SA345',  'SA346',  'SA347',
                'SA348',  'SA349',   'SA350',  'SA351',             ### 49 single run SA stations
                'WA10',   'WA13',    'WA14',   'WA15',   'WA26',
                'WA27',   'WA29',    'WA30',   'WA31',   'WA42',
                'WA43',   'WA44',    'WA45',   'WA46',   'WA47',
                'WA54',   'WA55',    'WA56',   'WA57',   'WA58',
                'WA60',   'WA61',    'WA62',   'WA63',   'WA64',
                'WA65',   'WA66',    'WA67',   'WA68',   'WA69',
                'WA70',   'WA71',    'WA72',   'WA73',   'WA74',
                'WA75',   'WANT19',  'WANT38', 'WANT45', 'WASA302',
                'WASA327',                          ### 41 single run WA stations
                'SA246',  'SA299',   'SA324-2']     ### 3 stations with multiple runs


### define the survey name and run number (for stations with a single run)

survey_name = "AusLAMP_Musgraves"
run_number = "001"

We will also be adding mt_metadata into our mth5 files. For this example, a single json file was created per run and the contents of each json file looked something like this:

{
     "electric_ex": {
         "channel_number": 0,
         "component": null,
         "data_quality.rating.value": 0,
         "dipole_length": null,
         "filter.applied": [
             false
         ],
         "filter.name": [],
         "measurement_azimuth": 0.0,
         "measurement_tilt": 0.0,
         "negative.elevation": 0.0,
         "negative.id": null,
         "negative.latitude": 0.0,
         "negative.longitude": 0.0,
         "negative.manufacturer": null,
         "negative.type": null,
         "positive.elevation": 0.0,
         "positive.id": null,
         "positive.latitude": 0.0,
         "positive.longitude": 0.0,
         "positive.manufacturer": null,
         "positive.type": null,
         "sample_rate": 10.0,
         "time_period.end": "1980-01-01T00:00:00+00:00",
         "time_period.start": "1980-01-01T00:00:00+00:00",
         "type": "electric",
         "units": null
     },

     "electric_ey": {
         "channel_number": 0,
         "component": null,
         "data_quality.rating.value": 0,
         "dipole_length": null,
         "filter.applied": [
             false
         ],
         "filter.name": [],
         "measurement_azimuth": 0.0,
         "measurement_tilt": 0.0,
         "negative.elevation": 0.0,
         "negative.id": null,
         "negative.latitude": 0.0,
         "negative.longitude": 0.0,
         "negative.manufacturer": null,
         "negative.type": null,
         "positive.elevation": 0.0,
         "positive.id": null,
         "positive.latitude": 0.0,
         "positive.longitude": 0.0,
         "positive.manufacturer": null,
         "positive.type": null,
         "sample_rate": 10.0,
         "time_period.end": "1980-01-01T00:00:00+00:00",
         "time_period.start": "1980-01-01T00:00:00+00:00",
         "type": "electric",
         "units": null
     },


     "magnetic_bx": {
         "channel_number": 0,
         "component": null,
         "data_quality.rating.value": 0,
         "filter.applied": [
             false
         ],
         "filter.name": [],
         "location.elevation": 0.0,
         "location.latitude": 0.0,
         "location.longitude": 0.0,
         "measurement_azimuth": 0.0,
         "measurement_tilt": 0.0,
         "sample_rate": 10.0,
         "sensor.id": null,
         "sensor.manufacturer": null,
         "sensor.type": null,
         "time_period.end": "1980-01-01T00:00:00+00:00",
         "time_period.start": "1980-01-01T00:00:00+00:00",
         "type": "magnetic",
         "units": null
     },


     "magnetic_by": {
         "channel_number": 0,
         "component": null,
         "data_quality.rating.value": 0,
         "filter.applied": [
             false
         ],
         "filter.name": [],
         "location.elevation": 0.0,
         "location.latitude": 0.0,
         "location.longitude": 0.0,
         "measurement_azimuth": 0.0,
         "measurement_tilt": 0.0,
         "sample_rate": 10.0,
         "sensor.id": null,
         "sensor.manufacturer": null,
         "sensor.type": null,
         "time_period.end": "1980-01-01T00:00:00+00:00",
         "time_period.start": "1980-01-01T00:00:00+00:00",
         "type": "magnetic",
         "units": null
     },


     "magnetic_bz": {
         "channel_number": 0,
         "component": null,
         "data_quality.rating.value": 0,
         "filter.applied": [
             false
         ],
         "filter.name": [],
         "location.elevation": 0.0,
         "location.latitude": 0.0,
         "location.longitude": 0.0,
         "measurement_azimuth": 0.0,
         "measurement_tilt": 0.0,
         "sample_rate": 10.0,
         "sensor.id": null,
         "sensor.manufacturer": null,
         "sensor.type": null,
         "time_period.end": "1980-01-01T00:00:00+00:00",
         "time_period.start": "1980-01-01T00:00:00+00:00",
         "type": "magnetic",
         "units": null
     },


     "run": {
         "channels_recorded_auxiliary": [],
         "channels_recorded_electric": [],
         "channels_recorded_magnetic": [],
         "data_logger.firmware.author": null,
         "data_logger.firmware.name": null,
         "data_logger.firmware.version": null,
         "data_logger.id": null,
         "data_logger.manufacturer": null,
         "data_logger.timing_system.drift": 0.0,
         "data_logger.timing_system.type": "GPS",
         "data_logger.timing_system.uncertainty": 0.0,
         "data_logger.type": null,
         "data_type": "LPMT",
         "id": null,
         "sample_rate": 10.0,
         "time_period.end": "1980-01-01T00:00:00+00:00",
         "time_period.start": "1980-01-01T00:00:00+00:00"
     },

     "station": {
         "acquired_by.name": null,
         "channels_recorded": [],
         "data_type": "LPMT",
         "geographic_name": null,
         "id": null,
         "location.declination.model": "WMM",
         "location.declination.value": 0.0,
         "location.elevation": 0.0,
         "location.latitude": 0.0,
         "location.longitude": 0.0,
         "orientation.method": null,
         "orientation.reference_frame": "geographic",
         "provenance.creation_time": "1980-01-01T00:00:00+00:00",
         "provenance.software.author": "none",
         "provenance.software.name": null,
         "provenance.software.version": null,
         "provenance.submitter.email": null,
         "provenance.submitter.organization": null,
         "run_list": [],
         "time_period.end": "1980-01-01T00:00:00+00:00",
         "time_period.start": "1980-01-01T00:00:00+00:00"
     },

     "survey": {
         "citation_dataset.doi": null,
         "citation_journal.doi": null,
         "country": null,
         "datum": "WGS84",
         "geographic_name": null,
         "id": null,
         "name": null,
         "northwest_corner.latitude": 0.0,
         "northwest_corner.longitude": 0.0,
         "project": null,
         "project_lead.email": null,
         "project_lead.organization": null,
         "release_license": "CC-0",
         "southeast_corner.latitude": 0.0,
         "southeast_corner.longitude": 0.0,
         "summary": null,
         "time_period.end_date": "1980-01-01",
         "time_period.start_date": "1980-01-01"
     },
 }

Note that the mt_metadata in these json files were synthetic and were mainly used to show how mt_metadata could be added into our automation.

We will now define functions that will be used to create our MTH5 files:

[ ]:
def make_mth5_dir(mth5_output_directory):
### creates the mth5 output directory if it doesn't already exist
    try:
        os.makedirs(mth5_output_directory)
    except FileExistsError:
        # directory already exists
        print('directory already exists!')
        pass


def remove_existing_mth5_files(mth5_files):
### removes existing mth5 files if they exists
    for mth5_file in mth5_files:
        if path.exists(mth5_file):
            os.unlink(mth5_file)
            print("INFO: Removed existing file {}".format(mth5_file))
        else:
            print("File does not exist")


def channel_data_extraction(channels):
### extracts electromagnetic time series data from concatenated (per run) Earth Data Logger
### ASCII time series files
    EX = [file for file in channels if file.endswith('EX')]
    EY = [file for file in channels if file.endswith('EY')]
    BX = [file for file in channels if file.endswith('BX')]
    BY = [file for file in channels if file.endswith('BY')]
    BZ = [file for file in channels if file.endswith('BZ')]
    with open(EX[0], 'r') as file:
        EX1 = file.read().splitlines()
        ex_ts = np.array(EX1).astype(np.int32)
    with open(EY[0], 'r') as file:
        EY1 = file.read().splitlines()
        ey_ts = np.array(EY1).astype(np.int32)
    with open(BX[0], 'r') as file:
        BX1 = file.read().splitlines()
        bx_ts = np.array(BX1).astype(np.int32)
    with open(BY[0], 'r') as file:
        BY1 = file.read().splitlines()
        by_ts = np.array(BY1).astype(np.int32)
    with open(BZ[0], 'r') as file:
        BZ1 = file.read().splitlines()
        bz_ts = np.array(BZ1).astype(np.int32)

    return ex_ts, ey_ts, bx_ts, by_ts, bz_ts


def create_mth5_group_station_run_channel(station,mt_metadata):
### creates the mth5 gropus, stations, runs and channels for each mth5 file
### and populates mt_metadata from the relevant json file
    with open(mt_metadata[0], 'r') as json_file:
        json_load = json.load(json_file)

    station_dict = json_load['station']
    add_station = m.add_station(station, survey=survey_name)
    add_station.metadata.from_dict(station_dict)
    add_station.write_metadata()

    channels = []
    for file in full_path_to_ascii_files:
        if station in file:
            channels.append(file)
        else:
            continue

### for stations with a single run:
    if len(channels) == len(raw_data_channels):
        run_dict = json_load['run']
        ex_dict = json_load['electric_ex']
        ey_dict = json_load['electric_ey']
        bx_dict = json_load['magnetic_bx']
        by_dict = json_load['magnetic_by']
        bz_dict = json_load['magnetic_bz']

        add_run = m.add_run(station, run_number, survey=survey_name)
        add_run.metadata.from_dict(run_dict)
        add_run.write_metadata()

        ex_ts,ey_ts,bx_ts,by_ts,bz_ts = channel_data_extraction(channels)

        ex = m.add_channel(station, run_number, "ex", "electric", ex_ts, survey=survey_name)
        ey = m.add_channel(station, run_number, "ey", "electric", ey_ts, survey=survey_name)
        bx = m.add_channel(station, run_number, "bx", "magnetic", bx_ts, survey=survey_name)
        by = m.add_channel(station, run_number, "by", "magnetic", by_ts, survey=survey_name)
        bz = m.add_channel(station, run_number, "bz", "magnetic", bz_ts, survey=survey_name)

        ex.metadata.from_dict(ex_dict)
        ex.write_metadata()
        ey.metadata.from_dict(ey_dict)
        ey.write_metadata()
        bx.metadata.from_dict(bx_dict)
        bx.write_metadata()
        by.metadata.from_dict(by_dict)
        by.write_metadata()
        bz.metadata.from_dict(bz_dict)
        bz.write_metadata()

### for stations with multiple runs:

        elif len(channels) > len(raw_data_channels):
        sort_files = sorted(channels)
        number_of_channels = len(raw_data_channels)
        split_lists = [sort_files[x:x+number_of_channels] for x in range(0, len(sort_files), number_of_channels)]
        mt_metadata_files = sorted(mt_metadata)
        for i, (group,mt_meta) in enumerate(zip(split_lists,mt_metadata_files)):
            mrun_number = i+1
            run = "00%i" % mrun_number
            with open(mt_meta, 'r') as json_file:
                json_load = json.load(json_file)

            run_dict = json_load['run']
            ex_dict = json_load['electric_ex']
            ey_dict = json_load['electric_ey']
            bx_dict = json_load['magnetic_bx']
            by_dict = json_load['magnetic_by']
            bz_dict = json_load['magnetic_bz']

            add_run = m.add_run(station, run, survey=survey_name)
            add_run.metadata.from_dict(run_dict)
            add_run.write_metadata()

            ex_ts,ey_ts,bx_ts,by_ts,bz_ts = channel_data_extraction(channels)

            ex = m.add_channel(station, run, "ex", "electric", ex_ts, survey=survey_name)
            ey = m.add_channel(station, run, "ey", "electric", ey_ts, survey=survey_name)
            bx = m.add_channel(station, run, "bx", "magnetic", bx_ts, survey=survey_name)
            by = m.add_channel(station, run, "by", "magnetic", by_ts, survey=survey_name)
            bz = m.add_channel(station, run, "bz", "magnetic", bz_ts, survey=survey_name)

            ex.metadata.from_dict(ex_dict)
            ex.write_metadata()
            ey.metadata.from_dict(ey_dict)
            ey.write_metadata()
            bx.metadata.from_dict(bx_dict)
            bx.write_metadata()
            by.metadata.from_dict(by_dict)
            by.write_metadata()
            bz.metadata.from_dict(bz_dict)
            bz.write_metadata()

    elif len(channels) < len(raw_data_channels):
        print('you are likely missing some channels')
        print(station)

    else:
        print('something has gone wrong')

The final step is to create our mth5 files per station. For this we will be creating a single mth5 file per MPI rank for each station in our survey. As each process on each rank is independent, we can run all processes in parallel. Additionally, we can add compression to each file as we are not using the Parallel HDF5 library (which doesn’t support compression) that was used in the mth5_in_parallel tutorial. For this example, we will use “gzip” level 4 compression.

[ ]:
### create mth5 output directory and remove existing files

if rank==0:
    make_mth5_dir(mth5_output_directory)
    remove_existing_mth5_files(full_path_to_mth5_files)

comm.Barrier()

### create a single mth5 file per station with compression and associated mt_metadata

for i,station in enumerate(sorted(stations_all)):
    if i%size!=rank:
        continue
    m = MTH5(file_version='0.2.0',shuffle=None,fletcher32=None,compression="gzip",compression_opts=4)
    hdf5_filename = '{}.h5'.format(station)
    h5_fn = mth5_output_directory+'/'+hdf5_filename
    m.open_mth5(h5_fn, "w")
    survey_group = m.add_survey(survey_name)
    with open(survey_json, 'r') as json_file:
        json_load = json.load(json_file)
    survey_dict = json_load['survey']
    survey_group.metadata.from_dict(survey_dict)
    survey_group.write_metadata()
    mt_metadata_file_name = '{}.json'.format(station)
    mt_metadata = [file for file in full_path_to_mt_metadata if file.endswith(mt_metadata_file_name)]
    create_mth5_group_station_run_channel(station,mt_metadata)
    m.close_mth5()


comm.Barrier()

### print total time to run script
if rank==0:
    print('The script took {0} seconds !'.format(time.time()-startTime))

Putting this altogether into a Python script (mth5_onefileperstation.py):

[ ]:
from mpi4py import MPI
import h5py
from mth5.mth5 import MTH5
import numpy as np
from os import path
import os, psutil
import glob
import nc_time_axis
import time

from mt_metadata import timeseries as metadata
from mt_metadata.utils.mttime import MTime
import json

startTime = time.time()

### define MPI comm, rank and size

comm = MPI.COMM_WORLD
rank = MPI.COMM_WORLD.rank
size = MPI.COMM_WORLD.size



### define working directories and file paths

work_dir = '/g/data/.../.../merged_data_all'
mth5_output_directory = '/g/data/.../.../mth5_outdir_single_file_per_station'
full_path_to_mth5_files = sorted(glob.glob(mth5_output_directory+"/*"))
full_path_to_ascii_files = sorted(glob.glob(work_dir+"/*"))
mt_metadata_dir = '/g/data/.../.../mt_metadata_json'
full_path_to_mt_metadata = sorted(glob.glob(mt_metadata_dir+"/*"))
survey_file_name = 'survey.json'
survey_json = mt_metadata_dir+'/'+survey_file_name

### define raw time series data channels

raw_data_channels = ['EX','EY','BX','BY','BZ','TP','ambientTemperature']


### define stations to go into mth5 file

stations_all = ['SA225-2','SA227',   'SA242',  'SA243',  'SA245',
                'SA247',  'SA248',   'SA249',  'SA250',  'SA251',
                'SA252',  'SA26W-2', 'SA270',  'SA271',  'SA272',
                'SA273',  'SA274-2', 'SA274',  'SA275',  'SA276',
                'SA277',  'SA293-2', 'SA294',  'SA295',  'SA296',
                'SA297',  'SA298',   'SA300',  'SA301',  'SA319',
                'SA320',  'SA320-2', 'SA321',  'SA322',  'SA323',
                'SA324',  'SA325-2', 'SA325',  'SA326N', 'SA326S',
                'SA344',  'SA344-2', 'SA345',  'SA346',  'SA347',
                'SA348',  'SA349',   'SA350',  'SA351',             ### 49 single run SA stations
                'WA10',   'WA13',    'WA14',   'WA15',   'WA26',
                'WA27',   'WA29',    'WA30',   'WA31',   'WA42',
                'WA43',   'WA44',    'WA45',   'WA46',   'WA47',
                'WA54',   'WA55',    'WA56',   'WA57',   'WA58',
                'WA60',   'WA61',    'WA62',   'WA63',   'WA64',
                'WA65',   'WA66',    'WA67',   'WA68',   'WA69',
                'WA70',   'WA71',    'WA72',   'WA73',   'WA74',
                'WA75',   'WANT19',  'WANT38', 'WANT45', 'WASA302',
                'WASA327',                          ### 41 single run WA stations
                'SA246',  'SA299',   'SA324-2']     ### 3 stations with multiple runs


### define survey name and run number (for stations with a single run)

survey_name = "AusLAMP_Musgraves"
run_number = "001"


### define functions

def make_mth5_dir(mth5_output_directory):
### creates mth5 output directory if it doesn't already exist
    try:
        os.makedirs(mth5_output_directory)
    except FileExistsError:
        # directory already exists
        print('directory already exists!')
        pass


def remove_existing_mth5_files(mth5_files):
### removes existing mth5 files if they exists
    for mth5_file in mth5_files:
        if path.exists(mth5_file):
            os.unlink(mth5_file)
            print("INFO: Removed existing file {}".format(mth5_file))
        else:
            print("File does not exist")


def channel_data_extraction(channels):
### extracts electromagnetic time series data from concatenated (per run) Earth Data Logger ASCII time series files
    EX = [file for file in channels if file.endswith('EX')]
    EY = [file for file in channels if file.endswith('EY')]
    BX = [file for file in channels if file.endswith('BX')]
    BY = [file for file in channels if file.endswith('BY')]
    BZ = [file for file in channels if file.endswith('BZ')]
    with open(EX[0], 'r') as file:
        EX1 = file.read().splitlines()
        ex_ts = np.array(EX1).astype(np.int32)
    with open(EY[0], 'r') as file:
        EY1 = file.read().splitlines()
        ey_ts = np.array(EY1).astype(np.int32)
    with open(BX[0], 'r') as file:
        BX1 = file.read().splitlines()
        bx_ts = np.array(BX1).astype(np.int32)
    with open(BY[0], 'r') as file:
        BY1 = file.read().splitlines()
        by_ts = np.array(BY1).astype(np.int32)
    with open(BZ[0], 'r') as file:
        BZ1 = file.read().splitlines()
        bz_ts = np.array(BZ1).astype(np.int32)

    return ex_ts, ey_ts, bx_ts, by_ts, bz_ts


def create_mth5_group_station_run_channel(station,mt_metadata):
### creates the mth5 gropus, stations, runs and channels for each mth5 file
### and populates mt_metadata from the relevant json file
    with open(mt_metadata[0], 'r') as json_file:
        json_load = json.load(json_file)

    station_dict = json_load['station']
    add_station = m.add_station(station, survey=survey_name)
    add_station.metadata.from_dict(station_dict)
    add_station.write_metadata()

    channels = []
    for file in full_path_to_ascii_files:
        if station in file:
            channels.append(file)
        else:
            continue
### for stations with a single run:
    if len(channels) == len(raw_data_channels):
        run_dict = json_load['run']
        ex_dict = json_load['electric_ex']
        ey_dict = json_load['electric_ey']
        bx_dict = json_load['magnetic_bx']
        by_dict = json_load['magnetic_by']
        bz_dict = json_load['magnetic_bz']

        add_run = m.add_run(station, run_number, survey=survey_name)
        add_run.metadata.from_dict(run_dict)
        add_run.write_metadata()

        ex_ts,ey_ts,bx_ts,by_ts,bz_ts = channel_data_extraction(channels)

        ex = m.add_channel(station, run_number, "ex", "electric", ex_ts, survey=survey_name)
        ey = m.add_channel(station, run_number, "ey", "electric", ey_ts, survey=survey_name)
        bx = m.add_channel(station, run_number, "bx", "magnetic", bx_ts, survey=survey_name)
        by = m.add_channel(station, run_number, "by", "magnetic", by_ts, survey=survey_name)
        bz = m.add_channel(station, run_number, "bz", "magnetic", bz_ts, survey=survey_name)

        ex.metadata.from_dict(ex_dict)
        ex.write_metadata()
        ey.metadata.from_dict(ey_dict)
        ey.write_metadata()
        bx.metadata.from_dict(bx_dict)
        bx.write_metadata()
        by.metadata.from_dict(by_dict)
        by.write_metadata()
        bz.metadata.from_dict(bz_dict)
        bz.write_metadata()

 ### for stations with multiple runs:
    elif len(channels) > len(raw_data_channels):
        sort_files = sorted(channels)
        number_of_channels = len(raw_data_channels)
        split_lists = [sort_files[x:x+number_of_channels] for x in range(0, len(sort_files), number_of_channels)]
        mt_metadata_files = sorted(mt_metadata)
        for i, (group,mt_meta) in enumerate(zip(split_lists,mt_metadata_files)):
            mrun_number = i+1
            run = "00%i" % mrun_number
            with open(mt_meta, 'r') as json_file:
                json_load = json.load(json_file)

            run_dict = json_load['run']
            ex_dict = json_load['electric_ex']
            ey_dict = json_load['electric_ey']
            bx_dict = json_load['magnetic_bx']
            by_dict = json_load['magnetic_by']
            bz_dict = json_load['magnetic_bz']

            add_run = m.add_run(station, run, survey=survey_name)
            add_run.metadata.from_dict(run_dict)
            add_run.write_metadata()

            ex_ts,ey_ts,bx_ts,by_ts,bz_ts = channel_data_extraction(channels)

            ex = m.add_channel(station, run, "ex", "electric", ex_ts, survey=survey_name)
            ey = m.add_channel(station, run, "ey", "electric", ey_ts, survey=survey_name)
            bx = m.add_channel(station, run, "bx", "magnetic", bx_ts, survey=survey_name)
            by = m.add_channel(station, run, "by", "magnetic", by_ts, survey=survey_name)
            bz = m.add_channel(station, run, "bz", "magnetic", bz_ts, survey=survey_name)

            ex.metadata.from_dict(ex_dict)
            ex.write_metadata()
            ey.metadata.from_dict(ey_dict)
            ey.write_metadata()
            bx.metadata.from_dict(bx_dict)
            bx.write_metadata()
            by.metadata.from_dict(by_dict)
            by.write_metadata()
            bz.metadata.from_dict(bz_dict)
            bz.write_metadata()

    elif len(channels) < len(raw_data_channels):
        print('you are likely missing some channels')
        print(station)

    else:
        print('something has gone wrong')


### create mth5 output directory and remove existing files

if rank==0:
    make_mth5_dir(mth5_output_directory)
    remove_existing_mth5_files(full_path_to_mth5_files)

comm.Barrier()

### create a single mth5 file per station with compression and associated mt_metadata

for i,station in enumerate(sorted(stations_all)):
    if i%size!=rank:
        continue
    m = MTH5(file_version='0.2.0',shuffle=None,fletcher32=None,compression="gzip",compression_opts=4)
    hdf5_filename = '{}.h5'.format(station)
    h5_fn = mth5_output_directory+'/'+hdf5_filename
    m.open_mth5(h5_fn, "w")
    survey_group = m.add_survey(survey_name)
    with open(survey_json, 'r') as json_file:
        json_load = json.load(json_file)
    survey_dict = json_load['survey']
    survey_group.metadata.from_dict(survey_dict)
    survey_group.write_metadata()
    mt_metadata_file_name = '{}.json'.format(station)
    mt_metadata = [file for file in full_path_to_mt_metadata if file.endswith(mt_metadata_file_name)]
    create_mth5_group_station_run_channel(station,mt_metadata)
    m.close_mth5()


comm.Barrier()

### print total time to run script
if rank==0:
    print('The script took {0} seconds !'.format(time.time()-startTime))

Now that we have created our mth5_onefileperstation.py script, we next need to make a job submission script to submit to the Gadi PBSPro scheduler. The job submission script specifys the queue to use and the duration/resources needed for the job.

#!/bin/bash

#PBS -N mth5_onefileperstation
#PBS -q hugemem
#PBS -P fp0
#PBS -l walltime=0:05:00
#PBS -l ncpus=96
#PBS -l mem=900GB
#PBS -l jobfs=10GB
#PBS -l storage=gdata/fp0+gdata/my80+gdata/lm70+gdata/up99

module use /g/data/up99/modulefiles
module load NCI-geophys/22.06

cd ${PBS_O_WORKDIR}

mpirun -np $PBS_NCPUS python3 mth5_onefileperstation.py > ./pbs_job_logs/$PBS_JOBID.log

For our jobscript above (mth5_onefileperstation.sh), we have requested: 1. to use the hugemem queue 2. 96 CPUs (2 nodes) 3. 900GB of memory 4. 5 minutes of walltime

We also need to define the NCI project codes used in mth5_onefileperstation.py. The project code up99 is required to make use of the NCI-geophys/22.06 module that contains the Python libraries used in this tutorial.

To submit this jobscript (mth5_onefileperstation.sh) on Gadi:

$ qsub mth5_muscle.sh

This job will process the 93 stations in our survey using 96 MPI ranks (one station per rank) and creates the following 93 mth5 files:

SA225-2.h5  SA252.h5    SA293-2.h5  SA320.h5    SA344.h5  WA15.h5  WA47.h5  WA65.h5  WANT19.h5
SA227.h5    SA26W-2.h5  SA294.h5    SA321.h5    SA345.h5  WA26.h5  WA54.h5  WA66.h5  WANT38.h5
SA242.h5    SA270.h5    SA295.h5    SA322.h5    SA346.h5  WA27.h5  WA55.h5  WA67.h5  WANT45.h5
SA243.h5    SA271.h5    SA296.h5    SA323.h5    SA347.h5  WA29.h5  WA56.h5  WA68.h5  WASA302.h5
SA245.h5    SA272.h5    SA297.h5    SA324-2.h5  SA348.h5  WA30.h5  WA57.h5  WA69.h5  WASA327.h5
SA246.h5    SA273.h5    SA298.h5    SA324.h5    SA349.h5  WA31.h5  WA58.h5  WA70.h5
SA247.h5    SA274-2.h5  SA299.h5    SA325-2.h5  SA350.h5  WA42.h5  WA60.h5  WA71.h5
SA248.h5    SA274.h5    SA300.h5    SA325.h5    SA351.h5  WA43.h5  WA61.h5  WA72.h5
SA249.h5    SA275.h5    SA301.h5    SA326N.h5   WA10.h5   WA44.h5  WA62.h5  WA73.h5
SA250.h5    SA276.h5    SA319.h5    SA326S.h5   WA13.h5   WA45.h5  WA63.h5  WA74.h5
SA251.h5    SA277.h5    SA320-2.h5  SA344-2.h5  WA14.h5   WA46.h5  WA64.h5  WA75.h5

Below is a summary of how long mth5_onefileperstation.py took to run for the AusLAMP Musgraves Province survey:

run on: Gadi
number of stations: 93
number of runs: 101
NCPUs used: 96 (2 nodes)
memory used: 791 GB
walltime used: 215 seconds (3m35s)
CPU time used: 48570 seconds (5h06m07s)
service units: 17.20
Concluding remarks

We have generated 93 mth5 files for the 93 stations from the Musgraves survey in 3 minutes and 35 seconds using 2 nodes (96 CPUs) on Gadi. For comparison, if we only used one CPU it would have taken approximately 5 hours to create these files.

In our mth5_in_parallel example, we generated a single mth5 file with all stations and this took approximately 35 minutes of walltime (or ~14 hours of CPU time). The “single file per station” model was much quicker as it did not require the creation of an mth5 skeleton as each process (rank) was run independently.

We were able to introduce compression into our mth5_onefileperstation.py code and the total volume of the 93 mth5 files was 14 GB. The mth5_in_parallel.ipynb tutorial was following the “all stations in a single mth5 file” model and as our version of Parallel HDF5 did not support compression, the total volume of the single mth5 file was 106 GB.

MT Metadata

See mt_metadata documentation for more information on magnetotelluric time series and transfer function metadata standards.

mth5

mth5 package

Subpackages
mth5.clients package
Submodules
mth5.clients.make_mth5 module
Make MTH5

This module provides helper functions to make MTH5 file from various clients

Supported Clients include:

  • FDSN (through Obspy)

  • Science Base (TODO)

  • NCI - Australia (TODO)

Updated on Wed Aug 25 19:57:00 2021

@author: jpeacock + tronan

class mth5.clients.make_mth5.MakeMTH5(mth5_version='0.2.0', interact=False, save_path=None, **kwargs)[source]

Bases: object

from_fdsn_client(request_df, client='IRIS', **kwargs)[source]

Pull data from an FDSN archive like IRIS. Uses Obspy.Clients.

Parameters
  • request_df (pandas.DataFrame) –

    DataFrame with columns

    • ’network’ –> FDSN Network code

    • ’station’ –> FDSN Station code

    • ’location’ –> FDSN Location code

    • ’channel’ –> FDSN Channel code

    • ’start’ –> Start time YYYY-MM-DDThh:mm:ss

    • ’end’ –> End time YYYY-MM-DDThh:mm:ss

  • client (string, optional) – FDSN client name, defaults to “IRIS”

  • interact (bool) – Boolean to keep the created MTH5 file open or not

Raises
  • AttributeError – If the input DataFrame is not properly formatted an Attribute Error will be raised.

  • ValueError – If the values of the DataFrame are not correct a ValueError will be raised.

Returns

MTH5 file name

Return type

pathlib.Path

Note

If any of the column values are blank, then any value will

searched for. For example if you leave ‘station’ blank, any station within the given start and end time will be returned.

from_usgs_geomag(request_df, **kwargs)[source]

Download geomagnetic observatory data from USGS webservices into an MTH5 using a request dataframe or csv file.

  • observatory: Geogmangetic observatory ID

  • type: type of data to get ‘adjusted’

  • start: start date time to request UTC

  • end: end date time to request UTC

  • elements: components to get

  • sampling_period: samples between measurements in seconds

Parameters

request_df (pandas.DataFrame, str or Path if csv file) –

DataFrame with columns

  • ’observatory’ –> Observatory code

  • ’type’ –> data type [ ‘variation’ | ‘adjusted’ | ‘quasi-definitive’ | ‘definitive’ ]

  • ’elements’ –> Elements to get [D, DIST, DST, E, E-E, E-N, F, G, H, SQ, SV, UK1, UK2, UK3, UK4, X, Y, Z]

  • ’sampling_period’ –> sample period [ 1 | 60 | 3600 ]

  • ’start’ –> Start time YYYY-MM-DDThh:mm:ss

  • ’end’ –> End time YYYY-MM-DDThh:mm:ss

Returns

if interact is True an MTH5 object is returned otherwise the path to the file is returned

Return type

Path or mth5.mth5.MTH5

from_zen(data_path, sample_rates=[4096, 1024, 256], calibration_path=None, survey_id=None, combine=True, **kwargs)[source]

Create an MTH5 from zen data.

Parameters
  • data_path (TYPE) – DESCRIPTION

  • sample_rates (TYPE, optional) – DESCRIPTION, defaults to [4096, 1024, 256]

  • save_path (TYPE, optional) – DESCRIPTION, defaults to None

  • calibration_path (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

Module contents
class mth5.clients.FDSN(client='IRIS', mth5_version='0.2.0', **kwargs)[source]

Bases: object

build_network_dict(df, client)[source]

Build out a dictionary of networks, keyed by network_id, start_time. We could return this dict and use it as an auxilliary variable, but it seems easier to just add a column to the df.

Parameters

df (pd.DataFrame) – This is a “request_df”

build_station_dict(df, client, networks_dict)[source]

Given the {network-id, starttime}-keyed dict of networks, we build a station layer below this

Parameters
  • df

  • networks_dict

get_df_from_inventory(inventory)[source]

Create an data frame from an inventory object

Parameters

inventory (obspy.Inventory) – inventory object

Returns

dataframe in proper format

Return type

pandas.DataFrame

get_fdsn_channel_map()[source]
get_inventory_from_df(df, client=None, data=True)[source]

20230806: The nested for looping here can make debugging complex, as well as lead to a lot of redundancies. I propose that we build out a dictionary of networks, keyed by network_id, start_time. It may actually be simpler to just add a column to the request_df that has the network_obj

networks = {} networks[network_id] = {} networks[network_id][start_time_1] = obspy_network_obj networks[network_id][start_time_2] = obspy_network_obj …

Then the role of “returned_network” can be replaced by accessing the appropriate element and the second for-loop can move up by a layer of indentation.

Will try to factor i Get an obspy.Inventory object from a pandas.DataFrame

Parameters
  • df (pandas.DataFrame) –

    DataFrame with columns

    • ’network’ –> FDSN Network code

    • ’station’ –> FDSN Station code

    • ’location’ –> FDSN Location code

    • ’channel’ –> FDSN Channel code

    • ’start’ –> Start time YYYY-MM-DDThh:mm:ss

    • ’end’ –> End time YYYY-MM-DDThh:mm:ss

  • client (string) – FDSN client

  • data – True if you want data False if you want just metadata,

defaults to True :type data: boolean, optional :return: An inventory of metadata requested and data :rtype: obspy.Inventory and obspy.Stream

Note

If any of the column values are blank, then any value will

searched for. For example if you leave ‘station’ blank, any station within the given start and end time will be returned.

get_run_group(mth5_obj_or_survey, station_id, run_id)[source]

This method is key to merging wrangle_runs_into_containers_v1 and wrangle_runs_into_containers_v2. Because a v1 mth5 object can get a survey group with the same method as can a v2 survey_group

Thus we can replace run_group = m.stations_group.get_station(station_id).add_run(run_id) & run_group = survey_group.stations_group.get_station(station_id).add_run(run_id) with run_group = mth5_obj_or_survey.stations_group.get_station(station_id).add_run(run_id) :param mth5_obj_or_survey: :type mth5_obj_or_survey: mth5.mth5.MTH5 or mth5.groups.survey.SurveyGroup

get_run_list_from_station_id(m, station_id, survey_id=None)[source]

ignored_groups created to address issue #153. This might be better placed closer to the core of mth5.

Parameters
  • m

  • station_id

Returns

run_list

Return type

list of strings

get_station_streams(station_id)[source]

Get streams for a certain station

get_unique_networks_and_stations(df)[source]

Get unique lists of networks, stations, locations, and channels from a given data frame.

[{‘network’: FDSN code, “stations”: [list of stations for network]}]

Parameters

df (pandas.DataFrame) – request data frame

Returns

list of network dictionaries with

[{‘network’: FDSN code, “stations”: [list of stations for network]}] :rtype: list

get_waveforms_from_request_row(client, row)[source]
Parameters
  • client

  • row

make_filename(df)[source]

Make a filename from a data frame that is networks and stations

Parameters

df (pandas.DataFrame) – request data frame

Returns

file name as network_01+stations_network_02+stations.h5

Return type

string

make_mth5_from_fdsn_client(df, path=None, client=None, interact=False)[source]

Make an MTH5 file from an FDSN data center

Parameters
  • df (pandas.DataFrame) –

    DataFrame with columns

    • ’network’ –> FDSN Network code

    • ’station’ –> FDSN Station code

    • ’location’ –> FDSN Location code

    • ’channel’ –> FDSN Channel code

    • ’start’ –> Start time YYYY-MM-DDThh:mm:ss

    • ’end’ –> End time YYYY-MM-DDThh:mm:ss

  • path (string or pathlib.Path, optional) – Path to save MTH5 file to, defaults to None

  • client (string, optional) – FDSN client name, defaults to “IRIS”

Raises

AttributeError – If the input DataFrame is not properly

formatted an Attribute Error will be raised. :raises ValueError: If the values of the DataFrame are not correct a ValueError will be raised. :return: MTH5 file name :rtype: pathlib.Path

Note

If any of the column values are blank, then any value will

searched for. For example if you leave ‘station’ blank, any station within the given start and end time will be returned.

pack_stream_into_run_group(run_group, run_stream)[source]
property run_list_ne_stream_intervals_message

note about not equal stream intervals

run_timings_match_stream_timing(run_group, stream_start, stream_end)[source]

Checks start and end times in the run. Compares start and end times of runs to start and end times of traces. If True, will packs runs based on time spans.

Parameters
  • run_group

  • stream_start

  • stream_end

stream_boundaries(streams)[source]

Identify start and end times of streams

Parameters

streams (obspy.core.stream.Stream) –

wrangle_runs_into_containers(m, station_id, survey_group=None)[source]

Note 1: There used to be two separate functions for this, but now there is one run_group_source is defined as either m or survey_group depending on v0.1.0 or 0.2.0

Note 2: If/elif/elif/else Logic: The strategy is to add the group first. This will get the already filled in metadata to update the run_ts_obj. Then get streams an add existing metadata.

Parameters
  • m

  • streams

  • station_id

  • survey_group

class mth5.clients.MakeMTH5(mth5_version='0.2.0', interact=False, save_path=None, **kwargs)[source]

Bases: object

from_fdsn_client(request_df, client='IRIS', **kwargs)[source]

Pull data from an FDSN archive like IRIS. Uses Obspy.Clients.

Parameters
  • request_df (pandas.DataFrame) –

    DataFrame with columns

    • ’network’ –> FDSN Network code

    • ’station’ –> FDSN Station code

    • ’location’ –> FDSN Location code

    • ’channel’ –> FDSN Channel code

    • ’start’ –> Start time YYYY-MM-DDThh:mm:ss

    • ’end’ –> End time YYYY-MM-DDThh:mm:ss

  • client (string, optional) – FDSN client name, defaults to “IRIS”

  • interact (bool) – Boolean to keep the created MTH5 file open or not

Raises
  • AttributeError – If the input DataFrame is not properly formatted an Attribute Error will be raised.

  • ValueError – If the values of the DataFrame are not correct a ValueError will be raised.

Returns

MTH5 file name

Return type

pathlib.Path

Note

If any of the column values are blank, then any value will

searched for. For example if you leave ‘station’ blank, any station within the given start and end time will be returned.

from_usgs_geomag(request_df, **kwargs)[source]

Download geomagnetic observatory data from USGS webservices into an MTH5 using a request dataframe or csv file.

  • observatory: Geogmangetic observatory ID

  • type: type of data to get ‘adjusted’

  • start: start date time to request UTC

  • end: end date time to request UTC

  • elements: components to get

  • sampling_period: samples between measurements in seconds

Parameters

request_df (pandas.DataFrame, str or Path if csv file) –

DataFrame with columns

  • ’observatory’ –> Observatory code

  • ’type’ –> data type [ ‘variation’ | ‘adjusted’ | ‘quasi-definitive’ | ‘definitive’ ]

  • ’elements’ –> Elements to get [D, DIST, DST, E, E-E, E-N, F, G, H, SQ, SV, UK1, UK2, UK3, UK4, X, Y, Z]

  • ’sampling_period’ –> sample period [ 1 | 60 | 3600 ]

  • ’start’ –> Start time YYYY-MM-DDThh:mm:ss

  • ’end’ –> End time YYYY-MM-DDThh:mm:ss

Returns

if interact is True an MTH5 object is returned otherwise the path to the file is returned

Return type

Path or mth5.mth5.MTH5

from_zen(data_path, sample_rates=[4096, 1024, 256], calibration_path=None, survey_id=None, combine=True, **kwargs)[source]

Create an MTH5 from zen data.

Parameters
  • data_path (TYPE) – DESCRIPTION

  • sample_rates (TYPE, optional) – DESCRIPTION, defaults to [4096, 1024, 256]

  • save_path (TYPE, optional) – DESCRIPTION, defaults to None

  • calibration_path (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

class mth5.clients.PhoenixClient(data_path, sample_rates=[130, 24000], save_path=None, calibration_path=None)[source]

Bases: object

property calibration_path

Path to calibration data

property data_path

Path to phoenix data

get_run_dict()[source]

Get Run information

Returns

DESCRIPTION

Return type

TYPE

make_mth5_from_phoenix(**kwargs)[source]

Make an MTH5 from Phoenix files. Split into runs, account for filters

Parameters
  • data_path (TYPE, optional) – DESCRIPTION, defaults to None

  • sample_rates (TYPE, optional) – DESCRIPTION, defaults to None

  • save_path (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

property sample_rates

sample rates to look for

property save_path

Path to save mth5

class mth5.clients.USGSGeomag(**kwargs)[source]

Bases: object

add_run_id(request_df)[source]

Add run id to request df

Parameters

request_df (pandas.DataFrame) – request dataframe

Returns

add a run number to unique time windows for each observatory at each unique sampling period.

Return type

pandas.DataFrame

make_mth5_from_geomag(request_df)[source]

Download geomagnetic observatory data from USGS webservices into an MTH5 using a request dataframe or csv file.

Parameters

request_df (pandas.DataFrame, str or Path if csv file) –

DataFrame with columns

  • ’observatory’ –> Observatory code

  • ’type’ –> data type [ ‘variation’ | ‘adjusted’ | ‘quasi-definitive’ | ‘definitive’ ]

  • ’elements’ –> Elements to get [D, DIST, DST, E, E-E, E-N, F, G, H, SQ, SV, UK1, UK2, UK3, UK4, X, Y, Z]

  • ’sampling_period’ –> sample period [ 1 | 60 | 3600 ]

  • ’start’ –> Start time YYYY-MM-DDThh:mm:ss

  • ’end’ –> End time YYYY-MM-DDThh:mm:ss

Returns

if interact is True an MTH5 object is returned otherwise the path to the file is returned

Return type

Path or mth5.mth5.MTH5

validate_request_df(request_df)[source]

Make sure the input request dataframe has the appropriate columns

Parameters

request_df (pandas.DataFrame) – request dataframe

Returns

valid request dataframe

Return type

pandas.DataFrame

class mth5.clients.ZenClient(data_path, sample_rates=[4096, 1024, 256], save_path=None, calibration_path=None, **kwargs)[source]

Bases: object

property calibration_path

Path to calibration data

property data_path

Path to phoenix data

get_run_dict()[source]

Get Run information

Returns

DESCRIPTION

Return type

TYPE

get_survey(station_dict)[source]

get survey name from a dictionary of a single station of runs :param station_dict: DESCRIPTION :type station_dict: TYPE :return: DESCRIPTION :rtype: TYPE

make_mth5_from_zen(survey_id=None, combine=True, **kwargs)[source]

Make an MTH5 from Phoenix files. Split into runs, account for filters

Parameters
  • data_path (TYPE, optional) – DESCRIPTION, defaults to None

  • sample_rates (TYPE, optional) – DESCRIPTION, defaults to None

  • save_path (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

property sample_rates

sample rates to look for

property save_path

Path to save mth5

mth5.groups package
Subpackages
mth5.groups.filter_groups package
Submodules
mth5.groups.filter_groups.coefficient_filter_group module

Created on Wed Jun 9 08:58:15 2021

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.groups.filter_groups.coefficient_filter_group.CoefficientGroup(group, **kwargs)[source]

Bases: BaseGroup

Container for Coefficient type filters

add_filter(name, coefficient_metadata)[source]

Add a coefficient Filter

Parameters
  • name (TYPE) – DESCRIPTION

  • coefficient_metadata (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

property filter_dict

Dictionary of available coefficient filters

Returns

DESCRIPTION

Return type

TYPE

from_object(coefficient_object)[source]

make a filter from a mt_metadata.timeseries.filters.CoefficientFilter

Parameters

zpk_object (mt_metadata.timeseries.filters.CoefficientFilter) – MT metadata Coefficient Filter

get_filter(name)[source]

Get a filter from the name

Parameters

name (string) – name of the filter

Returns

HDF5 group of the ZPK filter

remove_filter()[source]
to_object(name)[source]

make a mt_metadata.timeseries.filters.CoefficientFilter object

Returns

DESCRIPTION

Return type

TYPE

mth5.groups.filter_groups.fap_filter_group module

Created on Wed Jun 9 08:55:16 2021

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.groups.filter_groups.fap_filter_group.FAPGroup(group, **kwargs)[source]

Bases: BaseGroup

Container for fap type filters

add_filter(name, frequency, amplitude, phase, fap_metadata)[source]

create an HDF5 group/dataset from information given.

Parameters
  • name (string) – name of the filter

  • frequency (list, np.ndarray) – frequency array in samples per second

  • amplitude (list, np.ndarray) – amplitude array in units of units out

  • phase (list, np.ndarray) – Phase in degrees

  • fap_metadata – other metadata for the filter see

mt_metadata.timeseries.filters.FrequencyResponseTableFilter for details on entries :type fap_metadata: dictionary :return: DESCRIPTION :rtype: TYPE

property filter_dict

Dictionary of available fap filters

Returns

DESCRIPTION

Return type

TYPE

from_object(fap_object)[source]

make a filter from a mt_metadata.timeseries.filters.PoleZeroFilter

Parameters

fap_object (mt_metadata.timeseries.filters.PoleZeroFilter) – MT metadata PoleZeroFilter

get_filter(name)[source]

Get a filter from the name

Parameters

name (string) – name of the filter

Returns

HDF5 group of the fap filter

remove_filter()[source]
to_object(name)[source]

make a mt_metadata.timeseries.filters.pole_zeros_filter object

Returns

DESCRIPTION

Return type

TYPE

update_filter(fap_object)[source]

update values from fap object

Parameters

fap_object (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

mth5.groups.filter_groups.fir_filter_group module

Created on Wed Jun 9 08:55:16 2021

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.groups.filter_groups.fir_filter_group.FIRGroup(group, **kwargs)[source]

Bases: BaseGroup

Container for fir type filters

add_filter(name, coefficients, fir_metadata)[source]

create an HDF5 group/dataset from information given.

Parameters
  • name (string) – Nane of the filter

  • poles (np.ndarray(dtype=complex)) – poles of the filter as complex numbers

  • zeros (np.ndarray(dtype=comples)) – zeros of the filter as complex numbers

  • fir_metadata – metadata dictionary see

mt_metadata.timeseries.filters.PoleZeroFilter for details on entries :type fir_metadata: dictionary

property filter_dict

Dictionary of available fir filters

Returns

DESCRIPTION

Return type

TYPE

from_object(fir_object)[source]

make a filter from a mt_metadata.timeseries.filters.PoleZeroFilter

Parameters

fir_object (mt_metadata.timeseries.filters.PoleZeroFilter) – MT metadata PoleZeroFilter

get_filter(name)[source]

Get a filter from the name

Parameters

name (string) – name of the filter

Returns

HDF5 group of the fir filter

remove_filter()[source]
to_object(name)[source]

make a mt_metadata.timeseries.filters.pole_zeros_filter object

Returns

DESCRIPTION

Return type

TYPE

mth5.groups.filter_groups.time_delay_filter_group module

Created on Wed Jun 9 09:01:55 2021

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.groups.filter_groups.time_delay_filter_group.TimeDelayGroup(group, **kwargs)[source]

Bases: BaseGroup

Container for time_delay type filters

add_filter(name, time_delay_metadata)[source]

create an HDF5 group/dataset from information given.

Parameters
  • name (string) – Nane of the filter

  • poles (np.ndarray(dtype=complex)) – poles of the filter as complex numbers

  • zeros (np.ndarray(dtype=comples)) – zeros of the filter as complex numbers

  • time_delay_metadata – metadata dictionary see

mt_metadata.timeseries.filters.PoleZeroFilter for details on entries :type time_delay_metadata: dictionary

property filter_dict

Dictionary of available time_delay filters

Returns

DESCRIPTION

Return type

TYPE

from_object(time_delay_object)[source]

make a filter from a mt_metadata.timeseries.filters.PoleZeroFilter

Parameters

time_delay_object (mt_metadata.timeseries.filters.PoleZeroFilter) – MT metadata PoleZeroFilter

get_filter(name)[source]

Get a filter from the name

Parameters

name (string) – name of the filter

Returns

HDF5 group of the time_delay filter

remove_filter()[source]
to_object(name)[source]

make a mt_metadata.timeseries.filters.pole_zeros_filter object

Returns

DESCRIPTION

Return type

TYPE

mth5.groups.filter_groups.zpk_filter_group module

Created on Wed Jun 9 08:55:16 2021

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.groups.filter_groups.zpk_filter_group.ZPKGroup(group, **kwargs)[source]

Bases: BaseGroup

Container for ZPK type filters

add_filter(name, poles, zeros, zpk_metadata)[source]

create an HDF5 group/dataset from information given.

Parameters
  • name (string) – Nane of the filter

  • poles (np.ndarray(dtype=complex)) – poles of the filter as complex numbers

  • zeros (np.ndarray(dtype=comples)) – zeros of the filter as complex numbers

  • zpk_metadata – metadata dictionary see

mt_metadata.timeseries.filters.PoleZeroFilter for details on entries :type zpk_metadata: dictionary

property filter_dict

Dictionary of available ZPK filters

Returns

DESCRIPTION

Return type

TYPE

from_object(zpk_object)[source]

make a filter from a mt_metadata.timeseries.filters.PoleZeroFilter

Parameters

zpk_object (mt_metadata.timeseries.filters.PoleZeroFilter) – MT metadata PoleZeroFilter

get_filter(name)[source]

Get a filter from the name

Parameters

name (string) – name of the filter

Returns

HDF5 group of the ZPK filter

remove_filter()[source]
to_object(name)[source]

make a mt_metadata.timeseries.filters.pole_zeros_filter object

Returns

DESCRIPTION

Return type

TYPE

Module contents

Import all Group objects

class mth5.groups.filter_groups.CoefficientGroup(group, **kwargs)[source]

Bases: BaseGroup

Container for Coefficient type filters

add_filter(name, coefficient_metadata)[source]

Add a coefficient Filter

Parameters
  • name (TYPE) – DESCRIPTION

  • coefficient_metadata (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

property filter_dict

Dictionary of available coefficient filters

Returns

DESCRIPTION

Return type

TYPE

from_object(coefficient_object)[source]

make a filter from a mt_metadata.timeseries.filters.CoefficientFilter

Parameters

zpk_object (mt_metadata.timeseries.filters.CoefficientFilter) – MT metadata Coefficient Filter

get_filter(name)[source]

Get a filter from the name

Parameters

name (string) – name of the filter

Returns

HDF5 group of the ZPK filter

remove_filter()[source]
to_object(name)[source]

make a mt_metadata.timeseries.filters.CoefficientFilter object

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.filter_groups.FAPGroup(group, **kwargs)[source]

Bases: BaseGroup

Container for fap type filters

add_filter(name, frequency, amplitude, phase, fap_metadata)[source]

create an HDF5 group/dataset from information given.

Parameters
  • name (string) – name of the filter

  • frequency (list, np.ndarray) – frequency array in samples per second

  • amplitude (list, np.ndarray) – amplitude array in units of units out

  • phase (list, np.ndarray) – Phase in degrees

  • fap_metadata – other metadata for the filter see

mt_metadata.timeseries.filters.FrequencyResponseTableFilter for details on entries :type fap_metadata: dictionary :return: DESCRIPTION :rtype: TYPE

property filter_dict

Dictionary of available fap filters

Returns

DESCRIPTION

Return type

TYPE

from_object(fap_object)[source]

make a filter from a mt_metadata.timeseries.filters.PoleZeroFilter

Parameters

fap_object (mt_metadata.timeseries.filters.PoleZeroFilter) – MT metadata PoleZeroFilter

get_filter(name)[source]

Get a filter from the name

Parameters

name (string) – name of the filter

Returns

HDF5 group of the fap filter

remove_filter()[source]
to_object(name)[source]

make a mt_metadata.timeseries.filters.pole_zeros_filter object

Returns

DESCRIPTION

Return type

TYPE

update_filter(fap_object)[source]

update values from fap object

Parameters

fap_object (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.filter_groups.FIRGroup(group, **kwargs)[source]

Bases: BaseGroup

Container for fir type filters

add_filter(name, coefficients, fir_metadata)[source]

create an HDF5 group/dataset from information given.

Parameters
  • name (string) – Nane of the filter

  • poles (np.ndarray(dtype=complex)) – poles of the filter as complex numbers

  • zeros (np.ndarray(dtype=comples)) – zeros of the filter as complex numbers

  • fir_metadata – metadata dictionary see

mt_metadata.timeseries.filters.PoleZeroFilter for details on entries :type fir_metadata: dictionary

property filter_dict

Dictionary of available fir filters

Returns

DESCRIPTION

Return type

TYPE

from_object(fir_object)[source]

make a filter from a mt_metadata.timeseries.filters.PoleZeroFilter

Parameters

fir_object (mt_metadata.timeseries.filters.PoleZeroFilter) – MT metadata PoleZeroFilter

get_filter(name)[source]

Get a filter from the name

Parameters

name (string) – name of the filter

Returns

HDF5 group of the fir filter

remove_filter()[source]
to_object(name)[source]

make a mt_metadata.timeseries.filters.pole_zeros_filter object

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.filter_groups.TimeDelayGroup(group, **kwargs)[source]

Bases: BaseGroup

Container for time_delay type filters

add_filter(name, time_delay_metadata)[source]

create an HDF5 group/dataset from information given.

Parameters
  • name (string) – Nane of the filter

  • poles (np.ndarray(dtype=complex)) – poles of the filter as complex numbers

  • zeros (np.ndarray(dtype=comples)) – zeros of the filter as complex numbers

  • time_delay_metadata – metadata dictionary see

mt_metadata.timeseries.filters.PoleZeroFilter for details on entries :type time_delay_metadata: dictionary

property filter_dict

Dictionary of available time_delay filters

Returns

DESCRIPTION

Return type

TYPE

from_object(time_delay_object)[source]

make a filter from a mt_metadata.timeseries.filters.PoleZeroFilter

Parameters

time_delay_object (mt_metadata.timeseries.filters.PoleZeroFilter) – MT metadata PoleZeroFilter

get_filter(name)[source]

Get a filter from the name

Parameters

name (string) – name of the filter

Returns

HDF5 group of the time_delay filter

remove_filter()[source]
to_object(name)[source]

make a mt_metadata.timeseries.filters.pole_zeros_filter object

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.filter_groups.ZPKGroup(group, **kwargs)[source]

Bases: BaseGroup

Container for ZPK type filters

add_filter(name, poles, zeros, zpk_metadata)[source]

create an HDF5 group/dataset from information given.

Parameters
  • name (string) – Nane of the filter

  • poles (np.ndarray(dtype=complex)) – poles of the filter as complex numbers

  • zeros (np.ndarray(dtype=comples)) – zeros of the filter as complex numbers

  • zpk_metadata – metadata dictionary see

mt_metadata.timeseries.filters.PoleZeroFilter for details on entries :type zpk_metadata: dictionary

property filter_dict

Dictionary of available ZPK filters

Returns

DESCRIPTION

Return type

TYPE

from_object(zpk_object)[source]

make a filter from a mt_metadata.timeseries.filters.PoleZeroFilter

Parameters

zpk_object (mt_metadata.timeseries.filters.PoleZeroFilter) – MT metadata PoleZeroFilter

get_filter(name)[source]

Get a filter from the name

Parameters

name (string) – name of the filter

Returns

HDF5 group of the ZPK filter

remove_filter()[source]
to_object(name)[source]

make a mt_metadata.timeseries.filters.pole_zeros_filter object

Returns

DESCRIPTION

Return type

TYPE

Submodules
mth5.groups.base module
Base Group Class

Contains all the base functions that will be used by group classes.

Created on Fri May 29 15:09:48 2020

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.groups.base.BaseGroup(group, group_metadata=None, **kwargs)[source]

Bases: object

Generic object that will have functionality for reading/writing groups, including attributes. To access the hdf5 group directly use the BaseGroup.hdf5_group property.

>>> base = BaseGroup(hdf5_group)
>>> base.hdf5_group.ref
<HDF5 Group Reference>

Note

All attributes should be input into the metadata object, that way all input will be validated against the metadata standards. If you change attributes in metadata object, you should run the BaseGroup.write_metadata method. This is a temporary solution working on an automatic updater if metadata is changed.

>>> base.metadata.existing_attribute = 'update_existing_attribute'
>>> base.write_metadata()

If you want to add a new attribute this should be done using the metadata.add_base_attribute method.

>>> base.metadata.add_base_attribute('new_attribute',
...                                  'new_attribute_value',
...                                  {'type':str,
...                                   'required':True,
...                                   'style':'free form',
...                                   'description': 'new attribute desc.',
...                                   'units':None,
...                                   'options':[],
...                                   'alias':[],
...                                   'example':'new attribute'})

Includes intializing functions that makes a summary table and writes metadata.

property dataset_options
property groups_list
initialize_group(**kwargs)[source]

Initialize group by making a summary table and writing metadata

property metadata

Metadata for the Group based on mt_metadata.timeseries

read_metadata()[source]

read metadata from the HDF5 group into metadata object

write_metadata()[source]

Write HDF5 metadata from metadata object.

mth5.groups.estimate_dataset module

Created on Thu Mar 10 09:02:16 2022

@author: jpeacock

class mth5.groups.estimate_dataset.EstimateDataset(dataset, dataset_metadata=None, write_metadata=True, **kwargs)[source]

Bases: object

Holds a statistical estimate

This will hold multi-dimensional statistical estimates for transfer functions.

Parameters
  • dataset (h5py.Dataset) – hdf5 dataset

  • dataset_metadata – data set metadata see

mt_metadata.transfer_functions.tf.StatisticalEstimate,

defaults to None

Parameters
  • write_metadata (Boolean, optional) – True to write metadata, defaults to True

  • **kwargs

    DESCRIPTION

Raises

MTH5Error – When an estimate is not present, or metadata name does not match the given name

from_numpy(new_estimate)[source]
Returns

a numpy structured array

Return type

numpy.ndarray

Note

data is a builtin to numpy and cannot be used as a name

loads into RAM

from_xarray(data)[source]
Returns

an xarray DataArray with appropriate metadata and the appropriate coordinates base on the metadata.

Return type

xarray.DataArray

Note

that metadta will not be validated if changed in an xarray.

loads from memory

read_metadata()[source]

Read metadata from the HDF5 file into the metadata container, that way it can be validated.

replace_dataset(new_data_array)[source]

replace the entire dataset with a new one, nothing left behind

Parameters

new_data_array (numpy.ndarray) – new data array

to_numpy()[source]
Returns

a numpy structured array with

Return type

numpy.ndarray

loads into RAM

to_xarray(period)[source]
Returns

an xarray DataArray with appropriate metadata and the appropriate coordinates.

Return type

xarray.DataArray

Note

that metadta will not be validated if changed in an xarray.

loads from memory

write_metadata()[source]

Write metadata from the metadata container to the HDF5 attrs dictionary.

mth5.groups.experiment module

Created on Wed Dec 23 16:59:45 2020

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.groups.experiment.ExperimentGroup(group, **kwargs)[source]

Bases: BaseGroup

Utility class to hold general information about the experiment and accompanying metadata for an MT experiment.

To access the hdf5 group directly use ExperimentGroup.hdf5_group.

>>> experiment = ExperimentGroup(hdf5_group)
>>> experiment.hdf5_group.ref
<HDF5 Group Reference>

Note

All attributes should be input into the metadata object, that way all input will be validated against the metadata standards. If you change attributes in metadata object, you should run the ExperimentGroup.write_metadata() method. This is a temporary solution, working on an automatic updater if metadata is changed.

>>> experiment.metadata.existing_attribute = 'update_existing_attribute'
>>> experiment.write_metadata()

If you want to add a new attribute this should be done using the metadata.add_base_attribute method.

>>> experiment.metadata.add_base_attribute('new_attribute',
>>> ...                                'new_attribute_value',
>>> ...                                {'type':str,
>>> ...                                 'required':True,
>>> ...                                 'style':'free form',
>>> ...                                 'description': 'new attribute desc.',
>>> ...                                 'units':None,
>>> ...                                 'options':[],
>>> ...                                 'alias':[],
>>> ...                                 'example':'new attribute

Tip

If you want ot add stations, reports, etc to the experiment this should be done from the MTH5 object. This is to avoid duplication, at least for now.

To look at what the structure of /Experiment looks like:

>>> experiment
/Experiment:
====================
    |- Group: Surveys
    -----------------
    |- Group: Reports
    -----------------
    |- Group: Standards
    -------------------
    |- Group: Stations
    ------------------
property metadata

Overwrite get metadata to include station information

property surveys_group
mth5.groups.filters module

Created on Wed Dec 23 17:08:40 2020

Need to make a group for FAP and FIR filters.

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.groups.filters.FiltersGroup(group, **kwargs)[source]

Bases: BaseGroup

Not implemented yet

add_filter(filter_object)[source]

Add a filter dataset based on type

current types are:
  • zpk –> zeros, poles, gain

  • fap –> frequency look up table

  • time_delay –> time delay filter

  • coefficient –> coefficient filter

Parameters

filter_object (mt_metadata.timeseries.filters) – An MT metadata filter object

property filter_dict
get_filter(name)[source]

Get a filter by name

to_filter_object(name)[source]

return the MT metadata representation of the filter

mth5.groups.master_station_run_channel module
mth5.groups.reports module

Created on Wed Dec 23 17:03:53 2020

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.groups.reports.ReportsGroup(group, **kwargs)[source]

Bases: BaseGroup

Not sure how to handle this yet

add_report(report_name, report_metadata=None, report_data=None)[source]
Parameters
  • report_name (TYPE) – DESCRIPTION

  • report_metadata (TYPE, optional) – DESCRIPTION, defaults to None

  • report_data (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

mth5.groups.standards module

Created on Wed Dec 23 17:05:33 2020

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.groups.standards.StandardsGroup(group, **kwargs)[source]

Bases: BaseGroup

The StandardsGroup is a convenience group that stores the metadata standards that were used to make the current file. This is to help a user understand the metadata directly from the file and not have to look up documentation that might not be updated.

The metadata standards are stored in the summary table /Survey/Standards/summary

>>> standards = mth5_obj.standards_group
>>> standards.summary_table
index | attribute | type | required | style | units | description |
options  |  alias |  example
--------------------------------------------------------------------------
get_attribute_information(attribute_name)[source]

get information about an attribute

The attribute name should be in the summary table.

Parameters

attribute_name (string) – attribute name

Returns

prints a description of the attribute

Raises

MTH5TableError – if attribute is not found

>>> standars = mth5_obj.standards_group
>>> standards.get_attribute_information('survey.release_license')
survey.release_license
--------------------------
        type          : string
        required      : True
        style         : controlled vocabulary
        units         :
        description   : How the data can be used. The options are based on
                 Creative Commons licenses. For details visit
                 https://creativecommons.org/licenses/
        options       : CC-0,CC-BY,CC-BY-SA,CC-BY-ND,CC-BY-NC-SA,CC-BY-NC-ND
        alias         :
        example       : CC-0
        default       : CC-0
initialize_group()[source]

Initialize the group by making a summary table that summarizes the metadata standards used to describe the data.

Also, write generic metadata information.

property summary_table
summary_table_from_dict(summary_dict)[source]

Fill summary table from a dictionary that summarizes the metadata for the entire survey.

Parameters

summary_dict (dictionary) – Flattened dictionary of all metadata standards within the survey.

mth5.groups.standards.summarize_metadata_standards()[source]

Summarize metadata standards into a dictionary

mth5.groups.survey module

Created on Wed Dec 23 16:59:45 2020

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.groups.survey.MasterSurveyGroup(group, **kwargs)[source]

Bases: BaseGroup

Utility class to hold information about the surveys within an experiment and accompanying metadata. This class is next level down from Experiment for stations Experiment/Surveys. This class provides methods to add and get surveys.

To access MasterSurveyGroup from an open MTH5 file:

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> surveys = mth5_obj.surveys_group

To check what stations exist

>>> surveys.groups_list
['survey_01', 'survey_02']

To access the hdf5 group directly use SurveyGroup.hdf5_group.

>>> stations.hdf5_group.ref
<HDF5 Group Reference>

Note

All attributes should be input into the metadata object, that way all input will be validated against the metadata standards. If you change attributes in metadata object, you should run the SurveyGroup.write_metadata() method. This is a temporary solution, working on an automatic updater if metadata is changed.

If you want to add a new attribute this should be done using the metadata.add_base_attribute method.

>>> stations.metadata.add_base_attribute('new_attribute',
>>> ...                                'new_attribute_value',
>>> ...                                {'type':str,
>>> ...                                 'required':True,
>>> ...                                 'style':'free form',
>>> ...                                 'description': 'new attribute desc.',
>>> ...                                 'units':None,
>>> ...                                 'options':[],
>>> ...                                 'alias':[],
>>> ...                                 'example':'new attribute

To add a survey:

>>> new_survey = surveys.add_survey('new_survey')
>>> surveys
Experiment/Surveys:
====================
    |- Group: new_survey
    ---------------------
        |- Group: Filters
        ------------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
        |- Group: Stations
        ------------------

Add a survey with metadata:

>>> from mth5.metadata import Survey
>>> survey_metadata = Survey()
>>> survey_metadata.id = 'MT004'
>>> survey_metadata.time_period.start = '2020-01-01T12:30:00'
>>> new_survey = surveys.add_survey('Test_01', survey_metadata)
>>> # to look at the metadata
>>> new_survey.metadata
{
    "survey": {
        "acquired_by.author": null,
        "acquired_by.comments": null,
        "id": "MT004",
        ...
        }
}

See also

mth5.metadata for details on how to add metadata from various files and python objects.

To remove a survey:

>>> surveys.remove_survey('new_survey')
>>> surveys
/Survey/Stations:
====================

Note

Deleting a survey is not as simple as del(survey). In HDF5 this does not free up memory, it simply removes the reference to that survey. The common way to get around this is to copy what you want into a new file, or overwrite the survey.

To get a survey:

>>> existing_survey = surveys.get_survey('existing_survey_name')
add_survey(survey_name, survey_metadata=None)[source]
Add a survey with metadata if given with the path:

/Survey/surveys/survey_name

If the survey already exists, will return that survey and nothing is added.

Parameters
  • survey_name (string) – Name of the survey, should be the same as metadata.id

  • survey_metadata (mth5.metadata.Station, optional) – Station metadata container, defaults to None

Returns

A convenience class for the added survey

Return type

mth5_groups.StationGroup

To add a survey

>>> new_survey = surveys.add_survey('new_survey')
>>> surveys
Experiment/Surveys:
====================
    |- Group: new_survey
    ---------------------
        |- Group: Filters
        ------------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
        |- Group: Stations
        ------------------
Add a survey with metadata

>>> from mth5.metadata import Survey
>>> survey_metadata = Survey()
>>> survey_metadata.id = 'MT004'
>>> survey_metadata.time_period.start = '2020-01-01T12:30:00'
>>> new_survey = surveys.add_survey('Test_01', survey_metadata)
>>> # to look at the metadata
>>> new_survey.metadata
{
    "survey": {
        "acquired_by.author": null,
        "acquired_by.comments": null,
        "id": "MT004",
        ...
        }
}

See also

mth5.metadata for details on how to add metadata from various files and python objects.

property channel_summary

Summary of all channels in the file.

get_survey(survey_name)[source]

Get a survey with the same name as survey_name

Parameters

survey_name (string) – existing survey name

Returns

convenience survey class

Return type

mth5.mth5_groups.surveyGroup

Raises

MTH5Error – if the survey name is not found.

Example

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> # one option
>>> existing_survey = mth5_obj.get_survey('MT001')
>>> # another option
>>> existing_survey = mth5_obj.experiment_group.surveys_group.get_survey('MT001')
MTH5Error: MT001 does not exist, check survey_list for existing names
remove_survey(survey_name)[source]

Remove a survey from the file.

Note

Deleting a survey is not as simple as del(survey). In HDF5 this does not free up memory, it simply removes the reference to that survey. The common way to get around this is to copy what you want into a new file, or overwrite the survey.

Parameters

survey_name (string) – existing survey name

Example
>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> # one option
>>> mth5_obj.remove_survey('MT001')
>>> # another option
>>> mth5_obj.surveys_group.remove_survey('MT001')
class mth5.groups.survey.SurveyGroup(group, survey_metadata=None, **kwargs)[source]

Bases: BaseGroup

Utility class to holds general information about the survey and accompanying metadata for an MT survey.

To access the hdf5 group directly use SurveyGroup.hdf5_group.

>>> survey = SurveyGroup(hdf5_group)
>>> survey.hdf5_group.ref
<HDF5 Group Reference>

Note

All attributes should be input into the metadata object, that way all input will be validated against the metadata standards. If you change attributes in metadata object, you should run the SurveyGroup.write_metadata() method. This is a temporary solution, working on an automatic updater if metadata is changed.

>>> survey.metadata.existing_attribute = 'update_existing_attribute'
>>> survey.write_metadata()

If you want to add a new attribute this should be done using the metadata.add_base_attribute method.

>>> survey.metadata.add_base_attribute('new_attribute',
>>> ...                                'new_attribute_value',
>>> ...                                {'type':str,
>>> ...                                 'required':True,
>>> ...                                 'style':'free form',
>>> ...                                 'description': 'new attribute desc.',
>>> ...                                 'units':None,
>>> ...                                 'options':[],
>>> ...                                 'alias':[],
>>> ...                                 'example':'new attribute

Tip

If you want ot add surveys, reports, etc to the survey this should be done from the MTH5 object. This is to avoid duplication, at least for now.

To look at what the structure of /Survey looks like:

>>> survey
/Survey:
====================
    |- Group: Filters
    -----------------
        --> Dataset: summary
    -----------------
    |- Group: Reports
    -----------------
        --> Dataset: summary
        -----------------
    |- Group: Standards
    -------------------
        --> Dataset: summary
        -----------------
    |- Group: Stations
    ------------------
        --> Dataset: summary
        -----------------
property filters_group

Convenience property for /Survey/Filters group

initialize_group(**kwargs)[source]

Initialize group by making a summary table and writing metadata

property metadata

Overwrite get metadata to include station information in the survey

property reports_group

Convenience property for /Survey/Reports group

property standards_group

Convenience property for /Survey/Standards group

property stations_group
update_survey_metadata(survey_dict=None)[source]

update start end dates and location corners from stations_group.summary_table

mth5.groups.transfer_function module

Created on Thu Mar 10 08:22:33 2022

@author: jpeacock

class mth5.groups.transfer_function.TransferFunctionGroup(group, **kwargs)[source]

Bases: BaseGroup

Object to hold a single transfer function estimation

add_statistical_estimate(estimate_name, estimate_data=None, estimate_metadata=None, max_shape=(None, None, None), chunks=True, **kwargs)[source]

Add a StatisticalEstimate

Parameters

estimate (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

from_tf_object(tf_obj)[source]

Create data sets from a mt_metadata.transfer_function.core.TF object.

Parameters

tf_obj (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_estimate(estimate_name)[source]

Get a statistical estimate dataset

has_estimate(estimate)[source]

has estimate

property period

Get period from hdf5_group[“period”]

Returns

DESCRIPTION

Return type

TYPE

remove_estimate(estimate_name)[source]

remove a statistical estimate

Parameters

estimate_name (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

to_tf_object()[source]

Create a mt_metadata.transfer_function.core.TF object from the estimates in the group

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.transfer_function.TransferFunctionsGroup(group, **kwargs)[source]

Bases: BaseGroup

Object to hold transfer functions

The is the high level group, all transfer functions for the station are held here and each one will have its own TransferFunctionGroup.

This has add, get, remove_transfer_function.

add_transfer_function(name, tf_object=None)[source]

Add a transfer function to the group

Parameters
  • name (string) – name of the transfer function

  • tf_object (mt_metadata.transfer_function.core.TF) – Transfer Function object

Returns

DESCRIPTION

Return type

TYPE

>>> from mth5.mth5 import MTH5
>>> m = MTH5()
>>> m.open_mth5("example.h5", "a")
>>> station_group = m.get_station("mt01", survey="test")
>>> tf_group = station_group.transfer_functions_group
>>> tf_group.add_transfer_function("mt01_4096", tf_object)
get_tf_object(tf_id)[source]

This is the function you want to use to get a proper mt_metadata.transfer_functions.core.TF object with all the appropriate metadata.

Parameters

tf_id (string) – name of the transfer function to get

Returns

Full transfer function with appropriate metadata

Return type

mt_metadata.transfer_functions.core.TF

>>> from mth5.mth5 import MTH5
>>> m = MTH5()
>>> m.open_mth5("example.h5", "a")
>>> station_group = m.get_station("mt01", survey="test")
>>> tf_group = station_group.transfer_functions_group
>>> tf_object = tf_group.get_tf_object("mt01_4096")
get_transfer_function(tf_id)[source]

Get transfer function from id

Parameters

tf_id (string) – name of transfer function

Returns

Transfer function group

Return type

mth5.groups.TransferFunctionGroup

>>> from mth5.mth5 import MTH5
>>> m = MTH5()
>>> m.open_mth5("example.h5", "a")
>>> station_group = m.get_station("mt01", survey="test")
>>> tf_group = station_group.transfer_functions_group.get_transfer_function("mt01_4096")
remove_transfer_function(tf_id)[source]

Remove a transfer function from the group

Parameters

tf_id (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

>>> from mth5.mth5 import MTH5
>>> m = MTH5()
>>> m.open_mth5("example.h5", "a")
>>> station_group = m.get_station("mt01", survey="test")
>>> tf_group = station_group.transfer_functions_group
>>> tf_group.remove_transfer_function("mt01_4096")
tf_summary(as_dataframe=True)[source]

Summary of all transfer functions in this group

Returns

DESCRIPTION

Return type

TYPE

Module contents

Import all Group objects

class mth5.groups.AuxiliaryDataset(group, **kwargs)[source]

Bases: ChannelDataset

Holds a channel dataset. This is a simple container for the data to make sure that the user has the flexibility to turn the channel into an object they want to deal with.

For now all the numpy type slicing can be used on hdf5_dataset

Parameters
  • dataset (h5py.Dataset) – dataset object for the channel

  • dataset_metadata ([ mth5.metadata.Electric | mth5.metadata.Magnetic | mth5.metadata.Auxiliary ], optional) – metadata container, defaults to None

Raises

MTH5Error – If the dataset is not of the correct type

Utilities will be written to create some common objects like:

  • xarray.DataArray

  • pandas.DataFrame

  • zarr

  • dask.Array

The benefit of these other objects is that they can be indexed by time, and they have much more buit-in funcionality.

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> run = mth5_obj.stations_group.get_station('MT001').get_run('MT001a')
>>> channel = run.get_channel('Ex')
>>> channel
 Channel Electric:
 -------------------
   component:        Ey
   data type:        electric
   data format:      float32
   data shape:       (4096,)
   start:            1980-01-01T00:00:00+00:00
   end:              1980-01-01T00:00:01+00:00
   sample rate:      4096
class mth5.groups.BaseGroup(group, group_metadata=None, **kwargs)[source]

Bases: object

Generic object that will have functionality for reading/writing groups, including attributes. To access the hdf5 group directly use the BaseGroup.hdf5_group property.

>>> base = BaseGroup(hdf5_group)
>>> base.hdf5_group.ref
<HDF5 Group Reference>

Note

All attributes should be input into the metadata object, that way all input will be validated against the metadata standards. If you change attributes in metadata object, you should run the BaseGroup.write_metadata method. This is a temporary solution working on an automatic updater if metadata is changed.

>>> base.metadata.existing_attribute = 'update_existing_attribute'
>>> base.write_metadata()

If you want to add a new attribute this should be done using the metadata.add_base_attribute method.

>>> base.metadata.add_base_attribute('new_attribute',
...                                  'new_attribute_value',
...                                  {'type':str,
...                                   'required':True,
...                                   'style':'free form',
...                                   'description': 'new attribute desc.',
...                                   'units':None,
...                                   'options':[],
...                                   'alias':[],
...                                   'example':'new attribute'})

Includes intializing functions that makes a summary table and writes metadata.

property dataset_options
property groups_list
initialize_group(**kwargs)[source]

Initialize group by making a summary table and writing metadata

property metadata

Metadata for the Group based on mt_metadata.timeseries

read_metadata()[source]

read metadata from the HDF5 group into metadata object

write_metadata()[source]

Write HDF5 metadata from metadata object.

class mth5.groups.ChannelDataset(dataset, dataset_metadata=None, write_metadata=True, **kwargs)[source]

Bases: object

Holds a channel dataset. This is a simple container for the data to make sure that the user has the flexibility to turn the channel into an object they want to deal with.

For now all the numpy type slicing can be used on hdf5_dataset

Parameters
  • dataset (h5py.Dataset) – dataset object for the channel

  • dataset_metadata ([ mth5.metadata.Electric | mth5.metadata.Magnetic | mth5.metadata.Auxiliary ], optional) – metadata container, defaults to None

Raises

MTH5Error – If the dataset is not of the correct type

Utilities will be written to create some common objects like:

  • xarray.DataArray

  • pandas.DataFrame

  • zarr

  • dask.Array

The benefit of these other objects is that they can be indexed by time, and they have much more buit-in funcionality.

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> run = mth5_obj.stations_group.get_station('MT001').get_run('MT001a')
>>> channel = run.get_channel('Ex')
>>> channel
 Channel Electric:
 -------------------
   component:        Ey
   data type:        electric
   data format:      float32
   data shape:       (4096,)
   start:            1980-01-01T00:00:00+00:00
   end:              1980-01-01T00:00:01+00:00
   sample rate:      4096
property channel_entry

channel entry that will go into a full channel summary of the entire survey

property channel_response_filter
property end

return end time based on the data

extend_dataset(new_data_array, start_time, sample_rate, fill=None, max_gap_seconds=1, fill_window=10)[source]

Append data according to how the start time aligns with existing data. If the start time is before existing start time the data is prepended, similarly if the start time is near the end data will be appended.

If the start time is within the existing time range, existing data will be replace with the new data.

If there is a gap between start or end time of the new data with the existing data you can either fill the data with a constant value or an error will be raise depending on the value of fill.

Parameters
  • new_data_array (numpy.ndarray) – new data array with shape (npts, )

  • start_time (string or mth5.utils.mttime.MTime) – start time of the new data array in UTC

  • sample_rate (float) – Sample rate of the new data array, must match existing sample rate

  • fill (string, None, float, integer) – If there is a data gap how do you want to fill the gap * None: will raise an mth5.utils.exceptions.MTH5Error * ‘mean’: will fill with the mean of each data set within the fill window * ‘median’: will fill with the median of each data set within the fill window * value: can be an integer or float to fill the gap * ‘nan’: will fill the gap with NaN

  • max_gap_seconds (float or integer) – sets a maximum number of seconds the gap can be. Anything over this number will raise a mth5.utils.exceptions.MTH5Error.

  • fill_window (integer) – number of points from the end of each data set to estimate fill value from.

Raises

mth5.utils.excptions.MTH5Error if sample rate is not the same, or fill value is not understood,

Append Example

>>> ex = mth5_obj.get_channel('MT001', 'MT001a', 'Ex')
>>> ex.n_samples
4096
>>> ex.end
2015-01-08T19:32:09.500000+00:00
>>> t = timeseries.ChannelTS('electric',
...                     data=2*np.cos(4 * np.pi * .05 *         ...                                   np.linspace(0,4096l num=4096) *
...                                   .01),
...                     channel_metadata={'electric':{
...                        'component': 'ex',
...                        'sample_rate': 8,
...                        'time_period.start':(ex.end+(1)).iso_str}})
>>> ex.extend_dataset(t.ts, t.start, t.sample_rate, fill='median',
...                   max_gap_seconds=2)
2020-07-02T18:02:47 - mth5.groups.Electric.extend_dataset - INFO -
filling data gap with 1.0385180759767025
>>> ex.n_samples
8200
>>> ex.end
2015-01-08T19:40:42.500000+00:00
from_channel_ts(channel_ts_obj, how='replace', fill=None, max_gap_seconds=1, fill_window=10)[source]

fill data set from a mth5.timeseries.ChannelTS object.

Will check for time alignement, and metadata.

Parameters
  • channel_ts_obj (mth5.timeseries.ChannelTS) – time series object

  • how

    how the new array will be input to the existing dataset:

    • ’replace’ -> replace the entire dataset nothing is left over.

    • ’extend’ -> add onto the existing dataset, any overlapping values will be rewritten, if there are gaps between data sets those will be handled depending on the value of fill.

    param fill

    If there is a data gap how do you want to fill the gap:

    • None -> will raise an mth5.utils.exceptions.MTH5Error

    • ’mean’-> will fill with the mean of each data set within the fill window

    • ’median’ -> will fill with the median of each data set within the fill window

    • value -> can be an integer or float to fill the gap

    • ’nan’ -> will fill the gap with NaN

  • max_gap_seconds (float or integer) – sets a maximum number of seconds the gap can be. Anything over this number will raise a mth5.utils.exceptions.MTH5Error.

  • fill_window (integer) – number of points from the end of each data set to estimate fill value from.

from_xarray(data_array, how='replace', fill=None, max_gap_seconds=1, fill_window=10)[source]

fill data set from a xarray.DataArray object.

Will check for time alignement, and metadata.

Parameters
  • data_array_obj – Xarray data array

  • how

    how the new array will be input to the existing dataset:

    • ’replace’ -> replace the entire dataset nothing is left over.

    • ’extend’ -> add onto the existing dataset, any overlapping

    values will be rewritten, if there are gaps between data sets those will be handled depending on the value of fill.

    param fill

    If there is a data gap how do you want to fill the gap:

    • None -> will raise an mth5.utils.exceptions.MTH5Error

    • ’mean’-> will fill with the mean of each data set within

      the fill window

    • ’median’ -> will fill with the median of each data set

      within the fill window

    • value -> can be an integer or float to fill the gap

    • ’nan’ -> will fill the gap with NaN

  • max_gap_seconds (float or integer) – sets a maximum number of seconds the gap can be. Anything over this number will raise a mth5.utils.exceptions.MTH5Error.

  • fill_window (integer) – number of points from the end of each data set to estimate fill value from.

get_index_from_end_time(given_time)[source]

get the end index value. Add one to be inclusive of the found index.

Parameters

given_time (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_index_from_time(given_time)[source]

get the appropriate index for a given time.

Parameters

given_time (string or MTime) – time string

Returns

index value

Return type

int

property n_samples
read_metadata()[source]

Read metadata from the HDF5 file into the metadata container, that way it can be validated.

replace_dataset(new_data_array)[source]

replace the entire dataset with a new one, nothing left behind

Parameters

new_data_array (numpy.ndarray) – new data array shape (npts, )

property run_metadata

run metadata

property sample_rate
property start
property station_metadata

station metadata

property survey_id

shortcut to survey group

property survey_metadata

survey metadata

property time_index

Create a time index based on the metadata. This can help when asking for time windows from the data

Returns

DESCRIPTION

Return type

TYPE

time_slice(start, end=None, n_samples=None, return_type='channel_ts')[source]

Get a time slice from the channel and return the appropriate type

  • numpy array with metadata

  • pandas.Dataframe with metadata

  • xarray.DataFrame with metadata

  • mth5.timeseries.ChannelTS ‘default’

  • dask.DataFrame with metadata ‘not yet’

Parameters
  • start (string or mth5.utils.mttime.MTime) – start time of the slice

  • end (string or mth5.utils.mttime.MTime, optional) – end time of the slice

  • n_samples (integer, optional) – number of samples to read in

Returns

the correct container for the time series.

Return type

[ xarray.DataArray | pandas.DataFrame | mth5.timeseries.ChannelTS | numpy.ndarray ]

Raises

ValueError if both end_time and n_samples are None or given.

Example with number of samples

>>> ex = mth5_obj.get_channel('FL001', 'FL001a', 'Ex')
>>> ex_slice = ex.time_slice("2015-01-08T19:49:15", n_samples=4096)
>>> ex_slice
<xarray.DataArray (time: 4096)>
array([0.93115046, 0.14233688, 0.87917119, ..., 0.26073634, 0.7137319 ,
       0.88154395])
Coordinates:
  * time     (time) datetime64[ns] 2015-01-08T19:49:15 ... 2015-01-08T19:57:46.875000
Attributes:
    ac.end:                      None
    ac.start:                    None
    ...

>>> type(ex_slice)
mth5.timeseries.ChannelTS

# plot the time series
>>> ex_slice.ts.plot()
Example with start and end time

>>> ex_slice = ex.time_slice("2015-01-08T19:49:15",
...                          end_time="2015-01-09T19:49:15")
Raises Example

>>> ex_slice = ex.time_slice("2015-01-08T19:49:15",
...                          end_time="2015-01-09T19:49:15",
...                          n_samples=4096)
ValueError: Must input either end_time or n_samples, not both.
to_channel_ts()[source]
Returns

a Timeseries with the appropriate time index and metadata

Return type

mth5.timeseries.ChannelTS

loads from memory (nearly half the size of xarray alone, not sure why)

to_dataframe()[source]
Returns

a dataframe where data is stored in the ‘data’ column and attributes are stored in the experimental attrs attribute

Return type

pandas.DataFrame

Note

that metadta will not be validated if changed in an xarray.

loads into RAM

to_numpy()[source]
Returns

a numpy structured array with 2 columns (time, channel_data)

Return type

numpy.core.records

Note

data is a builtin to numpy and cannot be used as a name

loads into RAM

to_xarray()[source]
Returns

an xarray DataArray with appropriate metadata and the appropriate time index.

Return type

xarray.DataArray

Note

that metadta will not be validated if changed in an xarray.

loads from memory

write_metadata()[source]

Write metadata from the metadata container to the HDF5 attrs dictionary.

class mth5.groups.ElectricDataset(group, **kwargs)[source]

Bases: ChannelDataset

Holds a channel dataset. This is a simple container for the data to make sure that the user has the flexibility to turn the channel into an object they want to deal with.

For now all the numpy type slicing can be used on hdf5_dataset

Parameters
  • dataset (h5py.Dataset) – dataset object for the channel

  • dataset_metadata ([ mth5.metadata.Electric | mth5.metadata.Magnetic | mth5.metadata.Auxiliary ], optional) – metadata container, defaults to None

Raises

MTH5Error – If the dataset is not of the correct type

Utilities will be written to create some common objects like:

  • xarray.DataArray

  • pandas.DataFrame

  • zarr

  • dask.Array

The benefit of these other objects is that they can be indexed by time, and they have much more buit-in funcionality.

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> run = mth5_obj.stations_group.get_station('MT001').get_run('MT001a')
>>> channel = run.get_channel('Ex')
>>> channel
 Channel Electric:
 -------------------
   component:        Ey
   data type:        electric
   data format:      float32
   data shape:       (4096,)
   start:            1980-01-01T00:00:00+00:00
   end:              1980-01-01T00:00:01+00:00
   sample rate:      4096
class mth5.groups.EstimateDataset(dataset, dataset_metadata=None, write_metadata=True, **kwargs)[source]

Bases: object

Holds a statistical estimate

This will hold multi-dimensional statistical estimates for transfer functions.

Parameters
  • dataset (h5py.Dataset) – hdf5 dataset

  • dataset_metadata – data set metadata see

mt_metadata.transfer_functions.tf.StatisticalEstimate,

defaults to None

Parameters
  • write_metadata (Boolean, optional) – True to write metadata, defaults to True

  • **kwargs

    DESCRIPTION

Raises

MTH5Error – When an estimate is not present, or metadata name does not match the given name

from_numpy(new_estimate)[source]
Returns

a numpy structured array

Return type

numpy.ndarray

Note

data is a builtin to numpy and cannot be used as a name

loads into RAM

from_xarray(data)[source]
Returns

an xarray DataArray with appropriate metadata and the appropriate coordinates base on the metadata.

Return type

xarray.DataArray

Note

that metadta will not be validated if changed in an xarray.

loads from memory

read_metadata()[source]

Read metadata from the HDF5 file into the metadata container, that way it can be validated.

replace_dataset(new_data_array)[source]

replace the entire dataset with a new one, nothing left behind

Parameters

new_data_array (numpy.ndarray) – new data array

to_numpy()[source]
Returns

a numpy structured array with

Return type

numpy.ndarray

loads into RAM

to_xarray(period)[source]
Returns

an xarray DataArray with appropriate metadata and the appropriate coordinates.

Return type

xarray.DataArray

Note

that metadta will not be validated if changed in an xarray.

loads from memory

write_metadata()[source]

Write metadata from the metadata container to the HDF5 attrs dictionary.

class mth5.groups.ExperimentGroup(group, **kwargs)[source]

Bases: BaseGroup

Utility class to hold general information about the experiment and accompanying metadata for an MT experiment.

To access the hdf5 group directly use ExperimentGroup.hdf5_group.

>>> experiment = ExperimentGroup(hdf5_group)
>>> experiment.hdf5_group.ref
<HDF5 Group Reference>

Note

All attributes should be input into the metadata object, that way all input will be validated against the metadata standards. If you change attributes in metadata object, you should run the ExperimentGroup.write_metadata() method. This is a temporary solution, working on an automatic updater if metadata is changed.

>>> experiment.metadata.existing_attribute = 'update_existing_attribute'
>>> experiment.write_metadata()

If you want to add a new attribute this should be done using the metadata.add_base_attribute method.

>>> experiment.metadata.add_base_attribute('new_attribute',
>>> ...                                'new_attribute_value',
>>> ...                                {'type':str,
>>> ...                                 'required':True,
>>> ...                                 'style':'free form',
>>> ...                                 'description': 'new attribute desc.',
>>> ...                                 'units':None,
>>> ...                                 'options':[],
>>> ...                                 'alias':[],
>>> ...                                 'example':'new attribute

Tip

If you want ot add stations, reports, etc to the experiment this should be done from the MTH5 object. This is to avoid duplication, at least for now.

To look at what the structure of /Experiment looks like:

>>> experiment
/Experiment:
====================
    |- Group: Surveys
    -----------------
    |- Group: Reports
    -----------------
    |- Group: Standards
    -------------------
    |- Group: Stations
    ------------------
property metadata

Overwrite get metadata to include station information

property surveys_group
class mth5.groups.FCChannelDataset(dataset, dataset_metadata=None, write_metadata=True, **kwargs)[source]

Bases: object

This will hold multi-dimensional set of Fourier Coefficients

Columns

  • time

  • frequency [ integer as harmonic index or float ]

  • fc (complex)

  • weight_channel (maybe)

  • weight_band (maybe)

  • weight_time (maybe)

- name
- start time
- end time
- acquistion_sample_rate
- decimated_sample rate
- window_sample_rate
Type

delta_t within the window

- units
- [optional] weights or masking
- frequency method
Type

integer * window length / delta_t of window

Parameters
  • dataset (h5py.Dataset) – hdf5 dataset

  • dataset_metadata – data set metadata see

mt_metadata.transfer_functions.tf.StatisticalEstimate,

defaults to None

Parameters
  • write_metadata (Boolean, optional) – True to write metadata, defaults to True

  • **kwargs

    DESCRIPTION

Raises

MTH5Error – When an estimate is not present, or metadata name does not match the given name

property frequency

frequency array dictated by window size and sample rate

Returns

DESCRIPTION

Return type

TYPE

from_numpy(new_estimate)[source]
Returns

a numpy structured array

Return type

numpy.ndarray

Note

data is a builtin to numpy and cannot be used as a name

loads into RAM

from_xarray(data)[source]
Returns

an xarray DataArray with appropriate metadata and the appropriate coordinates base on the metadata.

Return type

xarray.DataArray

Note

that metadta will not be validated if changed in an xarray.

loads from memory

property n_frequencies

number of frequencies (window size)

property n_windows

number of time windows

read_metadata()[source]

Read metadata from the HDF5 file into the metadata container, that way it can be validated.

replace_dataset(new_data_array)[source]

replace the entire dataset with a new one, nothing left behind

Parameters

new_data_array (numpy.ndarray) – new data array

property time

Time array that includes the start of each time window

Returns

DESCRIPTION

Return type

TYPE

to_numpy()[source]
Returns

a numpy structured array with

Return type

numpy.ndarray

loads into RAM

to_xarray()[source]
Returns

an xarray DataArray with appropriate metadata and the appropriate coordinates.

Return type

xarray.DataArray

Note

that metadta will not be validated if changed in an xarray.

loads from memory

write_metadata()[source]

Write metadata from the metadata container to the HDF5 attrs dictionary.

class mth5.groups.FCDecimationGroup(group, decimation_level_metadata=None, **kwargs)[source]

Bases: BaseGroup

Holds a single decimation level

Attributes

  • start time

  • end time

  • channels (list)

  • decimation factor

  • decimation level

  • decimation sample rate

  • method (FFT, wavelet, …)

  • anti alias filter

  • prewhitening type

  • extra_pre_fft_detrend_type

  • recoloring (True | False)

  • harmonics_kept (index values of harmonics kept (list) | ‘all’)

  • window parameters
    • length

    • overlap

    • type

    • type parameters

    • window sample rate (method or property)

  • [optional] masking or weighting information

add_channel(fc_name, fc_data=None, fc_metadata=None, max_shape=(None, None), chunks=True, **kwargs)[source]

Add a set of Fourier coefficients for a single channel at a single decimation level for a processing run.

  • time

  • frequency [ integer as harmonic index or float ]

  • fc (complex)

The input can be

  • a numpy array where the index values for the time, frequency, and

coefficients are supplied in channel_key, frequency_key, channel_key as integer values.

  • a numpy structured array, dataframe, or xarray dataset or dataarray

where the channel_key if not supplied is assumed to the same as the fc_name.

haven’t fully tested dataframe or xarray yet.

Weights should be a separate data set as a 1D array along with matching index as fcs.

  • weight_channel (maybe)

  • weight_band (maybe)

  • weight_time (maybe)

Parameters
  • fc_datastet_name (TYPE) – DESCRIPTION

  • fc_dataset_metadata (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

add_weights(weight_name, weight_data=None, weight_metadata=None, max_shape=(None, None, None), chunks=True, **kwargs)[source]
property channel_summary

summary of channels in run :return: DESCRIPTION :rtype: TYPE

from_dataframe(df, channel_key, time_key='time', frequency_key='frequency')[source]

assumes channel_key is the coefficient values

Parameters
  • df (TYPE) – DESCRIPTION

  • channel_key (TYPE) – DESCRIPTION

  • time_key (TYPE, optional) – DESCRIPTION, defaults to “time”

  • frequency_key (TYPE, optional) – DESCRIPTION, defaults to “frequency”

Returns

DESCRIPTION

Return type

TYPE

from_numpy_array(nd_array, ch_name)[source]

assumes shape of (n_frequencies, n_windows) or (n_channels, n_frequencies, n_windows)

Parameters
  • nd_array (TYPE) – DESCRIPTION

  • ch_name – name of channel(s)

Returns

DESCRIPTION

Return type

TYPE

from_xarray(data_array)[source]

can input a dataarray or dataset

Parameters

data_array (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_channel(fc_name)[source]

get an fc dataset

Parameters

fc_dataset_name (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

property metadata

Overwrite get metadata to include channel information in the runs

remove_channel(fc_name)[source]

remove an fc dataset

Parameters

fc_dataset_name (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

to_xarray(channels=None)[source]

create an xarray dataset from the desired channels. If none grabs all channels in the decimation level.

Parameters

channels (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

update_metadata()[source]

update metadata from channels

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.FCGroup(group, decimation_level_metadata=None, **kwargs)[source]

Bases: BaseGroup

Holds a set of Fourier Coefficients based on a single set of configuration parameters.

Note

Must be calibrated FCs. Otherwise weird things will happen, can always rerun the FC estimation if the metadata changes.

Metadata should include:

  • list of decimation levels

  • start time (earliest)

  • end time (latest)

  • method (fft, wavelet, …)

  • list of channels (all inclusive)

  • list of acquistion runs (maybe)

  • starting sample rate

add_decimation_level(decimation_level_name, decimation_level_metadata=None)[source]

add a Decimation level

Parameters
  • decimation_level_name (TYPE) – DESCRIPTION

  • decimation_level_metadata (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

property decimation_level_summary

summary of channels in run :return: DESCRIPTION :rtype: TYPE

get_decimation_level(decimation_level_name)[source]

Get a Decimation Level

Parameters

decimation_level_name (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

property metadata

Overwrite get metadata to include channel information in the runs

remove_decimation_level(decimation_level_name)[source]

Remove decimation level

Parameters

decimation_level_name (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

supports_aurora_processing_config(processing_config, remote)[source]

This is an “all-or-nothing” check: Either every (valid) decimation level in the processing config is available or we will build all FCs.

Parameters
  • processing_config (aurora.config.metadata.processing.Processing) –

  • remote (bool) –

update_metadata()[source]

update metadata from channels

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.FiltersGroup(group, **kwargs)[source]

Bases: BaseGroup

Not implemented yet

add_filter(filter_object)[source]

Add a filter dataset based on type

current types are:
  • zpk –> zeros, poles, gain

  • fap –> frequency look up table

  • time_delay –> time delay filter

  • coefficient –> coefficient filter

Parameters

filter_object (mt_metadata.timeseries.filters) – An MT metadata filter object

property filter_dict
get_filter(name)[source]

Get a filter by name

to_filter_object(name)[source]

return the MT metadata representation of the filter

class mth5.groups.MagneticDataset(group, **kwargs)[source]

Bases: ChannelDataset

Holds a channel dataset. This is a simple container for the data to make sure that the user has the flexibility to turn the channel into an object they want to deal with.

For now all the numpy type slicing can be used on hdf5_dataset

Parameters
  • dataset (h5py.Dataset) – dataset object for the channel

  • dataset_metadata ([ mth5.metadata.Electric | mth5.metadata.Magnetic | mth5.metadata.Auxiliary ], optional) – metadata container, defaults to None

Raises

MTH5Error – If the dataset is not of the correct type

Utilities will be written to create some common objects like:

  • xarray.DataArray

  • pandas.DataFrame

  • zarr

  • dask.Array

The benefit of these other objects is that they can be indexed by time, and they have much more buit-in funcionality.

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> run = mth5_obj.stations_group.get_station('MT001').get_run('MT001a')
>>> channel = run.get_channel('Ex')
>>> channel
 Channel Electric:
 -------------------
   component:        Ey
   data type:        electric
   data format:      float32
   data shape:       (4096,)
   start:            1980-01-01T00:00:00+00:00
   end:              1980-01-01T00:00:01+00:00
   sample rate:      4096
class mth5.groups.MasterFCGroup(group, **kwargs)[source]

Bases: BaseGroup

Master group to hold various Fourier coefficient estimations of time series data. No metadata needed as of yet.

add_fc_group(fc_name, fc_metadata=None)[source]

Add a Fourier Coefficent group

Parameters
  • fc_name (TYPE) – DESCRIPTION

  • fc_metadata (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

property fc_summary

Summar of fourier coefficients

Returns

DESCRIPTION

Return type

TYPE

get_fc_group(fc_name)[source]

Get Fourier Coefficient group

Parameters

fc_name (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

remove_fc_group(fc_name)[source]

Remove an FC group

Parameters

fc_name (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.MasterStationGroup(group, **kwargs)[source]

Bases: BaseGroup

Utility class to holds information about the stations within a survey and accompanying metadata. This class is next level down from Survey for stations /Survey/Stations. This class provides methods to add and get stations. A summary table of all existing stations is also provided as a convenience look up table to make searching easier.

To access MasterStationGroup from an open MTH5 file:

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> stations = mth5_obj.stations_group

To check what stations exist

>>> stations.groups_list
['summary', 'MT001', 'MT002', 'MT003']

To access the hdf5 group directly use SurveyGroup.hdf5_group.

>>> stations.hdf5_group.ref
<HDF5 Group Reference>

Note

All attributes should be input into the metadata object, that way all input will be validated against the metadata standards. If you change attributes in metadata object, you should run the SurveyGroup.write_metadata() method. This is a temporary solution, working on an automatic updater if metadata is changed.

>>> stations.metadata.existing_attribute = 'update_existing_attribute'
>>> stations.write_metadata()

If you want to add a new attribute this should be done using the metadata.add_base_attribute method.

>>> stations.metadata.add_base_attribute('new_attribute',
>>> ...                                'new_attribute_value',
>>> ...                                {'type':str,
>>> ...                                 'required':True,
>>> ...                                 'style':'free form',
>>> ...                                 'description': 'new attribute desc.',
>>> ...                                 'units':None,
>>> ...                                 'options':[],
>>> ...                                 'alias':[],
>>> ...                                 'example':'new attribute

To add a station:

>>> new_station = stations.add_station('new_station')
>>> stations
/Survey/Stations:
====================
    --> Dataset: summary
    ......................
    |- Group: new_station
    ---------------------
        --> Dataset: summary
        ......................

Add a station with metadata:

>>> from mth5.metadata import Station
>>> station_metadata = Station()
>>> station_metadata.id = 'MT004'
>>> station_metadata.time_period.start = '2020-01-01T12:30:00'
>>> station_metadata.location.latitude = 40.000
>>> station_metadata.location.longitude = -120.000
>>> new_station = stations.add_station('Test_01', station_metadata)
>>> # to look at the metadata
>>> new_station.metadata
{
    "station": {
        "acquired_by.author": null,
        "acquired_by.comments": null,
        "id": "MT004",
        ...
        }
}

See also

mth5.metadata for details on how to add metadata from various files and python objects.

To remove a station:

>>> stations.remove_station('new_station')
>>> stations
/Survey/Stations:
====================
    --> Dataset: summary
    ......................

Note

Deleting a station is not as simple as del(station). In HDF5 this does not free up memory, it simply removes the reference to that station. The common way to get around this is to copy what you want into a new file, or overwrite the station.

To get a station:

>>> existing_station = stations.get_station('existing_station_name')
>>> existing_station
/Survey/Stations/existing_station_name:
=======================================
    --> Dataset: summary
    ......................
    |- Group: run_01
    ----------------
        --> Dataset: summary
        ......................
        --> Dataset: Ex
        ......................
        --> Dataset: Ey
        ......................
        --> Dataset: Hx
        ......................
        --> Dataset: Hy
        ......................
        --> Dataset: Hz
        ......................

A summary table is provided to make searching easier. The table summarized all stations within a survey. To see what names are in the summary table:

>>> stations.station_summary
add_station(station_name, station_metadata=None)[source]
Add a station with metadata if given with the path:

/Survey/Stations/station_name

If the station already exists, will return that station and nothing is added.

Parameters
  • station_name (string) – Name of the station, should be the same as metadata.id

  • station_metadata (mth5.metadata.Station, optional) – Station metadata container, defaults to None

Returns

A convenience class for the added station

Return type

mth5_groups.StationGroup

Example
>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> # one option
>>> stations = mth5_obj.stations_group
>>> new_station = stations.add_station('MT001')
>>> # another option
>>> new_staiton = mth5_obj.stations_group.add_station('MT001')
get_station(station_name)[source]

Get a station with the same name as station_name

Parameters

station_name (string) – existing station name

Returns

convenience station class

Return type

mth5.mth5_groups.StationGroup

Raises

MTH5Error – if the station name is not found.

Example

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> # one option
>>> stations = mth5_obj.stations_group
>>> existing_station = stations.get_station('MT001')
>>> # another option
>>> existing_staiton = mth5_obj.stations_group.get_station('MT001')
MTH5Error: MT001 does not exist, check station_list for existing names
remove_station(station_name)[source]

Remove a station from the file.

Note

Deleting a station is not as simple as del(station). In HDF5 this does not free up memory, it simply removes the reference to that station. The common way to get around this is to copy what you want into a new file, or overwrite the station.

Parameters

station_name (string) – existing station name

Example
>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> # one option
>>> stations = mth5_obj.stations_group
>>> stations.remove_station('MT001')
>>> # another option
>>> mth5_obj.stations_group.remove_station('MT001')
property station_summary

Summary of stations in the file

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.MasterSurveyGroup(group, **kwargs)[source]

Bases: BaseGroup

Utility class to hold information about the surveys within an experiment and accompanying metadata. This class is next level down from Experiment for stations Experiment/Surveys. This class provides methods to add and get surveys.

To access MasterSurveyGroup from an open MTH5 file:

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> surveys = mth5_obj.surveys_group

To check what stations exist

>>> surveys.groups_list
['survey_01', 'survey_02']

To access the hdf5 group directly use SurveyGroup.hdf5_group.

>>> stations.hdf5_group.ref
<HDF5 Group Reference>

Note

All attributes should be input into the metadata object, that way all input will be validated against the metadata standards. If you change attributes in metadata object, you should run the SurveyGroup.write_metadata() method. This is a temporary solution, working on an automatic updater if metadata is changed.

If you want to add a new attribute this should be done using the metadata.add_base_attribute method.

>>> stations.metadata.add_base_attribute('new_attribute',
>>> ...                                'new_attribute_value',
>>> ...                                {'type':str,
>>> ...                                 'required':True,
>>> ...                                 'style':'free form',
>>> ...                                 'description': 'new attribute desc.',
>>> ...                                 'units':None,
>>> ...                                 'options':[],
>>> ...                                 'alias':[],
>>> ...                                 'example':'new attribute

To add a survey:

>>> new_survey = surveys.add_survey('new_survey')
>>> surveys
Experiment/Surveys:
====================
    |- Group: new_survey
    ---------------------
        |- Group: Filters
        ------------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
        |- Group: Stations
        ------------------

Add a survey with metadata:

>>> from mth5.metadata import Survey
>>> survey_metadata = Survey()
>>> survey_metadata.id = 'MT004'
>>> survey_metadata.time_period.start = '2020-01-01T12:30:00'
>>> new_survey = surveys.add_survey('Test_01', survey_metadata)
>>> # to look at the metadata
>>> new_survey.metadata
{
    "survey": {
        "acquired_by.author": null,
        "acquired_by.comments": null,
        "id": "MT004",
        ...
        }
}

See also

mth5.metadata for details on how to add metadata from various files and python objects.

To remove a survey:

>>> surveys.remove_survey('new_survey')
>>> surveys
/Survey/Stations:
====================

Note

Deleting a survey is not as simple as del(survey). In HDF5 this does not free up memory, it simply removes the reference to that survey. The common way to get around this is to copy what you want into a new file, or overwrite the survey.

To get a survey:

>>> existing_survey = surveys.get_survey('existing_survey_name')
add_survey(survey_name, survey_metadata=None)[source]
Add a survey with metadata if given with the path:

/Survey/surveys/survey_name

If the survey already exists, will return that survey and nothing is added.

Parameters
  • survey_name (string) – Name of the survey, should be the same as metadata.id

  • survey_metadata (mth5.metadata.Station, optional) – Station metadata container, defaults to None

Returns

A convenience class for the added survey

Return type

mth5_groups.StationGroup

To add a survey

>>> new_survey = surveys.add_survey('new_survey')
>>> surveys
Experiment/Surveys:
====================
    |- Group: new_survey
    ---------------------
        |- Group: Filters
        ------------------
        |- Group: Reports
        -----------------
        |- Group: Standards
        -------------------
        |- Group: Stations
        ------------------
Add a survey with metadata

>>> from mth5.metadata import Survey
>>> survey_metadata = Survey()
>>> survey_metadata.id = 'MT004'
>>> survey_metadata.time_period.start = '2020-01-01T12:30:00'
>>> new_survey = surveys.add_survey('Test_01', survey_metadata)
>>> # to look at the metadata
>>> new_survey.metadata
{
    "survey": {
        "acquired_by.author": null,
        "acquired_by.comments": null,
        "id": "MT004",
        ...
        }
}

See also

mth5.metadata for details on how to add metadata from various files and python objects.

property channel_summary

Summary of all channels in the file.

get_survey(survey_name)[source]

Get a survey with the same name as survey_name

Parameters

survey_name (string) – existing survey name

Returns

convenience survey class

Return type

mth5.mth5_groups.surveyGroup

Raises

MTH5Error – if the survey name is not found.

Example

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> # one option
>>> existing_survey = mth5_obj.get_survey('MT001')
>>> # another option
>>> existing_survey = mth5_obj.experiment_group.surveys_group.get_survey('MT001')
MTH5Error: MT001 does not exist, check survey_list for existing names
remove_survey(survey_name)[source]

Remove a survey from the file.

Note

Deleting a survey is not as simple as del(survey). In HDF5 this does not free up memory, it simply removes the reference to that survey. The common way to get around this is to copy what you want into a new file, or overwrite the survey.

Parameters

survey_name (string) – existing survey name

Example
>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> # one option
>>> mth5_obj.remove_survey('MT001')
>>> # another option
>>> mth5_obj.surveys_group.remove_survey('MT001')
class mth5.groups.ReportsGroup(group, **kwargs)[source]

Bases: BaseGroup

Not sure how to handle this yet

add_report(report_name, report_metadata=None, report_data=None)[source]
Parameters
  • report_name (TYPE) – DESCRIPTION

  • report_metadata (TYPE, optional) – DESCRIPTION, defaults to None

  • report_data (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.RunGroup(group, run_metadata=None, **kwargs)[source]

Bases: BaseGroup

RunGroup is a utility class to hold information about a single run and accompanying metadata. This class is the next level down from Stations –> /Survey/Stations/station/station{a-z}.

This class provides methods to add and get channels. A summary table of all existing channels in the run is also provided as a convenience look up table to make searching easier.

Parameters
  • group (h5py.Group) – HDF5 group for a station, should have a path /Survey/Stations/station_name/run_name

  • station_metadata (mth5.metadata.Station, optional) – metadata container, defaults to None

Access RunGroup from an open MTH5 file

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> run = mth5_obj.stations_group.get_station('MT001').get_run('MT001a')
Check what channels exist

>>> station.groups_list
['Ex', 'Ey', 'Hx', 'Hy']

To access the hdf5 group directly use RunGroup.hdf5_group

>>> station.hdf5_group.ref
<HDF5 Group Reference>

Note

All attributes should be input into the metadata object, that way all input will be validated against the metadata standards. If you change attributes in metadata object, you should run the SurveyGroup.write_metadata() method. This is a temporary solution, working on an automatic updater if metadata is changed.

>>> run.metadata.existing_attribute = 'update_existing_attribute'
>>> run.write_metadata()

If you want to add a new attribute this should be done using the metadata.add_base_attribute method.

>>> station.metadata.add_base_attribute('new_attribute',
>>> ...                                 'new_attribute_value',
>>> ...                                 {'type':str,
>>> ...                                  'required':True,
>>> ...                                  'style':'free form',
>>> ...                                  'description': 'new attribute desc.',
>>> ...                                  'units':None,
>>> ...                                  'options':[],
>>> ...                                  'alias':[],
>>> ...                                  'example':'new attribute
Add a channel

>>> new_channel = run.add_channel('Ex', 'electric',
>>> ...                            data=numpy.random.rand(4096))
>>> new_run
/Survey/Stations/MT001/MT001a:
=======================================
    --> Dataset: summary
    ......................
    --> Dataset: Ex
    ......................
    --> Dataset: Ey
    ......................
    --> Dataset: Hx
    ......................
    --> Dataset: Hy
    ......................
Add a channel with metadata

>>> from mth5.metadata import Electric
>>> ex_metadata = Electric()
>>> ex_metadata.time_period.start = '2020-01-01T12:30:00'
>>> ex_metadata.time_period.end = '2020-01-03T16:30:00'
>>> new_ex = run.add_channel('Ex', 'electric',
>>> ...                       channel_metadata=ex_metadata)
>>> # to look at the metadata
>>> new_ex.metadata
{
     "electric": {
        "ac.end": 1.2,
        "ac.start": 2.3,
        ...
        }
}

See also

mth5.metadata for details on how to add metadata from various files and python objects.

Remove a channel

>>> run.remove_channel('Ex')
>>> station
/Survey/Stations/MT001/MT001a:
=======================================
    --> Dataset: summary
    ......................
    --> Dataset: Ey
    ......................
    --> Dataset: Hx
    ......................
    --> Dataset: Hy
    ......................

Note

Deleting a station is not as simple as del(station). In HDF5 this does not free up memory, it simply removes the reference to that station. The common way to get around this is to copy what you want into a new file, or overwrite the station.

Get a channel

>>> existing_ex = stations.get_channel('Ex')
>>> existing_ex
Channel Electric:
-------------------
    data type:        Ex
    data type:        electric
    data format:      float32
    data shape:       (4096,)
    start:            1980-01-01T00:00:00+00:00
    end:              1980-01-01T00:32:+08:00
    sample rate:      8
Summary Table

A summary table is provided to make searching easier. The table summarized all stations within a survey. To see what names are in the summary table:

>>> run.summary_table.dtype.descr
[('component', ('|S5', {'h5py_encoding': 'ascii'})),
 ('start', ('|S32', {'h5py_encoding': 'ascii'})),
 ('end', ('|S32', {'h5py_encoding': 'ascii'})),
 ('n_samples', '<i4'),
 ('measurement_type', ('|S12', {'h5py_encoding': 'ascii'})),
 ('units', ('|S25', {'h5py_encoding': 'ascii'})),
 ('hdf5_reference', ('|O', {'ref': h5py.h5r.Reference}))]

Note

When a run is added an entry is added to the summary table, where the information is pulled from the metadata.

>>> new_run.summary_table
index | component | start | end | n_samples | measurement_type | units |
hdf5_reference
--------------------------------------------------------------------------
-------------
add_channel(channel_name, channel_type, data, channel_dtype='int32', shape=None, max_shape=(None,), chunks=True, channel_metadata=None, **kwargs)[source]

add a channel to the run

Parameters
  • channel_name (string) – name of the channel

  • channel_type (string) – [ electric | magnetic | auxiliary ]

  • shape (tuple, optional) – Set the shape of the array, uses the data if input. Can be useful if you are setting up the file. This will set the size of the dataset, whereas max_shape sets the max shape which ends up in different memory size. If you are not sure about the size of the array suggest using max_shape, but if you already know and want to start with an array of 0’s use shape, defaults to None

  • max_shape (tuple, optional) – Absolute max shape of the data to be stored, this means the data can be extended up to the given shape. If None is given then the data can be extended infinitely (or until memory runs out), defaults to (None,)

  • chunks (bool, optional) – Use chunked storage, defaults to True

  • channel_metadata ([ mth5.metadata.Electric | mth5.metadata.Magnetic | mth5.metadata.Auxiliary ], optional) – metadata container, defaults to None

  • **kwargs

    Key word arguments

Raises

MTH5Error – If channel type is not correct

Returns

DESCRIPTION

Return type

TYPE

Returns

Channel container

Return type

[ mth5.mth5_groups.ElectricDatset | mth5.mth5_groups.MagneticDatset | mth5.mth5_groups.AuxiliaryDatset ]

>>> new_channel = run.add_channel('Ex', 'electric', None)
>>> new_channel
Channel Electric:
-------------------
                component:        None
        data type:        electric
        data format:      float32
        data shape:       (1,)
        start:            1980-01-01T00:00:00+00:00
        end:              1980-01-01T00:00:00+00:00
        sample rate:      None
property channel_summary

summary of channels in run :return: DESCRIPTION :rtype: TYPE

from_channel_ts(channel_ts_obj)[source]

create a channel data set from a mth5.timeseries.ChannelTS object and update metadata.

Parameters

channel_ts_obj (mth5.timeseries.ChannelTS) – a single time series object

Returns

new channel dataset

Return type

:class:`mth5.groups.ChannelDataset

from_runts(run_ts_obj, **kwargs)[source]

create channel datasets from a mth5.timeseries.RunTS object and update metadata.

:parameter mth5.timeseries.RunTS run_ts_obj: Run object with all the appropriate channels and metadata.

Will create a run group and appropriate channel datasets.

get_channel(channel_name)[source]

Get a channel from an existing name. Returns the appropriate container.

Parameters

channel_name (string) – name of the channel

Returns

Channel container

Return type

[ mth5.mth5_groups.ElectricDatset | mth5.mth5_groups.MagneticDatset | mth5.mth5_groups.AuxiliaryDatset ]

Raises

MTH5Error – If no channel is found

Example

>>> existing_channel = run.get_channel('Ex')
MTH5Error: Ex does not exist, check groups_list for existing names'
>>> run.groups_list
['Ey', 'Hx', 'Hz']
>>> existing_channel = run.get_channel('Ey')
>>> existing_channel
Channel Electric:
-------------------
                component:        Ey
        data type:        electric
        data format:      float32
        data shape:       (4096,)
        start:            1980-01-01T00:00:00+00:00
        end:              1980-01-01T00:00:01+00:00
        sample rate:      4096
property metadata

Overwrite get metadata to include channel information in the runs

plot(start=None, end=None, n_samples=None)[source]

Produce a simple matplotlib plot using runts

remove_channel(channel_name)[source]

Remove a run from the station.

Note

Deleting a channel is not as simple as del(channel). In HDF5 this does not free up memory, it simply removes the reference to that channel. The common way to get around this is to copy what you want into a new file, or overwrite the channel.

Parameters

station_name (string) – existing station name

Example

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> run = mth5_obj.stations_group.get_station('MT001').get_run('MT001a')
>>> run.remove_channel('Ex')
property station_metadata

station metadata

property survey_metadata

survey metadata

to_runts(start=None, end=None, n_samples=None)[source]

create a mth5.timeseries.RunTS object from channels of the run

Returns

DESCRIPTION

Return type

TYPE

update_run_metadata()[source]

Update metadata and table entries to ensure consistency

Returns

DESCRIPTION

Return type

TYPE

write_metadata()[source]

Overwrite Base.write_metadata to include updating table entry Write HDF5 metadata from metadata object.

class mth5.groups.StandardsGroup(group, **kwargs)[source]

Bases: BaseGroup

The StandardsGroup is a convenience group that stores the metadata standards that were used to make the current file. This is to help a user understand the metadata directly from the file and not have to look up documentation that might not be updated.

The metadata standards are stored in the summary table /Survey/Standards/summary

>>> standards = mth5_obj.standards_group
>>> standards.summary_table
index | attribute | type | required | style | units | description |
options  |  alias |  example
--------------------------------------------------------------------------
get_attribute_information(attribute_name)[source]

get information about an attribute

The attribute name should be in the summary table.

Parameters

attribute_name (string) – attribute name

Returns

prints a description of the attribute

Raises

MTH5TableError – if attribute is not found

>>> standars = mth5_obj.standards_group
>>> standards.get_attribute_information('survey.release_license')
survey.release_license
--------------------------
        type          : string
        required      : True
        style         : controlled vocabulary
        units         :
        description   : How the data can be used. The options are based on
                 Creative Commons licenses. For details visit
                 https://creativecommons.org/licenses/
        options       : CC-0,CC-BY,CC-BY-SA,CC-BY-ND,CC-BY-NC-SA,CC-BY-NC-ND
        alias         :
        example       : CC-0
        default       : CC-0
initialize_group()[source]

Initialize the group by making a summary table that summarizes the metadata standards used to describe the data.

Also, write generic metadata information.

property summary_table
summary_table_from_dict(summary_dict)[source]

Fill summary table from a dictionary that summarizes the metadata for the entire survey.

Parameters

summary_dict (dictionary) – Flattened dictionary of all metadata standards within the survey.

class mth5.groups.StationGroup(group, station_metadata=None, **kwargs)[source]

Bases: BaseGroup

StationGroup is a utility class to hold information about a single station and accompanying metadata. This class is the next level down from Stations –> /Survey/Stations/station_name.

This class provides methods to add and get runs. A summary table of all existing runs in the station is also provided as a convenience look up table to make searching easier.

Parameters
  • group (h5py.Group) – HDF5 group for a station, should have a path /Survey/Stations/station_name

  • station_metadata (mth5.metadata.Station, optional) – metadata container, defaults to None

Usage

Access StationGroup from an open MTH5 file

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> station = mth5_obj.stations_group.get_station('MT001')
Check what runs exist

>>> station.groups_list
['MT001a', 'MT001b', 'MT001c', 'MT001d']

To access the hdf5 group directly use StationGroup.hdf5_group.

>>> station.hdf5_group.ref
<HDF5 Group Reference>

Note

All attributes should be input into the metadata object, that way all input will be validated against the metadata standards. If you change attributes in metadata object, you should run the SurveyGroup.write_metadata() method. This is a temporary solution, working on an automatic updater if metadata is changed.

>>> station.metadata.existing_attribute = 'update_existing_attribute'
>>> station.write_metadata()

If you want to add a new attribute this should be done using the metadata.add_base_attribute method.

>>> station.metadata.add_base_attribute('new_attribute',
>>> ...                                 'new_attribute_value',
>>> ...                                 {'type':str,
>>> ...                                  'required':True,
>>> ...                                  'style':'free form',
>>> ...                                  'description': 'new attribute desc.',
>>> ...                                  'units':None,
>>> ...                                  'options':[],
>>> ...                                  'alias':[],
>>> ...                                  'example':'new attribute
To add a run

>>> new_run = stations.add_run('MT001e')
>>> new_run
/Survey/Stations/Test_01:
=========================
    |- Group: MT001e
    -----------------
        --> Dataset: summary
        ......................
    --> Dataset: summary
    ......................
Add a run with metadata

>>> from mth5.metadata import Run
>>> run_metadata = Run()
>>> run_metadata.time_period.start = '2020-01-01T12:30:00'
>>> run_metadata.time_period.end = '2020-01-03T16:30:00'
>>> run_metadata.location.latitude = 40.000
>>> run_metadata.location.longitude = -120.000
>>> new_run = runs.add_run('Test_01', run_metadata)
>>> # to look at the metadata
>>> new_run.metadata
{
    "run": {
        "acquired_by.author": "new_user",
        "acquired_by.comments": "First time",
        "channels_recorded_auxiliary": ['T'],
        ...
        }
}

See also

mth5.metadata for details on how to add metadata from various files and python objects.

Remove a run

>>> station.remove_run('new_run')
>>> station
/Survey/Stations/Test_01:
=========================
    --> Dataset: summary
    ......................

Note

Deleting a station is not as simple as del(station). In HDF5 this does not free up memory, it simply removes the reference to that station. The common way to get around this is to copy what you want into a new file, or overwrite the station.

Get a run

>>> existing_run = stations.get_station('existing_run')
>>> existing_run
/Survey/Stations/MT001/MT001a:
=======================================
    --> Dataset: summary
    ......................
    --> Dataset: Ex
    ......................
    --> Dataset: Ey
    ......................
    --> Dataset: Hx
    ......................
    --> Dataset: Hy
    ......................
    --> Dataset: Hz
    ......................
Summary Table

A summary table is provided to make searching easier. The table summarized all stations within a survey. To see what names are in the summary table:

>>> new_run.summary_table.dtype.descr
[('id', ('|S20', {'h5py_encoding': 'ascii'})),
 ('start', ('|S32', {'h5py_encoding': 'ascii'})),
 ('end', ('|S32', {'h5py_encoding': 'ascii'})),
 ('components', ('|S100', {'h5py_encoding': 'ascii'})),
 ('measurement_type', ('|S12', {'h5py_encoding': 'ascii'})),
 ('sample_rate', '<f8'),
 ('hdf5_reference', ('|O', {'ref': h5py.h5r.Reference}))]

Note

When a run is added an entry is added to the summary table, where the information is pulled from the metadata.

>>> station.summary_table
index | id | start | end | components | measurement_type | sample_rate |
hdf5_reference
--------------------------------------------------------------------------
-------------
add_run(run_name, run_metadata=None)[source]

Add a run to a station.

Parameters
  • run_name (string) – run name, should be id{a-z}

  • metadata (mth5.metadata.Station, optional) – metadata container, defaults to None

need to be able to fill an entry in the summary table.

property fourier_coefficients_group

Convinience method for /Station/Fourier_Coefficients

get_run(run_name)[source]

get a run from run name

Parameters

run_name (string) – existing run name

Returns

Run object

Return type

mth5.mth5_groups.RunGroup

>>> existing_run = station.get_run('MT001')
initialize_group(**kwargs)[source]

Initialize group by making a summary table and writing metadata

locate_run(sample_rate, start)[source]

Locate a run based on sample rate and start time from the summary table

Parameters
  • sample_rate (float) – sample rate in samples/seconds

  • start (string or mth5.utils.mttime.MTime) – start time

Returns

appropriate run name, None if not found

Return type

string or None

make_run_name(alphabet=False)[source]

Make a run name that will be the next alphabet letter extracted from the run list. Expects that all runs are labled as id{a-z}.

Returns

metadata.id + next letter

Return type

string

>>> station.metadata.id = 'MT001'
>>> station.make_run_name()
'MT001a'
property master_station_group

shortcut to master station group

property metadata

Overwrite get metadata to include run information in the station

property name
remove_run(run_name)[source]

Remove a run from the station.

Note

Deleting a station is not as simple as del(station). In HDF5 this does not free up memory, it simply removes the reference to that station. The common way to get around this is to copy what you want into a new file, or overwrite the station.

Parameters

station_name (string) – existing station name

Example
>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> # one option
>>> stations = mth5_obj.stations_group
>>> stations.remove_station('MT001')
>>> # another option
>>> mth5_obj.stations_group.remove_station('MT001')
property run_summary

Summary of runs in the station

Returns

DESCRIPTION

Return type

TYPE

property survey_metadata

survey metadata

property transfer_functions_group

Convinience method for /Station/Transfer_Functions

update_station_metadata()[source]

Check metadata from the runs and make sure it matches the station metadata

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.SurveyGroup(group, survey_metadata=None, **kwargs)[source]

Bases: BaseGroup

Utility class to holds general information about the survey and accompanying metadata for an MT survey.

To access the hdf5 group directly use SurveyGroup.hdf5_group.

>>> survey = SurveyGroup(hdf5_group)
>>> survey.hdf5_group.ref
<HDF5 Group Reference>

Note

All attributes should be input into the metadata object, that way all input will be validated against the metadata standards. If you change attributes in metadata object, you should run the SurveyGroup.write_metadata() method. This is a temporary solution, working on an automatic updater if metadata is changed.

>>> survey.metadata.existing_attribute = 'update_existing_attribute'
>>> survey.write_metadata()

If you want to add a new attribute this should be done using the metadata.add_base_attribute method.

>>> survey.metadata.add_base_attribute('new_attribute',
>>> ...                                'new_attribute_value',
>>> ...                                {'type':str,
>>> ...                                 'required':True,
>>> ...                                 'style':'free form',
>>> ...                                 'description': 'new attribute desc.',
>>> ...                                 'units':None,
>>> ...                                 'options':[],
>>> ...                                 'alias':[],
>>> ...                                 'example':'new attribute

Tip

If you want ot add surveys, reports, etc to the survey this should be done from the MTH5 object. This is to avoid duplication, at least for now.

To look at what the structure of /Survey looks like:

>>> survey
/Survey:
====================
    |- Group: Filters
    -----------------
        --> Dataset: summary
    -----------------
    |- Group: Reports
    -----------------
        --> Dataset: summary
        -----------------
    |- Group: Standards
    -------------------
        --> Dataset: summary
        -----------------
    |- Group: Stations
    ------------------
        --> Dataset: summary
        -----------------
property filters_group

Convenience property for /Survey/Filters group

initialize_group(**kwargs)[source]

Initialize group by making a summary table and writing metadata

property metadata

Overwrite get metadata to include station information in the survey

property reports_group

Convenience property for /Survey/Reports group

property standards_group

Convenience property for /Survey/Standards group

property stations_group
update_survey_metadata(survey_dict=None)[source]

update start end dates and location corners from stations_group.summary_table

class mth5.groups.TransferFunctionGroup(group, **kwargs)[source]

Bases: BaseGroup

Object to hold a single transfer function estimation

add_statistical_estimate(estimate_name, estimate_data=None, estimate_metadata=None, max_shape=(None, None, None), chunks=True, **kwargs)[source]

Add a StatisticalEstimate

Parameters

estimate (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

from_tf_object(tf_obj)[source]

Create data sets from a mt_metadata.transfer_function.core.TF object.

Parameters

tf_obj (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_estimate(estimate_name)[source]

Get a statistical estimate dataset

has_estimate(estimate)[source]

has estimate

property period

Get period from hdf5_group[“period”]

Returns

DESCRIPTION

Return type

TYPE

remove_estimate(estimate_name)[source]

remove a statistical estimate

Parameters

estimate_name (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

to_tf_object()[source]

Create a mt_metadata.transfer_function.core.TF object from the estimates in the group

Returns

DESCRIPTION

Return type

TYPE

class mth5.groups.TransferFunctionsGroup(group, **kwargs)[source]

Bases: BaseGroup

Object to hold transfer functions

The is the high level group, all transfer functions for the station are held here and each one will have its own TransferFunctionGroup.

This has add, get, remove_transfer_function.

add_transfer_function(name, tf_object=None)[source]

Add a transfer function to the group

Parameters
  • name (string) – name of the transfer function

  • tf_object (mt_metadata.transfer_function.core.TF) – Transfer Function object

Returns

DESCRIPTION

Return type

TYPE

>>> from mth5.mth5 import MTH5
>>> m = MTH5()
>>> m.open_mth5("example.h5", "a")
>>> station_group = m.get_station("mt01", survey="test")
>>> tf_group = station_group.transfer_functions_group
>>> tf_group.add_transfer_function("mt01_4096", tf_object)
get_tf_object(tf_id)[source]

This is the function you want to use to get a proper mt_metadata.transfer_functions.core.TF object with all the appropriate metadata.

Parameters

tf_id (string) – name of the transfer function to get

Returns

Full transfer function with appropriate metadata

Return type

mt_metadata.transfer_functions.core.TF

>>> from mth5.mth5 import MTH5
>>> m = MTH5()
>>> m.open_mth5("example.h5", "a")
>>> station_group = m.get_station("mt01", survey="test")
>>> tf_group = station_group.transfer_functions_group
>>> tf_object = tf_group.get_tf_object("mt01_4096")
get_transfer_function(tf_id)[source]

Get transfer function from id

Parameters

tf_id (string) – name of transfer function

Returns

Transfer function group

Return type

mth5.groups.TransferFunctionGroup

>>> from mth5.mth5 import MTH5
>>> m = MTH5()
>>> m.open_mth5("example.h5", "a")
>>> station_group = m.get_station("mt01", survey="test")
>>> tf_group = station_group.transfer_functions_group.get_transfer_function("mt01_4096")
remove_transfer_function(tf_id)[source]

Remove a transfer function from the group

Parameters

tf_id (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

>>> from mth5.mth5 import MTH5
>>> m = MTH5()
>>> m.open_mth5("example.h5", "a")
>>> station_group = m.get_station("mt01", survey="test")
>>> tf_group = station_group.transfer_functions_group
>>> tf_group.remove_transfer_function("mt01_4096")
tf_summary(as_dataframe=True)[source]

Summary of all transfer functions in this group

Returns

DESCRIPTION

Return type

TYPE

mth5.io package
Subpackages
mth5.io.lemi package
Submodules
mth5.io.lemi.lemi424 module

Created on Tue May 11 15:31:31 2021

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.io.lemi.lemi424.LEMI424(fn=None, **kwargs)[source]

Bases: object

Read in a LEMI424 file, this is a place holder until IRIS finalizes their reader.

Parameters
  • fn (pathlib.Path or string) – full path to LEMI424 file

  • sample_rate (float) – sample rate of the file, default is 1.0

  • chunk_size (integer) – chunk size for pandas to use, does not change reading time much for a single day file. default is 8640

  • file_column_names (list of strings) – column names of the LEMI424 file

  • dtypes (dictionary with keys of column names and values of data types) – data types for each column

  • data_column_names (dictionary with keys of column names and values of data types) – same as file_column names with and added column for date, which is the combined date and time columns.

LEMI424 File Column Names
  • year

  • month

  • day

  • hour

  • minute

  • second

  • bx

  • by

  • bz

  • temperature_e

  • temperature_h

  • e1

  • e2

  • e3

  • e4

  • battery

  • elevation

  • latitude

  • lat_hemisphere

  • longitude

  • lon_hemisphere

  • n_satellites

  • gps_fix

  • time_diff

Data Column Names
  • date

  • bx

  • by

  • bz

  • temperature_e

  • temperature_h

  • e1

  • e2

  • e3

  • e4

  • battery

  • elevation

  • latitude

  • lat_hemisphere

  • longitude

  • lon_hemisphere

  • n_satellites

  • gps_fix

  • time_diff

property data

Data represented as a pandas.DataFrame with data_column names

property elevation

median elevation where data have been collected in the LEMI424 file

property end

end time of data collection in the LEMI424 file

property file_size

size of file in bytes

property fn

full path to LEMI424 file

property gps_lock

has GPS lock

property latitude

median latitude where data have been collected in the LEMI424 file

property longitude

median longitude where data have been collected in the LEMI424 file

property n_samples

number of samples in the file

read(fn=None, fast=True)[source]

Read a LEMI424 file using pandas. The fast way will read in the first and last line to get the start and end time to make a time index. Then it will read in the data skipping parsing the date time columns. It will check to make sure the expected amount of points are correct. If not then it will read in the slower way which used the date time parser to ensure any time gaps are respected.

Parameters
  • fn (string or pathlib.Path, optional) – full path to file, defaults to None. Uses LEMI424.fn if not provided

  • fast – read the fast way (True) or not (False)

Returns

DESCRIPTION

Return type

TYPE

read_metadata()[source]

Read only first and last rows to get important metadata to use in the collection.

property run_metadata

run metadata as mt_metadata.timeseries.Run

property start

start time of data collection in the LEMI424 file

property station_metadata

station metadata as mt_metadata.timeseries.Station

to_run_ts(fn=None, e_channels=['e1', 'e2'])[source]

Create a mth5.timeseries.RunTS object from the data

Parameters
  • fn (string or pathlib.Path, optional) – full path to file, defaults to None. Will use LEMI424.fn if None.

  • e_channels (list of strings, optional) – columns for the electric channels to use, defaults to [“e1”, “e2”]

Returns

RunTS object

Return type

mth5.timeseries.RunTS

mth5.io.lemi.lemi424.lemi_date_parser(year, month, day, hour, minute, second)[source]

convenience function to combine the date-time columns that are output by lemi into a single column

Assumes UTC

Parameters
  • year (int) – year

  • month (int) – month

  • day (int) – day of the month

  • hour (int) – hour in 24 hr format

  • minute (int) – minutes in the hour

  • second (int) – seconds in the minute

Returns

date time as a single column

Return type

pandas.DateTime

mth5.io.lemi.lemi424.lemi_hemisphere_parser(hemisphere)[source]

convert hemisphere into a value [-1, 1]. Assumes the prime meridian is 0.

Parameters

hemisphere (string) – hemisphere string [ ‘N’ | ‘S’ | ‘E’ | ‘W’]

Returns

unity with a sign for the given hemisphere

Return type

signed integer

mth5.io.lemi.lemi424.lemi_position_parser(position)[source]

convenience function to parse the location strings into a decimal float Uses the hemisphere for the sign.

Note

the format of the location is odd in that it is multiplied by 100 within the LEMI to provide a single floating point value that includes the degrees and decimal degrees –> {degrees}{degrees[mm.ss]}. For example 40.50166 would be represented as 4030.1.

Parameters

position (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

mth5.io.lemi.lemi424.read_lemi424(fn, e_channels=['e1', 'e2'], fast=True)[source]

Read a LEMI 424 TXT file.

Parameters
  • fn (string or Path) – input file name

  • e_channels – A list of electric channels to read,

defaults to [“e1”, “e2”] :type e_channels: list of strings, optional :return: A RunTS object with appropriate metadata :rtype: mth5.timeseries.RunTS

mth5.io.lemi.lemi_collection module
LEMI 424 Collection

Collection of TXT files combined into runs

Created on Wed Aug 31 10:32:44 2022

@author: jpeacock

class mth5.io.lemi.lemi_collection.LEMICollection(file_path=None, **kwargs)[source]

Bases: Collection

Collection of LEMI 424 files into runs based on start and end times. Will assign the run name as ‘sr1_{index:0{zeros}}’ –> ‘sr1_0001’ for zeros = 4.

Parameters
  • file_path (string or :class`pathlib.Path`) – full path to single station LEMI424 directory

  • file_ext (string) – extension of LEMI424 files, default is ‘txt’

  • station_id (string) – station id

  • survey_id (string) – survey id

Note

This class assumes that the given file path contains a single LEMI station. If you want to do multiple stations merge the returned data frames.

Note

LEMI data comes with little metadata about the station or survey, therefore you should assign station_id and survey_id.

>>> from mth5.io.lemi import LEMICollection
>>> lc = LEMICollection(r"/path/to/single/lemi/station")
>>> lc.station_id = "mt001"
>>> lc.survey_id = "test_survey"
>>> run_dict = lc.get_runs(1)
assign_run_names(df, zeros=4)[source]

Assign run names based on start and end times, checks if a file has the same start time as the last end time.

Run names are assigned as sr{sample_rate}_{run_number:0{zeros}}.

Parameters
  • df (pandas.DataFrame) – Dataframe with the appropriate columns

  • zeros (int, optional) – number of zeros in run name, defaults to 4

Returns

Dataframe with run names

Return type

pandas.DataFrame

to_dataframe(sample_rates=[1], run_name_zeros=4, calibration_path=None)[source]

Create a data frame of each TXT file in a given directory.

Note

This assumes the given directory contains a single station

Parameters
  • sample_rates (int or list, optional) – sample rate to get, will always be 1 for LEMI data defaults to [1]

  • run_name_zeros (int, optional) – number of zeros to assing to the run name, defaults to 4

  • calibration_path (string or Path, optional) – path to calibration files, defaults to None

Returns

Dataframe with information of each TXT file in the given directory.

Return type

pandas.DataFrame

Example
>>> from mth5.io.lemi import LEMICollection
>>> lc = LEMICollection("/path/to/single/lemi/station")
>>> lemi_df = lc.to_dataframe()
Module contents
class mth5.io.lemi.LEMI424(fn=None, **kwargs)[source]

Bases: object

Read in a LEMI424 file, this is a place holder until IRIS finalizes their reader.

Parameters
  • fn (pathlib.Path or string) – full path to LEMI424 file

  • sample_rate (float) – sample rate of the file, default is 1.0

  • chunk_size (integer) – chunk size for pandas to use, does not change reading time much for a single day file. default is 8640

  • file_column_names (list of strings) – column names of the LEMI424 file

  • dtypes (dictionary with keys of column names and values of data types) – data types for each column

  • data_column_names (dictionary with keys of column names and values of data types) – same as file_column names with and added column for date, which is the combined date and time columns.

LEMI424 File Column Names
  • year

  • month

  • day

  • hour

  • minute

  • second

  • bx

  • by

  • bz

  • temperature_e

  • temperature_h

  • e1

  • e2

  • e3

  • e4

  • battery

  • elevation

  • latitude

  • lat_hemisphere

  • longitude

  • lon_hemisphere

  • n_satellites

  • gps_fix

  • time_diff

Data Column Names
  • date

  • bx

  • by

  • bz

  • temperature_e

  • temperature_h

  • e1

  • e2

  • e3

  • e4

  • battery

  • elevation

  • latitude

  • lat_hemisphere

  • longitude

  • lon_hemisphere

  • n_satellites

  • gps_fix

  • time_diff

property data

Data represented as a pandas.DataFrame with data_column names

property elevation

median elevation where data have been collected in the LEMI424 file

property end

end time of data collection in the LEMI424 file

property file_size

size of file in bytes

property fn

full path to LEMI424 file

property gps_lock

has GPS lock

property latitude

median latitude where data have been collected in the LEMI424 file

property longitude

median longitude where data have been collected in the LEMI424 file

property n_samples

number of samples in the file

read(fn=None, fast=True)[source]

Read a LEMI424 file using pandas. The fast way will read in the first and last line to get the start and end time to make a time index. Then it will read in the data skipping parsing the date time columns. It will check to make sure the expected amount of points are correct. If not then it will read in the slower way which used the date time parser to ensure any time gaps are respected.

Parameters
  • fn (string or pathlib.Path, optional) – full path to file, defaults to None. Uses LEMI424.fn if not provided

  • fast – read the fast way (True) or not (False)

Returns

DESCRIPTION

Return type

TYPE

read_metadata()[source]

Read only first and last rows to get important metadata to use in the collection.

property run_metadata

run metadata as mt_metadata.timeseries.Run

property start

start time of data collection in the LEMI424 file

property station_metadata

station metadata as mt_metadata.timeseries.Station

to_run_ts(fn=None, e_channels=['e1', 'e2'])[source]

Create a mth5.timeseries.RunTS object from the data

Parameters
  • fn (string or pathlib.Path, optional) – full path to file, defaults to None. Will use LEMI424.fn if None.

  • e_channels (list of strings, optional) – columns for the electric channels to use, defaults to [“e1”, “e2”]

Returns

RunTS object

Return type

mth5.timeseries.RunTS

class mth5.io.lemi.LEMICollection(file_path=None, **kwargs)[source]

Bases: Collection

Collection of LEMI 424 files into runs based on start and end times. Will assign the run name as ‘sr1_{index:0{zeros}}’ –> ‘sr1_0001’ for zeros = 4.

Parameters
  • file_path (string or :class`pathlib.Path`) – full path to single station LEMI424 directory

  • file_ext (string) – extension of LEMI424 files, default is ‘txt’

  • station_id (string) – station id

  • survey_id (string) – survey id

Note

This class assumes that the given file path contains a single LEMI station. If you want to do multiple stations merge the returned data frames.

Note

LEMI data comes with little metadata about the station or survey, therefore you should assign station_id and survey_id.

>>> from mth5.io.lemi import LEMICollection
>>> lc = LEMICollection(r"/path/to/single/lemi/station")
>>> lc.station_id = "mt001"
>>> lc.survey_id = "test_survey"
>>> run_dict = lc.get_runs(1)
assign_run_names(df, zeros=4)[source]

Assign run names based on start and end times, checks if a file has the same start time as the last end time.

Run names are assigned as sr{sample_rate}_{run_number:0{zeros}}.

Parameters
  • df (pandas.DataFrame) – Dataframe with the appropriate columns

  • zeros (int, optional) – number of zeros in run name, defaults to 4

Returns

Dataframe with run names

Return type

pandas.DataFrame

to_dataframe(sample_rates=[1], run_name_zeros=4, calibration_path=None)[source]

Create a data frame of each TXT file in a given directory.

Note

This assumes the given directory contains a single station

Parameters
  • sample_rates (int or list, optional) – sample rate to get, will always be 1 for LEMI data defaults to [1]

  • run_name_zeros (int, optional) – number of zeros to assing to the run name, defaults to 4

  • calibration_path (string or Path, optional) – path to calibration files, defaults to None

Returns

Dataframe with information of each TXT file in the given directory.

Return type

pandas.DataFrame

Example
>>> from mth5.io.lemi import LEMICollection
>>> lc = LEMICollection("/path/to/single/lemi/station")
>>> lemi_df = lc.to_dataframe()
mth5.io.lemi.read_lemi424(fn, e_channels=['e1', 'e2'], fast=True)[source]

Read a LEMI 424 TXT file.

Parameters
  • fn (string or Path) – input file name

  • e_channels – A list of electric channels to read,

defaults to [“e1”, “e2”] :type e_channels: list of strings, optional :return: A RunTS object with appropriate metadata :rtype: mth5.timeseries.RunTS

mth5.io.miniseed package
Submodules
mth5.io.miniseed.miniseed module

Created on Wed Sep 30 10:20:12 2020

author

Jared Peacock

license

MIT

mth5.io.miniseed.miniseed.read_miniseed(fn)[source]

Read a miniseed file into a mth5.timeseries.RunTS object. Uses Obspy to read the miniseed.

Parameters

fn (string) – full path to the miniseed file

Returns

RunTS object

Return type

mth5.timeseries.RunTS

Module contents
mth5.io.miniseed.read_miniseed(fn)[source]

Read a miniseed file into a mth5.timeseries.RunTS object. Uses Obspy to read the miniseed.

Parameters

fn (string) – full path to the miniseed file

Returns

RunTS object

Return type

mth5.timeseries.RunTS

mth5.io.nims package
Submodules
mth5.io.nims.gps module

Created on Thu Sep 1 11:43:56 2022

@author: jpeacock

class mth5.io.nims.gps.GPS(gps_string, index=0)[source]

Bases: object

class to parse GPS stamp from the NIMS

Depending on the type of Stamp different attributes will be filled.

GPRMC has full date and time information and declination GPGGA has elevation data

Note

GPGGA date is set to 1980-01-01 so that the time can be estimated. Should use GPRMC for accurate date/time information.

property declination

geomagnetic declination in degrees from north

property elevation

elevation in meters

property fix

GPS fixed

property gps_type

GPRMC or GPGGA

property latitude

Latitude in decimal degrees, WGS84

property longitude

Latitude in decimal degrees, WGS84

parse_gps_string(gps_string)[source]

Parse a raw gps string from the NIMS and set appropriate attributes. GPS string will first be validated, then parsed.

Parameters

gps_string (string) – raw GPS string to be parsed

property time_stamp

return a datetime object of the time stamp

validate_gps_list(gps_list)[source]

check to make sure the gps stamp is the correct format, checks each element for the proper format

Parameters

gps_list (list) – a parsed gps string from a NIMS

Raises

mth5.io.nims.GPSError if anything is wrong.

validate_gps_string(gps_string)[source]

make sure the string is valid, remove any binary numbers and find the end of the string as ‘*’

Parameters

gps_string (string) – raw GPS string to be validated

Returns

validated string or None if there is something wrong

exception mth5.io.nims.gps.GPSError[source]

Bases: Exception

mth5.io.nims.header module

Created on Thu Sep 1 12:57:32 2022

@author: jpeacock

exception mth5.io.nims.header.NIMSError[source]

Bases: Exception

class mth5.io.nims.header.NIMSHeader(fn=None)[source]

Bases: object

class to hold the NIMS header information.

A typical header looks like

'''
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>user field>>>>>>>>>>>>>>>>>>>>>>>>>>>>
SITE NAME: Budwieser Spring
STATE/PROVINCE: CA
COUNTRY: USA
>>> The following code in double quotes is REQUIRED to start the NIMS <<
>>> The next 3 lines contain values required for processing <<<<<<<<<<<<
>>> The lines after that are optional <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
"300b"  <-- 2CHAR EXPERIMENT CODE + 3 CHAR SITE CODE + RUN LETTER
1105-3; 1305-3  <-- SYSTEM BOX I.D.; MAG HEAD ID (if different)
106  0 <-- N-S Ex WIRE LENGTH (m); HEADING (deg E mag N)
109  90 <-- E-W Ey WIRE LENGTH (m); HEADING (deg E mag N)
1         <-- N ELECTRODE ID
3         <-- E ELECTRODE ID
2         <-- S ELECTRODE ID
4         <-- W ELECTRODE ID
Cu        <-- GROUND ELECTRODE INFO
GPS INFO: 01/10/19 16:16:42 1616.7000 3443.6088 115.7350 W 946.6
OPERATOR: KP
COMMENT: N/S CRS: .95/.96 DCV: 3.5 ACV:1
E/W CRS: .85/.86 DCV: 1.5 ACV: 1
Redeployed site for run b b/c possible animal disturbance
'''
property file_size

Size of the file

property fn

Full path to NIMS file

parse_header_dict(header_dict=None)[source]

parse the header dictionary into something useful

read_header(fn=None)[source]

read header information

Parameters

fn (string or pathlib.Path) – full path to file to read

Raises

mth5.io.nims.NIMSError if something is not right.

property station

Station ID

mth5.io.nims.nims module
NIMS
  • deals with reading in NIMS DATA.BIN files

This is a translation from Matlab codes written and edited by:
  • Anna Kelbert

  • Paul Bedrosian

  • Esteban Bowles-Martinez

  • Possibly others.

I’ve tested it against a version, and it matches. The data/GPS gaps I still don’t understand so for now the time series is just made continuous and the number of missing seconds is clipped from the end of the time series.

Note

this only works for 8Hz data for now

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.io.nims.nims.NIMS(fn=None)[source]

Bases: NIMSHeader

NIMS Class will read in a NIMS DATA.BIN file.

A fast way to read the binary files are to first read in the GPS strings, the third byte in each block as a character and parse that into valid GPS stamps.

Then read in the entire data set as unsigned 8 bit integers and reshape the data to be n seconds x block size. Then parse that array into the status information and data.

I only have a limited amount of .BIN files to test so this will likely break if there are issues such as data gaps. This has been tested against the matlab program loadNIMS by Anna Kelbert and the match for all the .bin files I have. If something looks weird check it against that program.

Warning

Currently Only 8 Hz data is supported

align_data(data_array, stamps)[source]

Need to match up the first good GPS stamp with the data

Do this by using the first GPS stamp and assuming that the time from the first time stamp to the start is the index value.

put the data into a pandas data frame that is indexed by time

Parameters
  • data_array (array) – structure array with columns for each component [hx, hy, hz, ex, ey]

  • stamps (list) – list of GPS stamps [[status_index, [GPRMC, GPGGA]]]

Returns

pandas DataFrame with colums of components and indexed by time initialized by the start time.

Note

Data gaps are squeezed cause not sure what a gap actually means.

property box_temperature

data logger temperature, sampled at 1 second

check_timing(stamps)[source]

make sure that there are the correct number of seconds in between the first and last GPS GPRMC stamps

Parameters

stamps (list) – list of GPS stamps [[status_index, [GPRMC, GPGGA]]]

Returns

[ True | False ] if data is valid or not.

Returns

gap index locations

Note

currently it is assumed that if a data gap occurs the data can be squeezed to remove them. Probably a more elegant way of doing it.

property declination

median elevation value from all the GPS stamps in decimal degrees WGS84

Only get from the first stamp within the sets

property elevation

median elevation value from all the GPS stamps in decimal degrees WGS84

Only get from the first stamp within the sets

property end_time

start time is the first good GPS time stamp minus the seconds to the beginning of the time series.

property ex

EX

property ex_metadata
property ey

EY

property ey_metadata
find_sequence(data_array, block_sequence=None)[source]

find a sequence in a given array

Parameters
  • data_array (array) – array of the data with shape [n, m] where n is the number of seconds recorded m is the block length for a given sampling rate.

  • block_sequence (list) – sequence pattern to locate default is [1, 131] the start of a data block.

Returns

array of index locations where the sequence is found.

get_channel_response(channel, dipole_length=1)[source]

Get the channel response for a given channel

Parameters
  • channel (TYPE) – DESCRIPTION

  • dipole_length (TYPE, optional) – DESCRIPTION, defaults to 1

Returns

DESCRIPTION

Return type

TYPE

get_stamps(nims_string)[source]

get a list of valid GPS strings and match synchronous GPRMC with GPGGA stamps if possible.

Parameters

nims_string (str) – raw GPS string output by NIMS

has_data()[source]
property hx

HX

property hx_metadata
property hy

HY

property hy_metadata
property hz

HZ

property hz_metadata
property latitude

median latitude value from all the GPS stamps in decimal degrees WGS84

Only get from the GPRMC stamp as they should be duplicates

property longitude

median longitude value from all the GPS stamps in decimal degrees WGS84

Only get from the first stamp within the sets

make_dt_index(start_time, sample_rate, stop_time=None, n_samples=None)[source]

make time index array

Note

date-time format should be YYYY-M-DDThh:mm:ss.ms UTC

Parameters
  • start_time (string) – start time

  • end_time (string) – end time

  • sample_rate (float) – sample_rate in samples/second

match_status_with_gps_stamps(status_array, gps_list)[source]

Match the index values from the status array with the index values of the GPS stamps. There appears to be a bit of wiggle room between when the lock is recorded and the stamp was actually recorded. This is typically 1 second and sometimes 2.

Parameters
  • status_array (array) – array of status values from each data block

  • gps_list (list) – list of valid GPS stamps [[GPRMC, GPGGA], …]

Note

I think there is a 2 second gap between the lock and the first stamp character.

property n_samples
read_nims(fn=None)[source]

Read NIMS DATA.BIN file.

  1. Read in the header information and stores those as attributes with the same names as in the header file.

  2. Locate the beginning of the data blocks by looking for the first [1, 131, …] combo. Anything before that is cut out.

  3. Make sure the data is a multiple of the block length, if the data is longer the extra bits are cut off.

  4. Read in the GPS data (3rd byte of each block) as characters. Parses those into valid GPS stamps with appropriate index locations of where the ‘$’ was found.

  5. Read in the data as unsigned 8-bit integers and reshape the array into [N, data_block_length]. Parse this array into the status information and the data.

  6. Remove duplicate blocks, by removing the first of the duplicates as suggested by Anna and Paul.

  7. Match the GPS locks from the status with valid GPS stamps.

  8. Check to make sure that there is the correct number of seconds between the first and last GPS stamp. The extra seconds are cut off from the end of the time series. Not sure if this is the best way to accommodate gaps in the data.

Note

The data and information array returned have the duplicates removed and the sequence reset to be monotonic.

Parameters

fn (str) – full path to DATA.BIN file

Example

>>> from mth5.io import nims
>>> n = nims.NIMS(r"/home/mt_data/nims/mt001.bin")
remove_duplicates(info_array, data_array)[source]

remove duplicate blocks, removing the first duplicate as suggested by Paul and Anna. Checks to make sure that the mag data are identical for the duplicate blocks. Removes the blocks from the information and data arrays and returns the reduced arrays. This should sync up the timing of GPS stamps and index values.

Parameters
  • info_array (np.array) – structured array of block information

  • data_array (np.array) – structured array of the data

Returns

reduced information array

Returns

reduced data array

Returns

index of duplicates in raw data

property run_metadata

Run metadata

property start_time

start time is the first good GPS time stamp minus the seconds to the beginning of the time series.

property station_metadata

Station metadata from nims file

to_runts(calibrate=False)[source]

Get xarray for run

unwrap_sequence(sequence)[source]

unwrap the sequence to be sequential numbers instead of modulated by 256. sets the first number to 0

Parameters

sequence (list) – sequence of bytes numbers

Returns

unwrapped number of counts

mth5.io.nims.nims.read_nims(fn)[source]
Parameters

fn (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

mth5.io.nims.nims_collection module
LEMI 424 Collection

Collection of TXT files combined into runs

Created on Wed Aug 31 10:32:44 2022

@author: jpeacock

class mth5.io.nims.nims_collection.NIMSCollection(file_path=None, **kwargs)[source]

Bases: Collection

Collection of NIMS files into runs.

>>> from mth5.io.nims import LEMICollection
>>> lc = NIMSCollection(r"/path/to/single/lemi/station")
>>> lc.station_id = "mt001"
>>> lc.survey_id = "test_survey"
>>> run_dict = lc.get_runs(1)
assign_run_names(df, zeros=2)[source]

Assign run names assuming a row represents single station

Run names are assigned as sr{sample_rate}_{run_number:0{zeros}}.

Parameters
  • df (pandas.DataFrame) – Dataframe with the appropriate columns

  • zeros (int, optional) – number of zeros in run name, defaults to 4

Returns

Dataframe with run names

Return type

pandas.DataFrame

to_dataframe(sample_rates=[1], run_name_zeros=2, calibration_path=None)[source]

Create a data frame of each TXT file in a given directory.

Note

This assumes the given directory contains a single station

Parameters
  • sample_rates (int or list, optional) – sample rate to get, will always be 1 for LEMI data defaults to [1]

  • run_name_zeros (int, optional) – number of zeros to assing to the run name, defaults to 4

  • calibration_path (string or Path, optional) – path to calibration files, defaults to None

Returns

Dataframe with information of each TXT file in the given directory.

Return type

pandas.DataFrame

Example
>>> from mth5.io.lemi import LEMICollection
>>> lc = LEMICollection("/path/to/single/lemi/station")
>>> lemi_df = lc.to_dataframe()
mth5.io.nims.response_filters module

Created on Fri Sep 2 13:50:51 2022

@author: jpeacock

class mth5.io.nims.response_filters.Response(system_id=None, **kwargs)[source]

Bases: object

Common NIMS response filters for electric and magnetic channels

dipole_filter(length)[source]

Make a dipole filter

Parameters

length (TYPE) – dipole length in meters

Returns

DESCRIPTION

Return type

TYPE

property electric_conversion

electric channel conversion from counts to volts :return: DESCRIPTION :rtype: TYPE

property electric_high_pass_hp

1-pole low pass for 1 hz instuments :return: DESCRIPTION :rtype: TYPE

property electric_high_pass_pc

1-pole low pass filter for 8 hz instruments :return: DESCRIPTION :rtype: TYPE

property electric_low_pass

5 pole electric low pass filter :return: DESCRIPTION :rtype: TYPE

property electric_physical_units

DESCRIPTION :rtype: TYPE

Type

return

get_channel_response(channel, dipole_length=1)[source]

Get the full channel response filter :param channel: DESCRIPTION :type channel: TYPE :param dipole_length: DESCRIPTION, defaults to 1 :type dipole_length: TYPE, optional :return: DESCRIPTION :rtype: TYPE

get_electric_high_pass(hardware='pc')[source]

get the electric high pass filter based on the hardware

property magnetic_conversion

DESCRIPTION :rtype: TYPE

Type

return

property magnetic_low_pass

Low pass 3 pole filter

Returns

DESCRIPTION

Return type

TYPE

exception mth5.io.nims.response_filters.ResponseError[source]

Bases: Exception

Module contents
class mth5.io.nims.GPS(gps_string, index=0)[source]

Bases: object

class to parse GPS stamp from the NIMS

Depending on the type of Stamp different attributes will be filled.

GPRMC has full date and time information and declination GPGGA has elevation data

Note

GPGGA date is set to 1980-01-01 so that the time can be estimated. Should use GPRMC for accurate date/time information.

property declination

geomagnetic declination in degrees from north

property elevation

elevation in meters

property fix

GPS fixed

property gps_type

GPRMC or GPGGA

property latitude

Latitude in decimal degrees, WGS84

property longitude

Latitude in decimal degrees, WGS84

parse_gps_string(gps_string)[source]

Parse a raw gps string from the NIMS and set appropriate attributes. GPS string will first be validated, then parsed.

Parameters

gps_string (string) – raw GPS string to be parsed

property time_stamp

return a datetime object of the time stamp

validate_gps_list(gps_list)[source]

check to make sure the gps stamp is the correct format, checks each element for the proper format

Parameters

gps_list (list) – a parsed gps string from a NIMS

Raises

mth5.io.nims.GPSError if anything is wrong.

validate_gps_string(gps_string)[source]

make sure the string is valid, remove any binary numbers and find the end of the string as ‘*’

Parameters

gps_string (string) – raw GPS string to be validated

Returns

validated string or None if there is something wrong

exception mth5.io.nims.GPSError[source]

Bases: Exception

class mth5.io.nims.NIMS(fn=None)[source]

Bases: NIMSHeader

NIMS Class will read in a NIMS DATA.BIN file.

A fast way to read the binary files are to first read in the GPS strings, the third byte in each block as a character and parse that into valid GPS stamps.

Then read in the entire data set as unsigned 8 bit integers and reshape the data to be n seconds x block size. Then parse that array into the status information and data.

I only have a limited amount of .BIN files to test so this will likely break if there are issues such as data gaps. This has been tested against the matlab program loadNIMS by Anna Kelbert and the match for all the .bin files I have. If something looks weird check it against that program.

Warning

Currently Only 8 Hz data is supported

align_data(data_array, stamps)[source]

Need to match up the first good GPS stamp with the data

Do this by using the first GPS stamp and assuming that the time from the first time stamp to the start is the index value.

put the data into a pandas data frame that is indexed by time

Parameters
  • data_array (array) – structure array with columns for each component [hx, hy, hz, ex, ey]

  • stamps (list) – list of GPS stamps [[status_index, [GPRMC, GPGGA]]]

Returns

pandas DataFrame with colums of components and indexed by time initialized by the start time.

Note

Data gaps are squeezed cause not sure what a gap actually means.

property box_temperature

data logger temperature, sampled at 1 second

check_timing(stamps)[source]

make sure that there are the correct number of seconds in between the first and last GPS GPRMC stamps

Parameters

stamps (list) – list of GPS stamps [[status_index, [GPRMC, GPGGA]]]

Returns

[ True | False ] if data is valid or not.

Returns

gap index locations

Note

currently it is assumed that if a data gap occurs the data can be squeezed to remove them. Probably a more elegant way of doing it.

property declination

median elevation value from all the GPS stamps in decimal degrees WGS84

Only get from the first stamp within the sets

property elevation

median elevation value from all the GPS stamps in decimal degrees WGS84

Only get from the first stamp within the sets

property end_time

start time is the first good GPS time stamp minus the seconds to the beginning of the time series.

property ex

EX

property ex_metadata
property ey

EY

property ey_metadata
find_sequence(data_array, block_sequence=None)[source]

find a sequence in a given array

Parameters
  • data_array (array) – array of the data with shape [n, m] where n is the number of seconds recorded m is the block length for a given sampling rate.

  • block_sequence (list) – sequence pattern to locate default is [1, 131] the start of a data block.

Returns

array of index locations where the sequence is found.

get_channel_response(channel, dipole_length=1)[source]

Get the channel response for a given channel

Parameters
  • channel (TYPE) – DESCRIPTION

  • dipole_length (TYPE, optional) – DESCRIPTION, defaults to 1

Returns

DESCRIPTION

Return type

TYPE

get_stamps(nims_string)[source]

get a list of valid GPS strings and match synchronous GPRMC with GPGGA stamps if possible.

Parameters

nims_string (str) – raw GPS string output by NIMS

has_data()[source]
property hx

HX

property hx_metadata
property hy

HY

property hy_metadata
property hz

HZ

property hz_metadata
property latitude

median latitude value from all the GPS stamps in decimal degrees WGS84

Only get from the GPRMC stamp as they should be duplicates

property longitude

median longitude value from all the GPS stamps in decimal degrees WGS84

Only get from the first stamp within the sets

make_dt_index(start_time, sample_rate, stop_time=None, n_samples=None)[source]

make time index array

Note

date-time format should be YYYY-M-DDThh:mm:ss.ms UTC

Parameters
  • start_time (string) – start time

  • end_time (string) – end time

  • sample_rate (float) – sample_rate in samples/second

match_status_with_gps_stamps(status_array, gps_list)[source]

Match the index values from the status array with the index values of the GPS stamps. There appears to be a bit of wiggle room between when the lock is recorded and the stamp was actually recorded. This is typically 1 second and sometimes 2.

Parameters
  • status_array (array) – array of status values from each data block

  • gps_list (list) – list of valid GPS stamps [[GPRMC, GPGGA], …]

Note

I think there is a 2 second gap between the lock and the first stamp character.

property n_samples
read_nims(fn=None)[source]

Read NIMS DATA.BIN file.

  1. Read in the header information and stores those as attributes with the same names as in the header file.

  2. Locate the beginning of the data blocks by looking for the first [1, 131, …] combo. Anything before that is cut out.

  3. Make sure the data is a multiple of the block length, if the data is longer the extra bits are cut off.

  4. Read in the GPS data (3rd byte of each block) as characters. Parses those into valid GPS stamps with appropriate index locations of where the ‘$’ was found.

  5. Read in the data as unsigned 8-bit integers and reshape the array into [N, data_block_length]. Parse this array into the status information and the data.

  6. Remove duplicate blocks, by removing the first of the duplicates as suggested by Anna and Paul.

  7. Match the GPS locks from the status with valid GPS stamps.

  8. Check to make sure that there is the correct number of seconds between the first and last GPS stamp. The extra seconds are cut off from the end of the time series. Not sure if this is the best way to accommodate gaps in the data.

Note

The data and information array returned have the duplicates removed and the sequence reset to be monotonic.

Parameters

fn (str) – full path to DATA.BIN file

Example

>>> from mth5.io import nims
>>> n = nims.NIMS(r"/home/mt_data/nims/mt001.bin")
remove_duplicates(info_array, data_array)[source]

remove duplicate blocks, removing the first duplicate as suggested by Paul and Anna. Checks to make sure that the mag data are identical for the duplicate blocks. Removes the blocks from the information and data arrays and returns the reduced arrays. This should sync up the timing of GPS stamps and index values.

Parameters
  • info_array (np.array) – structured array of block information

  • data_array (np.array) – structured array of the data

Returns

reduced information array

Returns

reduced data array

Returns

index of duplicates in raw data

property run_metadata

Run metadata

property start_time

start time is the first good GPS time stamp minus the seconds to the beginning of the time series.

property station_metadata

Station metadata from nims file

to_runts(calibrate=False)[source]

Get xarray for run

unwrap_sequence(sequence)[source]

unwrap the sequence to be sequential numbers instead of modulated by 256. sets the first number to 0

Parameters

sequence (list) – sequence of bytes numbers

Returns

unwrapped number of counts

class mth5.io.nims.NIMSCollection(file_path=None, **kwargs)[source]

Bases: Collection

Collection of NIMS files into runs.

>>> from mth5.io.nims import LEMICollection
>>> lc = NIMSCollection(r"/path/to/single/lemi/station")
>>> lc.station_id = "mt001"
>>> lc.survey_id = "test_survey"
>>> run_dict = lc.get_runs(1)
assign_run_names(df, zeros=2)[source]

Assign run names assuming a row represents single station

Run names are assigned as sr{sample_rate}_{run_number:0{zeros}}.

Parameters
  • df (pandas.DataFrame) – Dataframe with the appropriate columns

  • zeros (int, optional) – number of zeros in run name, defaults to 4

Returns

Dataframe with run names

Return type

pandas.DataFrame

to_dataframe(sample_rates=[1], run_name_zeros=2, calibration_path=None)[source]

Create a data frame of each TXT file in a given directory.

Note

This assumes the given directory contains a single station

Parameters
  • sample_rates (int or list, optional) – sample rate to get, will always be 1 for LEMI data defaults to [1]

  • run_name_zeros (int, optional) – number of zeros to assing to the run name, defaults to 4

  • calibration_path (string or Path, optional) – path to calibration files, defaults to None

Returns

Dataframe with information of each TXT file in the given directory.

Return type

pandas.DataFrame

Example
>>> from mth5.io.lemi import LEMICollection
>>> lc = LEMICollection("/path/to/single/lemi/station")
>>> lemi_df = lc.to_dataframe()
class mth5.io.nims.NIMSHeader(fn=None)[source]

Bases: object

class to hold the NIMS header information.

A typical header looks like

'''
>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
>>>user field>>>>>>>>>>>>>>>>>>>>>>>>>>>>
SITE NAME: Budwieser Spring
STATE/PROVINCE: CA
COUNTRY: USA
>>> The following code in double quotes is REQUIRED to start the NIMS <<
>>> The next 3 lines contain values required for processing <<<<<<<<<<<<
>>> The lines after that are optional <<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<
"300b"  <-- 2CHAR EXPERIMENT CODE + 3 CHAR SITE CODE + RUN LETTER
1105-3; 1305-3  <-- SYSTEM BOX I.D.; MAG HEAD ID (if different)
106  0 <-- N-S Ex WIRE LENGTH (m); HEADING (deg E mag N)
109  90 <-- E-W Ey WIRE LENGTH (m); HEADING (deg E mag N)
1         <-- N ELECTRODE ID
3         <-- E ELECTRODE ID
2         <-- S ELECTRODE ID
4         <-- W ELECTRODE ID
Cu        <-- GROUND ELECTRODE INFO
GPS INFO: 01/10/19 16:16:42 1616.7000 3443.6088 115.7350 W 946.6
OPERATOR: KP
COMMENT: N/S CRS: .95/.96 DCV: 3.5 ACV:1
E/W CRS: .85/.86 DCV: 1.5 ACV: 1
Redeployed site for run b b/c possible animal disturbance
'''
property file_size

Size of the file

property fn

Full path to NIMS file

parse_header_dict(header_dict=None)[source]

parse the header dictionary into something useful

read_header(fn=None)[source]

read header information

Parameters

fn (string or pathlib.Path) – full path to file to read

Raises

mth5.io.nims.NIMSError if something is not right.

property station

Station ID

class mth5.io.nims.Response(system_id=None, **kwargs)[source]

Bases: object

Common NIMS response filters for electric and magnetic channels

dipole_filter(length)[source]

Make a dipole filter

Parameters

length (TYPE) – dipole length in meters

Returns

DESCRIPTION

Return type

TYPE

property electric_conversion

electric channel conversion from counts to volts :return: DESCRIPTION :rtype: TYPE

property electric_high_pass_hp

1-pole low pass for 1 hz instuments :return: DESCRIPTION :rtype: TYPE

property electric_high_pass_pc

1-pole low pass filter for 8 hz instruments :return: DESCRIPTION :rtype: TYPE

property electric_low_pass

5 pole electric low pass filter :return: DESCRIPTION :rtype: TYPE

property electric_physical_units

DESCRIPTION :rtype: TYPE

Type

return

get_channel_response(channel, dipole_length=1)[source]

Get the full channel response filter :param channel: DESCRIPTION :type channel: TYPE :param dipole_length: DESCRIPTION, defaults to 1 :type dipole_length: TYPE, optional :return: DESCRIPTION :rtype: TYPE

get_electric_high_pass(hardware='pc')[source]

get the electric high pass filter based on the hardware

property magnetic_conversion

DESCRIPTION :rtype: TYPE

Type

return

property magnetic_low_pass

Low pass 3 pole filter

Returns

DESCRIPTION

Return type

TYPE

mth5.io.nims.read_nims(fn)[source]
Parameters

fn (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

mth5.io.phoenix package
Subpackages
mth5.io.phoenix.readers package
Subpackages
mth5.io.phoenix.readers.contiguous package
Submodules
mth5.io.phoenix.readers.contiguous.decimated_continuous_reader module

Module to read and parse native Phoenix Geophysics data formats of the MTU-5C Family.

This module implements Streamed readers for decimated continuos time series formats of the MTU-5C family.

author

Jorge Torres-Solis

Revised 2022 by J. Peacock

class mth5.io.phoenix.readers.contiguous.decimated_continuous_reader.DecimatedContinuousReader(path, num_files=1, report_hw_sat=False, **kwargs)[source]

Bases: TSReaderBase

Class to create a streamer for continuous decimated time series, i.e. ‘td_150’, ‘td_30’

These files have no sub header information.

read()[source]

Read in the full data from the file given.

Returns

single channel data array

Return type

numpy.ndarray

read_sequence(start=0, end=None)[source]

Read a sequence of files

Parameters
  • start (integer, optional) – starting index in the sequence, defaults to 0

  • end (integer, optional) – eneding index in the sequence to read, defaults to None

Returns

data within the given sequence range

Return type

numpy.ndarray

property segment_end_time

estimate end time

The first sequence starts 1 second later than the set start time due to initiation within the data logger

Returns

estimated end time from number of samples

Return type

mt_metadata.utils.mttime.MTime

property segment_start_time

estimate the segment start time based on sequence number

The first sequence starts 1 second later than the set start time due to initiation within the data logger

Returns

start time of the recording

Return type

mt_metadata.utils.mttime.MTime

property sequence_end
property sequence_start
to_channel_ts(rxcal_fn=None, scal_fn=None)[source]

convert to a ChannelTS object

Returns

DESCRIPTION

Return type

TYPE

Module contents
class mth5.io.phoenix.readers.contiguous.DecimatedContinuousReader(path, num_files=1, report_hw_sat=False, **kwargs)[source]

Bases: TSReaderBase

Class to create a streamer for continuous decimated time series, i.e. ‘td_150’, ‘td_30’

These files have no sub header information.

read()[source]

Read in the full data from the file given.

Returns

single channel data array

Return type

numpy.ndarray

read_sequence(start=0, end=None)[source]

Read a sequence of files

Parameters
  • start (integer, optional) – starting index in the sequence, defaults to 0

  • end (integer, optional) – eneding index in the sequence to read, defaults to None

Returns

data within the given sequence range

Return type

numpy.ndarray

property segment_end_time

estimate end time

The first sequence starts 1 second later than the set start time due to initiation within the data logger

Returns

estimated end time from number of samples

Return type

mt_metadata.utils.mttime.MTime

property segment_start_time

estimate the segment start time based on sequence number

The first sequence starts 1 second later than the set start time due to initiation within the data logger

Returns

start time of the recording

Return type

mt_metadata.utils.mttime.MTime

property sequence_end
property sequence_start
to_channel_ts(rxcal_fn=None, scal_fn=None)[source]

convert to a ChannelTS object

Returns

DESCRIPTION

Return type

TYPE

mth5.io.phoenix.readers.native package
Submodules
mth5.io.phoenix.readers.native.native_reader module

Module to read and parse native Phoenix Geophysics data formats of the MTU-5C Family.

This module implements Streamed readers for segmented-decimated time series formats of the MTU-5C family.

author

Jorge Torres-Solis

Revised 2022 by J. Peacock

class mth5.io.phoenix.readers.native.native_reader.NativeReader(path, num_files=1, scale_to=1, header_length=128, last_frame=0, ad_plus_minus_range=5.0, channel_type='E', report_hw_sat=False, **kwargs)[source]

Bases: TSReaderBase

Native sampling rate ‘Raw’ time series reader class, these are the .bin files. They are formatted with a header of 128 bytes then frames of 64.

Each frame is 20 x 3 byte (24-bit) data point then a 4 byte footer.

property npts_per_frame
read()[source]

Read the full data file.

Note

This uses numpy.lib.stride_tricks.as_strided which can be unstable if the bytes are not the correct length. See notes by numpy.

Got this solution from: https://stackoverflow.com/questions/12080279/how-do-i-create-a-numpy-dtype-that-includes-24-bit-integers?msclkid=3398046ecd6511ec9a37394f28c5aaba

Returns

scaled data and footer

Return type

tuple (data, footer)

read_frames(num_frames)[source]

Read the given amount of frames from the data.

Note

that seek is not reset so if you iterate this the stream reads from the last tell.

Parameters

num_frames (integer) – Number of frames to read

Returns

Scaled data from the given number of frames

Return type

np.ndarray(dtype=float)

read_sequence(start=0, end=None)[source]

Read sequence of files into a single array

Parameters
  • start (integer, optional) – sequence start, defaults to 0

  • end (integer, optional) – sequence end, defaults to None

Returns

scaled data

Return type

np.ndarray(dtype=float32)

Returns

footer

Return type

np.ndarray(dtype=int32)

skip_frames(num_frames)[source]

Skip frames of the stream

Parameters

num_frames (integer) – number of frames to skip

Returns

end of file

Return type

boolean

to_channel_ts(rxcal_fn=None, scal_fn=None)[source]

convert to a ChannelTS object

Returns

DESCRIPTION

Return type

TYPE

Module contents
class mth5.io.phoenix.readers.native.NativeReader(path, num_files=1, scale_to=1, header_length=128, last_frame=0, ad_plus_minus_range=5.0, channel_type='E', report_hw_sat=False, **kwargs)[source]

Bases: TSReaderBase

Native sampling rate ‘Raw’ time series reader class, these are the .bin files. They are formatted with a header of 128 bytes then frames of 64.

Each frame is 20 x 3 byte (24-bit) data point then a 4 byte footer.

property npts_per_frame
read()[source]

Read the full data file.

Note

This uses numpy.lib.stride_tricks.as_strided which can be unstable if the bytes are not the correct length. See notes by numpy.

Got this solution from: https://stackoverflow.com/questions/12080279/how-do-i-create-a-numpy-dtype-that-includes-24-bit-integers?msclkid=3398046ecd6511ec9a37394f28c5aaba

Returns

scaled data and footer

Return type

tuple (data, footer)

read_frames(num_frames)[source]

Read the given amount of frames from the data.

Note

that seek is not reset so if you iterate this the stream reads from the last tell.

Parameters

num_frames (integer) – Number of frames to read

Returns

Scaled data from the given number of frames

Return type

np.ndarray(dtype=float)

read_sequence(start=0, end=None)[source]

Read sequence of files into a single array

Parameters
  • start (integer, optional) – sequence start, defaults to 0

  • end (integer, optional) – sequence end, defaults to None

Returns

scaled data

Return type

np.ndarray(dtype=float32)

Returns

footer

Return type

np.ndarray(dtype=int32)

skip_frames(num_frames)[source]

Skip frames of the stream

Parameters

num_frames (integer) – number of frames to skip

Returns

end of file

Return type

boolean

to_channel_ts(rxcal_fn=None, scal_fn=None)[source]

convert to a ChannelTS object

Returns

DESCRIPTION

Return type

TYPE

mth5.io.phoenix.readers.segmented package
Submodules
mth5.io.phoenix.readers.segmented.decimated_segmented_reader module

Module to read and parse native Phoenix Geophysics data formats of the MTU-5C Family

This module implements Streamed readers for segmented-decimated time series

formats of the MTU-5C family.

author

Jorge Torres-Solis

Revised 2022 by J. Peacock

class mth5.io.phoenix.readers.segmented.decimated_segmented_reader.DecimatedSegmentCollection(path, num_files=1, report_hw_sat=False, **kwargs)[source]

Bases: TSReaderBase

Class to create a streamer for segmented decimated time series, i.e. ‘td_24k’

These files have a sub header

read_segments(metadata_only=False)[source]

Read the whole file in

Returns

DESCRIPTION

Return type

TYPE

to_channel_ts(rxcal_fn=None, scal_fn=None)[source]

convert to a ChannelTS object

Returns

DESCRIPTION

Return type

TYPE

class mth5.io.phoenix.readers.segmented.decimated_segmented_reader.DecimatedSegmentedReader(path, num_files=1, report_hw_sat=False, **kwargs)[source]

Bases: TSReaderBase

Class to create a streamer for segmented decimated time series, i.e. ‘td_24k’

These files have a sub header

read_segment(metadata_only=False)[source]

Read in a single segment

Parameters

metadata_only (TYPE, optional) – DESCRIPTION, defaults to False

Returns

DESCRIPTION

Return type

TYPE

to_channel_ts(rxcal_fn=None, scal_fn=None)[source]

convert to a ChannelTS object

Returns

DESCRIPTION

Return type

TYPE

class mth5.io.phoenix.readers.segmented.decimated_segmented_reader.Segment(stream, **kwargs)[source]

Bases: SubHeader

A segment class to hold a single segment

read_segment(metadata_only=False)[source]

Read the whole file in

Returns

DESCRIPTION

Return type

TYPE

property segment_end_time

estimate end time

Returns

DESCRIPTION

Return type

TYPE

property segment_start_time

estimate the segment start time based on sequence number

Returns

DESCRIPTION

Return type

TYPE

class mth5.io.phoenix.readers.segmented.decimated_segmented_reader.SubHeader(**kwargs)[source]

Bases: object

Class for subheader of segmented files

property gps_time_stamp

GPS time stamp in UTC

property missing_count
property n_samples
property saturation_count
unpack_header(stream)[source]
property value_max
property value_mean
property value_min
Module contents
class mth5.io.phoenix.readers.segmented.DecimatedSegmentedReader(path, num_files=1, report_hw_sat=False, **kwargs)[source]

Bases: TSReaderBase

Class to create a streamer for segmented decimated time series, i.e. ‘td_24k’

These files have a sub header

read_segment(metadata_only=False)[source]

Read in a single segment

Parameters

metadata_only (TYPE, optional) – DESCRIPTION, defaults to False

Returns

DESCRIPTION

Return type

TYPE

to_channel_ts(rxcal_fn=None, scal_fn=None)[source]

convert to a ChannelTS object

Returns

DESCRIPTION

Return type

TYPE

Submodules
mth5.io.phoenix.readers.base module

Module to read and parse native Phoenix Geophysics data formats of the MTU-5C Family.

This module implements Streamed readers for segmented-decimated continuus-decimated and native sampling rate time series formats of the MTU-5C family.

author

Jorge Torres-Solis

Revised 2022 by J. Peacock

class mth5.io.phoenix.readers.base.TSReaderBase(path, num_files=1, header_length=128, report_hw_sat=False, **kwargs)[source]

Bases: Header

Generic reader that all other readers will inherit

property base_dir

parent directory of file :rtype: pathlib.Path

Type

return

property base_path

full path of file :rtype: pathlib.Path

Type

return

property channel_metadata

Channel metadata updated from recmeta

Returns

DESCRIPTION

Return type

TYPE

close()[source]

Close the file

property config_file_path
property file_extension

file extension :rtype: string

Type

return

property file_name

name of the file :rtype: string

Type

return

property file_size

file size in bytes :rtype: integer

Type

return

get_channel_response_filter(rxcal_fn=None, scal_fn=None)[source]

Get the channel response filter

Parameters
  • rxcal_fn (TYPE, optional) – DESCRIPTION, defaults to None

  • scal_fn (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

get_config_object()[source]

Read a config file into an object.

Returns

DESCRIPTION

Return type

TYPE

get_dipole_filter()[source]
Returns

DESCRIPTION

Return type

TYPE

get_lowpass_filter_name()[source]

Get the lowpass filter used by the receiver pre-decimation.

Returns

DESCRIPTION

Return type

TYPE

get_receiver_lowpass_filter(rxcal_fn)[source]

get reciever lowpass filter from the rxcal.json file

Parameters
  • lp_name (TYPE) – DESCRIPTION

  • rxcal_fn (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_receiver_metadata_object()[source]

Read recmeta.json into an object

Returns

DESCRIPTION

Return type

TYPE

get_sensor_filter(scal_fn)[source]
Parameters

scal_fn (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_v_to_mv_filter()[source]

the units are in volts, convert to millivolts

property instrument_id

instrument ID :rtype: string

Type

return

property max_samples

Max number of samples in a file which is:

(total number of bytes - header length) / frame size * n samples per frame

Returns

max number of samples in a file

Return type

int

open_file_seq(file_seq_num=None)[source]

Open a file in the sequence given the sequence number :param file_seq_num: sequence number to open, defaults to None :type file_seq_num: integer, optional :return: [True] if next file is now open, [False] if it is not :rtype: boolean

open_next()[source]

Open the next file in the sequence :return: [True] if next file is now open, [False] if it is not :rtype: boolean

property recmeta_file_path
property run_metadata

Run metadata updated from recmeta

Returns

DESCRIPTION

Return type

TYPE

property seq

sequence number of the file :rtype: int

Type

return

property sequence_list

get all the files in the sequence sorted by sequence number

property station_metadata

station metadata updated from recmeta

Returns

DESCRIPTION

Return type

TYPE

update_channel_map_from_recmeta()[source]
mth5.io.phoenix.readers.header module

Adopted from TimeSeries reader, making all attributes properties for easier reading and testing.

Module to read and parse native Phoenix Geophysics data formats of the MTU-5C Family

This module implements Streamed readers for segmented-decimated continuus-decimated and native sampling rate time series formats of the MTU-5C family.

author

Jorge Torres-Solis

Revised 2022 by J. Peacock

class mth5.io.phoenix.readers.header.Header(**kwargs)[source]

Bases: object

The header is 128 bytes with a specific format. This reads in the 128 bytes and provides properties to read each attribute of the header in the correct way.

property attenuator_gain
property battery_voltage_v
property board_model_main
property board_model_revision
property bytes_per_sample
property ch_board_model
property ch_board_serial
property ch_firmware
property channel_id
property channel_main_gain
property channel_type
property decimation_node_id
property detected_channel_type
property file_sequence
property file_type
property file_version
property frag_period
property frame_rollover_count
property frame_size
property frame_size_bytes
property future1
property future2
get_channel_metadata()[source]

translate metadata to channel metadata :return: DESCRIPTION :rtype: TYPE

get_run_metadata()[source]

translate to run metadata

Returns

DESCRIPTION

Return type

TYPE

get_station_metadata()[source]

translate to station metadata

property gps_elevation
property gps_horizontal_accuracy

In millimeters

property gps_lat
property gps_long
property gps_vertical_accuracy

In millimeters

property hardware_configuration
property header_length
property instrument_serial_number
property instrument_type
property intrinsic_circuitry_gain

This function will adjust the intrinsic circuitry gain based on the sensor range configuration in the configuration fingerprint

For this, we consider that for the Electric channel, calibration path, or H-legacy sensors all go through a 1/4 gain stage, and then they get a virtial x2 gain from Single-ended-diff before the A/D. In the case of newer sensors (differential) instead of a 1/4 gain stage, there is only a 1/2 gain stage

Therefore, in the E,cal and legacy sensor case the circuitry gain is 1/2, while for newer sensors it is 1

Note

Circuitry Gain not directly configurable by the user

property lp_frequency
property max_signal
property min_signal
property missing_frames
property preamp_gain
property recording_id
property recording_start_time

The actual data recording starts 1 second after the set start time. This is caused by the data logger starting up and initializing filter. This is taken care of in the segment start time

See https://github.com/kujaku11/PhoenixGeoPy/tree/main/Docs for more information.

The time recorded is GPS time.

Returns

DESCRIPTION

Return type

TYPE

property sample_rate
property sample_rate_base
property sample_rate_exp
property saturated_frames
property timing_flags
property timing_sat_count
property timing_stability
property timing_status
property total_circuitry_gain
property total_selectable_gain
unpack_header(stream)[source]
mth5.io.phoenix.readers.phx_json module
Module contents
class mth5.io.phoenix.readers.DecimatedContinuousReader(path, num_files=1, report_hw_sat=False, **kwargs)[source]

Bases: TSReaderBase

Class to create a streamer for continuous decimated time series, i.e. ‘td_150’, ‘td_30’

These files have no sub header information.

read()[source]

Read in the full data from the file given.

Returns

single channel data array

Return type

numpy.ndarray

read_sequence(start=0, end=None)[source]

Read a sequence of files

Parameters
  • start (integer, optional) – starting index in the sequence, defaults to 0

  • end (integer, optional) – eneding index in the sequence to read, defaults to None

Returns

data within the given sequence range

Return type

numpy.ndarray

property segment_end_time

estimate end time

The first sequence starts 1 second later than the set start time due to initiation within the data logger

Returns

estimated end time from number of samples

Return type

mt_metadata.utils.mttime.MTime

property segment_start_time

estimate the segment start time based on sequence number

The first sequence starts 1 second later than the set start time due to initiation within the data logger

Returns

start time of the recording

Return type

mt_metadata.utils.mttime.MTime

property sequence_end
property sequence_start
to_channel_ts(rxcal_fn=None, scal_fn=None)[source]

convert to a ChannelTS object

Returns

DESCRIPTION

Return type

TYPE

class mth5.io.phoenix.readers.DecimatedSegmentedReader(path, num_files=1, report_hw_sat=False, **kwargs)[source]

Bases: TSReaderBase

Class to create a streamer for segmented decimated time series, i.e. ‘td_24k’

These files have a sub header

read_segment(metadata_only=False)[source]

Read in a single segment

Parameters

metadata_only (TYPE, optional) – DESCRIPTION, defaults to False

Returns

DESCRIPTION

Return type

TYPE

to_channel_ts(rxcal_fn=None, scal_fn=None)[source]

convert to a ChannelTS object

Returns

DESCRIPTION

Return type

TYPE

class mth5.io.phoenix.readers.Header(**kwargs)[source]

Bases: object

The header is 128 bytes with a specific format. This reads in the 128 bytes and provides properties to read each attribute of the header in the correct way.

property attenuator_gain
property battery_voltage_v
property board_model_main
property board_model_revision
property bytes_per_sample
property ch_board_model
property ch_board_serial
property ch_firmware
property channel_id
property channel_main_gain
property channel_type
property decimation_node_id
property detected_channel_type
property file_sequence
property file_type
property file_version
property frag_period
property frame_rollover_count
property frame_size
property frame_size_bytes
property future1
property future2
get_channel_metadata()[source]

translate metadata to channel metadata :return: DESCRIPTION :rtype: TYPE

get_run_metadata()[source]

translate to run metadata

Returns

DESCRIPTION

Return type

TYPE

get_station_metadata()[source]

translate to station metadata

property gps_elevation
property gps_horizontal_accuracy

In millimeters

property gps_lat
property gps_long
property gps_vertical_accuracy

In millimeters

property hardware_configuration
property header_length
property instrument_serial_number
property instrument_type
property intrinsic_circuitry_gain

This function will adjust the intrinsic circuitry gain based on the sensor range configuration in the configuration fingerprint

For this, we consider that for the Electric channel, calibration path, or H-legacy sensors all go through a 1/4 gain stage, and then they get a virtial x2 gain from Single-ended-diff before the A/D. In the case of newer sensors (differential) instead of a 1/4 gain stage, there is only a 1/2 gain stage

Therefore, in the E,cal and legacy sensor case the circuitry gain is 1/2, while for newer sensors it is 1

Note

Circuitry Gain not directly configurable by the user

property lp_frequency
property max_signal
property min_signal
property missing_frames
property preamp_gain
property recording_id
property recording_start_time

The actual data recording starts 1 second after the set start time. This is caused by the data logger starting up and initializing filter. This is taken care of in the segment start time

See https://github.com/kujaku11/PhoenixGeoPy/tree/main/Docs for more information.

The time recorded is GPS time.

Returns

DESCRIPTION

Return type

TYPE

property sample_rate
property sample_rate_base
property sample_rate_exp
property saturated_frames
property timing_flags
property timing_sat_count
property timing_stability
property timing_status
property total_circuitry_gain
property total_selectable_gain
unpack_header(stream)[source]
class mth5.io.phoenix.readers.NativeReader(path, num_files=1, scale_to=1, header_length=128, last_frame=0, ad_plus_minus_range=5.0, channel_type='E', report_hw_sat=False, **kwargs)[source]

Bases: TSReaderBase

Native sampling rate ‘Raw’ time series reader class, these are the .bin files. They are formatted with a header of 128 bytes then frames of 64.

Each frame is 20 x 3 byte (24-bit) data point then a 4 byte footer.

property npts_per_frame
read()[source]

Read the full data file.

Note

This uses numpy.lib.stride_tricks.as_strided which can be unstable if the bytes are not the correct length. See notes by numpy.

Got this solution from: https://stackoverflow.com/questions/12080279/how-do-i-create-a-numpy-dtype-that-includes-24-bit-integers?msclkid=3398046ecd6511ec9a37394f28c5aaba

Returns

scaled data and footer

Return type

tuple (data, footer)

read_frames(num_frames)[source]

Read the given amount of frames from the data.

Note

that seek is not reset so if you iterate this the stream reads from the last tell.

Parameters

num_frames (integer) – Number of frames to read

Returns

Scaled data from the given number of frames

Return type

np.ndarray(dtype=float)

read_sequence(start=0, end=None)[source]

Read sequence of files into a single array

Parameters
  • start (integer, optional) – sequence start, defaults to 0

  • end (integer, optional) – sequence end, defaults to None

Returns

scaled data

Return type

np.ndarray(dtype=float32)

Returns

footer

Return type

np.ndarray(dtype=int32)

skip_frames(num_frames)[source]

Skip frames of the stream

Parameters

num_frames (integer) – number of frames to skip

Returns

end of file

Return type

boolean

to_channel_ts(rxcal_fn=None, scal_fn=None)[source]

convert to a ChannelTS object

Returns

DESCRIPTION

Return type

TYPE

class mth5.io.phoenix.readers.PhoenixCalibration(cal_fn=None, **kwargs)[source]

Bases: object

property base_filter_name
property cal_fn
property calibration_date
get_filter(channel, filter_name)[source]

get the lowpass filter for the given channel and lowpass value

Parameters
  • channel (TYPE) – DESCRIPTION

  • lp_name (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_filter_lp_name(channel, max_freq)[source]

get the filter name as

{instrument_model}_{instrument_type}_{inst_serial}_{channel}_{max_freq}_lp

Parameters
  • channel (TYPE) – DESCRIPTION

  • max_freq (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_filter_sensor_name(sensor)[source]

get the filter name as

{instrument_model}_{instrument_type}_{inst_serial}_{sensor}

Parameters

max_freq (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_max_freq(freq)[source]

Name the filter {ch}_{max_freq}_lp_ :param freq: DESCRIPTION :type freq: TYPE :return: DESCRIPTION :rtype: TYPE

read(cal_fn=None)[source]
Parameters

cal_fn (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

class mth5.io.phoenix.readers.PhoenixConfig(fn=None, **kwargs)[source]

Bases: object

A container for the config.json file used to control the recording

property auto_power_enabled
property config
property empower_version
property fn
has_obj()[source]
property mtc150_reset
property network
read(fn=None)[source]

read a config.json file that is in the Phoenix format

Parameters

fn (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

property receiver
property schedule
station_metadata()[source]
property surveyTechnique
property timezone
property timezone_offset
property version
class mth5.io.phoenix.readers.PhoenixReceiverMetadata(fn=None, **kwargs)[source]

Bases: object

A container for the recmeta.json file used to control the recording

property channel_map
property e1_metadata
property e2_metadata
property fn
get_ch_index(tag)[source]
get_ch_metadata(index)[source]

get channel metadata from index

get_ch_tag(index)[source]
property h1_metadata
property h2_metadata
property h3_metadata
property h4_metadata
property h5_metadata
property h6_metadata
has_obj()[source]
property instrument_id
property lp_filter_base_name
read(fn=None)[source]

read a config.json file that is in the Phoenix format

Parameters

fn (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

property run_metadata
property station_metadata
property survey_metadata
class mth5.io.phoenix.readers.TSReaderBase(path, num_files=1, header_length=128, report_hw_sat=False, **kwargs)[source]

Bases: Header

Generic reader that all other readers will inherit

property base_dir

parent directory of file :rtype: pathlib.Path

Type

return

property base_path

full path of file :rtype: pathlib.Path

Type

return

property channel_metadata

Channel metadata updated from recmeta

Returns

DESCRIPTION

Return type

TYPE

close()[source]

Close the file

property config_file_path
property file_extension

file extension :rtype: string

Type

return

property file_name

name of the file :rtype: string

Type

return

property file_size

file size in bytes :rtype: integer

Type

return

get_channel_response_filter(rxcal_fn=None, scal_fn=None)[source]

Get the channel response filter

Parameters
  • rxcal_fn (TYPE, optional) – DESCRIPTION, defaults to None

  • scal_fn (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

get_config_object()[source]

Read a config file into an object.

Returns

DESCRIPTION

Return type

TYPE

get_dipole_filter()[source]
Returns

DESCRIPTION

Return type

TYPE

get_lowpass_filter_name()[source]

Get the lowpass filter used by the receiver pre-decimation.

Returns

DESCRIPTION

Return type

TYPE

get_receiver_lowpass_filter(rxcal_fn)[source]

get reciever lowpass filter from the rxcal.json file

Parameters
  • lp_name (TYPE) – DESCRIPTION

  • rxcal_fn (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_receiver_metadata_object()[source]

Read recmeta.json into an object

Returns

DESCRIPTION

Return type

TYPE

get_sensor_filter(scal_fn)[source]
Parameters

scal_fn (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_v_to_mv_filter()[source]

the units are in volts, convert to millivolts

property instrument_id

instrument ID :rtype: string

Type

return

property max_samples

Max number of samples in a file which is:

(total number of bytes - header length) / frame size * n samples per frame

Returns

max number of samples in a file

Return type

int

open_file_seq(file_seq_num=None)[source]

Open a file in the sequence given the sequence number :param file_seq_num: sequence number to open, defaults to None :type file_seq_num: integer, optional :return: [True] if next file is now open, [False] if it is not :rtype: boolean

open_next()[source]

Open the next file in the sequence :return: [True] if next file is now open, [False] if it is not :rtype: boolean

property recmeta_file_path
property run_metadata

Run metadata updated from recmeta

Returns

DESCRIPTION

Return type

TYPE

property seq

sequence number of the file :rtype: int

Type

return

property sequence_list

get all the files in the sequence sorted by sequence number

property station_metadata

station metadata updated from recmeta

Returns

DESCRIPTION

Return type

TYPE

update_channel_map_from_recmeta()[source]
Submodules
mth5.io.phoenix.phoenix_collection module

Phoenix file collection

Created on Thu Aug 4 16:48:47 2022

@author: jpeacock

class mth5.io.phoenix.phoenix_collection.PhoenixCollection(file_path=None, **kwargs)[source]

Bases: Collection

A class to collect the various files in a Phoenix file system and try to organize them into runs.

assign_run_names(df, zeros=4)[source]

Assign run names by looping through start times.

For continous data a single run is assigned as long as the start and end times of each file align. If there is a break a new run name is assigned.

For segmented data a new run name is assigned to each segment

Parameters
  • df (pandas.DataFrame) – Dataframe returned by to_dataframe method

  • zeros (integer, optional) – Number of zeros in the run name, defaults to 4

Returns

Dataframe with run names

Return type

pandas.DataFrame

get_runs(sample_rates, run_name_zeros=4, calibration_path=None)[source]

Get a list of runs contained within the given folder. First the dataframe will be developed from which the runs are extracted.

For continous data all you need is the first file in the sequence. The reader will read in the entire sequence.

For segmented data it will only read in the given segment, which is slightly different from the original reader.

Parameters
  • sample_rates – list of sample rates to read, defaults to [150, 24000]

  • run_name_zeros (integer, optional) – Number of zeros in the run name, defaults to 4

Returns

List of run dataframes with only the first block of files

Return type

OrderedDict

Example
>>> from mth5.io.phoenix import PhoenixCollection
>>> phx_collection = PhoenixCollection(r"/path/to/station")
>>> run_dict = phx_collection.get_runs(sample_rates=[150, 24000])
to_dataframe(sample_rates=[150, 24000], run_name_zeros=4, calibration_path=None)[source]

Get a dataframe of all the files in a given directory with given columns. Loop over station folders.

Parameters
  • sample_rates (list of integers, optional) – list of sample rates to read, defaults to [150, 24000]

  • run_name_zeros (integer, optional) – Number of zeros in the run name, defaults to 4

Returns

Dataframe with each row representing a single file

Return type

pandas.DataFrame

mth5.io.phoenix.read module

Created on Fri May 6 12:39:34 2022

@author: jpeacock

mth5.io.phoenix.read.open_phoenix(file_name, **kwargs)[source]

Will put the file into the appropriate container

Parameters

file_name (string or pathlib.Path) – full path to file to open

Returns

The appropriate container based on file extension

Return type

mth5.io.phoenix.read.read_phoenix(file_name, **kwargs)[source]

Read a Phoenix file into a ChannelTS object :param file_name: DESCRIPTION :type file_name: TYPE :return: DESCRIPTION :rtype: TYPE

Module contents
class mth5.io.phoenix.PhoenixCollection(file_path=None, **kwargs)[source]

Bases: Collection

A class to collect the various files in a Phoenix file system and try to organize them into runs.

assign_run_names(df, zeros=4)[source]

Assign run names by looping through start times.

For continous data a single run is assigned as long as the start and end times of each file align. If there is a break a new run name is assigned.

For segmented data a new run name is assigned to each segment

Parameters
  • df (pandas.DataFrame) – Dataframe returned by to_dataframe method

  • zeros (integer, optional) – Number of zeros in the run name, defaults to 4

Returns

Dataframe with run names

Return type

pandas.DataFrame

get_runs(sample_rates, run_name_zeros=4, calibration_path=None)[source]

Get a list of runs contained within the given folder. First the dataframe will be developed from which the runs are extracted.

For continous data all you need is the first file in the sequence. The reader will read in the entire sequence.

For segmented data it will only read in the given segment, which is slightly different from the original reader.

Parameters
  • sample_rates – list of sample rates to read, defaults to [150, 24000]

  • run_name_zeros (integer, optional) – Number of zeros in the run name, defaults to 4

Returns

List of run dataframes with only the first block of files

Return type

OrderedDict

Example
>>> from mth5.io.phoenix import PhoenixCollection
>>> phx_collection = PhoenixCollection(r"/path/to/station")
>>> run_dict = phx_collection.get_runs(sample_rates=[150, 24000])
to_dataframe(sample_rates=[150, 24000], run_name_zeros=4, calibration_path=None)[source]

Get a dataframe of all the files in a given directory with given columns. Loop over station folders.

Parameters
  • sample_rates (list of integers, optional) – list of sample rates to read, defaults to [150, 24000]

  • run_name_zeros (integer, optional) – Number of zeros in the run name, defaults to 4

Returns

Dataframe with each row representing a single file

Return type

pandas.DataFrame

class mth5.io.phoenix.PhoenixConfig(fn=None, **kwargs)[source]

Bases: object

A container for the config.json file used to control the recording

property auto_power_enabled
property config
property empower_version
property fn
has_obj()[source]
property mtc150_reset
property network
read(fn=None)[source]

read a config.json file that is in the Phoenix format

Parameters

fn (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

property receiver
property schedule
station_metadata()[source]
property surveyTechnique
property timezone
property timezone_offset
property version
class mth5.io.phoenix.PhoenixReceiverMetadata(fn=None, **kwargs)[source]

Bases: object

A container for the recmeta.json file used to control the recording

property channel_map
property e1_metadata
property e2_metadata
property fn
get_ch_index(tag)[source]
get_ch_metadata(index)[source]

get channel metadata from index

get_ch_tag(index)[source]
property h1_metadata
property h2_metadata
property h3_metadata
property h4_metadata
property h5_metadata
property h6_metadata
has_obj()[source]
property instrument_id
property lp_filter_base_name
read(fn=None)[source]

read a config.json file that is in the Phoenix format

Parameters

fn (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

property run_metadata
property station_metadata
property survey_metadata
mth5.io.phoenix.open_phoenix(file_name, **kwargs)[source]

Will put the file into the appropriate container

Parameters

file_name (string or pathlib.Path) – full path to file to open

Returns

The appropriate container based on file extension

Return type

mth5.io.phoenix.read_phoenix(file_name, **kwargs)[source]

Read a Phoenix file into a ChannelTS object :param file_name: DESCRIPTION :type file_name: TYPE :return: DESCRIPTION :rtype: TYPE

mth5.io.usgs_ascii package
Submodules
mth5.io.usgs_ascii.usgs_ascii module

Created on Thu Aug 27 16:54:09 2020

author

Jared Peacock

license

MIT

class mth5.io.usgs_ascii.usgs_ascii.USGSascii(fn=None, **kwargs)[source]

Bases: AsciiMetadata

Read and write USGS ascii formatted time series.

Attributes

Description

ts

Pandas dataframe holding the time series data

fn

Full path to .asc file

station_dir

Full path to station directory

meta_notes

Notes of how the station was collected

Example
>>> zc = Z3DCollection()
>>> fn_list = zc.get_time_blocks(z3d_path)
>>> zm = USGSasc()
>>> zm.SurveyID = 'iMUSH'
>>> zm.get_z3d_db(fn_list[0])
>>> zm.read_mtft24_cfg()
>>> zm.CoordinateSystem = 'Geomagnetic North'
>>> zm.SurveyID = 'MT'
>>> zm.write_asc_file(str_fmt='%15.7e')
>>> zm.write_station_info_metadata()
property ex
property ey
property hx

HX

property hy
property hz
read(fn=None)[source]

Read in a USGS ascii file and fill attributes accordingly.

Parameters

fn (string) – full path to .asc file to be read in

to_run_ts()[source]

Get xarray for run

write(save_fn=None, chunk_size=1024, str_fmt='%15.7e', full=True, compress=False, save_dir=None, compress_type='zip', convert_electrics=True)[source]

Write an ascii file in the USGS ascii format.

Parameters
  • save_fn (string) – full path to file name to save the merged ascii to

  • chunck_size (int) – chunck size to write file in blocks, larger numbers are typically slower.

  • str_fmt (string) – format of the data as written

  • full (boolean [ True | False ]) – write out the complete file, mostly for testing.

  • compress (boolean [ True | False ]) – compress file

  • compress_type (boolean [ zip | gzip ]) – compress file using zip or gzip

mth5.io.usgs_ascii.usgs_ascii.read_ascii(fn)[source]

read USGS ASCII formatted file

Parameters

fn (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

Module contents
class mth5.io.usgs_ascii.AsciiMetadata(fn=None, **kwargs)[source]

Bases: object

Container for all the important metadata in a USGS ascii file.

Attributes

Description

survey_id

Survey name

site_id

Site name

run_id

Run number

site_latitude

Site latitude in decimal degrees WGS84

site_longitude

Site longitude in decimal degrees WGS84

site_elevation

Site elevation according to national map meters

start

Start time of station YYYY-MM-DDThh:mm:ss UTC

end

Stop time of station YYYY-MM-DDThh:mm:ss UTC

sample_rate

Sampling rate samples/second

n_samples

Number of samples

n_channels

Number of channels

coordinate_system

[ Geographic North | Geomagnetic North ]

chn_settings

Channel settings, see below

missing_data_flag

Missing data value

Chn_settings

Keys

Description

ChnNum

site_id+channel number

ChnID

Component [ ex | ey | hx | hy | hz ]

InstrumentID

Data logger + sensor number

Azimuth

Setup angle of componet in degrees relative to coordinate_system

Dipole_Length

Dipole length in meters

property elevation

get elevation from national map

property end
property file_size
property fn
get_component_info(comp)[source]
Parameters

comp (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

property latitude
property longitude
property n_channels
property n_samples
read_metadata(fn=None, meta_lines=None)[source]

Read in a meta from the raw string or file. Populate all metadata as attributes.

Parameters
  • fn (string) – full path to USGS ascii file

  • meta_lines (list) – lines of metadata to read

property run_id
property run_metadata
property sample_rate
property site_id
property start
property station_metadata
property survey_id
property survey_metadata
write_metadata(chn_list=['Ex', 'Ey', 'Hx', 'Hy', 'Hz'])[source]

Write out metadata in the format of USGS ascii.

return

list of metadate lines.

Note

meant to use ‘

‘.join(lines) to write out in a file.

class mth5.io.usgs_ascii.USGSascii(fn=None, **kwargs)[source]

Bases: AsciiMetadata

Read and write USGS ascii formatted time series.

Attributes

Description

ts

Pandas dataframe holding the time series data

fn

Full path to .asc file

station_dir

Full path to station directory

meta_notes

Notes of how the station was collected

Example
>>> zc = Z3DCollection()
>>> fn_list = zc.get_time_blocks(z3d_path)
>>> zm = USGSasc()
>>> zm.SurveyID = 'iMUSH'
>>> zm.get_z3d_db(fn_list[0])
>>> zm.read_mtft24_cfg()
>>> zm.CoordinateSystem = 'Geomagnetic North'
>>> zm.SurveyID = 'MT'
>>> zm.write_asc_file(str_fmt='%15.7e')
>>> zm.write_station_info_metadata()
property ex
property ey
property hx

HX

property hy
property hz
read(fn=None)[source]

Read in a USGS ascii file and fill attributes accordingly.

Parameters

fn (string) – full path to .asc file to be read in

to_run_ts()[source]

Get xarray for run

write(save_fn=None, chunk_size=1024, str_fmt='%15.7e', full=True, compress=False, save_dir=None, compress_type='zip', convert_electrics=True)[source]

Write an ascii file in the USGS ascii format.

Parameters
  • save_fn (string) – full path to file name to save the merged ascii to

  • chunck_size (int) – chunck size to write file in blocks, larger numbers are typically slower.

  • str_fmt (string) – format of the data as written

  • full (boolean [ True | False ]) – write out the complete file, mostly for testing.

  • compress (boolean [ True | False ]) – compress file

  • compress_type (boolean [ zip | gzip ]) – compress file using zip or gzip

class mth5.io.usgs_ascii.USGSasciiCollection(file_path=None, **kwargs)[source]

Bases: Collection

Collection of USGS ASCII files.

>>> from mth5.io.usgs_ascii import USGSasciiCollection
>>> lc = USGSasciiCollection(r"/path/to/ascii/files")
>>> run_dict = lc.get_runs(1)
assign_run_names(df, zeros=4)[source]

Assign run names based on start and end times, checks if a file has the same start time as the last end time.

Run names are assigned as sr{sample_rate}_{run_number:0{zeros}}. Only if the run name is not assigned already.

Parameters
  • df (pandas.DataFrame) – Dataframe with the appropriate columns

  • zeros (int, optional) – number of zeros in run name, defaults to 4

Returns

Dataframe with run names

Return type

pandas.DataFrame

to_dataframe(sample_rates=[4], run_name_zeros=4, calibration_path=None)[source]

Create a data frame of each TXT file in a given directory.

Note

If a run name is already present it will not be overwritten

Parameters
  • sample_rates (int or list, optional) – sample rate to get, defaults to [4]

  • run_name_zeros (int, optional) – number of zeros to assing to the run name, defaults to 4

  • calibration_path (string or Path, optional) – path to calibration files, defaults to None

Returns

Dataframe with information of each TXT file in the given directory.

Return type

pandas.DataFrame

Example
>>> from mth5.io.usgs_ascii import USGSasciiCollection
>>> lc = USGSasciiCollection("/path/to/ascii/files")
>>> ascii_df = lc.to_dataframe()
mth5.io.usgs_ascii.read_ascii(fn)[source]

read USGS ASCII formatted file

Parameters

fn (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

mth5.io.zen package
Submodules
mth5.io.zen.coil_response module

Read an amtant.cal file provided by Zonge.

Apparently, the file includes the 6th and 8th harmonic of the given frequency, which is a fancy way of saying f x 6 and f x 8.

class mth5.io.zen.coil_response.CoilResponse(calibration_file=None, angular_frequency=False)[source]

Bases: object

property calibration_file
extrapolate(fap)[source]

Extrapolate assuming log-linear relationship

file_exists()[source]

Check to make sure the file exists

Returns

True if it does, False if it does not

Return type

boolean

get_coil_response_fap(coil_number, extrapolate=True)[source]

Read an amtant.cal file provided by Zonge.

Apparently, the file includes the 6th and 8th harmonic of the given frequency, which is a fancy way of saying f * 6 and f * 8.

Parameters

coil_number (int or string) – ANT4 4 digit serial number

Returns

Frequency look up table

Return type

mt_metadata.timeseries.filters.FrequencyResponseTableFilter

has_coil_number(coil_number)[source]

Test if coil number is in the antenna file

Parameters

coil_number (int or string) – ANT4 serial number

Returns

True if the coil is found, False if it is not

Return type

boolean

read_antenna_file(antenna_calibration_file=None)[source]

Read in the Antenna file to frequency, amplitude, phase of the proper harmonics (6, 8)

Note

Phase is measureed in milli-radians and will be converted

to radians.

Parameters

antenna_calibration_file (string or Path) – path to antenna.cal file provided by Zonge

mth5.io.zen.z3d_collection module
Z3DCollection

An object to hold Z3D file information to make processing easier.

Created on Sat Apr 4 12:40:40 2020

@author: peacock

class mth5.io.zen.z3d_collection.Z3DCollection(file_path=None, **kwargs)[source]

Bases: Collection

An object to deal with a collection of Z3D files. Metadata and information are contained with in Pandas DataFrames for easy searching.

assign_run_names(df, zeros=3)[source]
Returns

DESCRIPTION

Return type

TYPE

get_calibrations(antenna_calibration_file)[source]

Get coil calibrations from the antenna.cal file

Parameters

antenna_calibration_file (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

to_dataframe(sample_rates=[256, 4096], run_name_zeros=4, calibration_path=None)[source]

Get general z3d information and put information in a dataframe

Parameters

z3d_fn_list (list) – List of files Paths to z3d files

Returns

Dataframe of z3d information

Return type

Pandas.DataFrame

Example
>>> zc_obj = zc.Z3DCollection(r"/home/z3d_files")
>>> z3d_fn_list = zc.get_z3d_fn_list()
>>> z3d_df = zc.get_z3d_info(z3d_fn_list)
>>> # write dataframe to a file to use later
>>> z3d_df.to_csv(r"/home/z3d_files/z3d_info.csv")
mth5.io.zen.z3d_header module
Zen Header
  • Tools for reading and writing files for Zen and processing software

  • Tools for copying data from SD cards

  • Tools for copying schedules to SD cards

Created on Tue Jun 11 10:53:23 2013 Updated August 2020 (JP)

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.io.zen.z3d_header.Z3DHeader(fn=None, fid=None, **kwargs)[source]

Bases: object

Read in the header information of a Z3D file and make each metadata entry an attirbute.

Parameters
  • fn (string or pathlib.Path) – full path to Z3D file

  • fid (file) – file object ex. open(Z3Dfile, ‘rb’)

Attributes

Definition

_header_len

lenght of header in bits (512)

ad_gain

gain of channel

ad_rate

sampling rate in Hz

alt

altitude of the station (not reliable)

attenchannelsmask

not sure

box_number

ZEN box number

box_serial

ZEN box serial number

channel

channel number of the file

channelserial

serial number of the channel board

duty

duty cycle of the transmitter

fpga_buildnum

build number of one of the boards

gpsweek

GPS week

header_str

full header string

lat

latitude of station

logterminal

not sure

long

longitude of the station

main_hex_buildnum

build number of the ZEN box in hexidecimal

numsats

number of gps satelites

period

period of the transmitter

tx_duty

transmitter duty cycle

tx_freq

transmitter frequency

version

version of the firmware

Example
>>> import mtpy.usgs.zen as zen
>>> Z3Dfn = r"/home/mt/mt01/mt01_20150522_080000_256_EX.Z3D"
>>> header_obj = zen.Z3DHeader()
>>> header_obj.read_header()
convert_value(key_string, value_string)[source]

convert the value to the appropriate units given the key

property data_logger

Data logger name as ZEN{box_number}

read_header(fn=None, fid=None)[source]

Read the header information into appropriate attributes

Parameters
  • fn (string or pathlib.Path) – full path to Z3D file

  • fid (file) – file object ex. open(Z3Dfile, ‘rb’)

Example

>>> import mtpy.usgs.zen as zen
>>> Z3Dfn = r"/home/mt/mt01/mt01_20150522_080000_256_EX.Z3D"
>>> header_obj = zen.Z3DHeader()
>>> header_obj.read_header()
mth5.io.zen.z3d_metadata module

Created on Wed Aug 24 11:35:59 2022

@author: jpeacock

class mth5.io.zen.z3d_metadata.Z3DMetadata(fn=None, fid=None, **kwargs)[source]

Bases: object

Will read in the metadata information of a Z3D file and make each metadata entry an attirbute.The attributes are left in capitalization of the Z3D file.

Parameters
  • fn (string or pathlib.Path) – full path to Z3D file

  • fid (file) – file object ex. open(Z3Dfile, ‘rb’)

Attributes

Definition

_header_length

length of header in bits (512)

_metadata_length

length of metadata blocks (512)

_schedule_metadata_len

length of schedule meta data (512)

board_cal

board calibration np.ndarray()

cal_ant

antenna calibration

cal_board

board calibration

cal_ver

calibration version

ch_azimuth

channel azimuth

ch_cmp

channel component

ch_length

channel length (or # of coil)

ch_number

channel number on the ZEN board

ch_xyz1

channel xyz location (not sure)

ch_xyz2

channel xyz location (not sure)

coil_cal

coil calibration np.ndarray (freq, amp, phase)

fid

file object

find_metadata

boolean of finding metadata

fn

full path to Z3D file

gdp_operator

operater of the survey

gdp_progver

program version

job_by

job preformed by

job_for

job for

job_name

job name

job_number

job number

m_tell

location in the file where the last metadata block was found.

rx_aspace

electrode spacing

rx_sspace

not sure

rx_xazimuth

x azimuth of electrode

rx_xyz0

not sure

rx_yazimuth

y azimuth of electrode

survey_type

type of survey

unit_length

length units (m)

Example
>>> import mtpy.usgs.zen as zen
>>> Z3Dfn = r"/home/mt/mt01/mt01_20150522_080000_256_EX.Z3D"
>>> header_obj = zen.Z3DMetadata()
>>> header_obj.read_metadata()
read_metadata(fn=None, fid=None)[source]

read meta data

Parameters
  • fn (string) – full path to file, optional if already initialized.

  • fid (file) – open file object, optional if already initialized.

mth5.io.zen.z3d_schedule module

Created on Wed Aug 24 11:24:57 2022

@author: jpeacock

class mth5.io.zen.z3d_schedule.Z3DSchedule(fn=None, fid=None, **kwargs)[source]

Bases: object

Will read in the schedule information of a Z3D file and make each metadata entry an attirbute. The attributes are left in capitalization of the Z3D file.

Parameters
  • fn (string or pathlib.Path) – full path to Z3D file

  • fid (file) – file object ex. open(Z3Dfile, ‘rb’)

Attributes

Definition

AutoGain

Auto gain for the channel

Comment

Any comments for the schedule

Date

Date of when the schedule action was started YYYY-MM-DD

Duty

Duty cycle of the transmitter

FFTStacks

FFT stacks from the transmitter

Filename

Name of the file that the ZEN gives it

Gain

Gain of the channel

Log

Log the data [ Y | N ]

NewFile

Create a new file [ Y | N ]

Period

Period of the transmitter

RadioOn

Turn on the radio [ Y | N ]

SR

Sampling Rate in Hz

SamplesPerAcq

Samples per aquisition for transmitter

Sleep

Set the box to sleep [ Y | N ]

Sync

Sync with GPS [ Y | N ]

Time

Time the schedule action started HH:MM:SS (GPS time)

_header_len

length of header in bits (512)

_schedule_metadata_len

length of schedule metadata in bits (512)

fid

file object of the file

fn

file name to read in

meta_string

string of the schedule

Example
>>> import mtpy.usgs.zen as zen
>>> Z3Dfn = r"/home/mt/mt01/mt01_20150522_080000_256_EX.Z3D"
>>> header_obj = zen.Z3DSchedule()
>>> header_obj.read_schedule()
read_schedule(fn=None, fid=None)[source]

read meta data string

mth5.io.zen.zen module
Zen
  • Tools for reading and writing files for Zen and processing software

  • Tools for copying data from SD cards

  • Tools for copying schedules to SD cards

Created on Tue Jun 11 10:53:23 2013 Updated August 2020 (JP)

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.io.zen.zen.Z3D(fn=None, **kwargs)[source]

Bases: object

Deals with the raw Z3D files output by zen. :param **fn**: full path to .Z3D file to be read in :type **fn**: string

Attributes

Description

Default Value

_block_len

length of data block to read in as chunks faster reading

65536

_counts_to_mv_conversion

conversion factor to convert counts to mv

9.53674316406e-10

_gps_bytes

number of bytes for a gps stamp

16

_gps_dtype

data type for a gps stamp

see below

_gps_epoch

starting date of GPS time format is a tuple

(1980, 1, 6, 0,

0, 0, -1, -1, 0)

_gps_f0

first gps flag in raw binary

_gps_f1

second gps flag in raw binary

_gps_flag_0

first gps flag as an int32

2147483647

_gps_flag_1

second gps flag as an int32

-2147483648

_gps_stamp_length

bit length of gps stamp

64

_leap_seconds

leap seconds, difference between UTC time and GPS time. GPS time is ahead by this much

16

_week_len

week length in seconds

604800

df

sampling rate of the data

256

fn

Z3D file name

None

gps_flag

full gps flag

_gps_f0+_gps_f1

gps_stamps

np.ndarray of gps stamps

None

header

Z3DHeader object

Z3DHeader

metadata

Z3DMetadata

Z3DMetadata

schedule

Z3DSchedule

Z3DSchedule

time_series

np.ndarra(len_data)

None

units

units in which the data is in

counts

zen_schedule

time when zen was set to run

None

  • gps_dtype is formated as np.dtype([(‘flag0’, np.int32),

    (‘flag1’, np.int32), (‘time’, np.int32), (‘lat’, np.float64), (‘lon’, np.float64), (‘num_sat’, np.int32), (‘gps_sens’, np.int32), (‘temperature’, np.float32), (‘voltage’, np.float32), (‘num_fpga’, np.int32), (‘num_adc’, np.int32), (‘pps_count’, np.int32), (‘dac_tune’, np.int32), (‘block_len’, np.int32)])

Example
>>> import mtpy.usgs.zen as zen
>>> zt = zen.Zen3D(r"/home/mt/mt00/mt00_20150522_080000_256_EX.Z3D")
>>> zt.read_z3d()
>>> ------- Reading /home/mt/mt00/mt00_20150522_080000_256_EX.Z3D -----
    --> Reading data took: 0.322 seconds
    Scheduled time was 2015-05-22,08:00:16 (GPS time)
    1st good stamp was 2015-05-22,08:00:18 (GPS time)
    difference of 2.00 seconds
    found 6418 GPS time stamps
    found 1642752 data points
>>> zt.plot_time_series()
property azimuth

azimuth of instrument setup

property channel_metadata

Channel metadata

property channel_number
property channel_response
check_start_time()[source]

check to make sure the scheduled start time is similar to the first good gps stamp

property coil_number

coil number

property coil_response

Make the coile response into a FAP filter

Phase must be in radians

property component

channel

convert_counts_to_mv(data)[source]

convert the time series from counts to millivolts

convert_gps_time()[source]

convert gps time integer to relative seconds from gps_week

convert_mv_to_counts(data)[source]

convert millivolts to counts assuming no other scaling has been applied

property counts2mv_filter

Create a counts2mv coefficient filter

Note

Needs to be 1/channel factor because we divided the instrument response from the data.

property dipole_filter
property dipole_length

dipole length

property elevation

elevation in meters

property end
property file_size
property fn
get_UTC_date_time(gps_week, gps_time)[source]

get the actual date and time of measurement as UTC.

Parameters
  • gps_week (int) – integer value of gps_week that the data was collected

  • gps_time (int) – number of seconds from beginning of gps_week

Returns

mth5.utils.mttime.MTime

get_gps_stamp_index(ts_data, old_version=False)[source]

locate the time stamps in a given time series.

Looks for gps_flag_0 first, if the file is newer, then makes sure the next value is gps_flag_1

Returns

list of gps stamps indicies

get_gps_time(gps_int, gps_week=0)[source]

from the gps integer get the time in seconds.

Parameters
  • gps_int (int) – integer from the gps time stamp line

  • gps_week (int) – relative gps week, if the number of seconds is larger than a week then a week is subtracted from the seconds and computed from gps_week += 1

Returns

gps_time as number of seconds from the beginning of the relative gps week.

property latitude

latitude in decimal degrees

property longitude

longitude in decimal degrees

property n_samples
read_all_info()[source]

Read header, schedule, and metadata

read_z3d(z3d_fn=None)[source]

read in z3d file and populate attributes accordingly

  1. Read in the entire file as chunks as np.int32.

  2. Extract the gps stamps and convert accordingly. Check to make sure gps time stamps are 1 second apart and incrementing as well as checking the number of data points between stamps is the same as the sampling rate.

  3. Converts gps_stamps[‘time’] to seconds relative to header.gps_week

    Note we skip the first two gps stamps because there is something wrong with the data there due to some type of buffering. Therefore the first GPS time is when the time series starts, so you will notice that gps_stamps[0][‘block_len’] = 0, this is because there is nothing previous to this time stamp and so the ‘block_len’ measures backwards from the corresponding time index.

  4. Put the data chunks into Pandas data frame that is indexed by time

Example

>>> from mth5.io import zen
>>> z_obj = zen.Z3D(r"home/mt_data/zen/mt001.z3d")
>>> z_obj.read_z3d()
property run_metadata

Run metadata

property sample_rate

sampling rate

property start
property station

station name

property station_metadata

station metadta

to_channelts()[source]

fill time series object

trim_data()[source]

apparently need to skip the first 2 seconds of data because of something to do with the SD buffer

This method will be deprecated after field testing

validate_gps_time()[source]

make sure each time stamp is 1 second apart

validate_time_blocks()[source]

validate gps time stamps and make sure each block is the proper length

property zen_response

Zen response, not sure the full calibration comes directly from the Z3D file, so skipping for now. Will have to read a Zen##.cal file to get the full calibration. This shouldn’t be a big issue cause it should roughly be the same for all channels and since the TF is computing the ratio they will cancel out. Though we should look more into this if just looking at calibrate time series.

property zen_schedule

zen schedule data and time

exception mth5.io.zen.zen.ZenGPSError[source]

Bases: Exception

error for gps timing

exception mth5.io.zen.zen.ZenInputFileError[source]

Bases: Exception

error for input files

exception mth5.io.zen.zen.ZenSamplingRateError[source]

Bases: Exception

error for different sampling rates

mth5.io.zen.zen.read_z3d(fn, calibration_fn=None, logger_file_handler=None)[source]

generic tool to read z3d file

Module contents
class mth5.io.zen.CoilResponse(calibration_file=None, angular_frequency=False)[source]

Bases: object

property calibration_file
extrapolate(fap)[source]

Extrapolate assuming log-linear relationship

file_exists()[source]

Check to make sure the file exists

Returns

True if it does, False if it does not

Return type

boolean

get_coil_response_fap(coil_number, extrapolate=True)[source]

Read an amtant.cal file provided by Zonge.

Apparently, the file includes the 6th and 8th harmonic of the given frequency, which is a fancy way of saying f * 6 and f * 8.

Parameters

coil_number (int or string) – ANT4 4 digit serial number

Returns

Frequency look up table

Return type

mt_metadata.timeseries.filters.FrequencyResponseTableFilter

has_coil_number(coil_number)[source]

Test if coil number is in the antenna file

Parameters

coil_number (int or string) – ANT4 serial number

Returns

True if the coil is found, False if it is not

Return type

boolean

read_antenna_file(antenna_calibration_file=None)[source]

Read in the Antenna file to frequency, amplitude, phase of the proper harmonics (6, 8)

Note

Phase is measureed in milli-radians and will be converted

to radians.

Parameters

antenna_calibration_file (string or Path) – path to antenna.cal file provided by Zonge

class mth5.io.zen.Z3D(fn=None, **kwargs)[source]

Bases: object

Deals with the raw Z3D files output by zen. :param **fn**: full path to .Z3D file to be read in :type **fn**: string

Attributes

Description

Default Value

_block_len

length of data block to read in as chunks faster reading

65536

_counts_to_mv_conversion

conversion factor to convert counts to mv

9.53674316406e-10

_gps_bytes

number of bytes for a gps stamp

16

_gps_dtype

data type for a gps stamp

see below

_gps_epoch

starting date of GPS time format is a tuple

(1980, 1, 6, 0,

0, 0, -1, -1, 0)

_gps_f0

first gps flag in raw binary

_gps_f1

second gps flag in raw binary

_gps_flag_0

first gps flag as an int32

2147483647

_gps_flag_1

second gps flag as an int32

-2147483648

_gps_stamp_length

bit length of gps stamp

64

_leap_seconds

leap seconds, difference between UTC time and GPS time. GPS time is ahead by this much

16

_week_len

week length in seconds

604800

df

sampling rate of the data

256

fn

Z3D file name

None

gps_flag

full gps flag

_gps_f0+_gps_f1

gps_stamps

np.ndarray of gps stamps

None

header

Z3DHeader object

Z3DHeader

metadata

Z3DMetadata

Z3DMetadata

schedule

Z3DSchedule

Z3DSchedule

time_series

np.ndarra(len_data)

None

units

units in which the data is in

counts

zen_schedule

time when zen was set to run

None

  • gps_dtype is formated as np.dtype([(‘flag0’, np.int32),

    (‘flag1’, np.int32), (‘time’, np.int32), (‘lat’, np.float64), (‘lon’, np.float64), (‘num_sat’, np.int32), (‘gps_sens’, np.int32), (‘temperature’, np.float32), (‘voltage’, np.float32), (‘num_fpga’, np.int32), (‘num_adc’, np.int32), (‘pps_count’, np.int32), (‘dac_tune’, np.int32), (‘block_len’, np.int32)])

Example
>>> import mtpy.usgs.zen as zen
>>> zt = zen.Zen3D(r"/home/mt/mt00/mt00_20150522_080000_256_EX.Z3D")
>>> zt.read_z3d()
>>> ------- Reading /home/mt/mt00/mt00_20150522_080000_256_EX.Z3D -----
    --> Reading data took: 0.322 seconds
    Scheduled time was 2015-05-22,08:00:16 (GPS time)
    1st good stamp was 2015-05-22,08:00:18 (GPS time)
    difference of 2.00 seconds
    found 6418 GPS time stamps
    found 1642752 data points
>>> zt.plot_time_series()
property azimuth

azimuth of instrument setup

property channel_metadata

Channel metadata

property channel_number
property channel_response
check_start_time()[source]

check to make sure the scheduled start time is similar to the first good gps stamp

property coil_number

coil number

property coil_response

Make the coile response into a FAP filter

Phase must be in radians

property component

channel

convert_counts_to_mv(data)[source]

convert the time series from counts to millivolts

convert_gps_time()[source]

convert gps time integer to relative seconds from gps_week

convert_mv_to_counts(data)[source]

convert millivolts to counts assuming no other scaling has been applied

property counts2mv_filter

Create a counts2mv coefficient filter

Note

Needs to be 1/channel factor because we divided the instrument response from the data.

property dipole_filter
property dipole_length

dipole length

property elevation

elevation in meters

property end
property file_size
property fn
get_UTC_date_time(gps_week, gps_time)[source]

get the actual date and time of measurement as UTC.

Parameters
  • gps_week (int) – integer value of gps_week that the data was collected

  • gps_time (int) – number of seconds from beginning of gps_week

Returns

mth5.utils.mttime.MTime

get_gps_stamp_index(ts_data, old_version=False)[source]

locate the time stamps in a given time series.

Looks for gps_flag_0 first, if the file is newer, then makes sure the next value is gps_flag_1

Returns

list of gps stamps indicies

get_gps_time(gps_int, gps_week=0)[source]

from the gps integer get the time in seconds.

Parameters
  • gps_int (int) – integer from the gps time stamp line

  • gps_week (int) – relative gps week, if the number of seconds is larger than a week then a week is subtracted from the seconds and computed from gps_week += 1

Returns

gps_time as number of seconds from the beginning of the relative gps week.

property latitude

latitude in decimal degrees

property longitude

longitude in decimal degrees

property n_samples
read_all_info()[source]

Read header, schedule, and metadata

read_z3d(z3d_fn=None)[source]

read in z3d file and populate attributes accordingly

  1. Read in the entire file as chunks as np.int32.

  2. Extract the gps stamps and convert accordingly. Check to make sure gps time stamps are 1 second apart and incrementing as well as checking the number of data points between stamps is the same as the sampling rate.

  3. Converts gps_stamps[‘time’] to seconds relative to header.gps_week

    Note we skip the first two gps stamps because there is something wrong with the data there due to some type of buffering. Therefore the first GPS time is when the time series starts, so you will notice that gps_stamps[0][‘block_len’] = 0, this is because there is nothing previous to this time stamp and so the ‘block_len’ measures backwards from the corresponding time index.

  4. Put the data chunks into Pandas data frame that is indexed by time

Example

>>> from mth5.io import zen
>>> z_obj = zen.Z3D(r"home/mt_data/zen/mt001.z3d")
>>> z_obj.read_z3d()
property run_metadata

Run metadata

property sample_rate

sampling rate

property start
property station

station name

property station_metadata

station metadta

to_channelts()[source]

fill time series object

trim_data()[source]

apparently need to skip the first 2 seconds of data because of something to do with the SD buffer

This method will be deprecated after field testing

validate_gps_time()[source]

make sure each time stamp is 1 second apart

validate_time_blocks()[source]

validate gps time stamps and make sure each block is the proper length

property zen_response

Zen response, not sure the full calibration comes directly from the Z3D file, so skipping for now. Will have to read a Zen##.cal file to get the full calibration. This shouldn’t be a big issue cause it should roughly be the same for all channels and since the TF is computing the ratio they will cancel out. Though we should look more into this if just looking at calibrate time series.

property zen_schedule

zen schedule data and time

class mth5.io.zen.Z3DCollection(file_path=None, **kwargs)[source]

Bases: Collection

An object to deal with a collection of Z3D files. Metadata and information are contained with in Pandas DataFrames for easy searching.

assign_run_names(df, zeros=3)[source]
Returns

DESCRIPTION

Return type

TYPE

get_calibrations(antenna_calibration_file)[source]

Get coil calibrations from the antenna.cal file

Parameters

antenna_calibration_file (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

to_dataframe(sample_rates=[256, 4096], run_name_zeros=4, calibration_path=None)[source]

Get general z3d information and put information in a dataframe

Parameters

z3d_fn_list (list) – List of files Paths to z3d files

Returns

Dataframe of z3d information

Return type

Pandas.DataFrame

Example
>>> zc_obj = zc.Z3DCollection(r"/home/z3d_files")
>>> z3d_fn_list = zc.get_z3d_fn_list()
>>> z3d_df = zc.get_z3d_info(z3d_fn_list)
>>> # write dataframe to a file to use later
>>> z3d_df.to_csv(r"/home/z3d_files/z3d_info.csv")
class mth5.io.zen.Z3DHeader(fn=None, fid=None, **kwargs)[source]

Bases: object

Read in the header information of a Z3D file and make each metadata entry an attirbute.

Parameters
  • fn (string or pathlib.Path) – full path to Z3D file

  • fid (file) – file object ex. open(Z3Dfile, ‘rb’)

Attributes

Definition

_header_len

lenght of header in bits (512)

ad_gain

gain of channel

ad_rate

sampling rate in Hz

alt

altitude of the station (not reliable)

attenchannelsmask

not sure

box_number

ZEN box number

box_serial

ZEN box serial number

channel

channel number of the file

channelserial

serial number of the channel board

duty

duty cycle of the transmitter

fpga_buildnum

build number of one of the boards

gpsweek

GPS week

header_str

full header string

lat

latitude of station

logterminal

not sure

long

longitude of the station

main_hex_buildnum

build number of the ZEN box in hexidecimal

numsats

number of gps satelites

period

period of the transmitter

tx_duty

transmitter duty cycle

tx_freq

transmitter frequency

version

version of the firmware

Example
>>> import mtpy.usgs.zen as zen
>>> Z3Dfn = r"/home/mt/mt01/mt01_20150522_080000_256_EX.Z3D"
>>> header_obj = zen.Z3DHeader()
>>> header_obj.read_header()
convert_value(key_string, value_string)[source]

convert the value to the appropriate units given the key

property data_logger

Data logger name as ZEN{box_number}

read_header(fn=None, fid=None)[source]

Read the header information into appropriate attributes

Parameters
  • fn (string or pathlib.Path) – full path to Z3D file

  • fid (file) – file object ex. open(Z3Dfile, ‘rb’)

Example

>>> import mtpy.usgs.zen as zen
>>> Z3Dfn = r"/home/mt/mt01/mt01_20150522_080000_256_EX.Z3D"
>>> header_obj = zen.Z3DHeader()
>>> header_obj.read_header()
class mth5.io.zen.Z3DMetadata(fn=None, fid=None, **kwargs)[source]

Bases: object

Will read in the metadata information of a Z3D file and make each metadata entry an attirbute.The attributes are left in capitalization of the Z3D file.

Parameters
  • fn (string or pathlib.Path) – full path to Z3D file

  • fid (file) – file object ex. open(Z3Dfile, ‘rb’)

Attributes

Definition

_header_length

length of header in bits (512)

_metadata_length

length of metadata blocks (512)

_schedule_metadata_len

length of schedule meta data (512)

board_cal

board calibration np.ndarray()

cal_ant

antenna calibration

cal_board

board calibration

cal_ver

calibration version

ch_azimuth

channel azimuth

ch_cmp

channel component

ch_length

channel length (or # of coil)

ch_number

channel number on the ZEN board

ch_xyz1

channel xyz location (not sure)

ch_xyz2

channel xyz location (not sure)

coil_cal

coil calibration np.ndarray (freq, amp, phase)

fid

file object

find_metadata

boolean of finding metadata

fn

full path to Z3D file

gdp_operator

operater of the survey

gdp_progver

program version

job_by

job preformed by

job_for

job for

job_name

job name

job_number

job number

m_tell

location in the file where the last metadata block was found.

rx_aspace

electrode spacing

rx_sspace

not sure

rx_xazimuth

x azimuth of electrode

rx_xyz0

not sure

rx_yazimuth

y azimuth of electrode

survey_type

type of survey

unit_length

length units (m)

Example
>>> import mtpy.usgs.zen as zen
>>> Z3Dfn = r"/home/mt/mt01/mt01_20150522_080000_256_EX.Z3D"
>>> header_obj = zen.Z3DMetadata()
>>> header_obj.read_metadata()
read_metadata(fn=None, fid=None)[source]

read meta data

Parameters
  • fn (string) – full path to file, optional if already initialized.

  • fid (file) – open file object, optional if already initialized.

class mth5.io.zen.Z3DSchedule(fn=None, fid=None, **kwargs)[source]

Bases: object

Will read in the schedule information of a Z3D file and make each metadata entry an attirbute. The attributes are left in capitalization of the Z3D file.

Parameters
  • fn (string or pathlib.Path) – full path to Z3D file

  • fid (file) – file object ex. open(Z3Dfile, ‘rb’)

Attributes

Definition

AutoGain

Auto gain for the channel

Comment

Any comments for the schedule

Date

Date of when the schedule action was started YYYY-MM-DD

Duty

Duty cycle of the transmitter

FFTStacks

FFT stacks from the transmitter

Filename

Name of the file that the ZEN gives it

Gain

Gain of the channel

Log

Log the data [ Y | N ]

NewFile

Create a new file [ Y | N ]

Period

Period of the transmitter

RadioOn

Turn on the radio [ Y | N ]

SR

Sampling Rate in Hz

SamplesPerAcq

Samples per aquisition for transmitter

Sleep

Set the box to sleep [ Y | N ]

Sync

Sync with GPS [ Y | N ]

Time

Time the schedule action started HH:MM:SS (GPS time)

_header_len

length of header in bits (512)

_schedule_metadata_len

length of schedule metadata in bits (512)

fid

file object of the file

fn

file name to read in

meta_string

string of the schedule

Example
>>> import mtpy.usgs.zen as zen
>>> Z3Dfn = r"/home/mt/mt01/mt01_20150522_080000_256_EX.Z3D"
>>> header_obj = zen.Z3DSchedule()
>>> header_obj.read_schedule()
read_schedule(fn=None, fid=None)[source]

read meta data string

mth5.io.zen.read_z3d(fn, calibration_fn=None, logger_file_handler=None)[source]

generic tool to read z3d file

Submodules
mth5.io.collection module

Phoenix file collection

Created on Thu Aug 4 16:48:47 2022

@author: jpeacock

class mth5.io.collection.Collection(file_path=None, **kwargs)[source]

Bases: object

A general collection class to keep track of files with methods to create runs and run ids.

assign_run_names()[source]
Returns

DESCRIPTION

Return type

TYPE

property file_path

Path object to file directory

get_empty_entry_dict()[source]
Returns

an empty dictionary with the proper keys for an entry into a dataframe

Return type

dict

get_files(extension)[source]

Get files with given extension. Uses Pathlib.Path.rglob, so it finds all files within the file_path by searching all sub-directories.

Parameters

extension (string or list) – file extension(s)

Returns

list of files in the file_path with the given extensions

Return type

list of Path objects

get_runs(sample_rates, run_name_zeros=4, calibration_path=None)[source]

Get a list of runs contained within the given folder. First the dataframe will be developed from which the runs are extracted.

For continous data all you need is the first file in the sequence. The reader will read in the entire sequence.

For segmented data it will only read in the given segment, which is slightly different from the original reader.

Parameters
  • sample_rates – list of sample rates to read, defaults to [150, 24000]

  • run_name_zeros (integer, optional) – Number of zeros in the run name, defaults to 4

Returns

List of run dataframes with only the first block of files

Return type

collections.OrderedDict

Example
>>> from mth5.io.phoenix import PhoenixCollection
>>> phx_collection = PhoenixCollection(r"/path/to/station")
>>> run_dict = phx_collection.get_runs(sample_rates=[150, 24000])
to_dataframe()[source]

Get a data frame of the file summary with column names:

  • survey: survey id

  • station: station id

  • run: run id

  • start: start time UTC

  • end: end time UTC

  • channel_id: channel id or list of channel id’s in file

  • component: channel component or list of components in file

  • fn: path to file

  • sample_rate: sample rate in samples per second

  • file_size: file size in bytes

  • n_samples: number of samples in file

  • sequence_number: sequence number of the file

  • instrument_id: instrument id

  • calibration_fn: calibration file

Returns

summary table of file names,

Return type

TYPE

mth5.io.reader module

This is a utility function to get the appropriate reader for a given file type and return the appropriate object of mth5.timeseries

This setup to be like plugins but a hack cause I did not find the time to set this up properly as a true plugin.

If you are writing your own reader you need the following structure:

  • Class object that will read the given file

  • a reader function that is read_{file_type}, for instance read_nims

  • the return value is a mth5.timeseries.MTTS or mth5.timeseries.RunTS object and any extra metadata in the form of a dictionary with keys as {level.attribute}.

class NewFile
    def __init__(self, fn):
        self.fn = fn

    def read_header(self):
        return header_information

    def read_newfile(self):
        ex, ey, hx, hy, hz = read_in_channels_as_MTTS
        return RunTS([ex, ey, hx, hy, hz])

def read_newfile(fn):
    new_file_obj = NewFile(fn)
    run_obj = new_file_obj.read_newfile()

    return run_obj, extra_metadata

Then add your reader to the reader dictionary so that those files can be read.

See also

Existing readers for some guidance found in mth5.io

Created on Wed Aug 26 10:32:45 2020

author

Jared Peacock

license

MIT

mth5.io.reader.get_reader(extension)[source]

get the proper reader for file extension

Parameters

extension (string) – file extension

Returns

the correct function to read the file

Return type

function

mth5.io.reader.read_file(fn, file_type=None, **kwargs)[source]

This is the universal reader for MT time series. This will pick out the proper reader given the file type or extension. Keyworkd arguments will depend on the reader and file type.

Parameters
  • fn (string or pathlib.Path) – full path to file

  • file_type (string) – a specific file time if the extension is ambiguous.

Returns

channel or run time series object

Return type

mth5.timeseries.MTTS or mth5.timeseries.RunTS

Module contents
class mth5.io.Collection(file_path=None, **kwargs)[source]

Bases: object

A general collection class to keep track of files with methods to create runs and run ids.

assign_run_names()[source]
Returns

DESCRIPTION

Return type

TYPE

property file_path

Path object to file directory

get_empty_entry_dict()[source]
Returns

an empty dictionary with the proper keys for an entry into a dataframe

Return type

dict

get_files(extension)[source]

Get files with given extension. Uses Pathlib.Path.rglob, so it finds all files within the file_path by searching all sub-directories.

Parameters

extension (string or list) – file extension(s)

Returns

list of files in the file_path with the given extensions

Return type

list of Path objects

get_runs(sample_rates, run_name_zeros=4, calibration_path=None)[source]

Get a list of runs contained within the given folder. First the dataframe will be developed from which the runs are extracted.

For continous data all you need is the first file in the sequence. The reader will read in the entire sequence.

For segmented data it will only read in the given segment, which is slightly different from the original reader.

Parameters
  • sample_rates – list of sample rates to read, defaults to [150, 24000]

  • run_name_zeros (integer, optional) – Number of zeros in the run name, defaults to 4

Returns

List of run dataframes with only the first block of files

Return type

collections.OrderedDict

Example
>>> from mth5.io.phoenix import PhoenixCollection
>>> phx_collection = PhoenixCollection(r"/path/to/station")
>>> run_dict = phx_collection.get_runs(sample_rates=[150, 24000])
to_dataframe()[source]

Get a data frame of the file summary with column names:

  • survey: survey id

  • station: station id

  • run: run id

  • start: start time UTC

  • end: end time UTC

  • channel_id: channel id or list of channel id’s in file

  • component: channel component or list of components in file

  • fn: path to file

  • sample_rate: sample rate in samples per second

  • file_size: file size in bytes

  • n_samples: number of samples in file

  • sequence_number: sequence number of the file

  • instrument_id: instrument id

  • calibration_fn: calibration file

Returns

summary table of file names,

Return type

TYPE

mth5.io.read_file(fn, file_type=None, **kwargs)[source]

This is the universal reader for MT time series. This will pick out the proper reader given the file type or extension. Keyworkd arguments will depend on the reader and file type.

Parameters
  • fn (string or pathlib.Path) – full path to file

  • file_type (string) – a specific file time if the extension is ambiguous.

Returns

channel or run time series object

Return type

mth5.timeseries.MTTS or mth5.timeseries.RunTS

mth5.tables package
Submodules
mth5.tables.channel_table module

Created on Wed Mar 23 14:09:38 2022

@author: jpeacock

class mth5.tables.channel_table.ChannelSummaryTable(hdf5_dataset)[source]

Bases: MTH5Table

Object to hold the channel summary and provide some convenience functions like fill, to_dataframe …

summarize()[source]
Returns

DESCRIPTION

Return type

TYPE

to_dataframe()[source]

Create a pandas DataFrame from the table for easier querying.

Returns

Channel Summary

Return type

pandas.DataFrame

mth5.tables.mth5_table module

Created on Wed Dec 23 16:53:55 2020

author

Jared Peacock

license

MIT

class mth5.tables.mth5_table.MTH5Table(hdf5_dataset)[source]

Bases: object

Use the underlying NumPy basics, there are simple actions in this table, if a user wants to use something more sophisticated for querying they should try using a pandas table. In this case entries in the table are more difficult to change and datatypes need to be kept track of.

add_row(row, index=None)[source]

Add a row to the table.

row must be of the same data type as the table

Parameters
  • row (TYPE) – row entry for the table

  • index (integer, if None is given then the row is added to the end of the array) – index of row to add

Returns

index of the row added

Return type

integer

check_dtypes(other_dtype)[source]

Check to make sure datatypes match

clear_table()[source]

clear a table,

Basically delete the table and start over :return: DESCRIPTION :rtype: TYPE

property dtype
locate(column, value, test='eq')[source]

locate index where column is equal to value :param column: DESCRIPTION :type column: TYPE :param value: DESCRIPTION :type value: TYPE :type test: type of test to try * ‘eq’: equals * ‘lt’: less than * ‘le’: less than or equal to * ‘gt’: greater than * ‘ge’: greater than or equal to. * ‘be’: between or equal to * ‘bt’: between

If be or bt input value as a list of 2 values

Returns

DESCRIPTION

Return type

TYPE

property nrows
remove_row(index)[source]

Remove a row

Note

that there is not index value within the array, so the indexing is on the fly. A user should use the HDF5 reference instead of index number that is the safest and most robust method.

Parameters

index (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

This isn’t as easy as just deleteing an element. Need to delete the element from the weakly referenced array and then set the summary table dataset to the new array.

So set to a null array for now until a more clever option is found.

property shape
to_dataframe()[source]

Convert the table into a pandas.DataFrame object.

Returns

convert table into a pandas.DataFrame with the appropriate data types.

Return type

pandas.DataFrame

update_row(entry)[source]

Update an entry by first locating the index and then rewriting the entry.

Parameters

entry (np.ndarray) – numpy array with same datatype as the table

Returns

row index.

This doesn’t work because you cannot test for hdf5_reference, should use add row and locate by index.

mth5.tables.tf_table module

Created on Wed Mar 23 14:09:38 2022

@author: jpeacock

class mth5.tables.tf_table.TFSummaryTable(hdf5_dataset)[source]

Bases: MTH5Table

Object to hold the channel summary and provide some convenience functions like fill, to_dataframe …

summarize()[source]
Returns

DESCRIPTION

Return type

TYPE

to_dataframe()[source]

Create a pandas DataFrame from the table for easier querying.

Returns

Channel Summary

Return type

pandas.DataFrame

Module contents
class mth5.tables.ChannelSummaryTable(hdf5_dataset)[source]

Bases: MTH5Table

Object to hold the channel summary and provide some convenience functions like fill, to_dataframe …

summarize()[source]
Returns

DESCRIPTION

Return type

TYPE

to_dataframe()[source]

Create a pandas DataFrame from the table for easier querying.

Returns

Channel Summary

Return type

pandas.DataFrame

class mth5.tables.MTH5Table(hdf5_dataset)[source]

Bases: object

Use the underlying NumPy basics, there are simple actions in this table, if a user wants to use something more sophisticated for querying they should try using a pandas table. In this case entries in the table are more difficult to change and datatypes need to be kept track of.

add_row(row, index=None)[source]

Add a row to the table.

row must be of the same data type as the table

Parameters
  • row (TYPE) – row entry for the table

  • index (integer, if None is given then the row is added to the end of the array) – index of row to add

Returns

index of the row added

Return type

integer

check_dtypes(other_dtype)[source]

Check to make sure datatypes match

clear_table()[source]

clear a table,

Basically delete the table and start over :return: DESCRIPTION :rtype: TYPE

property dtype
locate(column, value, test='eq')[source]

locate index where column is equal to value :param column: DESCRIPTION :type column: TYPE :param value: DESCRIPTION :type value: TYPE :type test: type of test to try * ‘eq’: equals * ‘lt’: less than * ‘le’: less than or equal to * ‘gt’: greater than * ‘ge’: greater than or equal to. * ‘be’: between or equal to * ‘bt’: between

If be or bt input value as a list of 2 values

Returns

DESCRIPTION

Return type

TYPE

property nrows
remove_row(index)[source]

Remove a row

Note

that there is not index value within the array, so the indexing is on the fly. A user should use the HDF5 reference instead of index number that is the safest and most robust method.

Parameters

index (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

This isn’t as easy as just deleteing an element. Need to delete the element from the weakly referenced array and then set the summary table dataset to the new array.

So set to a null array for now until a more clever option is found.

property shape
to_dataframe()[source]

Convert the table into a pandas.DataFrame object.

Returns

convert table into a pandas.DataFrame with the appropriate data types.

Return type

pandas.DataFrame

update_row(entry)[source]

Update an entry by first locating the index and then rewriting the entry.

Parameters

entry (np.ndarray) – numpy array with same datatype as the table

Returns

row index.

This doesn’t work because you cannot test for hdf5_reference, should use add row and locate by index.

class mth5.tables.TFSummaryTable(hdf5_dataset)[source]

Bases: MTH5Table

Object to hold the channel summary and provide some convenience functions like fill, to_dataframe …

summarize()[source]
Returns

DESCRIPTION

Return type

TYPE

to_dataframe()[source]

Create a pandas DataFrame from the table for easier querying.

Returns

Channel Summary

Return type

pandas.DataFrame

mth5.timeseries package
Submodules
mth5.timeseries.channel_ts module

lists and arrays that goes on, seems easiest to convert all lists to strings and then convert them back if read in.

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.timeseries.channel_ts.ChannelTS(channel_type='auxiliary', data=None, channel_metadata=None, station_metadata=None, run_metadata=None, survey_metadata=None, **kwargs)[source]

Bases: object

Note

Assumes equally spaced samples from the start time.

The time series is stored in an xarray.Dataset that has coordinates of time and is a 1-D array labeled ‘data’. The xarray.Dataset can be accessed and set from the ts. The data is stored in ‘ts.data’ and the time index is a coordinate of ts.

The time coordinate is made from the start time, sample rate and number of samples. Currently, End time is a derived property and cannot be set.

Channel time series object is based on xarray and mth5.metadata therefore any type of interpolation, resampling, groupby, etc can be done using xarray methods.

There are 3 metadata classes that hold important metadata

  • mth5.metadata.Station holds information about the station

  • mth5.metadata.Run holds information about the run the channel

belongs to. * :class`mth5.metadata.Channel` holds information specific to the channel.

This way a single channel will hold all information needed to represent the channel.

Rubric
>>> from mth5.timeseries import ChannelTS
>>> ts_obj = ChannelTS('auxiliary')
>>> ts_obj.sample_rate = 8
>>> ts_obj.start = '2020-01-01T12:00:00+00:00'
>>> ts_obj.ts = range(4096)
>>> ts_obj.station_metadata.id = 'MT001'
>>> ts_obj.run_metadata.id = 'MT001a'
>>> ts_obj.component = 'temperature'
>>> print(ts_obj)
        Station      = MT001
        Run          = MT001a
        Channel Type = auxiliary
    Component    = temperature
        Sample Rate  = 8.0
        Start        = 2020-01-01T12:00:00+00:00
        End          = 2020-01-01T12:08:31.875000+00:00
        N Samples    = 4096
>>> p = ts_obj.ts.plot()
property channel_metadata

station metadata

property channel_response_filter

Full channel response filter

Returns

full channel response filter

Return type

mt_metadata.timeseries.filters.ChannelResponseFilter

property channel_type

Channel Type

property component
copy(data=True)[source]

Make a copy of the ChannelTS object with or without data.

Parameters

data (boolean) – include data in the copy (True) or not (False)

Returns

Copy of the channel

Return type

:class:`mth5.timeseries.ChannelTS

decimate(new_sample_rate, inplace=False, max_decimation=8)[source]

decimate the data by using scipy.signal.decimate

Parameters

dec_factor (int) – decimation factor

  • refills ts.data with decimated data and replaces sample_rate

property end

MTime object

from_obspy_trace(obspy_trace)[source]

Fill data from an obspy.core.Trace

Parameters

obspy_trace (obspy.core.trace) – Obspy trace object

get_slice(start, end=None, n_samples=None)[source]

Get a slice from the time series given a start and end time.

Looks for >= start & <= end

Uses loc to be exact with milliseconds

Parameters
  • start (string, MTime) – start time of the slice

  • end (string, MTime) – end time of the slice

  • n_samples (integer) – number of sample to get after start time

Returns

slice of the channel requested

Return type

ChannelTS

has_data()[source]

check to see if there is an index in the time series

merge(other, gap_method='slinear', new_sample_rate=None, resample_method='poly')[source]

merg two channels or list of channels together in the following steps

  1. xr.combine_by_coords([original, other])

  2. compute monotonic time index

  3. reindex(new_time_index, method=gap_method)

If you want a different method or more control use merge

Parameters

other (mth5.timeseries.ChannelTS) – Another channel

Raises
  • TypeError – If input is not a ChannelTS

  • ValueError – if the components are different

Returns

Combined channel with monotonic time index and same metadata

Return type

mth5.timeseries.ChannelTS

property n_samples

number of samples

plot()[source]

Simple plot of the data

Returns

figure object

Return type

matplotlib.figure

plot_spectra(spectra_type='welch', window_length=4096, **kwargs)[source]
Parameters
  • spectra_type (string, optional) – spectra type, defaults to “welch”

  • window_length (int, optional) – window length of the welch method should be a power of 2, defaults to 2 ** 12

  • **kwargs

    DESCRIPTION

remove_instrument_response(**kwargs)[source]

Remove instrument response from the given channel response filter

The order of operations is important (if applied):

  1. detrend

  2. zero mean

  3. zero pad

  4. time window

  5. frequency window

  6. remove response

  7. undo time window

  8. bandpass

kwargs

Parameters
  • plot (boolean, default True) – to plot the calibration process [ False | True ]

  • detrend (boolean, default True) – Remove linar trend of the time series

  • zero_mean (boolean, default True) – Remove the mean of the time series

  • zero_pad (boolean, default True) – pad the time series to the next power of 2 for efficiency

  • t_window (string, default None) – Time domain windown name see scipy.signal.windows for options

  • t_window_params – Time domain window parameters, parameters can be

found in scipy.signal.windows :type t_window_params: dictionary :param f_window: Frequency domain windown name see scipy.signal.windows for options :type f_window: string, defualt None :param f_window_params: Frequency window parameters, parameters can be found in scipy.signal.windows :type f_window_params: dictionary :param bandpass: bandpass freequency and order {“low”:, “high”:, “order”:,} :type bandpass: dictionary

resample_poly(new_sample_rate, pad_type='mean', inplace=False)[source]

Use scipy.signal.resample_poly to resample data while using an FIR filter to remove aliasing.

Parameters
  • new_sample_rate (TYPE) – DESCRIPTION

  • pad_type (TYPE, optional) – DESCRIPTION, defaults to “mean”

Returns

DESCRIPTION

Return type

TYPE

property run_metadata

station metadata

property sample_interval

Sample interval = 1 / sample_rate

Returns

sample interval as time distance between time samples

Return type

float

property sample_rate

sample rate in samples/second

property start

MTime object

property station_metadata

station metadata

property survey_metadata

survey metadata

property time_index

time index as a numpy array dtype np.datetime[ns]

Returns

array of the time index

Return type

np.ndarray(dtype=np.datetime[ns])

to_obspy_trace()[source]

Convert the time series to an obspy.core.trace.Trace object. This will be helpful for converting between data pulled from IRIS and data going into IRIS.

Returns

DESCRIPTION

Return type

TYPE

to_xarray()[source]

Returns a xarray.DataArray object of the channel timeseries this way metadata from the metadata class is updated upon return.

Returns

Returns a xarray.DataArray object of the channel timeseries

this way metadata from the metadata class is updated upon return. :rtype: xarray.DataArray

>>> import numpy as np
>>> from mth5.timeseries import ChannelTS
>>> ex = ChannelTS("electric")
>>> ex.start = "2020-01-01T12:00:00"
>>> ex.sample_rate = 16
>>> ex.ts = np.random.rand(4096)
property ts

Time series as a numpy array

welch_spectra(window_length=4096, **kwargs)[source]

get welch spectra

Parameters
  • window_length (TYPE) – DESCRIPTION

  • **kwargs

    DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

mth5.timeseries.run_ts module

lists and arrays that goes on, seems easiest to convert all lists to strings and then convert them back if read in.

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.timeseries.run_ts.RunTS(array_list=None, run_metadata=None, station_metadata=None, survey_metadata=None)[source]

Bases: object

holds all run ts in one aligned array

components –> {‘ex’: ex_xarray, ‘ey’: ey_xarray}

ToDo, have a single Survey object under the hood and properties to other metadata objects for get/set.

add_channel(channel)[source]

Add a channel to the dataset, can be an xarray.DataArray or mth5.timeseries.ChannelTS object.

Need to be sure that the coordinates and dimensions are the same as the existing dataset, namely coordinates are time, and dimensions are the same, if the dimesions are larger than the existing dataset then the added channel will be clipped to the dimensions of the existing dataset.

If the start time is not the same nan’s will be placed at locations where the timing does not match the current start time. This is a feature of xarray.

Parameters

channel (xarray.DataArray or mth5.timeseries.ChannelTS) – a channel xarray or ChannelTS to add to the run

calibrate(**kwargs)[source]

Calibrate the data according to the filters in each channel.

Returns

calibrated run

Return type

mth5.timeseries.RunTS

property channels

List of channel names in dataset

copy(data=True)[source]
Parameters

data (TYPE, optional) – DESCRIPTION, defaults to True

Returns

DESCRIPTION

Return type

TYPE

property dataset

xarray.Dataset

decimate(new_sample_rate, inplace=False, max_decimation=8)[source]

decimate data to new sample rate.

Parameters
  • new_sample_rate (TYPE) – DESCRIPTION

  • inplace (TYPE, optional) – DESCRIPTION, defaults to False

Returns

DESCRIPTION

Return type

TYPE

property end

End time UTC

property filters

Dictionary of filters used by the channels

from_obspy_stream(obspy_stream, run_metadata=None)[source]

Get a run from an obspy.core.stream which is a list of obspy.core.Trace objects.

Parameters

obspy_stream (obspy.core.Stream) – Obspy Stream object

get_slice(start, end=None, n_samples=None)[source]
Parameters
  • start (TYPE) – DESCRIPTION

  • end (TYPE, optional) – DESCRIPTION, defaults to None

  • n_samples (TYPE, optional) – DESCRIPTION, defaults to None

Raises

ValueError – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

has_data()[source]

check to see if there is data

merge(other, gap_method='slinear', new_sample_rate=None, resample_method='poly')[source]

merg two runs or list of runs together in the following steps

  1. xr.combine_by_coords([original, other])

  2. compute monotonic time index

  3. reindex(new_time_index, method=gap_method)

If you want a different method or more control use merge

Parameters

other (mth5.timeseries.RunTS) – Another run

Raises
  • TypeError – If input is not a RunTS

  • ValueError – if the components are different

Returns

Combined run with monotonic time index and same metadata

Return type

mth5.timeseries.RunTS

plot(color_map={'ex': (1, 0.2, 0.2), 'ey': (1, 0.5, 0), 'hx': (0, 0.5, 1), 'hy': (0.5, 0.2, 1), 'hz': (0.2, 1, 1)}, channel_order=None)[source]

plot the time series probably slow for large data sets

plot_spectra(spectra_type='welch', color_map={'ex': (1, 0.2, 0.2), 'ey': (1, 0.5, 0), 'hx': (0, 0.5, 1), 'hy': (0.5, 0.2, 1), 'hz': (0.2, 1, 1)}, **kwargs)[source]

Plot spectra using spectra type, only ‘welch’ is supported now.

Parameters
  • spectra_type (string, optional) – spectra type, defaults to “welch”

  • color_map (dictionary, optional) – colors of channels, defaults to { “ex”: (1, 0.2, 0.2), “ey”: (1, 0.5, 0), “hx”: (0, 0.5, 1), “hy”: (0.5, 0.2, 1), “hz”: (0.2, 1, 1), }

  • **kwargs

    key words for the spectra type

resample(new_sample_rate, inplace=False)[source]

Resample data to new sample rate. :param new_sample_rate: DESCRIPTION :type new_sample_rate: TYPE :param inplace: DESCRIPTION, defaults to False :type inplace: TYPE, optional :return: DESCRIPTION :rtype: TYPE

resample_poly(new_sample_rate, pad_type='mean', inplace=False)[source]

Use scipy.signal.resample_poly to resample data while using an FIR filter to remove aliasing.

Parameters
  • new_sample_rate (TYPE) – DESCRIPTION

  • pad_type (TYPE, optional) – DESCRIPTION, defaults to “mean”

Returns

DESCRIPTION

Return type

TYPE

property run_metadata

station metadata

property sample_interval

Sample interval = 1 / sample_rate :return: DESCRIPTION :rtype: TYPE

property sample_rate

Sample rate, this is estimated by the mdeian difference between samples in time, if data is present. Otherwise return the metadata sample rate.

set_dataset(array_list, align_type='outer')[source]
Parameters
  • array_list (list of mth5.timeseries.ChannelTS objects) – list of xarrays

  • align_type (string) – how the different times will be aligned * ’outer’: use the union of object indexes * ’inner’: use the intersection of object indexes * ’left’: use indexes from the first object with each dimension * ’right’: use indexes from the last object with each dimension * ’exact’: instead of aligning, raise ValueError when indexes to be aligned are not equal * ’override’: if indexes are of same size, rewrite indexes to be those of the first object with that dimension. Indexes for the same dimension must have the same size in all objects.

property start

Start time UTC

property station_metadata

station metadata

property summarize_metadata

Get a summary of all the metadata

Returns

A summary of all channel metadata in one place

Return type

dictionary

property survey_metadata

survey metadata

to_obspy_stream()[source]

convert time series to an obspy.core.Stream which is like a list of obspy.core.Trace objects.

Returns

An Obspy Stream object from the time series data

Return type

obspy.core.Stream

validate_metadata()[source]

Check to make sure that the metadata matches what is in the data set.

updates metadata from the data.

Check the start and end times, channels recorded

mth5.timeseries.ts_filters module

time series filters

class mth5.timeseries.ts_filters.RemoveInstrumentResponse(ts, time_array, sample_interval, channel_response_filter, **kwargs)[source]

Bases: object

Remove instrument response from the given channel response filter

The order of operations is important (if applied):

  1. detrend

  2. zero mean

  3. zero pad

  4. time window

  5. frequency window

  6. remove response

  7. undo time window

  8. bandpass

Parameters
  • ts (np.ndarray((N,) , dtype=float)) – time series data to remove response from

  • time_array (np.ndarray((N,) , dtype=np.datetime[ns])) – time index that corresponds to the time series

  • sample_interval (float) – seconds per sample (time interval between samples)

  • channel_response_filter – Channel response filter with all filters

included to convert from counts to physical units :type channel_response_filter: class:mt_metadata.timeseries.filters.ChannelResponseFilter`

kwargs

Parameters
  • plot (boolean, default True) – to plot the calibration process [ False | True ]

  • detrend (boolean, default True) – Remove linar trend of the time series

  • zero_mean (boolean, default True) – Remove the mean of the time series

  • zero_pad (boolean, default True) – pad the time series to the next power of 2 for efficiency

  • t_window (string, default None) – Time domain windown name see scipy.signal.windows for options

  • t_window_params – Time domain window parameters, parameters can be

found in scipy.signal.windows :type t_window_params: dictionary :param f_window: Frequency domain windown name see scipy.signal.windows for options :type f_window: string, defualt None :param f_window_params: Frequency window parameters, parameters can be found in scipy.signal.windows :type f_window_params: dictionary :param bandpass: bandpass freequency and order {“low”:, “high”:, “order”:,} :type bandpass: dictionary

apply_bandpass(ts)[source]

apply a bandpass filter to the calibrated data

Parameters

ts (np.ndarray) – calibrated time series

Returns

bandpassed time series

Return type

np.ndarray

apply_detrend(ts)[source]

Detrend time series using scipy.detrend(‘linear’)

Parameters

ts (np.ndarray) – input time series

Returns

detrended time series

Return type

np.ndarray

apply_f_window(data)[source]

Apply a frequency domain window. Get the available windows from scipy.signal.windows

Need to create a window twice the size of the input because we are only taking the rfft which gives just half the spectra and then take only half the window

Parameters

data (np.ndarray) – input spectra

Returns

windowed spectra

Return type

np.ndarray

apply_t_window(ts)[source]

Apply a window in the time domain. Get the available windows from scipy.signal.windows

Parameters

ts (np.ndarray) – input time series

Returns

windowed time series

Return type

np.ndarray

apply_zero_mean(ts)[source]

Remove the mean from the time series

Parameters

ts (np.ndarray) – input time series

Returns

zero mean time series

Return type

np.ndarray

apply_zero_pad(ts)[source]

zero pad to power of 2, at the end of the time series to make the FFT more efficient

Parameters

ts (np.ndarray) – input time series

Returns

zero padded time series

Return type

np.ndarray

static get_window(window, window_params, size)[source]

Get window from scipy.signal

Parameters
  • window (string) – name of the window

  • window_params (dictionary) – dictionary of window parameters

  • size (integer) – number of points in the window

Returns

window function

Return type

class:scipy.signal

remove_instrument_response(operation='divide')[source]

Remove instrument response following the recipe provided

Returns

calibrated time series

Return type

np.ndarray

mth5.timeseries.ts_filters.adaptive_notch_filter(bx, df=100, notches=[50, 100], notchradius=0.5, freqrad=0.9, rp=0.1, dbstop_limit=5.0)[source]
Parameters
  • bx (np.ndarray) – time series to filter

  • df (float, optional) – sample rate in samples per second, defaults to 100

  • notches (list, optional) – list of frequencies to locate notches at in Hz, defaults to [50, 100]

  • notchradius (float, optional) – notch radius, defaults to 0.5

  • freqrad (float, optional) – radius to search for a peak at the notch frequency, defaults to 0.9

  • rp (float, optional) – ripple of Chebyshev type 1 filter, lower numbers means less ripples, defaults to 0.1

  • dbstop_limit (float, optional) – limits the difference between the peak at the notch and surrounding spectra. Any difference above dbstop_limit will be filtered, anything less will not, defaults to 5.0

Returns

notch filtered data

Return type

np.ndarray

Returns

list of notch frequencies

Return type

list

Example

>>> import RemovePeriodicNoise_Kate as rmp
>>> # make a variable for the file to load in
>>> fn = r"/home/MT/mt01_20130101_000000.BX"
>>> # load in file, if the time series is not an ascii file
>>> # might need to add keywords to np.loadtxt or use another
>>> # method to read in the file
>>> bx = np.loadtxt(fn)
>>> # create a list of frequencies to filter out
>>> freq_notches = [50, 150, 200]
>>> # filter data
>>> bx_filt, filt_lst = rmp.adaptiveNotchFilter(bx, df=100.
>>> ...                                         notches=freq_notches)
>>> #save the filtered data into a file
>>> np.savetxt(r"/home/MT/Filtered/mt01_20130101_000000.BX", bx_filt)

Notes

Most of the time the default parameters work well, the only thing you need to change is the notches and perhaps the radius. I would test it out with a few time series to find the optimum parameters. Then make a loop over all you time series data. Something like

>>> import os
>>> dirpath = r"/home/MT"
>>> #make a director to save filtered time series
>>> save_path = r"/home/MT/Filtered"
>>> if not os.path.exists(save_path):
>>>     os.mkdir(save_path)
>>> for fn in os.listdir(dirpath):
>>>     bx = np.loadtxt(os.path.join(dirpath, fn)
>>>     bx_filt, filt_lst = rmp.adaptiveNotchFilter(bx, df=100.
>>>     ...                                         notches=freq_notches)
>>>     np.savetxt(os.path.join(save_path, fn), bx_filt)
mth5.timeseries.ts_filters.butter_bandpass(lowcut, highcut, fs, order=5)[source]

Butterworth bandpass filter using scipy.signal

Parameters
  • lowcut (float) – low cut frequency in Hz

  • highcut (float) – high cut frequency in Hz

  • fs (float) – Sample rate

  • order (int, optional) – Butterworth order, defaults to 5

Returns

SOS scipy.signal format

Return type

scipy.signal.SOS?

mth5.timeseries.ts_filters.butter_bandpass_filter(data, lowcut, highcut, fs, order=5)[source]
Parameters
  • data (np.ndarray) – 1D time series data

  • lowcut (float) – low cut frequency in Hz

  • highcut (float) – high cut frequency in Hz

  • fs (float) – Sample rate

  • order (int, optional) – Butterworth order, defaults to 5

Returns

filtered data

Return type

np.ndarray

mth5.timeseries.ts_filters.low_pass(data, low_pass_freq, cutoff_freq, sampling_rate)[source]
Parameters
  • data (np.ndarray) – 1D time series data

  • low_pass_freq (float) – low pass frequency in Hz

  • cutoff_freq (float) – cut off frequency in Hz

  • sampling_rate (float) – Sample rate in samples per second

Returns

lowpass filtered data

Return type

np.ndarray

mth5.timeseries.ts_filters.zero_pad(input_array, power=2, pad_fill=0)[source]
Parameters
  • input_array (np.ndarray) – 1D array

  • power – base power to used to pad to, defaults to 2 which is optimal

for the FFT :type power: int, optional :param pad_fill: fill value for padded values, defaults to 0 :type pad_fill: float, optional :return: zero padded array :rtype: np.ndarray

Module contents
class mth5.timeseries.ChannelTS(channel_type='auxiliary', data=None, channel_metadata=None, station_metadata=None, run_metadata=None, survey_metadata=None, **kwargs)[source]

Bases: object

Note

Assumes equally spaced samples from the start time.

The time series is stored in an xarray.Dataset that has coordinates of time and is a 1-D array labeled ‘data’. The xarray.Dataset can be accessed and set from the ts. The data is stored in ‘ts.data’ and the time index is a coordinate of ts.

The time coordinate is made from the start time, sample rate and number of samples. Currently, End time is a derived property and cannot be set.

Channel time series object is based on xarray and mth5.metadata therefore any type of interpolation, resampling, groupby, etc can be done using xarray methods.

There are 3 metadata classes that hold important metadata

  • mth5.metadata.Station holds information about the station

  • mth5.metadata.Run holds information about the run the channel

belongs to. * :class`mth5.metadata.Channel` holds information specific to the channel.

This way a single channel will hold all information needed to represent the channel.

Rubric
>>> from mth5.timeseries import ChannelTS
>>> ts_obj = ChannelTS('auxiliary')
>>> ts_obj.sample_rate = 8
>>> ts_obj.start = '2020-01-01T12:00:00+00:00'
>>> ts_obj.ts = range(4096)
>>> ts_obj.station_metadata.id = 'MT001'
>>> ts_obj.run_metadata.id = 'MT001a'
>>> ts_obj.component = 'temperature'
>>> print(ts_obj)
        Station      = MT001
        Run          = MT001a
        Channel Type = auxiliary
    Component    = temperature
        Sample Rate  = 8.0
        Start        = 2020-01-01T12:00:00+00:00
        End          = 2020-01-01T12:08:31.875000+00:00
        N Samples    = 4096
>>> p = ts_obj.ts.plot()
property channel_metadata

station metadata

property channel_response_filter

Full channel response filter

Returns

full channel response filter

Return type

mt_metadata.timeseries.filters.ChannelResponseFilter

property channel_type

Channel Type

property component
copy(data=True)[source]

Make a copy of the ChannelTS object with or without data.

Parameters

data (boolean) – include data in the copy (True) or not (False)

Returns

Copy of the channel

Return type

:class:`mth5.timeseries.ChannelTS

decimate(new_sample_rate, inplace=False, max_decimation=8)[source]

decimate the data by using scipy.signal.decimate

Parameters

dec_factor (int) – decimation factor

  • refills ts.data with decimated data and replaces sample_rate

property end

MTime object

from_obspy_trace(obspy_trace)[source]

Fill data from an obspy.core.Trace

Parameters

obspy_trace (obspy.core.trace) – Obspy trace object

get_slice(start, end=None, n_samples=None)[source]

Get a slice from the time series given a start and end time.

Looks for >= start & <= end

Uses loc to be exact with milliseconds

Parameters
  • start (string, MTime) – start time of the slice

  • end (string, MTime) – end time of the slice

  • n_samples (integer) – number of sample to get after start time

Returns

slice of the channel requested

Return type

ChannelTS

has_data()[source]

check to see if there is an index in the time series

merge(other, gap_method='slinear', new_sample_rate=None, resample_method='poly')[source]

merg two channels or list of channels together in the following steps

  1. xr.combine_by_coords([original, other])

  2. compute monotonic time index

  3. reindex(new_time_index, method=gap_method)

If you want a different method or more control use merge

Parameters

other (mth5.timeseries.ChannelTS) – Another channel

Raises
  • TypeError – If input is not a ChannelTS

  • ValueError – if the components are different

Returns

Combined channel with monotonic time index and same metadata

Return type

mth5.timeseries.ChannelTS

property n_samples

number of samples

plot()[source]

Simple plot of the data

Returns

figure object

Return type

matplotlib.figure

plot_spectra(spectra_type='welch', window_length=4096, **kwargs)[source]
Parameters
  • spectra_type (string, optional) – spectra type, defaults to “welch”

  • window_length (int, optional) – window length of the welch method should be a power of 2, defaults to 2 ** 12

  • **kwargs

    DESCRIPTION

remove_instrument_response(**kwargs)[source]

Remove instrument response from the given channel response filter

The order of operations is important (if applied):

  1. detrend

  2. zero mean

  3. zero pad

  4. time window

  5. frequency window

  6. remove response

  7. undo time window

  8. bandpass

kwargs

Parameters
  • plot (boolean, default True) – to plot the calibration process [ False | True ]

  • detrend (boolean, default True) – Remove linar trend of the time series

  • zero_mean (boolean, default True) – Remove the mean of the time series

  • zero_pad (boolean, default True) – pad the time series to the next power of 2 for efficiency

  • t_window (string, default None) – Time domain windown name see scipy.signal.windows for options

  • t_window_params – Time domain window parameters, parameters can be

found in scipy.signal.windows :type t_window_params: dictionary :param f_window: Frequency domain windown name see scipy.signal.windows for options :type f_window: string, defualt None :param f_window_params: Frequency window parameters, parameters can be found in scipy.signal.windows :type f_window_params: dictionary :param bandpass: bandpass freequency and order {“low”:, “high”:, “order”:,} :type bandpass: dictionary

resample_poly(new_sample_rate, pad_type='mean', inplace=False)[source]

Use scipy.signal.resample_poly to resample data while using an FIR filter to remove aliasing.

Parameters
  • new_sample_rate (TYPE) – DESCRIPTION

  • pad_type (TYPE, optional) – DESCRIPTION, defaults to “mean”

Returns

DESCRIPTION

Return type

TYPE

property run_metadata

station metadata

property sample_interval

Sample interval = 1 / sample_rate

Returns

sample interval as time distance between time samples

Return type

float

property sample_rate

sample rate in samples/second

property start

MTime object

property station_metadata

station metadata

property survey_metadata

survey metadata

property time_index

time index as a numpy array dtype np.datetime[ns]

Returns

array of the time index

Return type

np.ndarray(dtype=np.datetime[ns])

to_obspy_trace()[source]

Convert the time series to an obspy.core.trace.Trace object. This will be helpful for converting between data pulled from IRIS and data going into IRIS.

Returns

DESCRIPTION

Return type

TYPE

to_xarray()[source]

Returns a xarray.DataArray object of the channel timeseries this way metadata from the metadata class is updated upon return.

Returns

Returns a xarray.DataArray object of the channel timeseries

this way metadata from the metadata class is updated upon return. :rtype: xarray.DataArray

>>> import numpy as np
>>> from mth5.timeseries import ChannelTS
>>> ex = ChannelTS("electric")
>>> ex.start = "2020-01-01T12:00:00"
>>> ex.sample_rate = 16
>>> ex.ts = np.random.rand(4096)
property ts

Time series as a numpy array

welch_spectra(window_length=4096, **kwargs)[source]

get welch spectra

Parameters
  • window_length (TYPE) – DESCRIPTION

  • **kwargs

    DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

class mth5.timeseries.RunTS(array_list=None, run_metadata=None, station_metadata=None, survey_metadata=None)[source]

Bases: object

holds all run ts in one aligned array

components –> {‘ex’: ex_xarray, ‘ey’: ey_xarray}

ToDo, have a single Survey object under the hood and properties to other metadata objects for get/set.

add_channel(channel)[source]

Add a channel to the dataset, can be an xarray.DataArray or mth5.timeseries.ChannelTS object.

Need to be sure that the coordinates and dimensions are the same as the existing dataset, namely coordinates are time, and dimensions are the same, if the dimesions are larger than the existing dataset then the added channel will be clipped to the dimensions of the existing dataset.

If the start time is not the same nan’s will be placed at locations where the timing does not match the current start time. This is a feature of xarray.

Parameters

channel (xarray.DataArray or mth5.timeseries.ChannelTS) – a channel xarray or ChannelTS to add to the run

calibrate(**kwargs)[source]

Calibrate the data according to the filters in each channel.

Returns

calibrated run

Return type

mth5.timeseries.RunTS

property channels

List of channel names in dataset

copy(data=True)[source]
Parameters

data (TYPE, optional) – DESCRIPTION, defaults to True

Returns

DESCRIPTION

Return type

TYPE

property dataset

xarray.Dataset

decimate(new_sample_rate, inplace=False, max_decimation=8)[source]

decimate data to new sample rate.

Parameters
  • new_sample_rate (TYPE) – DESCRIPTION

  • inplace (TYPE, optional) – DESCRIPTION, defaults to False

Returns

DESCRIPTION

Return type

TYPE

property end

End time UTC

property filters

Dictionary of filters used by the channels

from_obspy_stream(obspy_stream, run_metadata=None)[source]

Get a run from an obspy.core.stream which is a list of obspy.core.Trace objects.

Parameters

obspy_stream (obspy.core.Stream) – Obspy Stream object

get_slice(start, end=None, n_samples=None)[source]
Parameters
  • start (TYPE) – DESCRIPTION

  • end (TYPE, optional) – DESCRIPTION, defaults to None

  • n_samples (TYPE, optional) – DESCRIPTION, defaults to None

Raises

ValueError – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

has_data()[source]

check to see if there is data

merge(other, gap_method='slinear', new_sample_rate=None, resample_method='poly')[source]

merg two runs or list of runs together in the following steps

  1. xr.combine_by_coords([original, other])

  2. compute monotonic time index

  3. reindex(new_time_index, method=gap_method)

If you want a different method or more control use merge

Parameters

other (mth5.timeseries.RunTS) – Another run

Raises
  • TypeError – If input is not a RunTS

  • ValueError – if the components are different

Returns

Combined run with monotonic time index and same metadata

Return type

mth5.timeseries.RunTS

plot(color_map={'ex': (1, 0.2, 0.2), 'ey': (1, 0.5, 0), 'hx': (0, 0.5, 1), 'hy': (0.5, 0.2, 1), 'hz': (0.2, 1, 1)}, channel_order=None)[source]

plot the time series probably slow for large data sets

plot_spectra(spectra_type='welch', color_map={'ex': (1, 0.2, 0.2), 'ey': (1, 0.5, 0), 'hx': (0, 0.5, 1), 'hy': (0.5, 0.2, 1), 'hz': (0.2, 1, 1)}, **kwargs)[source]

Plot spectra using spectra type, only ‘welch’ is supported now.

Parameters
  • spectra_type (string, optional) – spectra type, defaults to “welch”

  • color_map (dictionary, optional) – colors of channels, defaults to { “ex”: (1, 0.2, 0.2), “ey”: (1, 0.5, 0), “hx”: (0, 0.5, 1), “hy”: (0.5, 0.2, 1), “hz”: (0.2, 1, 1), }

  • **kwargs

    key words for the spectra type

resample(new_sample_rate, inplace=False)[source]

Resample data to new sample rate. :param new_sample_rate: DESCRIPTION :type new_sample_rate: TYPE :param inplace: DESCRIPTION, defaults to False :type inplace: TYPE, optional :return: DESCRIPTION :rtype: TYPE

resample_poly(new_sample_rate, pad_type='mean', inplace=False)[source]

Use scipy.signal.resample_poly to resample data while using an FIR filter to remove aliasing.

Parameters
  • new_sample_rate (TYPE) – DESCRIPTION

  • pad_type (TYPE, optional) – DESCRIPTION, defaults to “mean”

Returns

DESCRIPTION

Return type

TYPE

property run_metadata

station metadata

property sample_interval

Sample interval = 1 / sample_rate :return: DESCRIPTION :rtype: TYPE

property sample_rate

Sample rate, this is estimated by the mdeian difference between samples in time, if data is present. Otherwise return the metadata sample rate.

set_dataset(array_list, align_type='outer')[source]
Parameters
  • array_list (list of mth5.timeseries.ChannelTS objects) – list of xarrays

  • align_type (string) – how the different times will be aligned * ’outer’: use the union of object indexes * ’inner’: use the intersection of object indexes * ’left’: use indexes from the first object with each dimension * ’right’: use indexes from the last object with each dimension * ’exact’: instead of aligning, raise ValueError when indexes to be aligned are not equal * ’override’: if indexes are of same size, rewrite indexes to be those of the first object with that dimension. Indexes for the same dimension must have the same size in all objects.

property start

Start time UTC

property station_metadata

station metadata

property summarize_metadata

Get a summary of all the metadata

Returns

A summary of all channel metadata in one place

Return type

dictionary

property survey_metadata

survey metadata

to_obspy_stream()[source]

convert time series to an obspy.core.Stream which is like a list of obspy.core.Trace objects.

Returns

An Obspy Stream object from the time series data

Return type

obspy.core.Stream

validate_metadata()[source]

Check to make sure that the metadata matches what is in the data set.

updates metadata from the data.

Check the start and end times, channels recorded

mth5.utils package
Submodules
mth5.utils.exceptions module

Exceptions raised by MTH5

Created on Wed May 13 19:07:21 2020

@author: jpeacock

exception mth5.utils.exceptions.MTH5Error[source]

Bases: Exception

exception mth5.utils.exceptions.MTH5TableError[source]

Bases: Exception

exception mth5.utils.exceptions.MTSchemaError[source]

Bases: Exception

exception mth5.utils.exceptions.MTTSError[source]

Bases: Exception

exception mth5.utils.exceptions.MTTimeError[source]

Bases: Exception

mth5.utils.fdsn_tools module

Tools for FDSN standards

Created on Wed Sep 30 11:47:01 2020

author

Jared Peacock

license

MIT

mth5.utils.fdsn_tools.get_location_code(channel_obj)[source]

Get the location code given the components and channel number

Parameters

channel_obj (Channel) – Channel object

Returns

2 character location code

Return type

string

mth5.utils.fdsn_tools.get_measurement_code(measurement)[source]

get SEED sensor code given the measurement type

Parameters

measurement (string) – measurement type, e.g. * temperature * electric * magnetic

Returns

single character SEED sensor code, if the measurement type has not been defined yet Y is returned.

Return type

string

mth5.utils.fdsn_tools.get_orientation_code(azimuth, orientation='horizontal')[source]

Get orientation code given angle and orientation. This is a general code and the true azimuth is stored in channel

Parameters

azimuth (float) – angel assuming 0 is north, 90 is east, 0 is vertical down

Returns

single character SEED orientation code

Return type

string

mth5.utils.fdsn_tools.get_period_code(sample_rate)[source]

Get the SEED sampling rate code given a sample rate

Parameters

sample_rate (float) – sample rate in samples per second

Returns

single character SEED sampling code

Return type

string

mth5.utils.fdsn_tools.make_channel_code(channel_obj)[source]

Make the 3 character SEED channel code

Parameters

channel_obj (Channel) – Channel metadata

Returns

3 character channel code

Type

string

mth5.utils.fdsn_tools.make_mt_channel(code_dict, angle_tol=15)[source]
Parameters

code_dict (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

mth5.utils.fdsn_tools.read_channel_code(channel_code)[source]

read FDSN channel code

Parameters

channel_code (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

mth5.utils.helpers module
mth5.utils.helpers.get_compare_dict(input_dict)[source]

Helper function for removing 2 added attributes to metadata

  • hdf5_reference

  • mth5_type

Parameters

input_dict (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

mth5.utils.helpers.initialize_mth5(h5_path, mode='a', file_version='0.1.0')[source]

mth5 initializer for the case of writting files.

Parameters
  • h5_path (string or pathlib.Path) – path to file

  • mode (string) –

    how to open the file, options are

    • ”r”: read

    • ”w”: write

    • ”a”: append

Returns

mth5 object

Return type

mth5.MTH5

mth5.utils.helpers.read_back_data(mth5_path, station_id, run_id, survey=None, close_mth5=True, return_objects=[])[source]

Testing helper function, used to confirm that the h5 file can be accessed and that the data size is as expected.

Parameters
  • mth5_path (Path or string) – the full path the the mth5 that this method is going to try to read

  • station_id (string) – the label for the station, e.g. “PKD”

  • run_id (string) – The label for the run to read. e.g. “001”

  • survey (string) – The label for the survey associated with the run to read.

  • close_mth5 (bool) – Whether or not to close the mth5 object after reading

  • return_objects – List of strings. Specifies what, if anything to return.

Allowed values: [“run”, “run_ts”] :type return_objects: List of strings. :return: run object :rtype: mth5.groups.RunGroup :return: run time series :rtype: mth5.timeseries.RunTS

mth5.utils.mth5_logger module
Module contents
Submodules
mth5.helpers module

Helper functions for HDF5

Created on Tue Jun 2 12:37:50 2020

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

mth5.helpers.close_open_files()[source]
mth5.helpers.from_numpy_type(value)[source]

Need to make the attributes friendly with Numpy and HDF5.

For numbers and bool this is straight forward they are automatically mapped in h5py to a numpy type.

But for strings this can be a challenge, especially a list of strings.

HDF5 should only deal with ASCII characters or Unicode. No binary data is allowed.

mth5.helpers.get_tree(parent)[source]

Simple function to recursively print the contents of an hdf5 group :param parent: HDF5 (sub-)tree to print :type parent: h5py.Group

mth5.helpers.inherit_doc_string(cls)[source]
mth5.helpers.recursive_hdf5_tree(group, lines=[])[source]
mth5.helpers.to_numpy_type(value)[source]

Need to make the attributes friendly with Numpy and HDF5.

For numbers and bool this is straight forward they are automatically mapped in h5py to a numpy type.

But for strings this can be a challenge, especially a list of strings.

HDF5 should only deal with ASCII characters or Unicode. No binary data is allowed.

mth5.helpers.validate_compression(compression, level)[source]

validate that the input compression is supported.

Parameters
  • compression (string, [ 'lzf' | 'gzip' | 'szip' | None ]) – type of lossless compression

  • level (string for 'szip' or int for 'gzip') – compression level if supported

Returns

compression type

Return type

string

Returns

compressiong level

Return type

string for ‘szip’ or int for ‘gzip’

Raises

ValueError if comporession or level are not supported

Raises

TypeError if compression level is not a string

mth5.helpers.validate_name(name, pattern=None)[source]

Validate name

Parameters
  • name (TYPE) – DESCRIPTION

  • pattern (TYPE, optional) – DESCRIPTION, defaults to None

Returns

DESCRIPTION

Return type

TYPE

mth5.mth5 module
MTH5

MTH5 deals with reading and writing an MTH5 file, which are HDF5 files developed for magnetotelluric (MT) data. The code is based on h5py and therefor numpy. This is the simplest and we are not really dealing with large tables of data to warrant using pytables.

Created on Sun Dec 9 20:50:41 2018

copyright

Jared Peacock (jpeacock@usgs.gov)

license

MIT

class mth5.mth5.MTH5(filename=None, compression='gzip', compression_opts=4, shuffle=True, fletcher32=True, data_level=1, file_version='0.2.0')[source]

Bases: object

MTH5 is the main container for the HDF5 file format developed for MT data

It uses the metadata standards developled by the IRIS PASSCAL software group and defined in the metadata documentation.

MTH5 is built with h5py and therefore numpy. The structure follows the different levels of MT data collection:

For version 0.1.0:

  • Survey

    • Reports

    • Standards

    • Filters

    • Stations

      • Run

        • Channel

For version 0.2.0:

  • Experiment

    • Reports

    • Standards

    • Surveys

      • Reports

      • Standards

      • Filters

      • Stations

        • Run

          -Channel

All timeseries data are stored as individual channels with the appropriate metadata defined for the given channel, i.e. electric, magnetic, auxiliary.

Each level is represented as a mth5 group class object which has methods to add, remove, and get a group from the level below. Each group has a metadata attribute that is the approprate metadata class object. For instance the SurveyGroup has an attribute metadata that is a mth5.metadata.Survey object. Metadata is stored in the HDF5 group attributes as (key, value) pairs.

All groups are represented by their structure tree and can be shown at any time from the command line.

Each level has a summary array of the contents of the levels below to hopefully make searching easier.

Parameters
  • filename (string or pathlib.Path) – name of the to be or existing file

  • compression

    compression type. Supported lossless compressions are

    • ’lzf’ - Available with every installation of h5py

      (C source code also available). Low to moderate compression, very fast. No options.

    • ’gzip’ - Available with every installation of HDF5,

      so it’s best where portability is required. Good compression, moderate speed. compression_opts sets the compression level and may be an integer from 0 to 9, default is 3.

    • ’szip’ - Patent-encumbered filter used in the NASA

      community. Not available with all installations of HDF5 due to legal reasons. Consult the HDF5 docs for filter options.

  • compression_opts (string or int depending on compression type) – compression options, see above

  • shuffle (boolean) – Block-oriented compressors like GZIP or LZF work better when presented with runs of similar values. Enabling the shuffle filter rearranges the bytes in the chunk and may improve compression ratio. No significant speed penalty, lossless.

  • fletcher32 (boolean) – Adds a checksum to each chunk to detect data corruption. Attempts to read corrupted chunks will fail with an error. No significant speed penalty. Obviously shouldn’t be used with lossy compression filters.

  • data_level (integer, defaults to 1) –

    level the data are stored following levels defined by NASA ESDS

    • 0 - Raw data

    • 1 - Raw data with response information and full metadata

    • 2 - Derived product, raw data has been manipulated

  • file_version (string, optional) – Version of the file [ ‘0.1.0’ | ‘0.2.0’ ], defaults to “0.2.0”

Usage

  • Open a new file and show initialized file

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5(file_version='0.1.0')
>>> # Have a look at the dataset options
>>> mth5.dataset_options
{'compression': 'gzip',
 'compression_opts': 3,
 'shuffle': True,
 'fletcher32': True}
>>> mth5_obj.open_mth5(r"/home/mtdata/mt01.mth5", 'w')
>>> mth5_obj
/:
====================
    |- Group: Survey
    ----------------
        |- Group: Filters
        -----------------
            --> Dataset: summary
            ......................
        |- Group: Reports
        -----------------
            --> Dataset: summary
            ......................
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Stations
        ------------------
            --> Dataset: summary
            ......................
  • Add metadata for survey from a dictionary

>>> survey_dict = {'survey':{'acquired_by': 'me', 'archive_id': 'MTCND'}}
>>> survey = mth5_obj.survey_group
>>> survey.metadata.from_dict(survey_dict)
>>> survey.metadata
{
"survey": {
    "acquired_by.author": "me",
    "acquired_by.comments": null,
    "archive_id": "MTCND"
    ...}
}
  • Add a station from the convenience function

>>> station = mth5_obj.add_station('MT001')
>>> mth5_obj
/:
====================
    |- Group: Survey
    ----------------
        |- Group: Filters
        -----------------
            --> Dataset: summary
            ......................
        |- Group: Reports
        -----------------
            --> Dataset: summary
            ......................
        |- Group: Standards
        -------------------
            --> Dataset: summary
            ......................
        |- Group: Stations
        ------------------
            |- Group: MT001
            ---------------
                --> Dataset: summary
                ......................
            --> Dataset: summary
            ......................
>>> station
/Survey/Stations/MT001:
====================
    --> Dataset: summary
    ......................
>>> data.schedule_01.ex[0:10] = np.nan
>>> data.calibration_hx[...] = np.logspace(-4, 4, 20)

Note

if replacing an entire array with a new one you need to use […] otherwise the data will not be updated.

Warning

You can only replace entire arrays with arrays of the same size. Otherwise you need to delete the existing data and make a new dataset.

add_channel(station_name, run_name, channel_name, channel_type, data, channel_dtype='int32', max_shape=(None,), chunks=True, channel_metadata=None, survey=None)[source]

Convenience function to add a channel using mth5.stations_group.get_station().get_run().add_channel()

add a channel to a given run for a given station

Parameters
  • station_name (string) – existing station name

  • run_name (string) – existing run name

  • channel_name (string) – name of the channel

  • channel_type (string) – [ electric | magnetic | auxiliary ]

  • channel_metadata ([ mth5.metadata.Electric | mth5.metadata.Magnetic | mth5.metadata.Auxiliary ], optional) – metadata container, defaults to None

  • survey (string) – existing survey name, needed for file version >= 0.2.0

Raises

MTH5Error – If channel type is not correct

Returns

Channel container

Return type

[ mth5.mth5_groups.ElectricDatset | mth5.mth5_groups.MagneticDatset | mth5.mth5_groups.AuxiliaryDatset ]

Example

>>> new_channel = mth5_obj.add_channel('MT001', 'MT001a''Ex',
>>> ...                                'electric', None)
>>> new_channel
Channel Electric:
-------------------
                component:        None
        data type:        electric
        data format:      float32
        data shape:       (1,)
        start:            1980-01-01T00:00:00+00:00
        end:              1980-01-01T00:00:00+00:00
        sample rate:      None
add_run(station_name, run_name, run_metadata=None, survey=None)[source]

Convenience function to add a run using

Add a run to a given station.

Parameters
  • run_name (string) – run name, should be archive_id{a-z}

  • survey (string) – existing survey name, needed for file version >= 0.2.0

  • metadata (mth5.metadata.Station, optional) – metadata container, defaults to None

Example

>>> new_run = mth5_obj.add_run('MT001', 'MT001a')
add_station(station_name, station_metadata=None, survey=None)[source]

Convenience function to add a station using mth5.stations_group.add_station

Add a station with metadata if given with the path [v0.1.0]:

/Survey/Stations/station_name

Add a station with metadata if given with the path [v0.2.0]:

Experiment/Surveys/survey/Stations/station_name

If the station already exists, will return that station and nothing is added.

Parameters
  • station_name (string) – Name of the station, should be the same as metadata.archive_id

  • station_metadata (mth5.metadata.Station, optional) – Station metadata container, defaults to None

  • survey (string) – existing survey name, needed for file version >= 0.2.0

Returns

A convenience class for the added station

Return type

mth5_groups.StationGroup

Example

>>> new_staiton = mth5_obj.add_station('MT001')
add_survey(survey_name, survey_metadata=None)[source]
Add a survey with metadata if given with the path:

/Experiment/Surveys/survey_name

If the survey already exists, will return that survey and nothing is added.

Parameters
  • survey_name (string) – Name of the survey, should be the same as metadata.id

  • survey_metadata (mth5.metadata.survey, optional) – survey metadata container, defaults to None

Returns

A convenience class for the added survey

Return type

mth5_groups.SurveyGroup

Example
>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> # one option
>>> new_survey = mth5_obj.add_survey('MT001')
>>> # another option
>>> new_station = mth5_obj.experiment_group.surveys_group.add_survey('MT001')
add_transfer_function(tf_object)[source]

Add a transfer function :param tf_object: DESCRIPTION :type tf_object: TYPE :return: DESCRIPTION :rtype: TYPE

property channel_summary

return a dataframe of channels

close_mth5()[source]

close mth5 file to make sure everything is flushed to the file

property data_level

data level

property dataset_options

summary of dataset options

property experiment_group

Convenience property for /Experiment group

property file_attributes
property file_type

File Type should be MTH5

property file_version

mth5 file version

property filename

file name of the hdf5 file

property filters_group

Convenience property for /Survey/Filters group

from_experiment(experiment, survey_index=0, update=False)[source]

Fill out an MTH5 from a mt_metadata.timeseries.Experiment object given a survey_id

Parameters
  • experiment (mt_metadata.timeseries.Experiment) – Experiment metadata

  • survey_index (int, defaults to 0) – Index of the survey to write

from_reference(h5_reference)[source]

Get an HDF5 group, dataset, etc from a reference

Parameters

h5_reference (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

get_channel(station_name, run_name, channel_name, survey=None)[source]

Convenience function to get a channel using mth5.stations_group.get_station().get_run().get_channel()

Get a channel from an existing name. Returns the appropriate container.

Parameters
  • station_name (string) – existing station name

  • run_name (string) – existing run name

  • channel_name (string) – name of the channel

  • survey (string) – existing survey name, needed for file version >= 0.2.0

Returns

Channel container

Return type

[ mth5.mth5_groups.ElectricDatset | mth5.mth5_groups.MagneticDatset | mth5.mth5_groups.AuxiliaryDatset ]

Raises

MTH5Error – If no channel is found

Example

>>> existing_channel = mth5_obj.get_channel(station_name,
>>> ...                                     run_name,
>>> ...                                     channel_name)
>>> existing_channel
Channel Electric:
-------------------
                component:        Ex
        data type:        electric
        data format:      float32
        data shape:       (4096,)
        start:            1980-01-01T00:00:00+00:00
        end:              1980-01-01T00:00:01+00:00
        sample rate:      4096
get_run(station_name, run_name, survey=None)[source]

Convenience function to get a run using mth5.stations_group.get_station(station_name).get_run()

get a run from run name for a given station

Parameters
  • station_name (string) – existing station name

  • run_name (string) – existing run name

  • survey (string) – existing survey name, needed for file version >= 0.2.0

Returns

Run object

Return type

mth5.mth5_groups.RunGroup

Example

>>> existing_run = mth5_obj.get_run('MT001', 'MT001a')
get_station(station_name, survey=None)[source]

Convenience function to get a station using

Get a station with the same name as station_name

Parameters
  • station_name (string) – existing station name

  • survey (string) – existing survey name, needed for file version >= 0.2.0

Returns

convenience station class

Return type

mth5.mth5_groups.StationGroup

Raises

MTH5Error – if the station name is not found.

Example

>>> existing_staiton = mth5_obj.get_station('MT001')
MTH5Error: MT001 does not exist, check station_list for existing names
get_survey(survey_name)[source]

Get a survey with the same name as survey_name

Parameters

survey_name (string) – existing survey name

Returns

convenience survey class

Return type

mth5.mth5_groups.surveyGroup

Raises

MTH5Error – if the survey name is not found.

Example

>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> # one option
>>> existing_survey = mth5_obj.get_survey('MT001')
>>> # another option
>>> existing_staiton = mth5_obj.experiment_group.surveys_group.get_survey('MT001')
MTH5Error: MT001 does not exist, check groups_list for existing names
get_transfer_function(station_id, tf_id, survey=None)[source]

Get a transfer function

Parameters
  • survey_id (TYPE) – DESCRIPTION

  • station_id (TYPE) – DESCRIPTION

  • tf_id (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

h5_is_read()[source]

check to see if the hdf5 file is open and readable

Returns

True if readable, False if not

Return type

Boolean

h5_is_write()[source]

check to see if the hdf5 file is open and writeable

has_group(group_name)[source]

Check to see if the group name exists

open_mth5(filename=None, mode='a')[source]

open an mth5 file

Returns

Survey Group

Type

groups.SurveyGroup

Example

>>> from mth5 import mth5
>>> mth5_object = mth5.MTH5(file_version='0.1.0')
>>> survey_object = mth5_object.open_mth5('Test.mth5', 'w')
>>> from mth5 import mth5
>>> mth5_object = mth5.MTH5()
>>> survey_object = mth5_object.open_mth5('Test.mth5', 'w')
>>> mth5_object.file_version
'0.2.0'
remove_channel(station_name, run_name, channel_name, survey=None)[source]

Convenience function to remove a channel using mth5.stations_group.get_station().get_run().remove_channel()

Remove a channel from a given run and station.

Note

Deleting a channel is not as simple as del(channel). In HDF5 this does not free up memory, it simply removes the reference to that channel. The common way to get around this is to copy what you want into a new file, or overwrite the channel.

Parameters
  • station_name (string) – existing station name

  • run_name (string) – existing run name

  • channel_name (string) – existing station name

  • survey (string) – existing survey name, needed for file version >= 0.2.0

Example

>>> mth5_obj.remove_channel('MT001', 'MT001a', 'Ex')
remove_run(station_name, run_name, survey=None)[source]

Remove a run from the station.

Note

Deleting a run is not as simple as del(run). In HDF5 this does not free up memory, it simply removes the reference to that station. The common way to get around this is to copy what you want into a new file, or overwrite the run.

Parameters
  • station_name (string) – existing station name

  • run_name (string) – existing run name

  • survey (string) – existing survey name, needed for file version >= 0.2.0

Example

>>> mth5_obj.remove_station('MT001', 'MT001a')
remove_station(station_name, survey=None)[source]

Convenience function to remove a station using

Remove a station from the file.

Note

Deleting a station is not as simple as del(station). In HDF5 this does not free up memory, it simply removes the reference to that station. The common way to get around this is to copy what you want into a new file, or overwrite the station.

Parameters
  • station_name (string) – existing station name

  • survey (string) – existing survey name, needed for file version >= 0.2.0

Example

>>> mth5_obj.remove_station('MT001')
remove_survey(survey_name)[source]

Remove a survey from the file.

Note

Deleting a survey is not as simple as del(survey). In HDF5 this does not free up memory, it simply removes the reference to that survey. The common way to get around this is to copy what you want into a new file, or overwrite the survey.

Parameters

survey_name (string) – existing survey name

Example
>>> from mth5 import mth5
>>> mth5_obj = mth5.MTH5()
>>> mth5_obj.open_mth5(r"/test.mth5", mode='a')
>>> # one option
>>> mth5_obj.remove_survey('MT001')
>>> # another option
>>> mth5_obj.experiment_group.surveys_group.remove_survey('MT001')
remove_transfer_function(station_id, tf_id, survey=None)[source]

remove a transfer function

Parameters
  • survey_id (TYPE) – DESCRIPTION

  • station_id (TYPE) – DESCRIPTION

  • tf_id (TYPE) – DESCRIPTION

Returns

DESCRIPTION

Return type

TYPE

property reports_group

Convenience property for /Survey/Reports group

property software_name

software name that wrote the file

property standards_group

Convenience property for /Standards group

property station_list

list of existing stations names

property stations_group

Convenience property for /Survey/Stations group

property survey_group

Convenience property for /Survey group

property surveys_group

Convenience property for /Surveys group

property tf_summary

return a dataframe of channels

to_experiment()[source]

Create an mt_metadata.timeseries.Experiment object from the metadata contained in the MTH5 file.

Returns

mt_metadata.timeseries.Experiment

validate_file()[source]

Validate an open mth5 file

will test the attribute values and group names

Returns

Boolean [ True = valid, False = not valid]

Return type

Boolean

Module contents

Top-level package for MTH5.

Indices and tables