The key, value pairs found in the config file become attributes of the class instance after initialization. At minimum, there should be the storage_root attribute for storing data for this package.
+
+
+
+
+
Type
+
Default
+
Details
+
+
+
+
+
config_path
+
str
+
None
+
str or pathlib.Path
+
+
+
+
+
+
The Config() object
+
The config module instantiates a config object from the Config class. Its attributes can be used to access several aspects relevant to the configuration of planetarypy. Using an object approach enables easy growth and nesting over time.
Use input() to ask user for the storage_root path.
+
The path will be stored in the TOML-dict and saved into existing config file at Class.path, either default or as given during init. storage_root attribute is set as well.
Return configured data levels available for an instrument.
+
This currently simply points to the indexes, assuming that everything that has an index is also its own datalevel. In case it ever is not, we can add more here.
Downloading collection...
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+
+
+
+
+
+
This is performing the ISIS import and calibration in parallel:
Correct wrong scale_factor in PolarStereographic data.
+
Some PolarStereographic data have a 0 as a scale_factor in the projection (mostly MOLA), which is being corrected here. TODO: Check for being PolarStereographic before doing this!
Beware that this function calculates trigonometric angles. If the points are from an image that has (0, 0) in the upper left, this means that the angles increase clockwise. That is why, for example, for an HiRISE image, the return of this function matches the angle rotation definition for HiRISE data.
Determine paths and URLs for HiRISE RDR products (also EXTRAS.)
+
We use the PDS definition of PRODUCT_ID here, e.g. PSP_003092_0985_RED.
+
Attributes jp2_path and label_path get you the official RDR mosaic product, with kind steering if you get the COLOR or the RED product. All other properties go to the RDR/EXTRAS folder. The “PDS” part of the path is handled in the OBSID class.
+
+
+
+
+
+
+
+
+
+
+
Type
+
Default
+
Details
+
+
+
+
+
initstr
+
str
+
+
PRODUCT_ID string, e.g. PSP_003092_0985_RED
+
+
+
check_url
+
bool
+
True
+
for performance, the user might not want the url check
for i inrange(10):
+for ch in [0, 1]:
+ spid =f"{obsid}{i}_{ch}"
+try:
+ SOURCE_PRODUCT(spid).download()
+exceptConnectionError:
+pass
+
+
File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+File exists. Use `overwrite=True` to download fresh.
+
+
+
/tmp/ipykernel_1172729/2293453911.py:88: UserWarning: https://hirise-pds.lpl.arizona.edu/PDS/EDR/ESP/ORB_027000_027099/ESP_027021_1525/ESP_027021_1525_RED9_0.IMG does not exist on the server.
+ warnings.warn(f"{u} does not exist on the server.")
+/tmp/ipykernel_1172729/2293453911.py:88: UserWarning: https://hirise-pds.lpl.arizona.edu/PDS/EDR/ESP/ORB_027000_027099/ESP_027021_1525/ESP_027021_1525_RED9_1.IMG does not exist on the server.
+ warnings.warn(f"{u} does not exist on the server.")
This class manages one index, identified by a dotted key, e.g. cassini.iss.ring_summary
+
+
index = MTRDRIndex(url=url)
+
+
+
index.download()
+
+
+
+
+
+
+
+
Downloaded /home/ayek72/mnt/slowdata/planetarypy/missions/mro/crism/indexes/mtrdr/mtrdr0705_index.lbl and /home/ayek72/mnt/slowdata/planetarypy/missions/mro/crism/indexes/mtrdr/mtrdr0705_index.tab
+Done.
+ LROC Index is not at a fixed URL, so need to determine dynamically.
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
EDR vs RDR etc.
+
I cannot distinguish between different data levels for dynamic index URLs yet, as is required for LROC, so for now I only implement the EDR index, pretending it’s the only one!
Support working with label files of PDS Index tables.
+
+
+
+
+
+
+
+
+
+
Type
+
Details
+
+
+
+
+
labelpath
+
Union
+
Path to the labelfile for a PDS Indexfile. The actual table should reside in the same folder to be automatically parsed when calling the read_index_data method.
For a given dataframe, find the columns that are of mixed type.
+
Tool to help with the performance warning when trying to save a pandas DataFrame as a HDF. When a column changes datatype somewhere, pickling occurs, slowing down the reading process of the HDF file.
+
+
+
+
+
+
+
+
+
+
+
Type
+
Default
+
Details
+
+
+
+
+
df
+
DataFrame
+
+
Dataframe to be searched for mixed data-types
+
+
+
fix
+
bool
+
True
+
Switch to control if NaN values in these problem columns should be replaced by the string ‘UNKNOWN’
+
+
+
Returns
+
list
+
+
List of column names that have data type changes within themselves.
The HiRISE EDRCUMINDEX has some broken lines where the SCAN_EXPOSURE_DURATION is of format F10.4 instead of the defined F9.4. This function simply replaces those incidences with one less decimal fraction, so 20000.0000 becomes 20000.000.
Receive the list of supported datasets for automatic retrieval of archived SPICE kernels
+
+
The supported datasets are tabled here at NAIF: https://naif.jpl.nasa.gov/naif/data_archived.html
+
+
Receive the list of required SPICE kernels for a given mission and time range
+
Automatic download of kernels for a given mission and time range either into a given location or the planetarypy local archive.
+
+
As always in planetarypy the general design philosophy is to first develop a management class to give the user full control over all the details, and then add easy-to-use function for the end-user that do the most frequently used things in one go. (See section “User Functions”)
+
+
+
Identifying and downloading kernel sets
+
One repeating task for SPICE calculations is the identification and retrieval of all SPICE kernels for a mission for a given time interval.
+
The folks at NAIF offer a “Subset” feature at their servers. Here we set up a table of the currently supported datasets:
Now we build a management class for wrapping the Perl script available at below’s URL for accessing subsets of these datasets.
+
First, the basic URLs we will use:
+
The Perl script subsetds.pl (the name at the end of the BASE_URL) requires as input:
+
+
the dataset name
+
start and stop of the time interval
+
a constant named “Subset” to identify the action for this Perl script
+
+
We can assemble these parameters into a payload dictionary for the requests.get call and we manage different potential actions on the zipfile with a Subsetter class, that only requires the mission identifier, start and stop as parameters.
KPL/MK
+
+ This meta-kernel lists a subset of kernels from the meta-kernel
+ cas_2011_v17.tm provided in the CO-S/J/E/V-SPICE-6-V1.0 SPICE PDS3 archive,
+ covering the whole or a part of the customer requested time period
+ from 2011-02-13T00:00:00.000 to 2011-02-14T00:00:00.000.
+
+ The documentation describing these kernels can be found in the
+ complete CO-S/J/E/V-SPICE-6-V1.0 SPICE PDS3 archive available at this URL
+
+ https://naif.jpl.nasa.gov/pub/naif/pds/data/co-s_j_e_v-spice-6-v1.0/cosp_1000
+
+ To use this meta-kernel users may need to modify the value of the
+ PATH_VALUES keyword to point to the actual location of the archive's
+ ``data'' directory on their system. Replacing ``/'' with ``\''
+ and converting line terminators to the format native to the user's
+ system may also be required if this meta-kernel is to be used on a
+ non-UNIX workstation.
+
+ This meta-kernel was created by the NAIF node's SPICE PDS archive
+ subsetting service version 2.1 on Mon Jun 19 08:13:34 PDT 2023.
+
+
+ \begindata
+
+ PATH_VALUES = (
+ '/home/ayek72/mnt/slowdata/planetarypy/spice_kernels/cassini'
+ )
+
+ PATH_SYMBOLS = (
+ 'KERNELS'
+ )
+
+ KERNELS_TO_LOAD = (
+ '$KERNELS/lsk/naif0012.tls'
+ '$KERNELS/pck/pck00010.tpc'
+ '$KERNELS/fk/cas_rocks_v18.tf'
+ '$KERNELS/fk/cas_mimi_v202.tf'
+ '$KERNELS/fk/cas_dyn_v03.tf'
+ '$KERNELS/fk/cas_v41.tf'
+ '$KERNELS/ik/cas_caps_v03.ti'
+ '$KERNELS/ik/cas_cda_v01.ti'
+ '$KERNELS/ik/cas_cirs_v09.ti'
+ '$KERNELS/ik/cas_inms_v02.ti'
+ '$KERNELS/ik/cas_iss_v10.ti'
+ '$KERNELS/ik/cas_mag_v01.ti'
+ '$KERNELS/ik/cas_mimi_v11.ti'
+ '$KERNELS/ik/cas_radar_v11.ti'
+ '$KERNELS/ik/cas_rpws_v01.ti'
+ '$KERNELS/ik/cas_rss_v03.ti'
+ '$KERNELS/ik/cas_sru_v02.ti'
+ '$KERNELS/ik/cas_uvis_v06.ti'
+ '$KERNELS/ik/cas_vims_v06.ti'
+ '$KERNELS/sclk/cas00172.tsc'
+ '$KERNELS/spk/180927AP_RE_90165_18018.bsp'
+ '$KERNELS/spk/140809BP_IRRE_00256_25017.bsp'
+ '$KERNELS/spk/110504R_SCPSE_11041_11093.bsp'
+ '$KERNELS/ck/11001_12001pa_gapfill_v14.bc'
+ '$KERNELS/ck/11017_11066py_as_flown.bc'
+ '$KERNELS/ck/11044_11049ra.bc'
+ '$KERNELS/ck/cas_cda_20120517.bc'
+ '$KERNELS/ck/cas_lemms_05109_20001_v2.bc'
+ )
+
+ \begintext
+
+
The metakernel is correctly adapted, however for these tests, I didn’t download the kernels again
+
+
!cat {mkpath}
+
+
KPL/MK
+
+ This meta-kernel lists a subset of kernels from the meta-kernel
+ cas_2011_v17.tm provided in the CO-S/J/E/V-SPICE-6-V1.0 SPICE PDS3 archive,
+ covering the whole or a part of the customer requested time period
+ from 2011-02-13T00:00:00.000 to 2011-02-14T00:00:00.000.
+
+ The documentation describing these kernels can be found in the
+ complete CO-S/J/E/V-SPICE-6-V1.0 SPICE PDS3 archive available at this URL
+
+ https://naif.jpl.nasa.gov/pub/naif/pds/data/co-s_j_e_v-spice-6-v1.0/cosp_1000
+
+ To use this meta-kernel users may need to modify the value of the
+ PATH_VALUES keyword to point to the actual location of the archive's
+ ``data'' directory on their system. Replacing ``/'' with ``\''
+ and converting line terminators to the format native to the user's
+ system may also be required if this meta-kernel is to be used on a
+ non-UNIX workstation.
+
+ This meta-kernel was created by the NAIF node's SPICE PDS archive
+ subsetting service version 2.1 on Mon Jun 19 08:13:34 PDT 2023.
+
+
+ \begindata
+
+ PATH_VALUES = (
+ '.'
+ )
+
+ PATH_SYMBOLS = (
+ 'KERNELS'
+ )
+
+ KERNELS_TO_LOAD = (
+ '$KERNELS/lsk/naif0012.tls'
+ '$KERNELS/pck/pck00010.tpc'
+ '$KERNELS/fk/cas_rocks_v18.tf'
+ '$KERNELS/fk/cas_mimi_v202.tf'
+ '$KERNELS/fk/cas_dyn_v03.tf'
+ '$KERNELS/fk/cas_v41.tf'
+ '$KERNELS/ik/cas_caps_v03.ti'
+ '$KERNELS/ik/cas_cda_v01.ti'
+ '$KERNELS/ik/cas_cirs_v09.ti'
+ '$KERNELS/ik/cas_inms_v02.ti'
+ '$KERNELS/ik/cas_iss_v10.ti'
+ '$KERNELS/ik/cas_mag_v01.ti'
+ '$KERNELS/ik/cas_mimi_v11.ti'
+ '$KERNELS/ik/cas_radar_v11.ti'
+ '$KERNELS/ik/cas_rpws_v01.ti'
+ '$KERNELS/ik/cas_rss_v03.ti'
+ '$KERNELS/ik/cas_sru_v02.ti'
+ '$KERNELS/ik/cas_uvis_v06.ti'
+ '$KERNELS/ik/cas_vims_v06.ti'
+ '$KERNELS/sclk/cas00172.tsc'
+ '$KERNELS/spk/180927AP_RE_90165_18018.bsp'
+ '$KERNELS/spk/140809BP_IRRE_00256_25017.bsp'
+ '$KERNELS/spk/110504R_SCPSE_11041_11093.bsp'
+ '$KERNELS/ck/11001_12001pa_gapfill_v14.bc'
+ '$KERNELS/ck/11017_11066py_as_flown.bc'
+ '$KERNELS/ck/11044_11049ra.bc'
+ '$KERNELS/ck/cas_cda_20120517.bc'
+ '$KERNELS/ck/cas_lemms_05109_20001_v2.bc'
+ )
+
+ \begintext
+
+
/Users/maye/mambaforge/envs/py39/lib/python3.9/site-packages/distributed/node.py:182: UserWarning: Port 8787 is already in use.
+Perhaps you already have a cluster running?
+Hosting the HTTP server on port 57624 instead
+ warnings.warn(
Doing bc
+Half time: 2020-11-25 00:00:00.000
+Found 41 kernels for bc
+Doing cassini
+Half time: 2007-09-30 12:00:01.000
+Found 32 kernels for cassini
+Doing clementine
+Half time: 1994-03-17 12:00:00.000
+Found 19 kernels for clementine
+Doing dart
+Half time: 2035-12-06 00:00:00.000
+Found 14 kernels for dart
+Doing dawn
+Half time: 2013-04-14 00:00:00.000
+Found 21 kernels for dawn
+Doing di
+Half time: 2005-04-26 12:00:00.000
+Found 16 kernels for di
+Doing ds1
+Half time: 2000-05-21 11:59:59.500
+Found 11 kernels for ds1
+Doing epoxi
+Half time: 2008-05-27 00:00:00.000
+Found 12 kernels for epoxi
+Doing em16
+Half time: 2019-08-07 23:59:59.500
+Found 27 kernels for em16
+Doing grail
+Half time: 2012-04-29 00:00:00.500
+Found 20 kernels for grail
+Doing hayabusa
+Half time: 2005-10-15 12:00:00.000
+Found 15 kernels for hayabusa
+Doing insight
+Half time: 2020-08-24 12:00:00.000
+Found 21 kernels for insight
+Doing juno
+Half time: 2017-02-08 23:59:58.500
+Found 25 kernels for juno
+Doing ladee
+Half time: 2031-11-04 11:59:59.000
+Found 12 kernels for ladee
+Doing lro
+Half time: 2016-04-30 23:59:59.500
+Found 24 kernels for lro
+Doing maven
+Half time: 2018-07-25 11:59:59.000
+Found 20 kernels for maven
+Doing opportunity
+Half time: 2010-12-23 00:00:00.500
+Found 23 kernels for opportunity
+Doing spirit
+Half time: 2006-11-20 12:00:00.000
+Found 24 kernels for spirit
+Doing messenger
+Half time: 2009-12-15 23:59:59.500
+Found 18 kernels for messenger
+Doing mars2020
+Half time: 2021-10-13 12:00:00.000
+Found 17 kernels for mars2020
+Doing mex
+Half time: 2013-03-01 11:59:59.500
+Found 39 kernels for mex
+Doing mgs
+Half time: 2001-11-03 23:59:59.500
+Found 20 kernels for mgs
+Doing ody
+Half time: 2012-01-03 12:00:00.500
+Found 14 kernels for ody
+Doing mro
+Half time: 2014-06-06 11:59:59.500
+Found 20 kernels for mro
+Doing msl
+Half time: 2017-05-16 23:59:58.500
+Found 50 kernels for msl
+Doing near
+Half time: 1998-10-14 12:00:00.000
+Found 15 kernels for near
+Doing nh
+Half time: 2013-01-08 12:00:00.000
+Found 28 kernels for nh
+Doing orex
+Half time: 2018-11-30 23:59:59.500
+
+
+
warnings.warn('ERFA function "{}" yielded {}'.format(func_name, wmsg),
+ warnings.warn('ERFA function "{}" yielded {}'.format(func_name, wmsg),
+ warnings.warn('ERFA function "{}" yielded {}'.format(func_name, wmsg),
+ warnings.warn('ERFA function "{}" yielded {}'.format(func_name, wmsg),
+
+
+
OSError: SPICE Server request returned status code: {r.status_code}
+
+
+
+
NOTE: Any ErfaWarnings above are caused by the LADEE mission using a kernel up to 2050, and the astropy.Time module warns about potential precicision issues regarding unknown leapseconds that will be put in in the future.
+
+
+
+
Generic kernel management
+
There are a few generic kernels that are required for basic illumination calculations as supported by this package.
+ General utilities. Should probably split up into utils.time and utils.download
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
Time format strings
+
First, we define the different format strings these utils convert from and to.
+
An identifier with xxx_dt_format_xxx in its name signifies a full datetime format as compared to dates only.
+
+
NASA date to datetime and ISO
+
What we call NASA data, is the often used YYYY-JJJ based format in the Planetary Data System identifying dates via the running number of the day in the year, e.g. “2010-240”.
Improved urlretrieve with progressbar, timeout and chunker.
+
This downloader has built-in progress bar using tqdm and using the requests package it improves standard urllib behavior by adding time-out capability.
+
I tested different chunk_sizes and most of the time 128 was actually fastest, YMMV.
+
Inspired by https://stackoverflow.com/a/61575758/680232
Inside these docs the package will be called PLPY for brevity.
+
A standard Python import could be: plp or plpy
+
+
because the last p in plp can be pronounced out, we consider these equivalent for human conversation and pronounce these “plippy”.
+
+
+
+
+
General scope
+
First and foremost this package should provide support in working with planetary science data.
+
With working we mean:
+
+
locating
+
retrieving
+
reading
+
further processing
+
+
of data.
+
+
Locating
+
This library manages, via its PDS tools, multiple PDS3 index files per instrument that can be used for identifying data of interest. These index files are automatically downloaded and converted to the very performant (and cloud-ready) parquet file format. > Parquet is able to store advanced datatypes like nan-capable integer and full datetime objects, as opposed to HDF5.
+
+
+
Retrieving
+
The interface to getting data is via a path-retrieving function based on a PDS product-id. If that product-id is available locally, the path will be returned. If it is not, it will previously be downloaded, stored in a systematic fashion organized by mission and instrument, and then the local path will be returned.
+
+
+
Reading
+
For now, the library only returns the path to the object and the user needs to sort out the reading process. A recently funded NASA project Planetary Data Reader will be integrated here, so that basic reading into memory can be provided.
+
As such, we anticipate two classes of reading support: 1. basic reading into numpy and/or xarray 1. added reader functionality like basic plots and basic geospatial processing, as supported by interested parties
+
There will exist larger other packages that focus on working with a given instrument’s data, in which case that package could become an affiliated package with the planetarypy GitHub organization, if so desired.
+
+
+
Further processing
+
In the future, additional frequently used procedures will be added to this library, e.g. * frequently used GDAL/rasterio procedures * frequently used SPICE operations * like surface illumination on a given body
+
+
+
+
PDS tools
+
Look at the Apps docs to see what pds.apps exist for easily getting PDS indexes. The find_index app is specifically useful when you don’t know what index files exist.
+
So far, the following indexes are supported (but not necessarily all tested within PLPY):
+
+
Cassini
+
+
ISS (all)
+
UVIS (all)
+
+
MRO
+
+
CTX EDR
+
HiRISE
+
+
EDR, RDR, DTM
+
+
EDR index has a bug (as delivered by the team, reported), where I need to activate an existing fix for it.
+
+
+
+
LRO
+
+
Diviner (DLRE)
+
+
EDR, RDR
+
+
LOLA
+
+
EDR, RDR
+
+
+
+
+
More indexes
+
More indexes of other instruments can be easily added by following the existing structure of what has been copied into your config at ~/.planetarypy_config.toml.
+
Please consider submitting a pull request for adding further PDS index files into the config file at its source: https://github.com/michaelaye/nbplanetary/blob/master/planetarypy/data/planetarypy_config.toml
+
+
+
+
Utils
+
Find something in Utils for working with NASA timestamps and a well working URL download function url_retrieve, among other stuff.
+
+
+
Experiment/Instrument Specific
+
So far, planetarypy supports CTX EDR and HiRISE RGB.NOMAP data. Look at the CTX and HiRISE pages for descriptions of classes for working with these data.
+
+
+
Bug reports
+
Please submit bug reports at https://github.com/michaelaye/nbplanetary/issues