From c5613a942d7e79089865f4f5837313bf6abe3461 Mon Sep 17 00:00:00 2001 From: Penny How Date: Wed, 5 Apr 2023 10:12:08 -0200 Subject: [PATCH] Docs updated with new links and modules (#119) --- docs/guide_user.rst | 16 +++++++--------- docs/technical_data.rst | 2 +- docs/technical_process.rst | 13 ++++++------- 3 files changed, 14 insertions(+), 17 deletions(-) diff --git a/docs/guide_user.rst b/docs/guide_user.rst index 6f1998b1..6ebcf39c 100644 --- a/docs/guide_user.rst +++ b/docs/guide_user.rst @@ -15,15 +15,14 @@ These can be processed from Level 0 to a Level 3 data product as an ``AWS`` obje .. code:: python - from pypromice.process.aws import AWS + from pypromice.process import AWS # Define input paths config = "src/pypromice/test/test_config1.toml" inpath = "src/pypromice/test/" - vari = "src/pypromice/process/variables.csv" # Initiate and process - a = AWS(config, inpath, var_file=vari) + a = AWS(config, inpath) a.process() # Get Level 3 @@ -33,15 +32,14 @@ All processing steps are executed in ``AWS.process``. These can also be broken d .. code:: python - from pypromice.process.aws import AWS + from pypromice.process import AWS # Define input paths config = "src/pypromice/test/test_config2.toml" inpath = "src/pypromice/test/" - vari = "src/pypromice/process/variables.csv" # Initiate - a = AWS(config, inpath, var_file=vari) + a = AWS(config, inpath) # Process to Level 1 a.getL1() @@ -65,7 +63,7 @@ The Level 0 to Level 3 processing can also be executed from a CLI using the ``ge .. code:: console - $ getL3 -v src/pypromice/process/variables.csv -m src/pypromice/process/metadata.csv -c src/pypromice/test/test_config1.toml -i src/pypromice/test -o src/pypromice/test + $ getL3 -c src/pypromice/test/test_config1.toml -i src/pypromice/test -o src/pypromice/test Loading PROMICE data @@ -98,7 +96,7 @@ AWS data can be downloaded to file with pypromice. Open up a CLI and use the ``g .. code:: console - $ getData -n KPC_U + $ getData -n KPC_U_hour.csv Files are downloaded to the current directory as a CSV formatted file. Use the ``-h`` help flag to explore further input variables. @@ -135,5 +133,5 @@ If you would rather handle the AWS data as an ``xarray.Dataset`` object then the .. code:: python - ds = xr.Dataset.from_dataframe(df) + ds = xr.Dataset.from_dataframe(df) diff --git a/docs/technical_data.rst b/docs/technical_data.rst index 14a0dc4e..1b6a9f0f 100644 --- a/docs/technical_data.rst +++ b/docs/technical_data.rst @@ -17,7 +17,7 @@ Level 0 is raw, untouched data in one of three formats: Level 1 ======= - [X] Engineering units (e.g. current or volts) converted to physical units (e.g. temperature or wind speed) -- [ ] Invalid/bad/suspicious data flagged +- [X] Invalid/bad/suspicious data flagged - [X] Multiple data files merged into one time series per station diff --git a/docs/technical_process.rst b/docs/technical_process.rst index c25263ce..24e56bfa 100644 --- a/docs/technical_process.rst +++ b/docs/technical_process.rst @@ -22,8 +22,8 @@ Payload decoder ``PayloadFormat`` handles the message types and decoding templates. These can be imported from file, with two default CSV files provided with pypromice - payload_formatter.csv_ and payload_type.csv_. -.. _payload_formatter.csv: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/payload_formats.csv -.. _payload_type.csv: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/payload_types.csv +.. _payload_formatter.csv: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/tx/payload_formats.csv +.. _payload_type.csv: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/tx/payload_types.csv Payload processing @@ -38,8 +38,7 @@ The following function can be executed from a CLI to fetch ``L0`` transmission m .. code:: console $ getL0tx -a accounts.ini -p credentials.ini -c tx/config - -f payload_formats.csv -t payload_types.csv -u last_aws_uid.ini - -o tx + -u last_aws_uid.ini -o tx .. note:: @@ -55,13 +54,13 @@ To process from L0>>L3, the following function can be executed in a CLI. .. code:: console - $ getL3 -v variables.csv -m metadata.csv -c config/KPC_L.toml -i . -o ../../aws-l3/tx" + $ getL3 -c config/KPC_L.toml -i . -o ../../aws-l3/tx" And in parallel through all configuration .toml files ``$imei_list`` .. code:: console - $ parallel --bar "getL3 -v variables.csv -m metadata.csv -c ./{} -i . -o ../../aws-l3/tx" ::: $(ls $imei_list) + $ parallel --bar "getL3 -c ./{} -i . -o ../../aws-l3/tx" ::: $(ls $imei_list) Station configuration @@ -112,4 +111,4 @@ The TOML config file has the following expectations and behaviors: Be aware the column names should follow those defined in the variables look-up table found here_. Any column names provided that are not in this look-up table will be passed through the processing untouched. -.. _here: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/variables.csv +.. _here: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/process/variables.csv