Skip to content

Commit

Permalink
Docs updated with new links and modules (#119)
Browse files Browse the repository at this point in the history
  • Loading branch information
PennyHow authored Apr 5, 2023
1 parent 350c19e commit c5613a9
Show file tree
Hide file tree
Showing 3 changed files with 14 additions and 17 deletions.
16 changes: 7 additions & 9 deletions docs/guide_user.rst
Original file line number Diff line number Diff line change
Expand Up @@ -15,15 +15,14 @@ These can be processed from Level 0 to a Level 3 data product as an ``AWS`` obje

.. code:: python
from pypromice.process.aws import AWS
from pypromice.process import AWS
# Define input paths
config = "src/pypromice/test/test_config1.toml"
inpath = "src/pypromice/test/"
vari = "src/pypromice/process/variables.csv"
# Initiate and process
a = AWS(config, inpath, var_file=vari)
a = AWS(config, inpath)
a.process()
# Get Level 3
Expand All @@ -33,15 +32,14 @@ All processing steps are executed in ``AWS.process``. These can also be broken d

.. code:: python
from pypromice.process.aws import AWS
from pypromice.process import AWS
# Define input paths
config = "src/pypromice/test/test_config2.toml"
inpath = "src/pypromice/test/"
vari = "src/pypromice/process/variables.csv"
# Initiate
a = AWS(config, inpath, var_file=vari)
a = AWS(config, inpath)
# Process to Level 1
a.getL1()
Expand All @@ -65,7 +63,7 @@ The Level 0 to Level 3 processing can also be executed from a CLI using the ``ge

.. code:: console
$ getL3 -v src/pypromice/process/variables.csv -m src/pypromice/process/metadata.csv -c src/pypromice/test/test_config1.toml -i src/pypromice/test -o src/pypromice/test
$ getL3 -c src/pypromice/test/test_config1.toml -i src/pypromice/test -o src/pypromice/test
Loading PROMICE data
Expand Down Expand Up @@ -98,7 +96,7 @@ AWS data can be downloaded to file with pypromice. Open up a CLI and use the ``g

.. code:: console
$ getData -n KPC_U
$ getData -n KPC_U_hour.csv
Files are downloaded to the current directory as a CSV formatted file. Use the ``-h`` help flag to explore further input variables.

Expand Down Expand Up @@ -135,5 +133,5 @@ If you would rather handle the AWS data as an ``xarray.Dataset`` object then the

.. code:: python
ds = xr.Dataset.from_dataframe(df)
ds = xr.Dataset.from_dataframe(df)
2 changes: 1 addition & 1 deletion docs/technical_data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Level 0 is raw, untouched data in one of three formats:
Level 1
=======
- [X] Engineering units (e.g. current or volts) converted to physical units (e.g. temperature or wind speed)
- [ ] Invalid/bad/suspicious data flagged
- [X] Invalid/bad/suspicious data flagged
- [X] Multiple data files merged into one time series per station


Expand Down
13 changes: 6 additions & 7 deletions docs/technical_process.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@ Payload decoder

``PayloadFormat`` handles the message types and decoding templates. These can be imported from file, with two default CSV files provided with pypromice - payload_formatter.csv_ and payload_type.csv_.

.. _payload_formatter.csv: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/payload_formats.csv
.. _payload_type.csv: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/payload_types.csv
.. _payload_formatter.csv: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/tx/payload_formats.csv
.. _payload_type.csv: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/tx/payload_types.csv


Payload processing
Expand All @@ -38,8 +38,7 @@ The following function can be executed from a CLI to fetch ``L0`` transmission m
.. code:: console
$ getL0tx -a accounts.ini -p credentials.ini -c tx/config
-f payload_formats.csv -t payload_types.csv -u last_aws_uid.ini
-o tx
-u last_aws_uid.ini -o tx
.. note::

Expand All @@ -55,13 +54,13 @@ To process from L0>>L3, the following function can be executed in a CLI.

.. code:: console
$ getL3 -v variables.csv -m metadata.csv -c config/KPC_L.toml -i . -o ../../aws-l3/tx"
$ getL3 -c config/KPC_L.toml -i . -o ../../aws-l3/tx"
And in parallel through all configuration .toml files ``$imei_list``

.. code:: console
$ parallel --bar "getL3 -v variables.csv -m metadata.csv -c ./{} -i . -o ../../aws-l3/tx" ::: $(ls $imei_list)
$ parallel --bar "getL3 -c ./{} -i . -o ../../aws-l3/tx" ::: $(ls $imei_list)
Station configuration
Expand Down Expand Up @@ -112,4 +111,4 @@ The TOML config file has the following expectations and behaviors:

Be aware the column names should follow those defined in the variables look-up table found here_. Any column names provided that are not in this look-up table will be passed through the processing untouched.

.. _here: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/variables.csv
.. _here: https://github.com/GEUS-Glaciology-and-Climate/pypromice/blob/main/src/pypromice/process/variables.csv

0 comments on commit c5613a9

Please sign in to comment.