Skip to content

Install the environment

Thomas Saillour edited this page Dec 21, 2023 · 6 revisions

this wiki will guide through the installation of this fork of the pyposeidon package

After having installed the pyposeidon libraries, you'll be able to

  • mesh automatically using :
    • jigsaw
    • gmsh
    • oceanmesh
  • model global surface elevation, using :
    • TELEMAC
    • SCHISM

install the binaries : quick setup

  1. clone this fork of pyposeidon
git clone https://github.com/tomsail/pyPoseidon.git
  1. delete the poetry.lock
rm poetry.lock
  1. create a new environment containing only the binaries
mamba env create -n name_env -f environments/binary-p3.11.yml
  1. create an environment for the custom package and install the remaining librairies
python -mvenv .venv
source .venv/bin/activate
poetry install

the whole procedure -- long story

Rationale

The idea in a larger scope (i.e. like for any other dev environment) is to end up with 3 nested environments:

  1. The outer environment is called base and is managed by conda. This one contains basic utilities that you would expect to find in your OS. There can be different reasons why we need this base environment:
    • we don't have these utilities on windows
    • we don't have sudo permissions on our Linux Machine/VM
  2. The middle environment is called "${PROJ_NAME}" and is also managed by conda. This one should be "stacked" on top of the base environment and contains the binary dependencies of the specific project (including python!)
  3. The inner environment is a python virtual environment (.venv) created with the python of the middle environment. All the python dependencies go here.

Why do we need 3 environments and not one?

Because mixing package managers is a recipe for disaster. Unless you really know what you are doing I suggest avoiding that unless you want to deal with non-reproducible and broken environemnts.

Why do we need the inner virtualenv?

Because conda is not that flexible when it comes to development. First of all not everything is available on conda-forge. PyPi has 10x more packages than conda-forge. Arguably all the major ones are on conda-forge, but still, why limit yourself when you don't have to? Another example is this: let's say that you need package X and let's say that package X has a bug which has been fixed on master but has not been released. What do you do?

a. You wait for the release b. You pip install git+https//github.com/.../X.git. But wait, I thought we agreed that mixing package managers is bad. If you do it anyhow, you just broke your middle environment.

The same for trying some code from a PR etc.

base environment

not needed for Linux, although you should install pipx and the associated libraries

Rationale

  1. Have a clear separation of the base specific customizations -- achieved via .bashrc.base.
  2. Ditch the built in conda installation which is outdated and has its own base environment which is not editable by normal users
  3. Use micromamba which is currently the "best" way of installing/using conda.
  4. Setup micromamba to install packages/environments under /scratch instead of $HOME. The problem with $HOME is that it is under a network filesystem and unpacking files can be slow. The downside is that there are no backups on scratch, but IMHV environments should be treated as volatile and we should try really hard to make our dev environments reproducible anyhow (hint: use git; there are no excuses not to).
  5. Auto activate the base conda env, in order to ensure that we have all the basic utilities we need installed AND a recent python version (i.e. 3.11).
  6. use pipx to install python utilities on isolated virtual environments. The reason we don't add them on the base conda environment is because we want to keep it as ligthweight as possible in order to make updates easier.

Steps

  1. Create ~/.bashrc.base and make sure that it sourced at the end of .bashrc
touch ~/.bashrc.base
echo 'source ~/.bashrc.base' >> ~/.bashrc
  1. Create ~/.local/bin and add it to PATH
mkdir -p ~/.local/bin
cat << 'EOF' >> ~/.bashrc.base
# Start PATH
PATH=~/.local/bin:"${PATH}"
export PATH
# End PATH

EOF
  1. Install micromamba
wget --output-document ~/.local/bin/micromamba https://github.com/mamba-org/micromamba-releases/releases/download/1.5.1-1/micromamba-linux-64
chmod +x ~/.local/bin/micromamba
  1. Setup a minimal ~/.condarc
cat << EOF > ~/.condarc
auto_activate_base: false
auto_stack: false
channel_priority: strict
channels:
  - tomsail
  - gbrey
  - conda-forge
EOF
  1. Setup conda
cat << 'EOF' >> ~/.bashrc.base
# Start micromamba
MAMBA_EXE=$(which micromamba)
if [[ -d /scratch ]]; then
    mkdir -p /scratch/$(whoami)
    MAMBA_ROOT_PREFIX="${MAMBA_ROOT_PREFIX:-/scratch/$(whoami)/.micromamba}"
else
    MAMBA_ROOT_PREFIX="${MAMBA_ROOT_PREFIX:-"${HOME}"/.micromamba}"
fi
echo "mamba exe at   : ${MAMBA_EXE}"
echo "mamba prefix at: ${MAMBA_ROOT_PREFIX}"
mkdir -p "${MAMBA_ROOT_PREFIX}"
export MAMBA_EXE
export MAMBA_ROOT_PREFIX
eval "$(micromamba shell hook)"
# Since micromamba is not installing the conda and the mamba binaries,
# let's pretend that we have them!
conda() { micromamba "${@}"; }
mamba() { micromamba "${@}"; }
# End micromamba

EOF
  1. Start a new shell
bash
  1. Create the base conda environment. Add/remove packages as necessary. Do note that here we install stuff that we would normally install using apt.

Note: I install eccodes here because it is a dependency for inspectds which we will install later using pipx.

conda create \
    --yes \
    --name base \
    python=3.11 \
    bat \
    direnv \
    eccodes \
    stow \
    ncdu \
    tree \
    vim \
;
  1. Auto activate the base environment
echo 'conda activate base' >> .bashrc.base
  1. Start a new shell.
bash
  1. Install pipx

We override the default PIPX_HOME in order to force pipx to create the virtualenvs under /scratch instead of $HOME because the latency under home is really bad

pip install --user pipx
mkdir -p /scratch/$(whoami)/.pipx/
echo "export PIPX_HOME=/scratch/$(whoami)/.pipx/" >> ~/.bashrc.base
  1. Install pure-python utilities
pipx install poetry
pipx install pre-commit
pipx install azure-cli
pipx install 'inspectds[grib]'

Medium and inner environments

Steps

For each project that you work on you go to scratch

mkdir -p /scratch/$(whoami)
cd /scratch/$(whoami)

you create a project directory:

export PROJ_NAME="my_project"
mkdir -p /scratch/$(whoami)/"${PROJ_NAME}"
cd /scratch/$(whoami)/"${PROJ_NAME}"

you clone the repo of your code or create a new repo:

git clone https://github.com/...

# OR

git init

you create the medium conda environment where you install the binary dependencies of your project:

conda create \
    --name "${PROJ_NAME}_env \
    python=3.11 \
    gdal \
    proj \
    geos \
    eccodes \
    # ... \
;

You activate the conda env by stacking it on top of the base env:

conda activate --stack "${PROJ_NAME}"_env

Using the python of the "${PROJ_NAME}" environment, you create a virtual environment and you install the python dependencies there

python -mvenv .venv
source .venv/bin/activate
# Followed by
poetry install
# OR
pip install -r requirements.txt

⚠️ error concerning poetry:

[org.freedesktop.DBus.Error.Spawn.ExecFailed] ('Failed to execute program org.freedesktop.secrets: Operation not permitted',)

the fix is the following: export PYTHON_KEYRING_BACKEND=keyring.backends.null.Keyring

see source

Automate Env activation

The full steps for activating the environment are:

conda activate base
conda activate --stack "${PROJ_NAME}"
source /scratch/$(whoami)/"${PROJ_NAME}/.venv/bin/activate

Obviously typing all that is PITA. I suggest automating this activation either by placing it in a file which you can source or using direnv which should already be installed on the base environment.

We recomment to use direnv.


Note: This guide was done initially by @pmav99 for dealing with restricted LINUX environments (without sudo permissions)

Clone this wiki locally