-
Notifications
You must be signed in to change notification settings - Fork 0
Install the environment
this wiki will guide through the installation of this fork of the pyposeidon package
After having installed the pyposeidon libraries, you'll be able to
- mesh automatically using :
- jigsaw
- gmsh
- oceanmesh
- model global surface elevation, using :
- TELEMAC
- SCHISM
- clone this fork of pyposeidon
git clone https://github.com/tomsail/pyPoseidon.git
- delete the poetry.lock
rm poetry.lock
- create a new environment containing only the binaries
mamba env create -n name_env -f environments/binary-p3.11.yml
- create an environment for the custom package and install the remaining librairies
python -mvenv .venv
source .venv/bin/activate
poetry install
The idea in a larger scope (i.e. like for any other dev environment) is to end up with 3 nested environments:
- The outer environment is called
base
and is managed by conda. This one contains basic utilities that you would expect to find in your OS. The reason we need this is because we don't have these utilities on windows. - The middle environment is called
"${PROJ_NAME}"
and is also managed by conda. This one should be "stacked" on top of thebase
environment and contains the binary dependencies of the specific project (including python!) - The inner environment is a python virtual environment (
.venv
) created with the python of the middle environment. All the python dependencies go here.
Why do we need 3 environments and not one?
Because mixing package managers is a recipe for disaster. Unless you really know what you are doing I suggest avoiding that unless you want to deal with non-reproducible and broken environemnts.
Why do we need the inner virtualenv?
Because conda is not that flexible when it comes to development. First of all not everything is available on conda-forge
. PyPi has 10x more packages than conda-forge. Arguably all the major ones are on conda-forge, but still, why limit yourself when you don't have to? Another example is this: let's say that you need package X and let's say that package X has a bug which has been fixed on master
but has not been released. What do you do?
a. You wait for the release
b. You pip install git+https//github.com/.../X.git
. But wait, I thought we agreed that mixing package managers is bad. If you do it anyhow, you just broke your middle environment.
The same for trying some code from a PR etc.
not needed for Linux, although you should install pipx and the associated libraries
- Have a clear separation of the base specific customizations -- achieved via
.bashrc.base
. - Ditch the built in conda installation which is outdated and has its own
base
environment which is not editable by normal users - Use micromamba which is currently the "best" way of installing/using conda.
- Setup micromamba to install packages/environments under
/scratch
instead of$HOME
. The problem with $HOME is that it is under a network filesystem and unpacking files can be slow. The downside is that there are no backups on scratch, but IMHV environments should be treated as volatile and we should try really hard to make our dev environments reproducible anyhow (hint: usegit
; there are no excuses not to). - Auto activate the
base
conda env, in order to ensure that we have all the basic utilities we need installed AND a recent python version (i.e. 3.11). - use
pipx
to install python utilities on isolated virtual environments. The reason we don't add them on thebase
conda environment is because we want to keep it as ligthweight as possible in order to make updates easier.
- Create
~/.local/bin
and add it to PATH
mkdir -p ~/.local/bin
cat << 'EOF' >> ~/.bashrc
# Start PATH
PATH=~/.local/bin:"${PATH}"
export PATH
# End PATH
EOF
- Install micromamba
wget --output-document ~/.local/bin/micromamba https://github.com/mamba-org/micromamba-releases/releases/download/1.5.1-1/micromamba-linux-64
chmod +x ~/.local/bin/micromamba
- Setup a minimal
~/.condarc
cat << EOF > ~/.condarc
auto_activate_base: false
auto_stack: false
channel_priority: strict
channels:
- tomsail
- gbrey
- conda-forge
EOF
- Setup conda
cat << 'EOF' >> ~/.bashrc
# Start micromamba
MAMBA_EXE=$(which micromamba)
if [[ -d /scratch ]]; then
mkdir -p /scratch/$(whoami)
MAMBA_ROOT_PREFIX="${MAMBA_ROOT_PREFIX:-/scratch/$(whoami)/.micromamba}"
else
MAMBA_ROOT_PREFIX="${MAMBA_ROOT_PREFIX:-"${HOME}"/.micromamba}"
fi
echo "mamba exe at : ${MAMBA_EXE}"
echo "mamba prefix at: ${MAMBA_ROOT_PREFIX}"
mkdir -p "${MAMBA_ROOT_PREFIX}"
export MAMBA_EXE
export MAMBA_ROOT_PREFIX
eval "$(micromamba shell hook)"
# Since micromamba is not installing the conda and the mamba binaries,
# let's pretend that we have them!
conda() { micromamba "${@}"; }
mamba() { micromamba "${@}"; }
# End micromamba
EOF
- Start a new shell
bash
- Create the
base
conda environment. Add/remove packages as necessary. Do note that here we install stuff that we would normally install usingapt
.
Note: I install eccodes
here because it is a dependency for inspectds
which we will install later using pipx
.
conda create \
--yes \
--name base \
python=3.11 \
bat \
direnv \
eccodes \
stow \
ncdu \
tree \
vim \
;
- Auto activate the base environment
echo 'conda activate base' >> .bashrc
- Start a new shell.
bash
- Install pipx
We override the default PIPX_HOME
in order to force pipx to create the virtualenvs under /scratch
instead of $HOME
because the latency under home is really bad
pip install --user pipx
mkdir -p /scratch/$(whoami)/.pipx/
echo "export PIPX_HOME=/scratch/$(whoami)/.pipx/" >> ~/.bashrc
- Install pure-python utilities
pipx install poetry
pipx install pre-commit
pipx install azure-cli
pipx install 'inspectds[grib]'
For each project that you work on you go to scratch
mkdir -p /scratch/$(whoami)
cd /scratch/$(whoami)
you create a project directory:
export PROJ_NAME="my_project"
mkdir -p /scratch/$(whoami)/"${PROJ_NAME}"
cd /scratch/$(whoami)/"${PROJ_NAME}"
you clone the repo of your code or create a new repo:
git clone https://github.com/...
# OR
git init
you create the medium conda environment where you install the binary dependencies of your project:
conda create \
--name "${PROJ_NAME}_env \
python=3.11 \
gdal \
proj \
geos \
eccodes \
# ... \
;
You activate the conda env by stacking it on top of the base
env:
conda activate --stack "${PROJ_NAME}"_env
Using the python of the "${PROJ_NAME}"
environment, you create a virtual environment and you install the python dependencies there
python -mvenv .venv
source .venv/bin/activate
# Followed by
poetry install
# OR
pip install -r requirements.txt
[org.freedesktop.DBus.Error.Spawn.ExecFailed] ('Failed to execute program org.freedesktop.secrets: Operation not permitted',)
the fix is the following:
export PYTHON_KEYRING_BACKEND=keyring.backends.null.Keyring
The full steps for activating the environment are:
conda activate base
conda activate --stack "${PROJ_NAME}"
source /scratch/$(whoami)/"${PROJ_NAME}/.venv/bin/activate
Obviously typing all that is PITA. I suggest automating this activation either by placing it in a file which you can source or using direnv which should already be installed on the base
environment.
We recomment to use direnv
.
Note: This guide was done initially by @pmav99 for dealing with restricted LINUX environments (without sudo permissions)