Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: documenting using pymapdl on clusters #3466

Merged
Merged
Show file tree
Hide file tree
Changes from 17 commits
Commits
Show all changes
43 commits
Select commit Hold shift + click to select a range
ce11a7e
feat: adding env vars needed for multinode
germa89 Oct 4, 2024
61ad61b
feat: adding env vars needed for multinode
germa89 Oct 4, 2024
b50eeb6
Merge branch 'feat/passing-tight-integration-env-vars-to-MAPDL' of ht…
germa89 Oct 4, 2024
e9b91d4
feat: renaming hpc detection argument
germa89 Oct 7, 2024
c714d39
docs: adding documentation
germa89 Oct 7, 2024
492345b
chore: adding changelog file 3466.documentation.md
pyansys-ci-bot Oct 7, 2024
a289dab
feat: adding env vars needed for multinode
germa89 Oct 4, 2024
604bbf8
feat: renaming hpc detection argument
germa89 Oct 7, 2024
1d29651
docs: adding documentation
germa89 Oct 7, 2024
96929a8
chore: adding changelog file 3466.documentation.md
pyansys-ci-bot Oct 7, 2024
9b8e0e9
Merge branch 'feat/passing-tight-integration-env-vars-to-MAPDL' of ht…
germa89 Oct 7, 2024
6ab1d65
fix: vale issues
germa89 Oct 7, 2024
e45d2e5
chore: To fix sphinx build
germa89 Oct 7, 2024
bb2b90a
docs: expanding a bit troubleshooting advices and small format fix
germa89 Oct 7, 2024
330f33c
docs: fix vale
germa89 Oct 7, 2024
26f6dbd
Merge branch 'feat/passing-tight-integration-env-vars-to-MAPDL' of ht…
germa89 Oct 7, 2024
ac54f2c
fix: nproc tests
germa89 Oct 7, 2024
6985ee4
feat: adding env vars needed for multinode
germa89 Oct 4, 2024
03a05e6
feat: renaming hpc detection argument
germa89 Oct 7, 2024
d9e3b0d
docs: adding documentation
germa89 Oct 7, 2024
34bcfc4
chore: adding changelog file 3466.documentation.md
pyansys-ci-bot Oct 7, 2024
3bc1cc6
fix: vale issues
germa89 Oct 7, 2024
0f1606b
docs: fix vale
germa89 Oct 7, 2024
89552c9
docs: expanding a bit troubleshooting advices and small format fix
germa89 Oct 7, 2024
c3c6506
fix: nproc tests
germa89 Oct 7, 2024
db963c4
revert: "chore: To fix sphinx build"
germa89 Oct 7, 2024
7b386d0
chore: Merge branch 'feat/passing-tight-integration-env-vars-to-MAPDL…
germa89 Oct 7, 2024
1e31519
docs: clarifying where everything is running.
germa89 Oct 7, 2024
f8177a1
Merge branch 'main' into feat/passing-tight-integration-env-vars-to-M…
germa89 Oct 7, 2024
5c7967c
docs: expanding bash example
germa89 Oct 8, 2024
880a6b8
tests: fix
germa89 Oct 15, 2024
3cd005c
chore: merge remote-tracking branch 'origin/main' into feat/passing-t…
germa89 Oct 17, 2024
7514c31
docs: adding `PYMAPDL_NPROC` to env var section
germa89 Oct 17, 2024
fdf00d1
docs: fix vale issue
germa89 Oct 17, 2024
4aa477d
docs: fix vale issue
germa89 Oct 17, 2024
4dadc1d
fix: replacing env var name
germa89 Oct 17, 2024
e90a9cb
Merge branch 'main' into feat/passing-tight-integration-env-vars-to-M…
germa89 Oct 21, 2024
60bf932
fix: unit tests
germa89 Oct 21, 2024
d027edd
chore: adding changelog file 3466.documentation.md [dependabot-skip]
pyansys-ci-bot Oct 21, 2024
0bb2f81
Apply suggestions from code review
germa89 Oct 23, 2024
4231a2e
docs: apply suggestions from code review made by Kathy
germa89 Oct 23, 2024
300446e
docs: adding Kathy suggestion.
germa89 Oct 23, 2024
d23db7b
Merge branch 'feat/passing-tight-integration-env-vars-to-MAPDL' of ht…
germa89 Oct 23, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@ env:
MAPDL_PACKAGE: ghcr.io/ansys/mapdl
ON_CI: True
PYTEST_ARGUMENTS: '-vvv -ra --durations=10 --maxfail=3 --reruns 3 --reruns-delay 4 --cov=ansys.mapdl.core --cov-report=html'
BUILD_CHEATSHEET: True

# Following env vars when changed will "reset" the mentioned cache,
# by changing the cache file name. It is rendered as ...-v%RESET_XXX%-...
Expand Down
1 change: 1 addition & 0 deletions doc/changelog.d/3466.documentation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
feat: passing tight integration env vars to mapdl
1 change: 1 addition & 0 deletions doc/changelog.d/3468.fixed.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
fix: add ``build cheatsheet`` as env variable within doc-build
10 changes: 7 additions & 3 deletions doc/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -311,13 +311,17 @@
"json_url": f"https://{cname}/versions.json",
"version_match": switcher_version,
},
"cheatsheet": {
}

BUILD_CHEATSHEET = os.environ.get("BUILD_CHEATSHEET", "true").lower() == "true"

if BUILD_CHEATSHEET:
html_theme_options["cheatsheet"] = {
"file": "cheat_sheet/cheat_sheet.qmd",
"title": "PyMAPDL cheat sheet",
"version": f"v{version}",
"pages": ["getting_started/learning"],
},
}
}

html_context = {
"display_github": True, # Integrate GitHub
Expand Down
2 changes: 1 addition & 1 deletion doc/source/examples/extended_examples/hpc/hpc_ml_ga.rst
Original file line number Diff line number Diff line change
Expand Up @@ -251,7 +251,7 @@ this script.

If you have problems when creating the virtual environment
or accessing it from the compute nodes,
see :ref:`ref_hpc_pymapdl_job`.
see :ref:`ref_hpc_troubleshooting`.

3. Install the requirements for this example from the
:download:`requirements.txt <requirements.txt>` file.
Expand Down
169 changes: 127 additions & 42 deletions doc/source/user_guide/hpc/pymapdl.rst
Original file line number Diff line number Diff line change
@@ -1,84 +1,169 @@
.. _ref_hpc_pymapdl:

.. _ref_hpc_pymapdl_job:

=============================
PyMAPDL on SLURM HPC clusters
=============================
=======================
PyMAPDL on HPC Clusters
germa89 marked this conversation as resolved.
Show resolved Hide resolved
=======================

.. _ref_hpc_pymapdl_job:

Submit a PyMAPDL job
====================
Introduction
============

To submit a PyMAPDL job, you must create two files:
PyMAPDL communicates with MAPDL using the gRPC protocol.
This protocol offers many advantages and features, for more information
see :ref:`ref_project_page`.
germa89 marked this conversation as resolved.
Show resolved Hide resolved
One of these features is that it is not required to have both,
PyMAPDL and MAPDL processes, running on the same machine.
germa89 marked this conversation as resolved.
Show resolved Hide resolved
This possibility open the door to many configurations, depending
germa89 marked this conversation as resolved.
Show resolved Hide resolved
on whether you run them both or not on the HPC compute nodes.
Additionally, you might to be able interact with them (``interactive`` mode)
or not (``batch`` mode).
germa89 marked this conversation as resolved.
Show resolved Hide resolved

- Python script with the PyMAPDL code
- Bash script that activates the virtual environment and calls the Python script
Currently, the supported configurations are:

* :ref:`ref_pymapdl_batch_in_cluster_hpc`
germa89 marked this conversation as resolved.
Show resolved Hide resolved


Since v0.68.5, PyMAPDL can take advantage of the tight integration
between the scheduler and MAPDL to read the job configuration and
launch an MAPDL instance that can use all the resources allocated
to that job.
For instance, if a SLURM job has allocated 8 nodes with 4 cores each,
then PyMAPDL launches an MAPDL instance which uses 32 cores
spawning across those 8 nodes.
This behaviour can turn off if passing the environment variable
:envvar:`PYMAPDL_ON_SLURM` or passing the argument `detect_HPC=False`
to :func:`launch_mapdl() <ansys.mapdl.core.launcher.launch_mapdl>`.
germa89 marked this conversation as resolved.
Show resolved Hide resolved


.. _ref_pymapdl_batch_in_cluster_hpc:

Submit a PyMAPDL batch job to the cluster from the entrypoint node
==================================================================

Many HPC clusters allow their users to login in a machine using
``ssh``, ``vnc``, ``rdp``, or similar technologies and submit a job
to the cluster from there.
germa89 marked this conversation as resolved.
Show resolved Hide resolved
This entrypoint machine, sometimes known as *head node* or *entrypoint node*,
germa89 marked this conversation as resolved.
Show resolved Hide resolved
might be a virtual machine (VDI/VM).

In such cases, once the Python virtual environment with PyMAPDL is already
set and is accessible to all the compute nodes, launching a
PyMAPDL job is very easy to do using ``sbatch`` command.
No changes are needed on a PyMAPDL script to run it on an SLURM cluster.
germa89 marked this conversation as resolved.
Show resolved Hide resolved

First the virtual environment must be activated in the current terminal.

.. code-block:: console

user@entrypoint-machine:~$ export VENV_PATH=/my/path/to/the/venv
user@entrypoint-machine:~$ source $VENV_PATH/bin/activate

**Python script:** ``pymapdl_script.py``
Once the virtual environment has been activated, you can launch any Python
script if they do have the proper Python shebang (``#!/usr/bin/env python3``).
germa89 marked this conversation as resolved.
Show resolved Hide resolved

For instance, to launch the following Python script ``main.py``:
germa89 marked this conversation as resolved.
Show resolved Hide resolved

.. code-block:: python
:caption: main.py

#!/usr/bin/env python3

from ansys.mapdl.core import launch_mapdl

# Number of processors must be lower than the
# number of CPUs allocated for the job.
mapdl = launch_mapdl(nproc=10)
mapdl = launch_mapdl(run_location="/home/ubuntu/tmp/tmp/mapdl", loglevel="debug")

mapdl.prep7()
n_proc = mapdl.get_value("ACTIVE", 0, "NUMCPU")
print(f"Number of CPUs: {n_proc}")
print(mapdl.prep7())
print(f'Number of CPU: {mapdl.get_value("ACTIVE", 0, "NUMCPU")}')

mapdl.exit()

You can just run in your console:
germa89 marked this conversation as resolved.
Show resolved Hide resolved

**Bash script:** ``job.sh``

.. code-block:: bash
.. code-block:: console

source /home/user/.venv/bin/activate
python pymapdl_script.py
(venv) user@entrypoint-machine:~$ sbatch main.py

To start the simulation, you use this code:
Alternatively, you can remove the shebang from the python file and use a
germa89 marked this conversation as resolved.
Show resolved Hide resolved
Python executable call:

.. code-block:: console

user@machine:~$ srun job.sh
(venv) user@entrypoint-machine:~$ sbatch python main.py

Additionally, you can change the amount of cores used in your
job, by setting the :envvar:`PYMAPDL_NPROC` to the desired value.
germa89 marked this conversation as resolved.
Show resolved Hide resolved

.. code-block:: console

(venv) user@entrypoint-machine:~$ PYMAPDL_NPROC=4 sbatch main.py

The bash script allows you to customize the environment before running the Python script.
This bash script performs such tasks as creating environment variables, moving to
different directories, and printing to ensure your configuration is correct. However,
this bash script is not mandatory.
You can avoid having the ``job.sh`` bash script if the virtual environment is activated
and you pass all the environment variables to the job:
You can also add ``sbatch`` options to the command:

.. code-block:: console

user@machine:~$ source /home/user/.venv/bin/activate
(.venv) user@machine:~$ srun python pymapdl_script.py --export=ALL
(venv) user@entrypoint-machine:~$ PYMAPDL_NPROC=4 sbatch main.py


The ``--export=ALL`` argument might not be needed, depending on the cluster configuration.
Furthermore, you can omit the Python call in the preceding command if you include the
Python shebang (``#!/usr/bin/python3``) in the first line of the ``pymapdl_script.py`` script.
For instance, to launch a PyMAPDL job which start a four cores MAPDL instance
on a 10 CPU SLURM job, you can use:
germa89 marked this conversation as resolved.
Show resolved Hide resolved

.. code-block:: console

user@machine:~$ source /home/user/.venv/bin/activate
(.venv) user@machine:~$ srun pymapdl_script.py --export=ALL
(venv) user@entrypoint-machine:~$ PYMAPDL_NPROC=4 sbatch --partition=qsmall --nodes=10 --ntasks-per-node=1 main.py


Using a submission script
-------------------------

In case you need to customize more your job, you can create a SLURM
submission script to submit a PyMAPDL job.
In this case, you must create two files:
germa89 marked this conversation as resolved.
Show resolved Hide resolved

- Python script with the PyMAPDL code
- Bash script that activates the virtual environment and calls the
Python script.
germa89 marked this conversation as resolved.
Show resolved Hide resolved

.. code-block:: python
:caption: main.py

from ansys.mapdl.core import launch_mapdl

# Number of processors must be lower than the
# number of CPU allocated for the job.
mapdl = launch_mapdl(nproc=10)

mapdl.prep7()
n_proc = mapdl.get_value("ACTIVE", 0, "NUMCPU")
print(f"Number of CPU: {n_proc}")

mapdl.exit()

If you prefer to run the job in the background, you can use the ``sbatch``
command instead of the ``srun`` command. However, in this case, the Bash file is needed:

.. code-block:: bash
:caption: job.sh

source /home/user/.venv/bin/activate
python main.py

To start the simulation, you use this code:

.. code-block:: console

user@machine:~$ sbatch job.sh
Submitted batch job 1

Here is the expected output of the job:
In this case, the Python virtual environment does not need to be activated
before submission since it is activated later in the script.

The expected output of the job is
germa89 marked this conversation as resolved.
Show resolved Hide resolved

.. code-block:: text

Number of CPUs: 10.0
Number of CPU: 10.0


The bash script allows you to customize the environment before running the
Python script.
This bash script performs tasks such as creating environment variables,
moving files to different directories, and printing to ensure your
configuration is correct.
49 changes: 32 additions & 17 deletions doc/source/user_guide/hpc/settings.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,16 @@ Setting PyMAPDL
Requirements
============

Using PyMAPDL in an HPC environment managed by SLURM scheduler has certain requirements:
Using PyMAPDL in an HPC environment managed by SLURM scheduler has certain
requirements:

* **An Ansys installation must be accessible from all the compute nodes**.
germa89 marked this conversation as resolved.
Show resolved Hide resolved
This normally implies that the ``ANSYS`` installation directory is in a
shared drive or directory. Your HPC cluster administrator
should provide you with the path to the ``ANSYS`` directory.

* **A compatible Python installation must be accessible from all the compute nodes**.
* **A compatible Python installation must be accessible from all the compute
nodes**.
germa89 marked this conversation as resolved.
Show resolved Hide resolved
For compatible Python versions, see :ref:`ref_pymapdl_installation`.

Additionally, you must perform a few key steps to ensure efficient job
Expand All @@ -23,8 +25,8 @@ execution and resource utilization. Subsequent topics describe these steps.
Check the Python installation
=============================

The PyMAPDL Python package (``ansys-mapdl-core``) must be installed in a virtual
environment that is accessible from the compute nodes.
The PyMAPDL Python package (``ansys-mapdl-core``) must be installed in
a virtual environment that is accessible from the compute nodes.

To see where your Python distribution is installed, use this code:

Expand All @@ -40,9 +42,10 @@ To print the version of Python you have available, use this code:
user@machine:~$ python3 --version
Python 3.9.16

You should be aware that your machine might have installed other Python versions.
To find out if those installations are already in the ``PATH`` environment variable,
you can press the **Tab** key to use autocomplete:
You should be aware that your machine might have other Python versions
installed.
To find out if those installations are already in the ``PATH`` environment
variable, you can press the **Tab** key to use autocomplete:

.. code-block:: console

Expand All @@ -55,23 +58,34 @@ you can press the **Tab** key to use autocomplete:
You should use a Python version that is compatible with PyMAPDL.
For more information, see :ref:`ref_pymapdl_installation`.

The ``which`` command returns the path where the Python executable is installed.
You can use that executable to create your own Python virtual environment in a directory
that is accessible from all the compute nodes.
For most HPC clusters, the ``/home/$user`` directory is generally available to all nodes.
You can then create the virtual environment in the ``/home/user/.venv`` directory:
.. warning::

Contact your cluster administrator if you cannot find a Python version
compatible with PyMAPDL.


The ``which`` command returns the path where the Python executable is
installed.
You can use that executable to create your own Python virtual environment
in a directory that is accessible from all the compute nodes.
For most HPC clusters, the ``/home/$user`` directory is generally available
to all nodes.
You can then create the virtual environment in the ``/home/user/.venv``
directory:

.. code-block:: console

user@machine:~$ python3 -m venv /home/user/.venv

After activating the virtual environment, you can install PyMAPDL.

.. _ref_install_pymapdl_on_hpc:

Install PyMAPDL
===============

To install PyMAPDL on the activated virtual environment, run the following commands:
To install PyMAPDL on the activated virtual environment, run the following
commands:

.. code-block:: console

Expand Down Expand Up @@ -107,14 +121,15 @@ then you can run that script using:

user@machine:~$ srun test.sh

This command might take a minute or two to complete, depending on the amount of free
resources available in the cluster.
This command might take a minute or two to complete, depending on the amount of
free resources available in the cluster.
germa89 marked this conversation as resolved.
Show resolved Hide resolved
On the console, you should see this output:

.. code-block:: text

Testing Python!
PyMAPDL version 0.68.1 was successfully imported.

If you see an error in the output, see :ref:`ref_hpc_troubleshooting`, especially
:ref:`ref_python_venv_not_accesible`.
If you see an error in the output, see :ref:`ref_hpc_troubleshooting`,
especially :ref:`ref_python_venv_not_accesible`.

Loading