Skip to content

AGILAB project purpose is to explore AI for engineering. It is designed to help engineers quickly experiment with AI-driven methods.

License

Notifications You must be signed in to change notification settings

ThalesGroup/agilab

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

PyPI version Supported Python Versions License: BSD 3-Clause pypi_dl CI codecov GitHub stars black docs ORCID

AGILAB Open Source Project

AGILAB BSD license is an MLOps toolchain for engineering, powered by the OpenAI API, local GPT-OSS, local Mistral-Instruct, and MLflow. It helps you move from notebooks to production with CLI tooling, optional IDE run configurations, and packaged workers. IDE integrations remain available for teams that rely on them, but they are no longer required.

Docs publishing

  • The static site is committed under docs/html and deployed by GitHub Pages directly (no Sphinx build in CI).
  • Preferred path: run docs/gen_docs.sh. It builds Sphinx if a config exists; otherwise it syncs src/agilab/resources/help/ into docs/html and ensures an index.html is present.
  • CI will deploy the committed docs/html; if it’s empty, the workflow falls back to copying from src/agilab/resources/help/. See documentation.

See also: CHANGELOG.md for recent changes.

Audience profiles

  • End users install and launch packaged apps with uvx or the generated shell wrappers in tools/run_configs/; no repository checkout or IDE is required.
  • Developers clone this repository to build apps, regenerate run configurations (python3 tools/generate_runconfig_scripts.py), and extend the framework.

Install and Execution for end users

Quick run (no setup):

uvx -p 3.13 agilab

Note This uvx invocation is meant for demos or smoke tests. Any changes you make inside the cached package will be overwritten on the next run. For development, clone the repository or use a virtual environment.

Offline assistant (GPT-OSS)

Prefer to stay offline? Start a local GPT-OSS responses server and switch the “Assistant engine” selector (in the Experiment page sidebar) to GPT-OSS (local):

python -m pip install "agilab[offline]"
python -m pip install transformers torch accelerate  # installs the local backends used by GPT-OSS
python -m gpt_oss.responses_api.serve --inference-backend transformers --checkpoint gpt2 --port 8000

Update the endpoint field if you expose the server on a different port. The sidebar now lets you pick the GPT-OSS backend (stub, transformers, ollama, …), checkpoint, and extra launch flags; the app persists the values in env.envars. The default stub backend is only a connectivity check and returns canned responses—switch to transformers (or another real backend) to receive model-generated code. When GPT-OSS is installed and the endpoint targets localhost, the sidebar automatically launches a local server the first time you switch to GPT-OSS (local) using the configured backend.

Managed workspace (project folder):

mkdir agi-space && cd agi-space
uv init --bare --no-workspace
uv add agilab
uv run agilab

CLI wrappers for run configurations

Every IDE run configuration now has a matching shell script under tools/run_configs/. Regenerate them at any time with:

python3 tools/generate_runconfig_scripts.py

The generator groups scripts under tools/run_configs/<group>/ (agilab, apps, components). Each wrapper exports the same environment variables, switches to the correct working directory, and executes the underlying uv command—no IDE required.

Install for developers

Linux and MacOs
git clone https://github.com/ThalesGroup/agilab
cd agilab
./install.sh --openai-api-key "sk-your-api-key" --cluster-ssh-credentials "username[:password]"
Windows
git clone https://github.com/ThalesGroup/agilab
cd agilabpush it
powershell.exe -ExecutionPolicy Bypass -File .\install.ps1 --openai-api-key "sk-your-api-key"

AGILab Execution

Linux and MacOS and Windows:

cd agilab/src/agilab
uv run agilab

Credits

  • Portions of src/agilab/apps-pages/view_barycentric/src/view_barycentric/view_barycentric.py are adapted from Jean-Luc Parouty’s “barviz / barviz-mod” project (MIT License). See NOTICE and LICENSES/LICENSE-MIT-barviz-mod for details.

Notes for developers

  • AgiEnv is a singleton. Use instance attributes (env.apps_dir, env.logger, etc.). Class attribute reads (e.g., AgiEnv.apps_dir) proxy to the singleton when initialised; methods/properties are not shadowed. A few helpers are pre‑init safe (AgiEnv.set_env_var, AgiEnv.read_agilab_path, AgiEnv._build_env, AgiEnv.log_info).

  • Environment flags (replaces legacy install_type):

    • env.is_source_env: true when running from a source checkout.
    • env.is_worker_env: true in worker-only contexts (e.g., wenv/*_worker).
    • env.is_local_worker: helper flag for home‑scoped worker layouts.
  • App constructors (templates + flight_project) ignore unknown kwargs when constructing their Pydantic Args models. This preserves strict validation while making constructors resilient to incidental extras. Configure verbosity via AgiEnv(verbose=…) or logging, not via app Args.

About

AGILAB project purpose is to explore AI for engineering. It is designed to help engineers quickly experiment with AI-driven methods.

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Packages

No packages published

Contributors 6