Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
18 commits
Select commit Hold shift + click to select a range
e490d0b
Add config.yml to .gitignore to prevent tracking of configuration files
wli51 Sep 18, 2025
6f818e1
Add config.yml.template
wli51 Sep 18, 2025
9965e3b
Add data wrangling script for DepMap PRISM dataset preprocessing
wli51 Sep 18, 2025
96565ea
Add test for wrangle_depmap_prism script execution
wli51 Sep 18, 2025
aba3db4
Add processed_depmap_prism_ic50.csv to .gitignore to prevent tracking…
wli51 Sep 18, 2025
c14df83
Add VSCode settings for Python environment and Jupyter notebook confi…
wli51 Sep 29, 2025
3433e20
Add utility function for detecting if running in notebook, make a sep…
wli51 Sep 29, 2025
f2f9d0d
Elevante IN_NOTEBOOK detection to a utils module of an importable sub…
wli51 Sep 29, 2025
ace7201
Move notebook under subdir notebook
wli51 Sep 29, 2025
cfe2bdd
Update default markers in pathing utility to include .env and LICENSE
wli51 Sep 29, 2025
8764ad9
Add pyproject.toml for nbutils package configuration
wli51 Sep 29, 2025
2127824
Add script to convert Jupyter notebooks to Python scripts
wli51 Sep 29, 2025
929dbf7
Update README.md with project setup instructions for notebook and scr…
wli51 Sep 29, 2025
1f4c740
Use external json schema for config yml validation
wli51 Sep 30, 2025
cca528e
Make data wrangling notebook always save plots and only skip showing …
wli51 Sep 30, 2025
6e5ebd9
Refactor test script path and improve assertion error message formatting
wli51 Oct 2, 2025
39e4e45
Update README.md with configuration requirements and analysis noteboo…
wli51 Oct 2, 2025
3791ad5
Update analysis README.md to rename notebook and improve formatting
wli51 Oct 2, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,3 +1,9 @@
# Big preprocessing output
/data/processed/processed_depmap_prism_ic50.csv

# Actual config.yml
config.yml
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider adding this file's details to the readme.


# Byte-compiled / optimized / DLL files
__pycache__/
*.py[codz]
Expand Down
9 changes: 9 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
// .vscode/settings.json
{
"python.envFile": "${workspaceFolder}/.env",
"python.analysis.extraPaths": [
"agentic_system/src",
"analysis/src"
],
"jupyter.notebookFileRoot": "${workspaceFolder}"
}
49 changes: 46 additions & 3 deletions analysis/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,55 @@ agentic system defined in [`agentic_system/`](../agentic_system/).

## Experiment/Analysis Overview

**Details TBA**
### `Configuration`

All notebooks require a `config.yml` file at the project root.
This file specifies data paths and API credentials needed for the analysis.

Please refer to `config.yml.template` for guidance on how the file should be formatted.

### `0.data_wrangling`
- `0.1.WRANGLE_DEPMAP_PRISM_DATA.ipynb`

Preprocesses the raw DepMap PRISM secondary drug repurposing dataset to produce a clean, deduplicated table of drug-cell line-IC50 values.
The script handles deduplication of overlapping entries between the HTS002 and MTS010 screens, prioritizing MTS010 results and highest-quality curve fits (r²).

**Output:** `data/processed/processed_depmap_prism_ic50.csv` - cleaned dataset with unique (cell line, drug) combinations ready for downstream analysis.

---

## Usage

1. Make sure you’ve installed the infrastructure package (from repo root):

**Details TBA**
> ⚙️ **Project Setup Note**
>
> To run the analysis notebooks or nbconverted scripts correctly, some project setup is required:
>
> - **Notebook mode (VS Code / Jupyter)**
> - Ensure `.vscode/settings.json` contains a `python.envFile` pointing to a `.env` that sets the `PYTHONPATH` for both partitions:
> ```json
> {
> "python.envFile": "${workspaceFolder}/.env",
> "python.analysis.extraPaths": [
> "agentic_system/src",
> "analysis/src"
> ],
> "jupyter.notebookFileRoot": "${workspaceFolder}"
> }
> ```
> - Example `.env` (at the repo root):
> ```
> PYTHONPATH=agentic_system/src:analysis/src:${PYTHONPATH}
> ```
>
> - **Script mode (running nbconvert-generated `.py`)**
> - Perform an **editable install** of the notebook utilities once:
> ```bash
> pip install -e ./analysis
> # or: uv pip install -e ./analysis
> ```
> - This makes the `nbutils` package importable from the scripts:
> ```python
> from nbutils.pathing import project_file
> config_path = project_file("config.yml")
> ```
51 changes: 51 additions & 0 deletions analysis/nbconvert_all.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
#!/usr/bin/env bash
set -euo pipefail

# Resolve repo root
if git rev-parse --show-toplevel &>/dev/null; then
REPO_ROOT="$(git rev-parse --show-toplevel)"
else
# Fallback: repo root is parent of this script's directory twice
REPO_ROOT="$(realpath "$(dirname "$0")"/..)"
fi

NB_ROOT="$REPO_ROOT/analysis/notebooks"
OUT_ROOT="$REPO_ROOT/analysis/scripts"

# Optional --force flag to rebuild everything
FORCE=0
if [[ "${1:-}" == "--force" ]]; then
FORCE=1
fi

echo "Repo root: $REPO_ROOT"
echo "Notebook root: $NB_ROOT"
echo "Output root: $OUT_ROOT"
echo

mkdir -p "$OUT_ROOT"

# Find all notebooks (skip checkpoints)
while IFS= read -r -d '' nb; do
rel="${nb#$NB_ROOT/}" # path relative to NB_ROOT
dest_dir="$OUT_ROOT/$(dirname "$rel")" # mirror folder structure
base="$(basename "$rel" .ipynb)" # filename without .ipynb
out_path="$dest_dir/$base.py"

mkdir -p "$dest_dir"

if [[ $FORCE -eq 0 && -f "$out_path" && "$nb" -ot "$out_path" ]]; then
echo "Up to date: $rel"
continue
fi

echo "Converting: $rel -> ${out_path#$REPO_ROOT/}"
jupyter nbconvert \
--to script \
--output-dir "$dest_dir" \
"$nb"

done < <(find "$NB_ROOT" -type f -name '*.ipynb' -not -path '*/.ipynb_checkpoints/*' -print0)

echo
echo "Done."
Loading