Skip to content

Commit 0e43b28

Browse files
authored
Merge branch 'dev' into support-decollate-batch-numpy-scalars
2 parents e649f3a + 53382d8 commit 0e43b28

35 files changed

+490
-133
lines changed

CHANGELOG.md

Lines changed: 34 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,38 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).
55

66
## [Unreleased]
77

8+
## [1.5.1] - 2025-09-22
9+
10+
## What's Changed
11+
### Added
12+
* PyTorch 2.7 and 2.8 support (#8429, #8530)
13+
* Create SECURITY.md (#8546)
14+
* Add kwargs in array and functional file (#8508)
15+
* Add .coderabbit.yaml File (#8513)
16+
* Add input validation to ImageStats class (#8501)
17+
* Add support for optional conditioning in PatchInferer, SliceInferer, and SlidingWindowInferer (#8400)
18+
* Add classifier free guidance unconditioned value (#8562)
19+
* Improved `DiffusionModelEncoder` to support output linear layers of different dimensions (#8578, #8580)
20+
21+
### Fixed
22+
* Fix for insecure zip file extraction to address [GHSA-x6ww-pf9m-m73m](https://github.com/Project-MONAI/MONAI/security/advisories/GHSA-x6ww-pf9m-m73m) (#8568)
23+
* Fix for insecure use of `torch.load` and `pickle` to address [GHSA-6vm5-6jv9-rjpj](https://github.com/Project-MONAI/MONAI/security/advisories/GHSA-6vm5-6jv9-rjpj) and [GHSA-p8cm-mm2v-gwjm](https://github.com/Project-MONAI/MONAI/security/advisories/GHSA-p8cm-mm2v-gwjm) (#8566)
24+
* Torchvision fix for loading pretrained weights using current syntax (#8563)
25+
* Fix bug in MAISI vae (#8517)
26+
* Throw exception on invalid images in retinanet detector (#8515)
27+
* Fix: HistogramNormalized doc (#8543)
28+
* Fix build failure by pinning pyamg to versions below 5.3.0 (#8548)
29+
* Fix hardcoded input dim in DiffusionModelEncoder (#8514)
30+
* Fix for gdown downloading fails (#8576)
31+
32+
### Changed
33+
* Update README badges to add research paper citations number (#8494)
34+
* CI: Add custom timeout to ci job in order to save resources (#8504)
35+
* Improve documentation on the datalist format (#8539)
36+
* Tests Cleanup and refactor (#8405, #8535)
37+
* Improve Orientation transform to use the "space" (LPS vs RAS) of a metatensor by default (#8473)
38+
* Updated supported version of Huggingface Transformers (#8574)
39+
840
## [1.5.0] - 2025-06-13
941

1042
## What's Changed
@@ -1229,7 +1261,8 @@ the postprocessing steps should be used before calling the metrics methods
12291261

12301262
[highlights]: https://github.com/Project-MONAI/MONAI/blob/master/docs/source/highlights.md
12311263

1232-
[Unreleased]: https://github.com/Project-MONAI/MONAI/compare/1.5.0...HEAD
1264+
[Unreleased]: https://github.com/Project-MONAI/MONAI/compare/1.5.1...HEAD
1265+
[1.5.1]: https://github.com/Project-MONAI/MONAI/compare/1.5.0...1.5.1
12331266
[1.5.0]: https://github.com/Project-MONAI/MONAI/compare/1.4.0...1.5.0
12341267
[1.4.0]: https://github.com/Project-MONAI/MONAI/compare/1.3.2...1.4.0
12351268
[1.3.2]: https://github.com/Project-MONAI/MONAI/compare/1.3.1...1.3.2

CITATION.cff

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,8 +6,8 @@ title: "MONAI: Medical Open Network for AI"
66
abstract: "AI Toolkit for Healthcare Imaging"
77
authors:
88
- name: "MONAI Consortium"
9-
date-released: 2025-06-13
10-
version: "1.5.0"
9+
date-released: 2025-09-22
10+
version: "1.5.1"
1111
identifiers:
1212
- description: "This DOI represents all versions of MONAI, and will always resolve to the latest one."
1313
type: doi

SECURITY.md

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
# Security Policy
2+
3+
## Reporting a Vulnerability
4+
MONAI takes security seriously and appreciate your efforts to responsibly disclose vulnerabilities. If you discover a security issue, please report it as soon as possible.
5+
6+
To report a security issue:
7+
* please use the GitHub Security Advisories tab to "[Open a draft security advisory](https://github.com/Project-MONAI/MONAI/security/advisories/new)".
8+
* Include a detailed description of the issue, steps to reproduce, potential impact, and any possible mitigations.
9+
* If applicable, please also attach proof-of-concept code or screenshots.
10+
* We aim to acknowledge your report within 72 hours and provide a status update as we investigate.
11+
* Please do not create public issues for security-related reports.
12+
13+
## Disclosure Policy
14+
* We follow a coordinated disclosure approach.
15+
* We will not publicly disclose vulnerabilities until a fix has been developed and released.
16+
* Credit will be given to researchers who responsibly disclose vulnerabilities, if requested.
17+
## Acknowledgements
18+
We greatly appreciate contributions from the security community and strive to recognize all researchers who help keep MONAI safe.

docs/requirements.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ sphinxcontrib-serializinghtml
2121
sphinx-autodoc-typehints==1.11.1
2222
pandas
2323
einops
24-
transformers>=4.36.0, <4.41.0; python_version <= '3.10'
24+
transformers>=4.53.0
2525
mlflow>=2.12.2
2626
clearml>=1.10.0rc0
2727
tensorboardX

docs/source/whatsnew.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,7 @@ What's New
66
.. toctree::
77
:maxdepth: 1
88

9+
whatsnew_1_5_1.md
910
whatsnew_1_5.md
1011
whatsnew_1_4.md
1112
whatsnew_1_3.md

docs/source/whatsnew_1_5.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11

2-
# What's new in 1.5 🎉🎉
2+
# What's new in 1.5
33

44
- Support numpy 2.x and Pytorch 2.6
55
- MAISI inference accelerate

docs/source/whatsnew_1_5_1.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
2+
# What's new in 1.5.1 🎉🎉
3+
4+
This is a minor update for MONAI to address security concerns and improve compatibility with the newest PyTorch release.
5+
6+
With the upgrade support for PyTorch 2.8, MONAI now directly support NVIDIA GeForce RTX 50 series GPUs and other Blackwell-based GPUs!
7+
8+
- Support up to PyTorch 2.8.
9+
- Security fixes to address advisories [GHSA-x6ww-pf9m-m73m](https://github.com/Project-MONAI/MONAI/security/advisories/GHSA-x6ww-pf9m-m73m), [GHSA-6vm5-6jv9-rjpj](https://github.com/Project-MONAI/MONAI/security/advisories/GHSA-6vm5-6jv9-rjpj), and [GHSA-p8cm-mm2v-gwjm](https://github.com/Project-MONAI/MONAI/security/advisories/GHSA-p8cm-mm2v-gwjm),
10+
- Updated version of supported Huggingface Transformers library to address security advisories raised for it.
11+
- Updated Torchvision pretrained network loading to use current arguments.
12+
- Many minor fixes to identified issues, see release notes for details on merged PRs.

monai/apps/nnunet/nnunet_bundle.py

Lines changed: 17 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -133,7 +133,7 @@ def get_nnunet_trainer(
133133
cudnn.benchmark = True
134134

135135
if pretrained_model is not None:
136-
state_dict = torch.load(pretrained_model)
136+
state_dict = torch.load(pretrained_model, weights_only=True)
137137
if "network_weights" in state_dict:
138138
nnunet_trainer.network._orig_mod.load_state_dict(state_dict["network_weights"])
139139
return nnunet_trainer
@@ -182,7 +182,9 @@ def __init__(self, predictor: object, model_folder: Union[str, Path], model_name
182182
parameters = []
183183

184184
checkpoint = torch.load(
185-
join(Path(model_training_output_dir).parent, "nnunet_checkpoint.pth"), map_location=torch.device("cpu")
185+
join(Path(model_training_output_dir).parent, "nnunet_checkpoint.pth"),
186+
map_location=torch.device("cpu"),
187+
weights_only=True,
186188
)
187189
trainer_name = checkpoint["trainer_name"]
188190
configuration_name = checkpoint["init_args"]["configuration"]
@@ -192,7 +194,9 @@ def __init__(self, predictor: object, model_folder: Union[str, Path], model_name
192194
else None
193195
)
194196
if Path(model_training_output_dir).joinpath(model_name).is_file():
195-
monai_checkpoint = torch.load(join(model_training_output_dir, model_name), map_location=torch.device("cpu"))
197+
monai_checkpoint = torch.load(
198+
join(model_training_output_dir, model_name), map_location=torch.device("cpu"), weights_only=True
199+
)
196200
if "network_weights" in monai_checkpoint.keys():
197201
parameters.append(monai_checkpoint["network_weights"])
198202
else:
@@ -383,8 +387,12 @@ def convert_nnunet_to_monai_bundle(nnunet_config: dict, bundle_root_folder: str,
383387
dataset_name, f"{nnunet_trainer}__{nnunet_plans}__{nnunet_configuration}"
384388
)
385389

386-
nnunet_checkpoint_final = torch.load(Path(nnunet_model_folder).joinpath(f"fold_{fold}", "checkpoint_final.pth"))
387-
nnunet_checkpoint_best = torch.load(Path(nnunet_model_folder).joinpath(f"fold_{fold}", "checkpoint_best.pth"))
390+
nnunet_checkpoint_final = torch.load(
391+
Path(nnunet_model_folder).joinpath(f"fold_{fold}", "checkpoint_final.pth"), weights_only=True
392+
)
393+
nnunet_checkpoint_best = torch.load(
394+
Path(nnunet_model_folder).joinpath(f"fold_{fold}", "checkpoint_best.pth"), weights_only=True
395+
)
388396

389397
nnunet_checkpoint = {}
390398
nnunet_checkpoint["inference_allowed_mirroring_axes"] = nnunet_checkpoint_final["inference_allowed_mirroring_axes"]
@@ -470,7 +478,7 @@ def get_network_from_nnunet_plans(
470478
if model_ckpt is None:
471479
return network
472480
else:
473-
state_dict = torch.load(model_ckpt)
481+
state_dict = torch.load(model_ckpt, weights_only=True)
474482
network.load_state_dict(state_dict[model_key_in_ckpt])
475483
return network
476484

@@ -534,7 +542,7 @@ def subfiles(
534542

535543
Path(nnunet_model_folder).joinpath(f"fold_{fold}").mkdir(parents=True, exist_ok=True)
536544

537-
nnunet_checkpoint: dict = torch.load(f"{bundle_root_folder}/models/nnunet_checkpoint.pth")
545+
nnunet_checkpoint: dict = torch.load(f"{bundle_root_folder}/models/nnunet_checkpoint.pth", weights_only=True)
538546
latest_checkpoints: list[str] = subfiles(
539547
Path(bundle_root_folder).joinpath("models", f"fold_{fold}"), prefix="checkpoint_epoch", sort=True
540548
)
@@ -545,7 +553,7 @@ def subfiles(
545553
epochs.sort()
546554
final_epoch: int = epochs[-1]
547555
monai_last_checkpoint: dict = torch.load(
548-
f"{bundle_root_folder}/models/fold_{fold}/checkpoint_epoch={final_epoch}.pt"
556+
f"{bundle_root_folder}/models/fold_{fold}/checkpoint_epoch={final_epoch}.pt", weights_only=True
549557
)
550558

551559
best_checkpoints: list[str] = subfiles(
@@ -558,7 +566,7 @@ def subfiles(
558566
key_metrics.sort()
559567
best_key_metric: str = key_metrics[-1]
560568
monai_best_checkpoint: dict = torch.load(
561-
f"{bundle_root_folder}/models/fold_{fold}/checkpoint_key_metric={best_key_metric}.pt"
569+
f"{bundle_root_folder}/models/fold_{fold}/checkpoint_key_metric={best_key_metric}.pt", weights_only=True
562570
)
563571

564572
nnunet_checkpoint["optimizer_state"] = monai_last_checkpoint["optimizer_state"]

monai/apps/utils.py

Lines changed: 60 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -122,6 +122,38 @@ def update_to(self, b: int = 1, bsize: int = 1, tsize: int | None = None) -> Non
122122
raise e
123123

124124

125+
def safe_extract_member(member, extract_to):
126+
"""Securely verify compressed package member paths to prevent path traversal attacks"""
127+
# Get member path (handle different compression formats)
128+
if hasattr(member, "filename"):
129+
member_path = member.filename # zipfile
130+
elif hasattr(member, "name"):
131+
member_path = member.name # tarfile
132+
else:
133+
member_path = str(member)
134+
135+
if hasattr(member, "issym") and member.issym():
136+
raise ValueError(f"Symbolic link detected in archive: {member_path}")
137+
if hasattr(member, "islnk") and member.islnk():
138+
raise ValueError(f"Hard link detected in archive: {member_path}")
139+
140+
member_path = os.path.normpath(member_path)
141+
142+
if os.path.isabs(member_path) or ".." in member_path.split(os.sep):
143+
raise ValueError(f"Unsafe path detected in archive: {member_path}")
144+
145+
full_path = os.path.join(extract_to, member_path)
146+
full_path = os.path.normpath(full_path)
147+
148+
extract_root = os.path.realpath(extract_to)
149+
target_real = os.path.realpath(full_path)
150+
# Ensure the resolved path stays within the extraction root
151+
if os.path.commonpath([extract_root, target_real]) != extract_root:
152+
raise ValueError(f"Unsafe path: path traversal {member_path}")
153+
154+
return full_path
155+
156+
125157
def check_hash(filepath: PathLike, val: str | None = None, hash_type: str = "md5") -> bool:
126158
"""
127159
Verify hash signature of specified file.
@@ -242,6 +274,32 @@ def download_url(
242274
)
243275

244276

277+
def _extract_zip(filepath, output_dir):
278+
with zipfile.ZipFile(filepath, "r") as zip_file:
279+
for member in zip_file.infolist():
280+
safe_path = safe_extract_member(member, output_dir)
281+
if member.is_dir():
282+
continue
283+
os.makedirs(os.path.dirname(safe_path), exist_ok=True)
284+
with zip_file.open(member) as source:
285+
with open(safe_path, "wb") as target:
286+
shutil.copyfileobj(source, target)
287+
288+
289+
def _extract_tar(filepath, output_dir):
290+
with tarfile.open(filepath, "r") as tar_file:
291+
for member in tar_file.getmembers():
292+
safe_path = safe_extract_member(member, output_dir)
293+
if not member.isfile():
294+
continue
295+
os.makedirs(os.path.dirname(safe_path), exist_ok=True)
296+
source = tar_file.extractfile(member)
297+
if source is not None:
298+
with source:
299+
with open(safe_path, "wb") as target:
300+
shutil.copyfileobj(source, target)
301+
302+
245303
def extractall(
246304
filepath: PathLike,
247305
output_dir: PathLike = ".",
@@ -287,14 +345,10 @@ def extractall(
287345
logger.info(f"Writing into directory: {output_dir}.")
288346
_file_type = file_type.lower().strip()
289347
if filepath.name.endswith("zip") or _file_type == "zip":
290-
zip_file = zipfile.ZipFile(filepath)
291-
zip_file.extractall(output_dir)
292-
zip_file.close()
348+
_extract_zip(filepath, output_dir)
293349
return
294350
if filepath.name.endswith("tar") or filepath.name.endswith("tar.gz") or "tar" in _file_type:
295-
tar_file = tarfile.open(filepath)
296-
tar_file.extractall(output_dir)
297-
tar_file.close()
351+
_extract_tar(filepath, output_dir)
298352
return
299353
raise NotImplementedError(
300354
f'Unsupported file type, available options are: ["zip", "tar.gz", "tar"]. name={filepath} type={file_type}.'

monai/data/__init__.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -78,7 +78,6 @@
7878
from .thread_buffer import ThreadBuffer, ThreadDataLoader
7979
from .torchscript_utils import load_net_with_metadata, save_net_with_metadata
8080
from .utils import (
81-
PICKLE_KEY_SUFFIX,
8281
affine_to_spacing,
8382
compute_importance_map,
8483
compute_shape_offset,

0 commit comments

Comments
 (0)