A vulnerability in the HuggingFace Transformers library, specifically in the `Trainer` class, allows for arbitrary code execution. The `_load_rng_state()` method in `src/transformers/trainer.py` at line 3059 calls `torch.load()` without the `weights_only=True` parameter. This issue affects all versions of the library supporting `torch>=2.2` when used with PyTorch versions below 2.6, as the `safe_globals()` context manager provides no protection in these versions. An attacker can exploit this vulnerability by supplying a malicious checkpoint file, such as `rng_state.pth`, which can execute arbitrary code when loaded. The issue is resolved in version v5.0.0rc3.
Metrics
Affected Vendors & Products
References
History
Tue, 07 Apr 2026 07:15:00 +0000
| Type | Values Removed | Values Added |
|---|---|---|
| Description | A vulnerability in the HuggingFace Transformers library, specifically in the `Trainer` class, allows for arbitrary code execution. The `_load_rng_state()` method in `src/transformers/trainer.py` at line 3059 calls `torch.load()` without the `weights_only=True` parameter. This issue affects all versions of the library supporting `torch>=2.2` when used with PyTorch versions below 2.6, as the `safe_globals()` context manager provides no protection in these versions. An attacker can exploit this vulnerability by supplying a malicious checkpoint file, such as `rng_state.pth`, which can execute arbitrary code when loaded. The issue is resolved in version v5.0.0rc3. | |
| Title | Arbitrary Code Execution via Unsafe torch.load() in Trainer Checkpoint Loading in huggingface/transformers | |
| Weaknesses | CWE-502 | |
| References |
| |
| Metrics |
cvssV3_0
|
Status: PUBLISHED
Assigner: @huntr_ai
Published:
Updated: 2026-04-07T13:27:41.789Z
Reserved: 2026-02-03T16:49:27.781Z
Link: CVE-2026-1839
No data.
Status : Received
Published: 2026-04-07T06:16:41.490
Modified: 2026-04-07T06:16:41.490
Link: CVE-2026-1839
No data.
OpenCVE Enrichment
No data.