Skip to content
Threat Feed
medium advisory

Keras Model Loader Vulnerable to Denial-of-Service via Malicious HDF5 Shape Bombs

Keras model loader is vulnerable to denial-of-service by loading specially crafted .keras files containing HDF5-based weight files with maliciously oversized dataset metadata, leading to immediate memory exhaustion during model loading.

A denial-of-service vulnerability exists in Keras versions 3.0.0 through 3.12.0 and 3.13.0 through 3.13.1 due to improper handling of HDF5 dataset metadata within .keras model files. An attacker can craft a malicious .keras archive containing a valid model.weights.h5 file, where the HDF5 dataset declares an extremely large shape while storing minimal data. This “shape bomb” exploits the KerasFileEditor, which loads user-supplied .keras model files. When Keras attempts to load the model, it executes result[key] = value[()], causing h5py to allocate RAM proportional to the dataset’s declared shape (e.g., 8.88 PiB). This leads to immediate memory exhaustion, Python/TensorFlow crashes, Jupyter kernel kills, system instability, and ultimately, a full Denial of Service. This allows an attacker to crash any environment or pipeline that loads untrusted .keras models, including MLOps backends, training services, model upload endpoints, or automated pipelines. The vulnerability was reported on May 6, 2026.

Attack Chain

  1. An attacker crafts a malicious .keras file.
  2. The malicious .keras file includes a model.weights.h5 file.
  3. The model.weights.h5 file contains HDF5 dataset metadata declaring an extremely large shape (e.g., 50,000,000 x 50,000,000).
  4. The HDF5 dataset uses gzip compression to keep the file size small (100-400 KB).
  5. A victim system attempts to load the malicious .keras model using KerasFileEditor.
  6. Keras attempts to load the entire dataset into memory using value[()], triggering h5py.
  7. h5py attempts to allocate RAM proportional to the declared shape, leading to extreme memory exhaustion (e.g. 8.88 PiB).
  8. The Python/TensorFlow interpreter crashes, resulting in a Denial of Service.

Impact

Successful exploitation of this vulnerability leads to a denial-of-service condition. Observed damage includes immediate memory exhaustion (8+ PiB allocation attempts), crashes of the TensorFlow/Python interpreter, and the killing of Jupyter kernels. This can break automated model-upload pipelines and crash MLOps servers that process user models. In one proof-of-concept, a Google Colab compute quota dropped from 83 hours to 4 hours after only a few tests. Platforms allowing user-uploaded Keras models, such as training services, inference endpoints, and AutoML tools, are particularly vulnerable.

Recommendation

  • Implement input validation on .keras files to check for excessively large HDF5 dataset shapes before loading models.
  • Monitor Python/TensorFlow processes for abnormal memory allocation patterns indicative of a memory exhaustion attack.
  • Apply patches and updates for Keras to address CVE-2026-0897 as they become available from Google.
  • Deploy the Sigma rule “Detect Suspicious Keras Model Loading” to identify potential exploitation attempts based on process execution and file access patterns.
  • Block access to the malicious URL https://drive.google.com/file/d/1XAj57epTBWpj93GwHprHvb14WS9wpl5m/view?usp=drivesdk at the network perimeter.

Detection coverage 2

Detect Suspicious Keras Model Loading

high

Detects suspicious process execution and file access patterns associated with loading Keras models, potentially indicating an attempt to exploit the HDF5 shape bomb vulnerability.

sigma tactics: denial_of_service techniques: T1499.004 sources: process_creation, linux

Detect Large Memory Allocation by Python

medium

Detects python processes allocating large amounts of memory which could indicate a shape bomb attack

sigma tactics: denial_of_service techniques: T1499.004 sources: process_creation, linux

Detection queries are kept inside the platform. Get full rules →

Indicators of compromise

1

url

TypeValue
urlhttps://drive.google.com/file/d/1XAj57epTBWpj93GwHprHvb14WS9wpl5m/view?usp=drivesdk