CVE-2024-11393 – Hugging Face Transformers MaskFormer Model Deserialization of Untrusted Data Remote Code Execution Vulnerability (25 Nov 2024)

Preface: What is the difference between Hugging Face and transformers?

Transformers is a library that contains various state-of-the-art machine learning models, as well as a Trainer API which can be used to train models. Huggingface_hub is a library to programmatically integrate with the hub.

Backgound: Masks are often used in segmentation tasks, where they provide a precise way to isolate the object of interest for further processing or analysis.

MaskFormer is based on the DETR architecture, which uses a transformer decoder to predict masks for each object in an image. MaskFormer has been shown to be effective for both semantic segmentation and panoptic segmentation. However, it has not been as successful for instance segmentation.

Vulnerability details: This vulnerability allows remote attackers to execute arbitrary code on affected installations of Hugging Face Transformers. User interaction is required to exploit this vulnerability in that the target must visit a malicious page or open a malicious file. The specific flaw exists within the parsing of model files.

Official announcement: Please see the link below for details – https://nvd.nist.gov/vuln/detail/CVE-2024-11393

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.