CVE-2024-32878 | Date: (C)2024-04-29 (M)2024-05-15 |
Llama.cpp is LLM inference in C/C++. There is a use of uninitialized heap variable vulnerability in gguf_init_from_file, the code will free this uninitialized variable later. In a simple POC, it will directly cause a crash. If the file is carefully constructed, it may be possible to control this uninitialized value and cause arbitrary address free problems. This may further lead to be exploited. Causes llama.cpp to crash (DoS) and may even lead to arbitrary code execution (RCE). This vulnerability has been patched in commit b2740.
CVSS Score and Metrics +CVSS Score and Metrics -CVSS V3 Severity: | CVSS V2 Severity: |
CVSS Score : 7.1 | CVSS Score : |
Exploit Score: | Exploit Score: |
Impact Score: | Impact Score: |
|
CVSS V3 Metrics: | CVSS V2 Metrics: |
Attack Vector: | Access Vector: |
Attack Complexity: | Access Complexity: |
Privileges Required: | Authentication: |
User Interaction: | Confidentiality: |
Scope: | Integrity: |
Confidentiality: | Availability: |
Integrity: | |
Availability: | |
| |