[Forgot Password]
Login  Register Subscribe

30481

 
 

423868

 
 

255861

 
 

909

 
 

199025

 
 

282

Paid content will be excluded from the download.


Download | Alert*
CVE
view JSON

CVE-2023-29374Date: (C)2023-04-07   (M)2023-11-10


In LangChain through 0.0.131, the LLMMathChain chain allows prompt injection attacks that can execute arbitrary code via the Python exec method.

CVSS Score and Metrics +CVSS Score and Metrics -

CVSS V3 Severity:CVSS V2 Severity:
CVSS Score : 9.8CVSS Score :
Exploit Score: 3.9Exploit Score:
Impact Score: 5.9Impact Score:
 
CVSS V3 Metrics:CVSS V2 Metrics:
Attack Vector: NETWORKAccess Vector:
Attack Complexity: LOWAccess Complexity:
Privileges Required: NONEAuthentication:
User Interaction: NONEConfidentiality:
Scope: UNCHANGEDIntegrity:
Confidentiality: HIGHAvailability:
Integrity: HIGH 
Availability: HIGH 
  
Reference:
https://github.com/hwchase17/langchain/issues/1026
https://github.com/hwchase17/langchain/issues/814
https://github.com/hwchase17/langchain/pull/1119
https://twitter.com/rharang/status/1641899743608463365/photo/1

CWE    1
CWE-74

© SecPod Technologies