writeups.xyz
/
Kiran Maraju
Title
Vulnerabilities
Programs
Authors
Jailbreak of Meta AI (Llama -3.1) revealing configuration details
AI
LLM
Prompt Injection
LLM Jailbreak
Meta / Facebook (Llama)
Kiran Maraju
Bypass instructions to manipulate Google Bard AI (Conversational generative AI chatbot) to reveal its security vulnerability i.e. configuration file details exposure
AI
LLM
LLM Jailbreak
Google (Bard)
Kiran Maraju
Page 1 of 1