tech 3 min read
TryHackMe — BankGPT Write-Up
A practical walkthrough of the BankGPT room with focus on prompt injection techniques, LLM security, and context vulnerabilities in banking chatbots.
Thoughts, projects and adventures.
A practical walkthrough of the BankGPT room with focus on prompt injection techniques, LLM security, and context vulnerabilities in banking chatbots.
A practical guide to prompt injection vulnerabilities in LLMs, demonstrating how system instructions can be revealed through context reinterpretation and structured output formats.