#aijailbreaks

Yetkin Degirmenci 🍫yetkinmiller@infosec.exchange
2025-02-19

The Power of Words: Prompt Engineering and Jailbreaks

"Think of it like this: in social engineering, using the right words can open doors, build trust, and unlock information. Similarly, with LLMs, which are trained on vast amounts of human language, choosing the right words in your prompts is key to “opening the door” to clear, insightful, and truly valuable answers."
#AI #PromptEngineering #LLM #AICommunity #AISecurity #AIRedTeaming #AIJailBreaks

medium.com/@yetkind/the-power-

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst