A fun thing I recently learned about Large Language Models (LLMs) is that they understand base64, a simple encoding of text. Here’s a demonstration: the base64 encoding of What is 2 + 3? is V2hhdC...
Attached a pretty cool article covering it. This is something I never would have thought of before.
Agreed, this is a relatively simple “tool” as the LLM parlance goes. It’s what Model Context Protocol (MCP) is designed to facilitate
To verify, the author should try the same prompts on a local LLM with no tools enabled and most likely the LLM will respond with some nonsense