MCP Learned to Touch the World

Original language: 🇯🇵 Japanese

When the Model Context Protocol (MCP) was released, My first impression was 'this is not just a unified API specification'. We watched demo videos overseas and immediately created a MOC (proof of concept) within the team. We thought that this would definitely become the "second operating system" for AI.

---

MCPs are often compared to USB. This is certainly true in the sense that it is a "common port through which AI can freely handle external tools". But when you actually touch it, it is more like a "AI-enabled personal computer " than a USB.

Previously, ChatGPT and Claude were, completed inside the input text. The main weakness was no access to local or remote external context (context).

An AI with no memory will rethink things from scratch every time. An AI that does not know the outside world can only 'regenerate' in the end.

---

But MCPs have gone beyond that. External context can now be incorporated, AI can now "touch" its own working environment.**

This is the kind of configuration I tried.

- Claude Code. - BigQuery (aggregation of business data) - GitHub (source code repository) - Notion (business logic documentation)

These are connected via MCP, Claude Code to work with ReAct (Reason + Act).

---

The AI then started working autonomously. It read the schema from BigQuery, Referencing logic from Notion, code on GitHub to come up with a modification plan.

What amazed me was that from that point on, the AI kept running** completely out of my hands for 15-20 minutes. I was updating the front in React while repeating Try and Error.

---

This was the first time I felt that AI had its own 'thought loop'. It did not just follow the specifications created by humans, It is also combining external information to make its own decisions.

It was no longer 'generating' but 'acting'. The AI broke out of the learned model, It was the moment when it reached out to the world.

---

Memory and long-term learning challenges still remain. However, MCP+ReAct has shown the first signs of "autonomy".

AI accesses the context of the world and, think, test and move away from their own hands. It was the moment when humans saw for the first time 'code that thinks for itself'.

Would you like to turn your thoughts into essays and publish them in multiple languages? We handle translation and editing. Your native language is fine.

Join Slack
Back to Essays

English · 中文 · 한국어 · Español · Français · Deutsch · Italiano · Português · Português (Brasil) · Nederlands · Русский · Türkçe · Bahasa Indonesia · Polski · Ελληνικά · Български · Čeština · Dansk · Eesti · Suomi · Magyar · Lietuvių · Latviešu · Norsk Bokmål · Română · Slovenčina · Slovenščina · Svenska · Українська