As you read this text, there are big forces(ChatGPT, Google, Anthropic, Perplexity to name a few...) in play that are fundamentally altering the way information/content is presented for consumption over the web. MCP and A2A are two fundamental protocols, with A2A playing a major role in this shift.
Consider the following use case in various times;
- A restaurant display its menu and capabilities for a human to order items from that Menu, using catchy and vivid images of the food. The interface as it exists today is/was meant for human interaction.
- Along came search engines that had to write custom crawlers to handle the various formats in which restaurants were making that information available. The onus was on the search engine to put the effort to understand the formats that were used by restaurants and make them available as answers for humans, so search engines could drive more human traffic to their search bars.
- As more and more started to rely on search engine output to look for menus, rather than typing the address of the restaurant, the power slowly shifted away from restaurants to search engines. The restaurants had to design their website to not only meet the requirement for human consumption but also meet the need of web crawlers, so they could parse the content properly and expose them as search results. The final aim was to ensure that those results would show up as top results for human search queries
- Thus the sites began to be Search Engine Optimized web pages.
- I have not done much this research, but it is reasonable to posit that; many of those restaurants would have sacrificed their web design to support SEO web pages, as search engines started to provide options to order items directly from their own interface and taking a slice of that order pie. This is akin to restaurants only supporting take outs instead of dine-in, post covid(say... cloud kitchen), since it made more sense economically, as well as human behavior to dine in was met with reluctance.
- With the advent of transformers and its subsequent use in developing LLMs, this has exploded the need for further automation with the introduction of AI Agents or Agentic AI.
- The idea is to offload tasks that can be automated
- Like booking your trip to Las Vegas. Given some bounding parameters, the task can be offloaded to an AI Agent, that scours the airlines, hotels and restaurants, reserves seats for you with your credit card and generates a itinerary that you can follow it to a T. (replace your travel agent or even yourself browsing daily for interesting deals)
- As more and more humans start using AI Agents, it is reasonable to posit; a similar shift to develop websites that are Agentic AI aware first, using A2A protocols would become the norm, de-prioritizing regular human users.