2025

Periskope MCP Server

AI Infrastructure · Periskope

I transformed Periskope into an AI-native WhatsApp platform by implementing the Model Context Protocol (MCP), exposing 82 core tools to external LLMs like Claude and Cursor.

MCP
Claude
Model Context Protocol
API
TypeScript
AI Integration
PlatformWeb · API · AI Infrastructure
ClientPeriskope
My Role
AI Engineer
Full-Stack Engineer
Periskope MCP Server

Control Whatsapp with an MCP

While building Periskope, a core challenge was making WhatsApp data actionable for modern AI workflows. Traditional chatbots are often siloed, limited by the specific UI components we build for them. I wanted to invert this relationship: instead of bringing AI into Periskope, I wanted to bring all of Periskope's capabilities into the AI assistants where developers and power users already spend their time.

The Model Context Protocol (MCP) by Anthropic provided the perfect architecture for this. By implementing an MCP server, I was able to turn Periskope into a plug-and-play data source. This allows an LLM to not just 'read' messages, but to actively manage the CRM, update ticket statuses, and moderate groups using natural language.

Architecture: From REST to Tool Definitions

Technically, the integration acts as a specialized bridge between the Periskope REST API and the MCP transport layer (primarily JSON-RPC over stdio). I mapped our internal business logic into 82 distinct tools across five major categories: Chat, Message, Contact, Group, and Ticket management.

One of the key design decisions was how to handle tool discovery. I built the server to dynamically generate JSON Schema definitions for every endpoint. This ensures that when a user connects their Claude Desktop to Periskope, the LLM immediately understands the constraints, required parameters, and return types for actions like `search_messages` or `update_ticket_status`.

The Challenge of High-Density Context

With 82 tools available, the 'context window' management becomes critical. If you provide too much documentation to the LLM at once, it can get confused or hallucinate parameters. I solved this by implementing strict Zod schema validation on the server side.

If an LLM attempts to call a tool with an invalid chat ID format or a missing ticket status, the MCP server returns a detailed error message. This feedback loop allows the agent to self-correct its next turn, significantly increasing the success rate of complex, multi-step workflows like 'Find all customers who asked about pricing today and tag them as leads.'

By leveraging MCP's standard interface, we bypassed the need to build custom plugins for every AI tool on the market (Cursor, ChatGPT, Claude), instantly gaining compatibility with the entire ecosystem.

Performance and Scale Considerations

Because MCP servers often run locally on a user's machine (via `npx`), I had to ensure the package was lightweight and the startup time was near-instant. I optimized the build using esbuild to minimize the bundle size and utilized persistent HTTP connections to our backend to reduce latency during tool execution.

Security was also a top priority. Since the MCP server runs with the user's local environment variables, I implemented a middleware layer that ensures the `PERISKOPE_API_KEY` is validated against specific scopes before any write operation is performed on a WhatsApp group or contact list.

82
Available Tools
<250ms
Avg Tool Latency
1.2MB
Build Size

Enabling Natural Language Workflows

The real magic happens when you combine these tools. A user can now ask: 'Analyze the last 10 messages in the Support group, summarize the main complaint, and create a ticket for it.' The LLM uses `list_messages`, processes the text, and then calls `create_ticket` autonomously.

This transforms Periskope from a passive dashboard into an active participant in a company's operations. It effectively gives every Periskope user a specialized WhatsApp operations assistant that knows their business logic inside out.

This implementation moved Periskope from being 'AI-powered' (using LLMs for summaries) to 'AI-native' (allowing LLMs to drive the entire application state).

Lessons Learned and Future Roadmap

If I were to start over, I would focus more on 'Resource' templates in MCP rather than just 'Tools.' Currently, fetching chat history is a tool call; making it a resource would allow LLMs to 'subscribe' to message streams more efficiently.

Next, I plan to integrate multi-modal support. Being able to pass WhatsApp images and voice notes directly into the MCP context will allow agents to reason about screenshots of bugs or customer voice memos, further closing the loop on automated customer support.