Enhance your LlamaIndex applications with real-time social media context from Membit. This integration allows your AI agents to access up-to-the-minute discussions, trends, and insights from platforms like X (Twitter), Farcaster, and more.

Prerequisites

Before you begin, ensure you have:
  • Python 3.10 or higher installed
  • A Membit account with API access
  • Basic familiarity with LlamaIndex agents
You’ll need valid Membit API credentials. If you don’t have access yet, get your API key to get started.

Installation

Install the required LlamaIndex MCP tools package:
pip install llama-index-tools-mcp
We recommend using a virtual environment to manage your dependencies and avoid conflicts.

Quick Start

1

Import the required modules

First, import the necessary components from LlamaIndex:
from llama_index.tools.mcp import (
    BasicMCPClient,
    get_tools_from_mcp_url,
    aget_tools_from_mcp_url,
)
from llama_index.core.agent import ReActAgent
from llama_index.core.memory import ChatMemoryBuffer
from llama_index.llms.openai import OpenAI
2

Get Membit tools from MCP server

Connect to Membit’s MCP server to retrieve available tools:
# Create MCP Client
mcp_client = BasicMCPClient(
    command_or_url="npx",
    args=[
        "mcp-remote",
        "https://mcp.membit.ai/mcp",
        "--header",
        "X-Membit-Api-Key:${MEMBIT_API_KEY}",
    ],
    env={
        "MEMBIT_API_KEY": <your-api-key>,
    },
)
# Async method
tools = await aget_tools_from_mcp_url(
    "",
    client=mcp_client,
)
Replace <your-api-key> with your actual Membit API key. Keep this credential secure and don’t share it with unauthorized users.
3

Create and configure your agent

Set up a LlamaIndex ReAct agent with Membit tools:
# Initialize your LLM
llm = OpenAI(model="gpt-4o-mini")

# Create the agent with Membit tools
agent = ReActAgent(
    tools=tools,
    llm=llm,
    memory=ChatMemoryBuffer.from_defaults(),
    verbose=True,
)
Your agent is now ready to access real-time social media context through Membit!
4

Query your agent

Start asking questions that leverage real-time social data:
response = agent.query(
    "Use cluster_search to find trending Bitcoin discussions today, "
    "then use cluster_info to get details about the most interesting cluster."
)

print(response)

Complete Example

Here’s a full working example that demonstrates the integration:
import asyncio

from llama_index.core.agent import ReActAgent
from llama_index.core.memory import ChatMemoryBuffer
from llama_index.llms.openai import OpenAI
from llama_index.tools.mcp import aget_tools_from_mcp_url

async def main(): # Create MCP client
mcp_client = BasicMCPClient(
command_or_url="npx",
args=[
"mcp-remote",
"https://mcp.membit.ai/mcp",
"--header",
"X-Membit-Api-Key:${MEMBIT_API_KEY}",
],
env={ # Replace `<your-api-key>` with your actual Membit API key.
"MEMBIT_API_KEY": <your-api-key>,
},
)

    # Get Membit tools
    tools = await aget_tools_from_mcp_url(
        "",
        client=mcp_client,
    )

    # Initialize LLM
    llm = OpenAI(model="gpt-4o-mini")

    # Create agent
    agent = ReActAgent(
        tools=tools,
        llm=llm,
        memory=ChatMemoryBuffer.from_defaults(),
        verbose=True,
    )

    # Query for trending crypto discussions
    response = await agent.aquery(
        "What are the most trending discussions about Bitcoin today? "
        "Use cluster_search to find the hottest topics and give me insights."
    )

    print(response)

if **name** == "**main**":
asyncio.run(main())

Troubleshooting