Enhance your LangChain applications with real-time social media context from Membit. This integration allows your AI agents to access up-to-the-minute discussions, trends, and insights from platforms like X (Twitter), Farcaster, and more.

Prerequisites

Before you begin, ensure you have:
  • Python 3.10 or higher installed
  • A Membit account with API access
  • Basic familiarity with LangChain agents
  • Node.js installed (for MCP remote client)
You’ll need valid Membit API credentials and the MCP remote URL. If you don’t have access yet, get your API key to get started.

Installation

Install the required packages for LangChain and MCP integration:
# Install LangChain MCP adapter
pip install langchain-mcp-adapters

# Install MCP remote client (requires Node.js)
npm install -g mcp-remote
We recommend using a virtual environment to manage your Python dependencies and avoid conflicts.

Quick Start

1

Import required modules

Import the necessary components for LangChain and MCP integration:
import asyncio

from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
from langchain.agents import create_openai_tools_agent, AgentExecutor
from langchain.chat_models import init_chat_model
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain.schema import HumanMessage
from langchain_mcp_adapters.tools import load_mcp_tools
2

Configure your LLM

Set up your language model with the provider of your choice:
llm = init_chat_model(
    model="gpt-4o",
    model_provider="openai",
)
3

Configure MCP server parameters

Set up the connection to Membit’s MCP server:
server_params = StdioServerParameters(
    command="npx",
    args=[
        "mcp-remote",
        "https://mcp.membit.ai/mcp",
        "--header",
        "X-Membit-Api-Key:${MEMBIT_API_KEY}",
    ],
    env={
        "MEMBIT_API_KEY": <your-api-key>,
    },
)
Make sure you have mcp-remote installed globally via npm for this to work.
Replace <your-api-key> with your actual Membit API key. Keep this credential secure and don’t share it with unauthorized users.
4

Create agent prompt

Define a prompt template that encourages the agent to use Membit tools:
prompt = ChatPromptTemplate.from_messages(
    [
        (
            "system",
            "Make sure you utilize membit tools to get the most trending data."
        ),
        MessagesPlaceholder(variable_name="messages"),
        MessagesPlaceholder(variable_name="agent_scratchpad"),
    ]
)
5

Create and run your agent

Connect to Membit and create your LangChain agent:
async def main():
    async with stdio_client(server_params) as (read, write):
        async with ClientSession(read, write) as session:
            # Initialize the MCP connection
            await session.initialize()

            # Load Membit tools from MCP session
            mcp_tools = await load_mcp_tools(session)

            # Create the LangChain agent
            agent = create_openai_tools_agent(
                llm=llm,
                tools=mcp_tools,
                prompt=prompt
            )

            # Create executor
            agent_executor = AgentExecutor(
                agent=agent,
                tools=mcp_tools,
                verbose=True
            )

            # Query your agent
            await agent_executor.ainvoke({
                "messages": [HumanMessage(content=
                    "What are the most trending discussions about AI today?"
                )]
            })

if __name__ == "__main__":
    asyncio.run(main())
Your LangChain agent is now powered by real-time social media context from Membit!

Complete Example

Here’s a full working example with enhanced functionality:
import asyncio

from langchain.agents import AgentExecutor, create_openai_tools_agent
from langchain.chat_models import init_chat_model
from langchain.schema import HumanMessage
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_mcp_adapters.tools import load_mcp_tools
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client

# Initialize LLM

llm = init_chat_model(
model="gpt-4o",
model_provider="openai",
)

server_params = StdioServerParameters(
command="npx",
args=[
"mcp-remote",
"https://mcp.membit.ai/mcp",
"--header",
"X-Membit-Api-Key:${MEMBIT_API_KEY}",
],
env={ # Replace `<your-api-key>` with your actual Membit API key.
"MEMBIT_API_KEY": <your-api-key>,
},
)

prompt = ChatPromptTemplate.from_messages([
(
"system",
"You are a social media analyst with access to real-time data. "
"Make sure you utilize membit tools to get the most trending data."
),
MessagesPlaceholder(variable_name="messages"),
MessagesPlaceholder(variable_name="agent_scratchpad"),
])

async def main():
async with stdio_client(server_params) as (read, write):
async with ClientSession(read, write) as session:
await session.initialize()

            mcp_tools = await load_mcp_tools(session)

            agent = create_openai_tools_agent(
                llm=llm,
                tools=mcp_tools,
                prompt=prompt
            )

            agent_executor = AgentExecutor(
                agent=agent,
                tools=mcp_tools,
                verbose=True
            )

            user_input = (
                "What are the most trending discussions today related to Bitcoin? "
                "Give me the best conversations to follow with URLs."
            )

            await agent_executor.ainvoke({
                "messages": [HumanMessage(content=user_input)]
            })


if **name** == "**main**":
asyncio.run(main())

Troubleshooting