Skip to main content

A2A Integration

A2A Server — expose an agent

from orxhestra import LlmAgent, InMemorySessionService
from orxhestra.a2a import A2AServer, AgentSkill

agent = LlmAgent(name="MyAgent", llm=llm, tools=[...])

server = A2AServer(
    agent=agent,
    session_service=InMemorySessionService(),
    app_name="my-agent-service",
    skills=[
        AgentSkill(
            id="qa", name="Q&A",
            description="Answers general questions.",
            tags=["general"],
        ),
    ],
)

app = server.as_fastapi_app()
# uvicorn my_module:app --host 0.0.0.0 --port 8000

Endpoints

MethodEndpointDescription
GET/.well-known/agent.jsonAgent Card discovery
POST/JSON-RPC 2.0 dispatch

JSON-RPC methods

MethodDescription
message/sendSend message, receive completed Task
message/streamSend message, receive SSE stream
tasks/getRetrieve task by ID
tasks/cancelCancel a running task

A2A Agent — connect to a remote agent

from orxhestra.agents.a2a_agent import A2AAgent

remote = A2AAgent(
    name="RemoteAgent",
    description="A remote research agent.",
    url="http://localhost:9000",
)

async for event in remote.astream("What is quantum computing?"):
    print(event.text)

In YAML Composer

agents:
  remote_researcher:
    type: a2a
    description: "Remote research agent"
    url: "http://localhost:9000"

  orchestrator:
    type: llm
    tools:
      - agent: remote_researcher

main_agent: orchestrator