Engine Module¶
The engine module contains the core components of AgentScope Runtime.
Submodules¶
Agents¶
- class agentscope_runtime.engine.agents.base_agent.Agent(name='', description='', before_agent_callback=None, after_agent_callback=None, agent_config=None, **kwargs)[source]
Bases:
object
- __init__(name='', description='', before_agent_callback=None, after_agent_callback=None, agent_config=None, **kwargs)[source]
- async run_async(context, **kwargs)[source]
- Return type:
AsyncGenerator[Event, None]
Deployers¶
- class agentscope_runtime.engine.deployers.local_deployer.LocalDeployManager(host='localhost', port=8090)[source]
Bases:
DeployManager
- deploy_sync(func, endpoint_path='/process', request_model=<class 'agentscope_runtime.engine.schemas.agent_schemas.AgentRequest'>, response_type='sse', before_start=None, after_finish=None, **kwargs)[source]
Deploy the agent as a FastAPI service (synchronous version).
- Parameters:
func (Callable) – Custom processing function
endpoint_path (str) – API endpoint path for the processing function
request_model (Type | None) – Pydantic model for request validation
response_type (str) – Response type - “json”, “sse”, or “text”
before_start (Callable | None) – Callback function called before server starts
after_finish (Callable | None) – Callback function called after server finishes
**kwargs (Any) – Additional keyword arguments passed to callbacks
- Returns:
Dictionary containing deploy_id and url of the deployed service
- Return type:
- Raises:
RuntimeError – If deployment fails
- async deploy(func, endpoint_path='/process', request_model=<class 'agentscope_runtime.engine.schemas.agent_schemas.AgentRequest'>, response_type='sse', before_start=None, after_finish=None, protocol_adapters=None, **kwargs)[source]
Deploy the agent as a FastAPI service (asynchronous version).
- Parameters:
func (Callable) – Custom processing function
endpoint_path (str) – API endpoint path for the processing function
request_model (Type | None) – Pydantic model for request validation
response_type (str) – Response type - “json”, “sse”, or “text”
before_start (Callable | None) – Callback function called before server starts
after_finish (Callable | None) – Callback function called after server finishes
**kwargs (Any) – Additional keyword arguments passed to callbacks
protocol_adapters (list[ProtocolAdapter] | None)
**kwargs
- Returns:
Dictionary containing deploy_id and url of the deployed service
- Return type:
- Raises:
RuntimeError – If deployment fails
- async stop()[source]
Stop the FastAPI service.
- Raises:
RuntimeError – If stopping fails
- Return type:
None
- property is_running: bool
Check if the service is currently running.
Services¶
- class agentscope_runtime.engine.services.base.Service[source]
Bases:
ABC
Abstract base class for services.
This class defines the interface that all services must implement.
- abstract async start()[source]
Starts the service, initializing any necessary resources or connections.
- Return type:
None
- abstract async stop()[source]
Stops the service, releasing any acquired resources.
- Return type:
None
- class agentscope_runtime.engine.services.base.ServiceLifecycleManagerMixin[source]
Bases:
object
Mixin class that provides async lifecycle manager functionality for services.
This mixin can be used with any class that implements the Service interface.
- async __aenter__()[source]
Async context manager entry.
- async __aexit__(exc_type, exc_val, exc_tb)[source]
Async context manager exit.
- class agentscope_runtime.engine.services.base.ServiceWithLifecycleManager[source]
Bases:
Service
,ServiceLifecycleManagerMixin
Base class for services that want async lifecycle manager functionality.
This class combines the Service interface with the context manager mixin, providing a convenient base class for most service implementations.
Note: This is an abstract base class. Subclasses must implement the abstract methods from the Service class.
- abstract async start()[source]
Starts the service, initializing any necessary resources or connections.
- Return type:
None
- abstract async stop()[source]
Stops the service, releasing any acquired resources.
- Return type:
None
- class agentscope_runtime.engine.services.manager.ServiceManager[source]
Bases:
ABC
Abstract base class for service managers. Provides common functionality for service registration and lifecycle management.
- __init__()[source]
- register(service_class, *args, name=None, **kwargs)[source]
Register a service.
- Parameters:
- Returns:
For method chaining
- Return type:
self
- register_service(name, service)[source]
Register an already instantiated service.
- Parameters:
name (str) – Service name
service (Service) – Service instance
- Returns:
For method chaining
- Return type:
self
- async __aenter__()[source]
Start all registered services using AsyncExitStack.
- async __aexit__(exc_type, exc_val, exc_tb)[source]
Close all services using AsyncExitStack.
- __getattr__(name)[source]
Enable attribute access for services, e.g., manager.env, manager.session.
- Parameters:
name (str)
- class agentscope_runtime.engine.services.context_manager.ContextComposer[source]
Bases:
object
- async static compose(request_input, session, memory_service=None, session_history_service=None, rag_service=None)[source]
- Parameters:
request_input (List[Message])
session (Session)
memory_service (MemoryService | None)
session_history_service (SessionHistoryService | None)
rag_service (RAGService | None)
- class agentscope_runtime.engine.services.context_manager.ContextManager(context_composer_cls=<class 'agentscope_runtime.engine.services.context_manager.ContextComposer'>, session_history_service=None, memory_service=None, rag_service=None)[source]
Bases:
ServiceManager
The contextManager class
- Parameters:
session_history_service (SessionHistoryService)
memory_service (MemoryService)
rag_service (RAGService)
- __init__(context_composer_cls=<class 'agentscope_runtime.engine.services.context_manager.ContextComposer'>, session_history_service=None, memory_service=None, rag_service=None)[source]
- Parameters:
session_history_service (SessionHistoryService | None)
memory_service (MemoryService | None)
rag_service (RAGService | None)
- agentscope_runtime.engine.services.context_manager.create_context_manager(memory_service=None, session_history_service=None, rag_service=None)[source]
- Parameters:
memory_service (MemoryService)
session_history_service (SessionHistoryService)
rag_service (RAGService)
- class agentscope_runtime.engine.services.environment_manager.EnvironmentManager(sandbox_service=None)[source]
Bases:
ServiceManager
The EnvironmentManager class for managing environment-related services.
- Parameters:
sandbox_service (SandboxService)
- __init__(sandbox_service=None)[source]
- Parameters:
sandbox_service (SandboxService | None)
- release_sandbox(session_id, user_id)[source]
- agentscope_runtime.engine.services.environment_manager.create_environment_manager(sandbox_service=None)[source]
- Parameters:
sandbox_service (SandboxService)
- class agentscope_runtime.engine.services.memory_service.MemoryService[source]
Bases:
ServiceWithLifecycleManager
Used to store and retrieve long memory from the database or in-memory. The memory is organized by the user id at top level, under which there are two different memory manage strategies, - one is the message grouped by the session id, the session id is under the user id, - the other is the message grouped by the user id only
- abstract async add_memory(user_id, messages, session_id=None)[source]
Adds messages to the memory service.
- async stop()[source]
Stops the service, releasing any acquired resources.
- async start()[source]
Starts the service, initializing any necessary resources or connections.
- abstract async search_memory(user_id, messages, filters=FieldInfo(annotation=NoneType, required=False, default=None, description='Associated filters for the messages, such as top_k, score etc.'))[source]
Searches messages from the memory service.
- Parameters:
- Return type:
- abstract async list_memory(user_id, filters=FieldInfo(annotation=NoneType, required=False, default=None, description='Associated filters for the messages, such as top_k, score etc.'))[source]
Lists the memory items for a given user with filters, such as page_num, page_size, etc.
- class agentscope_runtime.engine.services.memory_service.InMemoryMemoryService[source]
Bases:
MemoryService
An in-memory implementation of the memory service.
- async start()[source]
Starts the service.
- Return type:
None
- async stop()[source]
Stops the service.
- Return type:
None
- async add_memory(user_id, messages, session_id=None)[source]
Adds messages to the in-memory store.
- async search_memory(user_id, messages, filters=None)[source]
- Searches messages from the in-memory store for a specific user
based on keywords.
- Parameters:
- Returns:
A list of matching messages from the store.
- Return type:
- async get_query_text(message)[source]
Gets the query text from the messages.
- Parameters:
message (Message) – A list of messages.
- Returns:
The query text.
- Return type:
- async list_memory(user_id, filters=None)[source]
Lists messages from the in-memory store with pagination support.
- async delete_memory(user_id, session_id=None)[source]
Deletes messages from the in-memory store.
- class agentscope_runtime.engine.services.session_history_service.Session(*, id, user_id, messages=[])[source]
Bases:
BaseModel
Represents a single conversation session.
A session contains the history of a conversation, including all messages, and is uniquely identified by its ID.
- id
The unique identifier for the session.
- Type:
- user_id
The identifier of the user who owns the session.
- Type:
- messages
A list of messages formatted for Agent response
- Type:
List[agentscope_runtime.engine.schemas.agent_schemas.Message | Dict[str, Any]]
- id: str
- user_id: str
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.services.session_history_service.SessionHistoryService[source]
Bases:
ServiceWithLifecycleManager
Abstract base class for session history management services.
This class defines the standard interface for creating, retrieving, updating, and deleting conversation sessions. Concrete implementations (like InMemorySessionHistoryService) will handle the actual storage logic.
- async start()[source]
Starts the service, initializing any necessary resources or connections.
- Return type:
None
- async stop()[source]
Stops the service, releasing any acquired resources.
- Return type:
None
- async health()[source]
Checks the health of the service.
- Returns:
True if the service is healthy, False otherwise.
- Return type:
- abstract async create_session(user_id, session_id=None)[source]
Creates a new session for a given user.
- abstract async get_session(user_id, session_id)[source]
Retrieves a specific session.
- abstract async delete_session(user_id, session_id)[source]
Deletes a specific session.
- abstract async list_sessions(user_id)[source]
Lists all sessions for a given user.
- async append_message(session, message)[source]
Appends a message to the history of a specific session.
- class agentscope_runtime.engine.services.session_history_service.InMemorySessionHistoryService[source]
Bases:
SessionHistoryService
An in-memory implementation of the SessionHistoryService.
This service stores all session data in a dictionary, making it suitable for development, testing, and scenarios where persistence is not required.
- _sessions
A dictionary holding all session objects, keyed by user ID and then by session ID.
- __init__()[source]
Initializes the InMemorySessionHistoryService.
- Return type:
None
- async create_session(user_id, session_id=None)[source]
Creates a new session for a given user and stores it.
- async get_session(user_id, session_id)[source]
Retrieves a specific session from memory.
- async delete_session(user_id, session_id)[source]
Deletes a specific session from memory.
If the session does not exist, the method does nothing.
- async list_sessions(user_id)[source]
Lists all sessions for a given user.
To improve performance and reduce data transfer, the returned session objects do not contain the detailed response history.
- async append_message(session, message)[source]
Appends message to a session’s history in memory.
This method finds the authoritative session object in the in-memory storage and appends the message to its history. It supports both dictionary format messages and Message objects.
Schemas¶
- class agentscope_runtime.engine.schemas.agent_schemas.MessageType[source]
Bases:
object
- MESSAGE = 'message'
- FUNCTION_CALL = 'function_call'
- FUNCTION_CALL_OUTPUT = 'function_call_output'
- PLUGIN_CALL = 'plugin_call'
- PLUGIN_CALL_OUTPUT = 'plugin_call_output'
- COMPONENT_CALL = 'component_call'
- COMPONENT_CALL_OUTPUT = 'component_call_output'
- MCP_LIST_TOOLS = 'mcp_list_tools'
- MCP_APPROVAL_REQUEST = 'mcp_approval_request'
- MCP_TOOL_CALL = 'mcp_call'
- MCP_APPROVAL_RESPONSE = 'mcp_approval_response'
- HEARTBEAT = 'heartbeat'
- ERROR = 'error'
- classmethod all_values()[source]
return all constants values in MessageType
- class agentscope_runtime.engine.schemas.agent_schemas.ContentType[source]
Bases:
object
- TEXT = 'text'
- DATA = 'data'
- IMAGE = 'image'
- AUDIO = 'audio'
- class agentscope_runtime.engine.schemas.agent_schemas.Role[source]
Bases:
object
- ASSISTANT = 'assistant'
- USER = 'user'
- SYSTEM = 'system'
- TOOL = 'tool'
- class agentscope_runtime.engine.schemas.agent_schemas.RunStatus[source]
Bases:
object
Enum class for agent event message.
- Created = 'created'
- InProgress = 'in_progress'
- Completed = 'completed'
- Canceled = 'canceled'
- Failed = 'failed'
- Rejected = 'rejected'
- Unknown = 'unknown'
- class agentscope_runtime.engine.schemas.agent_schemas.FunctionParameters(*, type, properties, required)[source]
Bases:
BaseModel
- type: str
The type of the parameters object. Must be object.
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.schemas.agent_schemas.FunctionTool(*, name, description, parameters)[source]
Bases:
BaseModel
Model class for message tool.
- name: str
The name of the function to be called.
- description: str
A description of what the function does, used by the model to choose when and how to call the function.
- parameters: FunctionParameters | Dict[str, Any]
The parameters the functions accepts, described as a JSON Schema object.
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.schemas.agent_schemas.Tool(*, type='function', function=None)[source]
Bases:
BaseModel
Model class for assistant message tool call.
- Parameters:
type (str | None)
function (FunctionTool | None)
- function: FunctionTool | None
The function that the model called.
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.schemas.agent_schemas.FunctionCall(*, call_id=None, name=None, arguments=None)[source]
Bases:
BaseModel
Model class for assistant prompt message tool call function.
- arguments: str | None
The arguments to call the function with, as generated by the model in JSON format.
Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function.
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.schemas.agent_schemas.FunctionCallOutput(*, call_id, output)[source]
Bases:
BaseModel
Model class for assistant prompt message tool call function.
- call_id: str
The ID of the tool call.
- output: str
The result of the function.
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.schemas.agent_schemas.Error(*, code, message)[source]
Bases:
BaseModel
- code: str
The error code of the message.
- message: str
The error message of the message.
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.schemas.agent_schemas.Event(*, sequence_number=None, object, status=None, error=None)[source]
Bases:
BaseModel
- object: str
The identity of the content part.
- error: Error | None
response error for output
- created()[source]
Set the message status to ‘created’.
- Return type:
Self
- in_progress()[source]
Set the message status to ‘in_progress’.
- Return type:
Self
- completed()[source]
Set the message status to ‘completed’.
- Return type:
Self
- failed(error)[source]
Set the message status to ‘failed’.
- Parameters:
error (Error)
- Return type:
Self
- rejected()[source]
Set the message status to ‘rejected’.
- Return type:
Self
- canceled()[source]
Set the message status to ‘canceled’.
- Return type:
Self
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.schemas.agent_schemas.Content(*, sequence_number=None, object='content', status=None, error=None, type, index=None, delta=False, msg_id=None)[source]
Bases:
Event
- Parameters:
- type: str
The type of the content part.
- object: str
The identity of the content part.
- static from_chat_completion_chunk(chunk, index=None)[source]
- Parameters:
chunk (ChatCompletionChunk)
index (int | None)
- Return type:
TextContent | DataContent | ImageContent | None
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.schemas.agent_schemas.ImageContent(*, sequence_number=None, object='content', status=None, error=None, type='image', index=None, delta=False, msg_id=None, image_url=None)[source]
Bases:
Content
- Parameters:
- type: Literal['image']
The type of the content part.
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.schemas.agent_schemas.TextContent(*, sequence_number=None, object='content', status=None, error=None, type='text', index=None, delta=False, msg_id=None, text=None)[source]
Bases:
Content
- Parameters:
- type: Literal['text']
The type of the content part.
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.schemas.agent_schemas.DataContent(*, sequence_number=None, object='content', status=None, error=None, type='data', index=None, delta=False, msg_id=None, data=None)[source]
Bases:
Content
- Parameters:
- type: Literal['data']
The type of the content part.
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- class agentscope_runtime.engine.schemas.agent_schemas.Message(*, sequence_number=None, object='message', status='created', error=None, id=<factory>, type='message', role=None, content=None, code=None, message=None, usage=None)[source]
Bases:
Event
- Parameters:
sequence_number (str | None)
object (str)
status (str)
error (Error | None)
id (str)
type (str)
role (Literal['assistant', 'system', 'user', 'tool'] | None)
content (List[Annotated[TextContent | ImageContent | DataContent, FieldInfo(annotation=NoneType, required=True, discriminator='type')]] | None)
code (str | None)
message (str | None)
usage (Dict | None)
- id: str
message unique id
- object: str
message identity
- type: str
The type of the message.
- status: str
The status of the message. in_progress, completed, or incomplete
- role: Literal['assistant', 'system', 'user', 'tool'] | None
The role of the messages author, should be in user,`system`, ‘assistant’.
- content: List[Annotated[TextContent | ImageContent | DataContent, FieldInfo(annotation=NoneType, required=True, discriminator='type')]] | None
The contents of the message.
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- static from_openai_message(message)[source]
Create a message object from an openai message.
- Parameters:
message (BaseModel | dict)
- Return type:
Message
- get_text_content()[source]
Extract the first text content from the message.
- Returns:
First text string found in the content, or None if no text content
- Return type:
str | None
- get_image_content()[source]
Extract all image content (URLs or base64 data) from the message.
- get_audio_content()[source]
Extract all audio content (URLs or base64 data) from the message.
- add_delta_content(new_content)[source]
- Parameters:
new_content (TextContent | ImageContent | DataContent)
- add_content(new_content)[source]
- Parameters:
new_content (TextContent | ImageContent | DataContent)
- class agentscope_runtime.engine.schemas.agent_schemas.BaseRequest(*, input, stream=True)[source]
Bases:
BaseModel
agent request
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- input: List[Message]
input messages
- stream: bool
If set, partial message deltas will be sent, like in ChatGPT.
- class agentscope_runtime.engine.schemas.agent_schemas.AgentRequest(*, input, stream=True, model=None, top_p=None, temperature=None, frequency_penalty=None, presence_penalty=None, max_tokens=None, stop=None, n=1, seed=None, tools=None, session_id=None, response_id=None)[source]
Bases:
BaseRequest
agent request
- Parameters:
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- top_p: float | None
Nucleus sampling, between (0, 1.0], where the model considers the results of the tokens with top_p probability mass.
So 0.1 means only the tokens comprising the top 10% probability mass are considered.
We generally recommend altering this or temperature but not both.
- temperature: float | None
What sampling temperature to use, between 0 and 2.
Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.
We generally recommend altering this or top_p but not both.
- frequency_penalty: float | None
Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model’s likelihood to repeat the same line verbatim.
- presence_penalty: float | None
Number between -2.0 and 2.0.
Positive values penalize new tokens based on whether they appear in the text so far, increasing the model’s likelihood to talk about new topics.
- max_tokens: int | None
The maximum number of [tokens](/tokenizer) that can be generated in the chat completion.
The total length of input tokens and generated tokens is limited by the model’s context length.
- n: int | None
How many chat completion choices to generate for each input message.
Note that you will be charged based on the number of generated tokens across all of the choices. Keep n as 1 to minimize costs.
- class agentscope_runtime.engine.schemas.agent_schemas.BaseResponse(*, sequence_number=None, object='response', status='created', error=None, id=<factory>, created_at=1757505168, completed_at=None, output=None, usage=None)[source]
Bases:
Event
- Parameters:
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- classmethod validate_id(v)[source]
- object: str
response identity
- status: str
response run status
- created_at: int
request start time
- add_new_message(message)[source]
- Parameters:
message (Message)
- class agentscope_runtime.engine.schemas.agent_schemas.AgentResponse(*, sequence_number=None, object='response', status='created', error=None, id=<factory>, created_at=1757505168, completed_at=None, output=None, usage=None, session_id=None)[source]
Bases:
BaseResponse
agent response
- Parameters:
- model_config: ClassVar[ConfigDict] = {}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- agentscope_runtime.engine.schemas.agent_schemas.convert_to_openai_tool_call(function)[source]
- Parameters:
function (FunctionCall)
- agentscope_runtime.engine.schemas.agent_schemas.convert_to_openai_messages(messages)[source]
Convert a generic message protocol to a model-specific protocol. :param messages: Original list of messages
- agentscope_runtime.engine.schemas.agent_schemas.convert_to_openai_tools(tools)[source]
- class agentscope_runtime.engine.schemas.context.Context(*, user_id, session=Session(id='', user_id='', messages=[]), activate_tools=[], new_message=None, current_messages=[], request, new_message_dict=None, messages_list=[], environment_manager=None, context_manager=None, agent, agent_config=None)[source]
Bases:
BaseModel
Holds all contextual information for a single agent invocation.
This object is created by the Runner and passed through the agent execution flow, providing access to necessary services and data, including a live request queue for real-time interaction.
- Parameters:
- model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'extra': 'forbid'}
Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].
- user_id: str
- session: Session
- activate_tools: list
- new_message: Message | None
- current_messages: List[Message]
- request: AgentRequest
- environment_manager: EnvironmentManager | None
- context_manager: ContextManager | None
- agent: Agent
- property messages