Engine 模块

Engine模块包含AgentScope Runtime的核心组件。

子模块

Agents

class agentscope_runtime.engine.agents.base_agent.Agent(name='', description='', before_agent_callback=None, after_agent_callback=None, agent_config=None, **kwargs)[source]

Bases: object

Parameters:
  • name (str)

  • description (str)

__init__(name='', description='', before_agent_callback=None, after_agent_callback=None, agent_config=None, **kwargs)[source]
Parameters:
  • name (str)

  • description (str)

async run_async(context, **kwargs)[source]
Return type:

AsyncGenerator[Event, None]

class agentscope_runtime.engine.agents.llm_agent.LLMAgent(model, **kwargs)[source]

Bases: Agent

Parameters:

model (BaseLLM)

__init__(model, **kwargs)[source]
Parameters:

model (BaseLLM)

async run_async(context, **kwargs)[source]

Deployers

class agentscope_runtime.engine.deployers.base.DeployManager[source]

Bases: ABC

__init__()[source]
abstract async deploy(*args, **kwargs)[source]

Deploy the service and return a dictionary with deploy_id and URL.

Return type:

Dict[str, str]

class agentscope_runtime.engine.deployers.local_deployer.LocalDeployManager(host='localhost', port=8090)[source]

Bases: DeployManager

Parameters:
__init__(host='localhost', port=8090)[source]
Parameters:
deploy_sync(func, endpoint_path='/process', request_model=<class 'agentscope_runtime.engine.schemas.agent_schemas.AgentRequest'>, response_type='sse', before_start=None, after_finish=None, **kwargs)[source]

Deploy the agent as a FastAPI service (synchronous version).

Parameters:
  • func (Callable) – Custom processing function

  • endpoint_path (str) – API endpoint path for the processing function

  • request_model (Type | None) – Pydantic model for request validation

  • response_type (str) – Response type - “json”, “sse”, or “text”

  • before_start (Callable | None) – Callback function called before server starts

  • after_finish (Callable | None) – Callback function called after server finishes

  • **kwargs (Any) – Additional keyword arguments passed to callbacks

Returns:

Dictionary containing deploy_id and url of the deployed service

Return type:

Dict[str, str]

Raises:

RuntimeError – If deployment fails

async deploy(func, endpoint_path='/process', request_model=<class 'agentscope_runtime.engine.schemas.agent_schemas.AgentRequest'>, response_type='sse', before_start=None, after_finish=None, protocol_adapters=None, **kwargs)[source]

Deploy the agent as a FastAPI service (asynchronous version).

Parameters:
  • func (Callable) – Custom processing function

  • endpoint_path (str) – API endpoint path for the processing function

  • request_model (Type | None) – Pydantic model for request validation

  • response_type (str) – Response type - “json”, “sse”, or “text”

  • before_start (Callable | None) – Callback function called before server starts

  • after_finish (Callable | None) – Callback function called after server finishes

  • **kwargs (Any) – Additional keyword arguments passed to callbacks

  • protocol_adapters (list[ProtocolAdapter] | None)

  • **kwargs

Returns:

Dictionary containing deploy_id and url of the deployed service

Return type:

Dict[str, str]

Raises:

RuntimeError – If deployment fails

async stop()[source]

Stop the FastAPI service.

Raises:

RuntimeError – If stopping fails

Return type:

None

property is_running: bool

Check if the service is currently running.

property service_url: str | None

Get the current service URL if running.

Services

class agentscope_runtime.engine.services.base.Service[source]

Bases: ABC

Abstract base class for services.

This class defines the interface that all services must implement.

abstract async start()[source]

Starts the service, initializing any necessary resources or connections.

Return type:

None

abstract async stop()[source]

Stops the service, releasing any acquired resources.

Return type:

None

abstract async health()[source]

Checks the health of the service.

Returns:

True if the service is healthy, False otherwise.

Return type:

bool

class agentscope_runtime.engine.services.base.ServiceLifecycleManagerMixin[source]

Bases: object

Mixin class that provides async lifecycle manager functionality for services.

This mixin can be used with any class that implements the Service interface.

async __aenter__()[source]

Async context manager entry.

async __aexit__(exc_type, exc_val, exc_tb)[source]

Async context manager exit.

class agentscope_runtime.engine.services.base.ServiceWithLifecycleManager[source]

Bases: Service, ServiceLifecycleManagerMixin

Base class for services that want async lifecycle manager functionality.

This class combines the Service interface with the context manager mixin, providing a convenient base class for most service implementations.

Note: This is an abstract base class. Subclasses must implement the abstract methods from the Service class.

abstract async start()[source]

Starts the service, initializing any necessary resources or connections.

Return type:

None

abstract async stop()[source]

Stops the service, releasing any acquired resources.

Return type:

None

abstract async health()[source]

Checks the health of the service.

Returns:

True if the service is healthy, False otherwise.

Return type:

bool

class agentscope_runtime.engine.services.manager.ServiceManager[source]

Bases: ABC

Abstract base class for service managers. Provides common functionality for service registration and lifecycle management.

__init__()[source]
register(service_class, *args, name=None, **kwargs)[source]

Register a service.

Parameters:
  • service_class (Type) – The class of the service to register.

  • *args – Positional arguments for service initialization.

  • name (str | None) – Optional service name. Defaults to class name without ‘Service’ suffix and converted to lowercase.

  • **kwargs – Keyword arguments for service initialization.

Returns:

For method chaining

Return type:

self

register_service(name, service)[source]

Register an already instantiated service.

Parameters:
  • name (str) – Service name

  • service (Service) – Service instance

Returns:

For method chaining

Return type:

self

async __aenter__()[source]

Start all registered services using AsyncExitStack.

async __aexit__(exc_type, exc_val, exc_tb)[source]

Close all services using AsyncExitStack.

__getattr__(name)[source]

Enable attribute access for services, e.g., manager.env, manager.session.

Parameters:

name (str)

__getitem__(name)[source]

Enable dictionary-style access for services.

Parameters:

name (str)

get(name, default=None)[source]

Explicitly retrieve a service instance with optional default.

Parameters:

name (str)

has_service(name)[source]

Check if a service exists.

Parameters:

name (str)

Return type:

bool

list_services()[source]

List all registered service names.

Return type:

List[str]

property all_services: Dict[str, Any]

Retrieve all service instances.

async health_check()[source]

Check health of all services.

Return type:

Dict[str, bool]

class agentscope_runtime.engine.services.context_manager.ContextComposer[source]

Bases: object

async static compose(request_input, session, memory_service=None, session_history_service=None, rag_service=None)[source]
Parameters:
  • request_input (List[Message])

  • session (Session)

  • memory_service (MemoryService | None)

  • session_history_service (SessionHistoryService | None)

  • rag_service (RAGService | None)

class agentscope_runtime.engine.services.context_manager.ContextManager(context_composer_cls=<class 'agentscope_runtime.engine.services.context_manager.ContextComposer'>, session_history_service=None, memory_service=None, rag_service=None)[source]

Bases: ServiceManager

The contextManager class

Parameters:
  • session_history_service (SessionHistoryService)

  • memory_service (MemoryService)

  • rag_service (RAGService)

__init__(context_composer_cls=<class 'agentscope_runtime.engine.services.context_manager.ContextComposer'>, session_history_service=None, memory_service=None, rag_service=None)[source]
Parameters:
  • session_history_service (SessionHistoryService | None)

  • memory_service (MemoryService | None)

  • rag_service (RAGService | None)

async compose_context(session, request_input)[source]
Parameters:
  • session (Session)

  • request_input (List[Message])

async compose_session(user_id, session_id)[source]
Parameters:
  • user_id (str)

  • session_id (str)

async append(session, event_output)[source]
Parameters:
  • session (Session)

  • event_output (List[Message])

agentscope_runtime.engine.services.context_manager.create_context_manager(memory_service=None, session_history_service=None, rag_service=None)[source]
Parameters:
  • memory_service (MemoryService)

  • session_history_service (SessionHistoryService)

  • rag_service (RAGService)

class agentscope_runtime.engine.services.environment_manager.EnvironmentManager(sandbox_service=None)[source]

Bases: ServiceManager

The EnvironmentManager class for managing environment-related services.

Parameters:

sandbox_service (SandboxService)

__init__(sandbox_service=None)[source]
Parameters:

sandbox_service (SandboxService | None)

connect_sandbox(session_id, user_id, env_types=None, tools=None)[source]
Return type:

List

release_sandbox(session_id, user_id)[source]
agentscope_runtime.engine.services.environment_manager.create_environment_manager(sandbox_service=None)[source]
Parameters:

sandbox_service (SandboxService)

class agentscope_runtime.engine.services.memory_service.MemoryService[source]

Bases: ServiceWithLifecycleManager

Used to store and retrieve long memory from the database or in-memory. The memory is organized by the user id at top level, under which there are two different memory manage strategies, - one is the message grouped by the session id, the session id is under the user id, - the other is the message grouped by the user id only

abstract async add_memory(user_id, messages, session_id=None)[source]

Adds messages to the memory service.

Parameters:
  • user_id (str) – The user id.

  • messages (list) – The messages to add.

  • session_id (str | None) – The session id, which is optional.

Return type:

None

async stop()[source]

Stops the service, releasing any acquired resources.

async start()[source]

Starts the service, initializing any necessary resources or connections.

abstract async search_memory(user_id, messages, filters=FieldInfo(annotation=NoneType, required=False, default=None, description='Associated filters for the messages, such as top_k, score etc.'))[source]

Searches messages from the memory service.

Parameters:
  • user_id (str) – The user id.

  • messages (list) – The user query or the query with history messages, both in the format of list of messages. If messages is a list, the search will be based on the content of the last message.

  • filters (Dict[str, Any] | None) – The filters used to search memory

Return type:

list

abstract async list_memory(user_id, filters=FieldInfo(annotation=NoneType, required=False, default=None, description='Associated filters for the messages, such as top_k, score etc.'))[source]

Lists the memory items for a given user with filters, such as page_num, page_size, etc.

Parameters:
  • user_id (str) – The user id.

  • filters (Dict[str, Any] | None) – The filters for the memory items.

Return type:

list

abstract async delete_memory(user_id, session_id=None)[source]

Deletes the memory items for a given user with certain session id, or all the memory items for a given user.

Parameters:
  • user_id (str)

  • session_id (str | None)

Return type:

None

class agentscope_runtime.engine.services.memory_service.InMemoryMemoryService[source]

Bases: MemoryService

An in-memory implementation of the memory service.

async start()[source]

Starts the service.

Return type:

None

async stop()[source]

Stops the service.

Return type:

None

async health()[source]

Checks the health of the service.

Return type:

bool

async add_memory(user_id, messages, session_id=None)[source]

Adds messages to the in-memory store.

Parameters:
  • user_id (str) – The user’s unique identifier.

  • messages (list) – A list of messages to be added.

  • session_id (str | None) – An optional session identifier. If not provided,

  • used. (a default session is)

Return type:

None

async search_memory(user_id, messages, filters=None)[source]
Searches messages from the in-memory store for a specific user

based on keywords.

Parameters:
  • user_id (str) – The user’s unique identifier.

  • messages (list) – A list of messages, where the last message’s content is used as the search query.

  • filters (Dict[str, Any] | None) – Optional filters to apply, such as ‘top_k’ to limit the number of returned messages.

Returns:

A list of matching messages from the store.

Return type:

list

async get_query_text(message)[source]

Gets the query text from the messages.

Parameters:

message (Message) – A list of messages.

Returns:

The query text.

Return type:

str

async list_memory(user_id, filters=None)[source]

Lists messages from the in-memory store with pagination support.

Parameters:
  • user_id (str) – The user’s unique identifier.

  • filters (Dict[str, Any] | None) – Optional filters for pagination, including ‘page_num’ and ‘page_size’.

Returns:

A paginated list of messages.

Return type:

list

async delete_memory(user_id, session_id=None)[source]

Deletes messages from the in-memory store.

Parameters:
  • user_id (str) – The user’s unique identifier.

  • session_id (str | None) – If provided, only deletes the messages for that session. Otherwise, deletes all messages for the user.

Return type:

None

class agentscope_runtime.engine.services.session_history_service.Session(*, id, user_id, messages=[])[source]

Bases: BaseModel

Represents a single conversation session.

A session contains the history of a conversation, including all messages, and is uniquely identified by its ID.

Parameters:
id

The unique identifier for the session.

Type:

str

user_id

The identifier of the user who owns the session.

Type:

str

messages

A list of messages formatted for Agent response

Type:

List[agentscope_runtime.engine.schemas.agent_schemas.Message | Dict[str, Any]]

id: str
user_id: str
messages: List[Message | Dict[str, Any]]
model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class agentscope_runtime.engine.services.session_history_service.SessionHistoryService[source]

Bases: ServiceWithLifecycleManager

Abstract base class for session history management services.

This class defines the standard interface for creating, retrieving, updating, and deleting conversation sessions. Concrete implementations (like InMemorySessionHistoryService) will handle the actual storage logic.

async start()[source]

Starts the service, initializing any necessary resources or connections.

Return type:

None

async stop()[source]

Stops the service, releasing any acquired resources.

Return type:

None

async health()[source]

Checks the health of the service.

Returns:

True if the service is healthy, False otherwise.

Return type:

bool

abstract async create_session(user_id, session_id=None)[source]

Creates a new session for a given user.

Parameters:
  • user_id (str) – The identifier for the user.

  • session_id (str | None) – Could be defined by user

Returns:

The newly created Session object.

Return type:

Session

abstract async get_session(user_id, session_id)[source]

Retrieves a specific session.

Parameters:
  • user_id (str) – The identifier for the user.

  • session_id (str) – The identifier for the session to retrieve.

Returns:

The Session object if found, otherwise should raise an error or return None in concrete implementations.

Return type:

Session | None

abstract async delete_session(user_id, session_id)[source]

Deletes a specific session.

Parameters:
  • user_id (str) – The identifier for the user.

  • session_id (str) – The identifier for the session to delete.

abstract async list_sessions(user_id)[source]

Lists all sessions for a given user.

Parameters:

user_id (str) – The identifier for the user.

Returns:

A list of Session objects.

Return type:

list[Session]

async append_message(session, message)[source]

Appends a message to the history of a specific session.

Parameters:
  • session (Session) – The session to which the message should be appended.

  • message (Message | List[Message] | Dict[str, Any] | List[Dict[str, Any]]) – The message or list of messages to append. Supports both dictionary format and Message objects.

class agentscope_runtime.engine.services.session_history_service.InMemorySessionHistoryService[source]

Bases: SessionHistoryService

An in-memory implementation of the SessionHistoryService.

This service stores all session data in a dictionary, making it suitable for development, testing, and scenarios where persistence is not required.

_sessions

A dictionary holding all session objects, keyed by user ID and then by session ID.

__init__()[source]

Initializes the InMemorySessionHistoryService.

Return type:

None

async create_session(user_id, session_id=None)[source]

Creates a new session for a given user and stores it.

Parameters:
  • user_id (str) – The identifier for the user creating the session.

  • session_id (str | None) – The identifier for the session to delete.

Returns:

A deep copy of the newly created Session object.

Return type:

Session

async get_session(user_id, session_id)[source]

Retrieves a specific session from memory.

Parameters:
  • user_id (str) – The identifier for the user.

  • session_id (str) – The identifier for the session to retrieve.

Returns:

A deep copy of the Session object if found, otherwise None.

Return type:

Session | None

async delete_session(user_id, session_id)[source]

Deletes a specific session from memory.

If the session does not exist, the method does nothing.

Parameters:
  • user_id (str) – The identifier for the user.

  • session_id (str) – The identifier for the session to delete.

Return type:

None

async list_sessions(user_id)[source]

Lists all sessions for a given user.

To improve performance and reduce data transfer, the returned session objects do not contain the detailed response history.

Parameters:

user_id (str) – The identifier of the user whose sessions to list.

Returns:

A list of Session objects belonging to the user, without history.

Return type:

list[Session]

async append_message(session, message)[source]

Appends message to a session’s history in memory.

This method finds the authoritative session object in the in-memory storage and appends the message to its history. It supports both dictionary format messages and Message objects.

Parameters:
  • session (Session) – The session object, typically from the context. The user_id and id from this object are used for lookup.

  • message (Message | List[Message] | Dict[str, Any] | List[Dict[str, Any]]) – The message or list of messages to append to the session’s history.

Return type:

None

Schemas

class agentscope_runtime.engine.schemas.agent_schemas.MessageType[source]

Bases: object

MESSAGE = 'message'
FUNCTION_CALL = 'function_call'
FUNCTION_CALL_OUTPUT = 'function_call_output'
PLUGIN_CALL = 'plugin_call'
PLUGIN_CALL_OUTPUT = 'plugin_call_output'
COMPONENT_CALL = 'component_call'
COMPONENT_CALL_OUTPUT = 'component_call_output'
MCP_LIST_TOOLS = 'mcp_list_tools'
MCP_APPROVAL_REQUEST = 'mcp_approval_request'
MCP_TOOL_CALL = 'mcp_call'
MCP_APPROVAL_RESPONSE = 'mcp_approval_response'
HEARTBEAT = 'heartbeat'
ERROR = 'error'
classmethod all_values()[source]

return all constants values in MessageType

class agentscope_runtime.engine.schemas.agent_schemas.ContentType[source]

Bases: object

TEXT = 'text'
DATA = 'data'
IMAGE = 'image'
AUDIO = 'audio'
class agentscope_runtime.engine.schemas.agent_schemas.Role[source]

Bases: object

ASSISTANT = 'assistant'
USER = 'user'
SYSTEM = 'system'
TOOL = 'tool'
class agentscope_runtime.engine.schemas.agent_schemas.RunStatus[source]

Bases: object

Enum class for agent event message.

Created = 'created'
InProgress = 'in_progress'
Completed = 'completed'
Canceled = 'canceled'
Failed = 'failed'
Rejected = 'rejected'
Unknown = 'unknown'
class agentscope_runtime.engine.schemas.agent_schemas.FunctionParameters(*, type, properties, required)[source]

Bases: BaseModel

Parameters:
type: str

The type of the parameters object. Must be object.

properties: Dict[str, Any]

The properties of the parameters object.

required: List[str] | None

The names of the required properties.

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class agentscope_runtime.engine.schemas.agent_schemas.FunctionTool(*, name, description, parameters)[source]

Bases: BaseModel

Model class for message tool.

Parameters:
name: str

The name of the function to be called.

description: str

A description of what the function does, used by the model to choose when and how to call the function.

parameters: FunctionParameters | Dict[str, Any]

The parameters the functions accepts, described as a JSON Schema object.

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class agentscope_runtime.engine.schemas.agent_schemas.Tool(*, type='function', function=None)[source]

Bases: BaseModel

Model class for assistant message tool call.

Parameters:
  • type (str | None)

  • function (FunctionTool | None)

type: str | None

The type of the tool. Currently, only function is supported.

function: FunctionTool | None

The function that the model called.

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class agentscope_runtime.engine.schemas.agent_schemas.FunctionCall(*, call_id=None, name=None, arguments=None)[source]

Bases: BaseModel

Model class for assistant prompt message tool call function.

Parameters:
  • call_id (str | None)

  • name (str | None)

  • arguments (str | None)

call_id: str | None

The ID of the tool call.

name: str | None

The name of the function to call.

arguments: str | None

The arguments to call the function with, as generated by the model in JSON format.

Note that the model does not always generate valid JSON, and may hallucinate parameters not defined by your function schema. Validate the arguments in your code before calling your function.

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class agentscope_runtime.engine.schemas.agent_schemas.FunctionCallOutput(*, call_id, output)[source]

Bases: BaseModel

Model class for assistant prompt message tool call function.

Parameters:
call_id: str

The ID of the tool call.

output: str

The result of the function.

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class agentscope_runtime.engine.schemas.agent_schemas.Error(*, code, message)[source]

Bases: BaseModel

Parameters:
code: str

The error code of the message.

message: str

The error message of the message.

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class agentscope_runtime.engine.schemas.agent_schemas.Event(*, sequence_number=None, object, status=None, error=None)[source]

Bases: BaseModel

Parameters:
  • sequence_number (str | None)

  • object (str)

  • status (str | None)

  • error (Error | None)

sequence_number: str | None

sequence number of event

object: str

The identity of the content part.

status: str | None

The status of the message. in_progress, completed, or incomplete

error: Error | None

response error for output

created()[source]

Set the message status to ‘created’.

Return type:

Self

in_progress()[source]

Set the message status to ‘in_progress’.

Return type:

Self

completed()[source]

Set the message status to ‘completed’.

Return type:

Self

failed(error)[source]

Set the message status to ‘failed’.

Parameters:

error (Error)

Return type:

Self

rejected()[source]

Set the message status to ‘rejected’.

Return type:

Self

canceled()[source]

Set the message status to ‘canceled’.

Return type:

Self

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class agentscope_runtime.engine.schemas.agent_schemas.Content(*, sequence_number=None, object='content', status=None, error=None, type, index=None, delta=False, msg_id=None)[source]

Bases: Event

Parameters:
  • sequence_number (str | None)

  • object (str)

  • status (str | None)

  • error (Error | None)

  • type (str)

  • index (int | None)

  • delta (bool | None)

  • msg_id (str | None)

type: str

The type of the content part.

object: str

The identity of the content part.

index: int | None

the content index in message’s content list

delta: bool | None

Whether this content is a delta.

msg_id: str | None

message unique id

static from_chat_completion_chunk(chunk, index=None)[source]
Parameters:
  • chunk (ChatCompletionChunk)

  • index (int | None)

Return type:

TextContent | DataContent | ImageContent | None

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

sequence_number: str | None

sequence number of event

status: str | None

The status of the message. in_progress, completed, or incomplete

error: Error | None

response error for output

class agentscope_runtime.engine.schemas.agent_schemas.ImageContent(*, sequence_number=None, object='content', status=None, error=None, type='image', index=None, delta=False, msg_id=None, image_url=None)[source]

Bases: Content

Parameters:
  • sequence_number (str | None)

  • object (str)

  • status (str | None)

  • error (Error | None)

  • type (Literal['image'])

  • index (int | None)

  • delta (bool | None)

  • msg_id (str | None)

  • image_url (str | None)

type: Literal['image']

The type of the content part.

image_url: str | None

The image URL details.

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

object: str

The identity of the content part.

index: int | None

the content index in message’s content list

delta: bool | None

Whether this content is a delta.

msg_id: str | None

message unique id

sequence_number: str | None

sequence number of event

status: str | None

The status of the message. in_progress, completed, or incomplete

error: Error | None

response error for output

class agentscope_runtime.engine.schemas.agent_schemas.TextContent(*, sequence_number=None, object='content', status=None, error=None, type='text', index=None, delta=False, msg_id=None, text=None)[source]

Bases: Content

Parameters:
  • sequence_number (str | None)

  • object (str)

  • status (str | None)

  • error (Error | None)

  • type (Literal['text'])

  • index (int | None)

  • delta (bool | None)

  • msg_id (str | None)

  • text (str | None)

type: Literal['text']

The type of the content part.

text: str | None

The text content.

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

object: str

The identity of the content part.

index: int | None

the content index in message’s content list

delta: bool | None

Whether this content is a delta.

msg_id: str | None

message unique id

sequence_number: str | None

sequence number of event

status: str | None

The status of the message. in_progress, completed, or incomplete

error: Error | None

response error for output

class agentscope_runtime.engine.schemas.agent_schemas.DataContent(*, sequence_number=None, object='content', status=None, error=None, type='data', index=None, delta=False, msg_id=None, data=None)[source]

Bases: Content

Parameters:
  • sequence_number (str | None)

  • object (str)

  • status (str | None)

  • error (Error | None)

  • type (Literal['data'])

  • index (int | None)

  • delta (bool | None)

  • msg_id (str | None)

  • data (Dict | None)

type: Literal['data']

The type of the content part.

data: Dict | None

The data content.

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

object: str

The identity of the content part.

index: int | None

the content index in message’s content list

delta: bool | None

Whether this content is a delta.

msg_id: str | None

message unique id

sequence_number: str | None

sequence number of event

status: str | None

The status of the message. in_progress, completed, or incomplete

error: Error | None

response error for output

class agentscope_runtime.engine.schemas.agent_schemas.Message(*, sequence_number=None, object='message', status='created', error=None, id=<factory>, type='message', role=None, content=None, code=None, message=None, usage=None)[source]

Bases: Event

Parameters:
  • sequence_number (str | None)

  • object (str)

  • status (str)

  • error (Error | None)

  • id (str)

  • type (str)

  • role (Literal['assistant', 'system', 'user', 'tool'] | None)

  • content (List[Annotated[TextContent | ImageContent | DataContent, FieldInfo(annotation=NoneType, required=True, discriminator='type')]] | None)

  • code (str | None)

  • message (str | None)

  • usage (Dict | None)

id: str

message unique id

object: str

message identity

type: str

The type of the message.

status: str

The status of the message. in_progress, completed, or incomplete

role: Literal['assistant', 'system', 'user', 'tool'] | None

The role of the messages author, should be in user,`system`, ‘assistant’.

content: List[Annotated[TextContent | ImageContent | DataContent, FieldInfo(annotation=NoneType, required=True, discriminator='type')]] | None

The contents of the message.

code: str | None

The error code of the message.

message: str | None

The error message of the message.

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

usage: Dict | None

response usage for output

sequence_number: str | None

sequence number of event

error: Error | None

response error for output

static from_openai_message(message)[source]

Create a message object from an openai message.

Parameters:

message (BaseModel | dict)

Return type:

Message

get_text_content()[source]

Extract the first text content from the message.

Returns:

First text string found in the content, or None if no text content

Return type:

str | None

get_image_content()[source]

Extract all image content (URLs or base64 data) from the message.

Returns:

List of image URLs or base64 encoded strings found in the content

Return type:

List[str]

get_audio_content()[source]

Extract all audio content (URLs or base64 data) from the message.

Returns:

List of audio URLs or base64 encoded strings found in the content

Return type:

List[str]

add_delta_content(new_content)[source]
Parameters:

new_content (TextContent | ImageContent | DataContent)

content_completed(content_index)[source]
Parameters:

content_index (int)

add_content(new_content)[source]
Parameters:

new_content (TextContent | ImageContent | DataContent)

class agentscope_runtime.engine.schemas.agent_schemas.BaseRequest(*, input, stream=True)[source]

Bases: BaseModel

agent request

Parameters:
model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

input: List[Message]

input messages

stream: bool

If set, partial message deltas will be sent, like in ChatGPT.

class agentscope_runtime.engine.schemas.agent_schemas.AgentRequest(*, input, stream=True, model=None, top_p=None, temperature=None, frequency_penalty=None, presence_penalty=None, max_tokens=None, stop=None, n=1, seed=None, tools=None, session_id=None, response_id=None)[source]

Bases: BaseRequest

agent request

Parameters:
  • input (List[Message])

  • stream (bool)

  • model (str | None)

  • top_p (float | None)

  • temperature (float | None)

  • frequency_penalty (float | None)

  • presence_penalty (float | None)

  • max_tokens (int | None)

  • stop (str | None | List[str])

  • n (Annotated[int | None, Ge(ge=1), Le(le=5)])

  • seed (int | None)

  • tools (List[Tool | Dict] | None)

  • session_id (str | None)

  • response_id (str | None)

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

input: List[Message]

input messages

stream: bool

If set, partial message deltas will be sent, like in ChatGPT.

model: str | None

model id

top_p: float | None

Nucleus sampling, between (0, 1.0], where the model considers the results of the tokens with top_p probability mass.

So 0.1 means only the tokens comprising the top 10% probability mass are considered.

We generally recommend altering this or temperature but not both.

temperature: float | None

What sampling temperature to use, between 0 and 2.

Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.

We generally recommend altering this or top_p but not both.

frequency_penalty: float | None

Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model’s likelihood to repeat the same line verbatim.

presence_penalty: float | None

Number between -2.0 and 2.0.

Positive values penalize new tokens based on whether they appear in the text so far, increasing the model’s likelihood to talk about new topics.

max_tokens: int | None

The maximum number of [tokens](/tokenizer) that can be generated in the chat completion.

The total length of input tokens and generated tokens is limited by the model’s context length.

stop: str | None | List[str]

Up to 4 sequences where the API will stop generating further tokens.

n: int | None

How many chat completion choices to generate for each input message.

Note that you will be charged based on the number of generated tokens across all of the choices. Keep n as 1 to minimize costs.

seed: int | None

If specified, system will make a best effort to sample deterministically, such that repeated requests with the same seed and parameters should return the same result.

tools: List[Tool | Dict] | None

tool call list

session_id: str | None

conversation id for dialog

response_id: str | None

response unique id

class agentscope_runtime.engine.schemas.agent_schemas.BaseResponse(*, sequence_number=None, object='response', status='created', error=None, id=<factory>, created_at=1757505168, completed_at=None, output=None, usage=None)[source]

Bases: Event

Parameters:
  • sequence_number (str | None)

  • object (str)

  • status (str)

  • error (Error | None)

  • id (str | None)

  • created_at (int)

  • completed_at (int | None)

  • output (List[Message] | None)

  • usage (Dict | None)

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

sequence_number: str | None

sequence number of event

error: Error | None

response error for output

id: str | None

response unique id

classmethod validate_id(v)[source]
object: str

response identity

status: str

response run status

created_at: int

request start time

completed_at: int | None

request completed time

output: List[Message] | None

response data for output

usage: Dict | None

response usage for output

add_new_message(message)[source]
Parameters:

message (Message)

class agentscope_runtime.engine.schemas.agent_schemas.AgentResponse(*, sequence_number=None, object='response', status='created', error=None, id=<factory>, created_at=1757505168, completed_at=None, output=None, usage=None, session_id=None)[source]

Bases: BaseResponse

agent response

Parameters:
  • sequence_number (str | None)

  • object (str)

  • status (str)

  • error (Error | None)

  • id (str | None)

  • created_at (int)

  • completed_at (int | None)

  • output (List[Message] | None)

  • usage (Dict | None)

  • session_id (str | None)

model_config: ClassVar[ConfigDict] = {}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

id: str | None

response unique id

object: str

response identity

status: str

response run status

created_at: int

request start time

completed_at: int | None

request completed time

output: List[Message] | None

response data for output

usage: Dict | None

response usage for output

sequence_number: str | None

sequence number of event

error: Error | None

response error for output

session_id: str | None

conversation id for dialog

agentscope_runtime.engine.schemas.agent_schemas.convert_to_openai_tool_call(function)[source]
Parameters:

function (FunctionCall)

agentscope_runtime.engine.schemas.agent_schemas.convert_to_openai_messages(messages)[source]

Convert a generic message protocol to a model-specific protocol. :param messages: Original list of messages

Returns:

Message format required by the model

Return type:

list

Parameters:

messages (List[Message])

agentscope_runtime.engine.schemas.agent_schemas.convert_to_openai_tools(tools)[source]
Parameters:

tools (List[Tool | Dict])

Return type:

list | None

class agentscope_runtime.engine.schemas.context.Context(*, user_id, session=Session(id='', user_id='', messages=[]), activate_tools=[], new_message=None, current_messages=[], request, new_message_dict=None, messages_list=[], environment_manager=None, context_manager=None, agent, agent_config=None)[source]

Bases: BaseModel

Holds all contextual information for a single agent invocation.

This object is created by the Runner and passed through the agent execution flow, providing access to necessary services and data, including a live request queue for real-time interaction.

Parameters:
  • user_id (str)

  • session (Session)

  • activate_tools (list)

  • new_message (Message | None)

  • current_messages (List[Message])

  • request (AgentRequest)

  • new_message_dict (Dict | None)

  • messages_list (List[Dict])

  • environment_manager (EnvironmentManager | None)

  • context_manager (ContextManager | None)

  • agent (Agent)

  • agent_config (dict | None)

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'extra': 'forbid'}

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

user_id: str
session: Session
activate_tools: list
new_message: Message | None
current_messages: List[Message]
request: AgentRequest
new_message_dict: Dict | None
messages_list: List[Dict]
environment_manager: EnvironmentManager | None
context_manager: ContextManager | None
agent: Agent
agent_config: dict | None
property messages