Zum Hauptinhalt springen

Interface: LLM<AdditionalChatOptions, AdditionalMessageOptions>

Unified language model interface

Type parameters

NameType
AdditionalChatOptionsextends object = object
AdditionalMessageOptionsextends object = object

Hierarchy

  • LLMChat<AdditionalChatOptions>

    LLM

Implemented by

Properties

metadata

metadata: LLMMetadata

Defined in

packages/core/src/llm/types.ts:50

Methods

chat

chat(params): Promise<AsyncIterable<ChatResponseChunk<object>>>

Get a chat response from the LLM

Parameters

NameType
paramsLLMChatParamsStreaming<AdditionalChatOptions, AdditionalMessageOptions>

Returns

Promise<AsyncIterable<ChatResponseChunk<object>>>

Overrides

LLMChat.chat

Defined in

packages/core/src/llm/types.ts:54

chat(params): Promise<ChatResponse<AdditionalMessageOptions>>

Parameters

NameType
paramsLLMChatParamsNonStreaming<AdditionalChatOptions, AdditionalMessageOptions>

Returns

Promise<ChatResponse<AdditionalMessageOptions>>

Overrides

LLMChat.chat

Defined in

packages/core/src/llm/types.ts:60


complete

complete(params): Promise<AsyncIterable<CompletionResponse>>

Get a prompt completion from the LLM

Parameters

NameType
paramsLLMCompletionParamsStreaming

Returns

Promise<AsyncIterable<CompletionResponse>>

Defined in

packages/core/src/llm/types.ts:70

complete(params): Promise<CompletionResponse>

Parameters

NameType
paramsLLMCompletionParamsNonStreaming

Returns

Promise<CompletionResponse>

Defined in

packages/core/src/llm/types.ts:73