Skip to main content
Configuration for RemoteLLMCompletionNode using LLM provider settings. Remarks: This approach creates both the node and the underlying LLM component. If provider/modelName are omitted, sensible defaults will be used. Extends: AbstractNodeProps

Interface Definition

interface RemoteLLMCompletionNodeProps {
    id?: string;
    modelName?: string;
    provider?: string;
    reportToClient?: boolean;
    stream?: boolean;
    textGenerationConfig?: Camelize<TextGenerationConfig>;
}

Properties

id (optional)

id?: string
Optional explicit node identifier. Remarks: If omitted, a stable auto-generated ID based on the class name is assigned. Inherited from: AbstractNodeProps.id

modelName (optional)

modelName?: string
Model name specific to the provider (e.g., ‘gpt-4’, ‘claude-3-5-sonnet-20241022’)

provider (optional)

provider?: string
LLM provider (e.g., ‘openai’, ‘anthropic’, ‘inworld’)

reportToClient (optional)

reportToClient?: boolean
Whether this node should report its outputs to the client. If set to true, you will see the output of this node in the GraphOutputStream. Inherited from: AbstractNodeProps.reportToClient

stream (optional)

stream?: boolean
Whether to stream responses

textGenerationConfig (optional)

textGenerationConfig?: Camelize<TextGenerationConfig>
TextGenerationConfig parameters