StandardAnthropicLLMService

Description

A Controller Service that provides integration with Anthropic’s Claude AI models through their Messages API. Supports configurable parameters including model selection, response generation settings (temperature, top_p, top_k), token limits, and retry behavior.

Tags

ai, anthropic, api, claude, language model, llm, openflow

Properties

In the list below required Properties are shown with an asterisk (*). Other properties are considered optional. The table also indicates any default values, and whether a property supports the NiFi Expression Language.

Display NameAPI NameDefault ValueAllowable ValuesDescription
Anthropic API Key *Anthropic API KeyThe API Key for authenticating to Anthropic
Backoff Base Delay (ms) *Backoff Base Delay (ms)1000The base delay in milliseconds for exponential backoff between retries
Max Response Tokens *Max Response Tokens1000The maximum number of tokens to generate in the response.
Max Retries *Max Retries3The maximum number of retry attempts for API calls
Model Name *Model Nameclaude-3-5-sonnet-latestThe name of the Anthropic model
TemperatureTemperatureThe temperature to use for generating the response.
Top KTop KThe top K value to use for generating the response. Only sample from the top K options for each subsequent token. Recommended for advanced use cases only. You usually only need to use temperature.
Top PTop PThe top_p value for nucleus sampling. It controls the diversity of the generated responses.
User IDUser IDThe user id to set in the request metadata
Web Client Service *Web Client ServiceThe Web Client Service to use for communicating with the LLM provider.

State management

This component does not store state.

Restricted

This component is not restricted.

System Resource Considerations

This component does not specify system resource considerations.