Cortex Inference

Returns the LLMs available for the current session

GET/api/v2/cortex/models
Returns the LLMs available for the current session

响应

代码描述
200
OK

Perform LLM text completion inference

POST/api/v2/cortex/inference:complete
Perform LLM text completion inference, similar to snowflake.cortex.Complete.

For more information

Go to the SQL command page to view more information about arguments, options, privileges requirements, and usage guidelines.

视图

响应

代码描述
200
OK
语言: 中文