Cortex InferenceReturns the LLMs available for the current session¶GET/api/v2/cortex/modelsReturns the LLMs available for the current session响应¶代码描述200OK响应对象错误响应代码响应正文架构¶Perform LLM text completion inference¶POST/api/v2/cortex/inference:completePerform LLM text completion inference, similar to snowflake.cortex.Complete.响应¶代码描述200OK响应对象错误响应代码响应正文架构¶