PutDatabricksSQL 2025.3.28.13-SNAPSHOT¶
BUNDLE¶
com.snowflake.openflow.runtime | runtime-databricks-processors-nar
DESCRIPTION¶
Submit a SQL Execution using Databricks REST API then write the JSON response to FlowFile Content. For high performance SELECT or INSERT queries use ExecuteSQL instead.
INPUT REQUIREMENT¶
Supports Sensitive Dynamic Properties¶
false
PROPERTIES¶
Property |
Description |
---|---|
Databricks Client |
Databricks Client Service. |
Default Catalog |
Default table catalog, some SQL statements such as ‘COPY INTO’ do not support using a default catalog |
Default Schema |
Default table schema, some SQL statements such as ‘COPY INTO’ do not support using a default schema |
Record Writer |
Specifies the Controller Service to use for writing results to a FlowFile. The Record Writer may use Inherit Schema to emulate the inferred schema behavior, i.e. an explicit schema need not be defined in the writer, and will be supplied by the same logic used to infer the schema from the column types. |
SQL Warehouse ID |
Warehouse ID used to execute SQL |
SQL Warehouse Name |
SQL Warehouse Name used to execute SQL, will search through all SQL Warehouses to find matching name. |
Statement |
SQL statement to execute |
RELATIONSHIPS¶
NAME |
DESCRIPTION |
---|---|
http.response |
HTTP Response to SQL API Request |
original |
The original FlowFile is routed to this relationship when processing is successful. |
records |
Serialized SQL Records |
failure |
Databricks failure relationship |
WRITES ATTRIBUTES¶
NAME |
DESCRIPTION |
---|---|
statement.state |
The final state of the executed SQL statement |
error.code |
The error code for the SQL statement if an error occurred. |
error.message |
The error message for the SQL statement if an error occurred. |