SnowConvert AI - BigQuery Issues

Note

Conversion Scope

SnowConvert AI for Google BigQuery currently supports assessment and translation for TABLES and VIEWS. Although SnowConvert AI can recognize other types of statements, they are not fully supported.

This page provides a comprehensive reference for how SnowConvert AI translates Google BigQuery grammar elements to Snowflake equivalents. In this translation reference, you will find code examples, functional equivalence results, key differences, recommendations, known issues, and descriptions of each transformation.

SSC-EWI-BQ0001

Snowflake does not support the options clause.

Warning

This EWI is deprecated; please refer to SSC-EWI-0016 for the latest version of this EWI.

Severity

Medium

Description

This EWI is added to DDL statements when the OPTIONS has unsupported options by Snowflake.

Code Example

Input Code:

BigQuery
 CREATE VIEW my_view
OPTIONS (
  expiration_timestamp=TIMESTAMP "2026-01-01 00:00:00 UTC",
  privacy_policy='{"aggregation_threshold_policy": {"threshold": 50, "privacy_unit_columns": "ID"}}'
) AS
SELECT column1, column2
FROM my_table;
Copy

Output Code:

Snowflake
 CREATE VIEW my_view
!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0001 - SNOWFLAKE DOES NOT SUPPORT THE OPTIONS: EXPIRATION_TIMESTAMP, PRIVACY_POLICY ***/!!!
OPTIONS(
  expiration_timestamp=TIMESTAMP "2026-01-01 00:00:00 UTC",
  privacy_policy='{"aggregation_threshold_policy": {"threshold": 50, "privacy_unit_columns": "ID"}}'
)
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": {  "major": 0,  "minor": 0,  "patch": "0" }, "attributes": {  "component": "bigquery",  "convertedOn": "03/26/2025",  "domain": "test" }}'
AS
SELECT column1, column2
FROM
  my_table;
Copy
Recommendations

SSC-EWI-BQ0002

Micro-partitioning is automatically performed on all Snowflake tables.

Note

This issue is deprecated and no longer generated by SnowConvert AI

Severity

Medium

Description

This warning is added to the Create table when the partition by clause is present. PARTITION BY is an optional clause that controls table partitioning (https://cloud.google.com/bigquery/docs/partitioned-tables) but is not supported in Snowflake.

All data in Snowflake tables is automatically divided into micro-partitions, which are contiguous units of storage. Each micro-partition contains between 50 MB and 500 MB of uncompressed data. This size and structure allows for extremely granular pruning of very large tables, which can be comprised of millions, or even hundreds of millions, of micro-partitions.

Snowflake stores metadata about all rows stored in a micro-partition, including:

  • The range of values for each of the columns in the micro-partition.

  • The number of distinct values.

  • Additional properties used for both optimization and efficient query processing.

Also the tables are transparently partitioned using the ordering of the data as it is inserted/loaded. For more information please refer to Benefits of Micro-partitioning.

Code Example

Input Code:
BigQuery
 CREATE TABLE table1(
    transaction_id INT, 
    transaction_date DATE
)
PARTITION BY transaction_date;
Copy
Generated Code:
Snowflake
 CREATE TABLE table1 (
    transaction_id INT,
  transaction_date DATE
)
!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0002 - MICRO-PARTITIONING IS AUTOMATICALLY PERFORMED ON ALL SNOWFLAKE TABLES. ***/!!!
PARTITION BY transaction_date
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": {  "major": 0,  "minor": 0,  "patch": "0" }, "attributes": {  "component": "bigquery",  "convertedOn": "04/09/2025",  "domain": "test" }}';
Copy

Recommendations

SSC-EWI-BQ0003

Pending SnowConvert AI translation for differential privacy.

Severity

Medium

Description

BigQuery allows applying differential privacy (https://cloud.google.com/bigquery/docs/differential-privacy#what_is_differential_privacy) over some statistical functions to introduce noise in the data, making it difficult to extract information about individuals when analyzing query results.

Snowflake now supports differential privacy natively. However, SnowConvert AI has not yet implemented the translation for this feature. Any use of differential privacy in BigQuery will be commented out and this issue will be generated to flag the need for manual conversion.

Code Example

Input Code:
BigQuery
 SELECT
  WITH DIFFERENTIAL_PRIVACY
    OPTIONS(epsilon=10, delta=.01, max_groups_contributed=2, privacy_unit_column=id)
    item,
    COUNT(quantity, contribution_bounds_per_group => (0,100)) total_quantity
FROM professors
GROUP BY item;
Copy
Generated Code:
Snowflake
 SELECT
!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0003 - PENDING SNOWCONVERT AI TRANSLATION FOR DIFFERENTIAL PRIVACY. ***/!!!
  WITH DIFFERENTIAL_PRIVACY
    OPTIONS(epsilon=10, delta=.01, max_groups_contributed=2, privacy_unit_column=id)
    item,
    COUNT(quantity,
                    !!!RESOLVE EWI!!! /*** SSC-EWI-BQ0003 - PENDING SNOWCONVERT AI TRANSLATION FOR DIFFERENTIAL PRIVACY. ***/!!! contribution_bounds_per_group => (0,100)) total_quantity
FROM
  professors
GROUP BY item;
Copy

Recommendations

  1. Use native Snowflake support: Snowflake now supports differential privacy natively. Rewrite the BigQuery differential privacy syntax using Snowflake’s privacy policies and privacy budgets.

  2. Key differences: Snowflake’s differential privacy implementation uses privacy policies assigned to tables/views, privacy budgets to manage analyst queries, and privacy domains for fact and dimension columns. The syntax differs from BigQuery’s inline WITH DIFFERENTIAL_PRIVACY clause.

  3. Further reading: Snowflake Differential Privacy Overview

SSC-EWI-BQ0004

Snowflake does not support named windows.

Severity

Medium

Description

BigQuery allows the definition and usage of named windows in aggregate functions, they are defined in the WINDOW clause of the query they are used and can be used inside the OVER clause of these functions.

Snowflake does not support declaring named windows, please consider taking the window definition and apply it to all usages of that window directly in the OVER clause of the functions.

Code Example

Input Code:
BigQuery
 SELECT 
    COUNT(col1) OVER(myWindow)
FROM 
    test.exampleTable
WINDOW 
    myWindow AS (ORDER BY col2);
Copy
Generated Code:
Snowflake
 SELECT
    COUNT(col1)
    !!!RESOLVE EWI!!! /*** SSC-EWI-BQ0004 - SNOWFLAKE DOES NOT SUPPORT NAMED WINDOWS. ***/!!! OVER(myWindow)
FROM
    test.exampleTable
!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0004 - SNOWFLAKE DOES NOT SUPPORT NAMED WINDOWS. ***/!!!
WINDOW
    myWindow AS (ORDER BY col2);
Copy

Recommendations

  • Review your named window definitions, it might be possible to take the definition and apply it to the OVER clause of the functions it is used in. However, keep in mind the functional differences between BigQuery and Snowflake window frames still apply, take the following case as an example:

BigQuery:

 SELECT 
    COUNT(col1) OVER(myWindow)
FROM 
    test.exampleTable
WINDOW 
    myWindow AS (ORDER BY col2);
Copy

Snowflake:

 SELECT 
    COUNT(col1) OVER(ORDER BY col2)
FROM 
    test.exampleTable;
Copy

These two queries will produce the same rows but the Snowflake results will not be ordered, this is because the ORDER BY clause for window frames does not impact the entire query ordering as it does in BigQuery.

SSC-EWI-BQ0005

Javascript code has not been validated by SnowConvert AI.

Severity

High

Description

SnowConvert AI does not transform Javascript code. Since the Javascript code extracted from BigQuery’s functions hasn’t been changed at all, this code might need some tweaks to work on Snowflake.

Code Example

Input Code:
BigQuery
 CREATE FUNCTION test.languageJs (x integer, y integer)
RETURNS integer
LANGUAGE js
AS "return x * y;";
Copy
Generated Code:
Snowflake
 CREATE FUNCTION test.languageJs (x integer, y integer)
RETURNS DOUBLE
LANGUAGE JAVASCRIPT
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": {  "major": 0,  "minor": 0,  "patch": "0" }, "attributes": {  "component": "bigquery",  "convertedOn": "07/16/2025",  "domain": "no-domain-provided" }}'
!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0005 - JAVASCRIPT CODE HAS NOT BEEN VALIDATED BY SNOWCONVERT AI. ***/!!!
AS
$$
return x * y;
$$;
Copy

Recommendations

SSC-EWI-BQ0006

Oriented parameter in the ST_GEOGFROMTEXT function is not supported in Snowflake.

Severity

Low

Description

This warning is added when the oriented parameter is specified in the ST_GEOGFROMTEXT (https://cloud.google.com/bigquery/docs/reference/standard-sql/geography_functions#st_geogfromtext) function, because it is not supported in Snowflake. If this parameter is set to TRUE, any polygon in the input is assumed to be oriented as follows: if someone walks along the polygon boundary in the order of the input vertices, the interior of the polygon is to the left. This allows WKT to represent polygons larger than a hemisphere. If oriented is FALSE or omitted, this function returns the polygon with the smallest area.

Code Example

Input Code:
BigQuery
 SELECT ST_GEOGFROMTEXT('POLYGON((0 0, 1 0, 1 1, 0 1, 0 0))', TRUE);
Copy
Generated Code:
Snowflake
 SELECT
!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0006 - ORIENTED PARAMETER IN THE ST_GEOGFROMTEXT FUNCTION IS NOT SUPPORTED IN SNOWFLAKE. ***/!!!
 ST_GEOGFROMTEXT('POLYGON((0 0, 1 0, 1 1, 0 1, 0 0))');
Copy

Recommendations

  1. Review polygon orientation: If the oriented parameter was set to TRUE, verify that the polygon does not span more than a hemisphere. Snowflake’s ST_GEOGFROMTEXT always returns the polygon with the smallest area.

  2. Manual validation: For polygons larger than a hemisphere, consider splitting them into smaller polygons or using alternative geospatial representations.

  3. Remove the parameter: After manual review, remove the oriented parameter from the function call, as Snowflake’s ST_GEOGFROMTEXT accepts only the WKT string argument.

SSC-EWI-BQ0007

Escape Sequence is not valid in Snowflake.

Severity

Low

Description

Bell character (\a) and Vertical character (\v) are valid escape sequences in BigQuery, but not in Snowflake.

This warning is added when a bell character or vertical character escape sequence is found when translating BigQuery code. For more information, see BigQuery Escape Sequences (https://cloud.google.com/bigquery/docs/reference/standard-sql/lexical#escape_sequences).

Code Example

Input Code:
BigQuery
 SELECT "\a";
SELECT "\v";
Copy
Generated Code:
Snowflake
 SELECT
    !!!RESOLVE EWI!!! /*** SSC-EWI-BQ0007 - ESCAPE SEQUENCE \a IS NOT VALID IN SNOWFLAKE. ***/!!!
    '\a';
SELECT
    !!!RESOLVE EWI!!! /*** SSC-EWI-BQ0007 - ESCAPE SEQUENCE \v IS NOT VALID IN SNOWFLAKE. ***/!!!
    '\v';
Copy

Recommendations

  1. Replace with Unicode escapes: Replace \a (bell character, U+0007) with \x07 and \v (vertical tab, U+000B) with \x0B, which are supported by Snowflake.

  2. Review usage: If the escape sequence was used for formatting purposes, consider whether it is still needed in the Snowflake context.

SSC-EWI-BQ0008

Eight hex digit Unicode escape sequence is not supported in Snowflake.

Severity

Low

Description

BigQuery supports Unicode sequences of 8 hex digits. Snowflake doesn’t support this kind of Unicode sequences.

This warning is added when an 8 hex digits Unicode sequence is found when translating BigQuery code. More about BigQuery Escape Sequences (https://cloud.google.com/bigquery/docs/reference/standard-sql/lexical#escape_sequences).

Code Example

Input Code:
BigQuery
 SELECT "\U00100000";
Copy
Generated Code:
Snowflake
 SELECT
    !!!RESOLVE EWI!!! /*** SSC-EWI-BQ0008 - EIGHT HEX DIGIT UNICODE ESCAPE SEQUENCE IS NOT SUPPORTED IN SNOWFLAKE ***/!!!
    '\U00100000';
Copy

Recommendations

  1. Use surrogate pairs: Convert the 8-hex-digit Unicode sequence into two 4-hex-digit surrogate pair sequences. For example, \U00100000 can be represented using surrogate pairs \uDBC0\uDC00.

  2. Use CHR function: Alternatively, use Snowflake’s CHR function with the Unicode code point to generate the character at runtime.

SSC-EWI-BQ0009

SnowConvert AI was unable to generate the correct return table clause. Missing symbol information.

Severity

High

Description

Snowflake requires a valid RETURNS TABLE clause for CREATE TABLE FUNCTION statements. SnowConvert AI has to build a new one from the ground up. To do this, an analysis is made on the CREATE TABLE FUNCTION query in order to properly infer the types of the columns of the resulting table, however there may be scenarios where SnowConvert AI currently has a limitation to be able to build the return clause properly.

These scenarios will be considered in the future, but in the meantime this error will be added.

Code Example

Input Code:
BigQuery
 CREATE TABLE FUNCTION tableValueFunction2()
AS
SELECT *
REPLACE("John" AS employee_name)
FROM employees;
Copy
Generated Code:
Snowflake
 CREATE FUNCTION tableValueFunction2 ()
!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0009 - SNOWCONVERT AI WAS UNABLE TO GENERATE THE CORRECT RETURN TABLE CLAUSE. MISSING SYMBOL INFORMATION. ***/!!!
RETURNS TABLE (
)
AS
  $$
      SELECT
        * REPLACE("John" AS employee_name) !!!RESOLVE EWI!!! /*** SSC-EWI-0073 - PENDING FUNCTIONAL EQUIVALENCE REVIEW FOR 'ExceptReplaceOperator' NODE ***/!!!
      FROM
        employees
  $$;
Copy

Recommendations

  1. Manually define the RETURNS TABLE clause: Inspect the original BigQuery TABLE FUNCTION body to determine the column names and types of the result set, then populate the empty RETURNS TABLE() clause with the correct column definitions.

  2. Provide source references: If the issue is caused by missing references, ensure all referenced tables and views are included in the input provided to SnowConvert AI.

SSC-EWI-BQ0011

Session variable assignment of complex datatype is not supported in Snowflake

Severity

Medium

Description

In BigQuery, declaring a variable at script level allows it to be used in the entire script, to replicate this behavior in Snowflake SQL variables are used.

However, declaring variables of datatypes that are complex like ARRAY, GEOGRAPHY, STRUCT or JSON will fail in Snowflake when trying to set the value to the SQL variable. When SnowConvert AI detects one of such cases then this is EWI will be added to the SQL variable declaration.

Variables of these types can be declared without problems inside block statements and other procedural statements, this EWI applies only for variables declared at script level.

Code Example

Input Code:
BigQuery
 CREATE TABLE test.JsonTable
(
    col1 JSON
);

DECLARE myVar1 JSON DEFAULT JSON'{"name": "John", "age": 30}';

INSERT INTO test.JsonTable VALUES (myVar1);

BEGIN
    DECLARE myVar2 JSON DEFAULT JSON'{"name": "Mike", "age": 27}';
    INSERT INTO test.JsonTable VALUES (myVar2);
END;

SELECT col1 FROM test.JsonTable;
Copy
Generated Code:
Snowflake
 CREATE TABLE test.JsonTable
(
    col1 VARIANT
)
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": {  "major": 0,  "minor": 0,  "patch": "0" }, "attributes": {  "component": "bigquery",  "convertedOn": "07/02/2025",  "domain": "no-domain-provided" }}';

!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0011 - SESSION VARIABLE ASSIGNMENT OF COMPLEX DATATYPE IS NOT SUPPORTED IN SNOWFLAKE ***/!!!
SET myVar1 = (
    SELECT
        PARSE_JSON('{"name": "John", "age": 30}')
);

INSERT INTO test.JsonTable
VALUES ($myVar1);

BEGIN
    LET myVar2 VARIANT DEFAULT PARSE_JSON('{"name": "Mike", "age": 27}');
    INSERT INTO test.JsonTable
    VALUES (:myVar2);
END;

SELECT col1 FROM
    test.JsonTable;
Copy

Recommendations

  • If the uses of the variable are limited to a single scope or its value is never modified, consider declaring the variable locally in the scopes that use it, that will solve the issue.

  • If you need more support, you can email us at snowconvert-support@snowflake.com

SSC-EWI-BQ0012

SnowConvert AI was unable to generate a correct OBJECT_CONSTRUCT parameter. Missing symbol information.

Severity

High

Description

SnowConvert AI was unable to generate a correct OBJECT_CONSTRUCT parameter due to missing symbol information. This typically occurs when the table definition is not included in the input provided to SnowConvert AI, or when the table uses complex types (such as STRUCT) whose field names are needed to build the OBJECT_CONSTRUCT call.

Code Example

Input Code:
BigQuery
 INSERT INTO test.tuple_sample
VALUES
  ((12, 34)),
  ((56, 78)),
  ((9, 99)),
  ((12, 35));
Copy
Generated Code:
Snowflake
 INSERT INTO test.tuple_sample
!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0012 - SNOWCONVERT AI WAS UNABLE TO GENERATE A CORRECT OBJECT_CONSTRUCT PARAMETER. MISSING SYMBOL INFORMATION. ***/!!!
VALUES
  ((12, 34)),
  ((56, 78)),
  ((9, 99)),
  ((12, 35));
Copy

Recommendations

  1. Provide table definitions: Ensure all referenced table definitions (CREATE TABLE statements) are included in the input provided to SnowConvert AI so that symbol information can be collected.

  2. Manual replacement: Inspect the original BigQuery INSERT statement and manually construct the OBJECT_CONSTRUCT call with the correct field names and values matching the target table’s schema.

SSC-EWI-BQ0013

External table data format not supported in snowflake

Warning

This EWI is deprecated; please refer to SSC-EWI-0029 for the latest version of this EWI.

Severity

Medium

Description

Snowflake supports the following BigQuery formats:

BigQuery

Snowflake

AVRO

AVRO

CSV
GOOGLE_SHEETS

CSV

NEWLINE_DELIMITED_JSON
JSON

JSON

ORC

ORC

PARQUET

PARQUET

When an external table has other FORMAT not specified in the above table, this EWI will be generated to inform the user that the FORMAT is not supported.

Code Example

Input Code:
BigQuery
 CREATE OR REPLACE EXTERNAL TABLE test.backup_restore_table
OPTIONS (
  format = 'DATASTORE_BACKUP',
  uris = ['gs://backup_bucket/backup_folder/*']
);
Copy
Generated Code:
Snowflake
 !!!RESOLVE EWI!!! /*** SSC-EWI-BQ0013 - EXTERNAL TABLE DATA FORMAT NOT SUPPORTED IN SNOWFLAKE ***/!!!
CREATE OR REPLACE EXTERNAL TABLE test.backup_restore_table
OPTIONS (
  format = 'DATASTORE_BACKUP',
  uris = ['gs://backup_bucket/backup_folder/*']
);
Copy

Recommendations

SSC-EWI-BQ0014

Hive partitioned external tables are not supported in Snowflake

Severity

Medium

Description

Snowflake does not support hive partitioned external tables, when the WITH PARTITION COLUMNS clause is found in the external table, it will be marked as not supported using this EWI.

Code Example

Input Code:
BigQuery
 CREATE EXTERNAL TABLE test.CustomHivePartitionedTable
WITH PARTITION COLUMNS (
  field_1 STRING,
  field_2 INT64)
OPTIONS (
  uris = ['gs://sc_external_table_bucket/folder_with_parquet/*'],
  format = 'PARQUET',
  hive_partition_uri_prefix = 'gs://sc_external_table_bucket/folder_with_parquet',
  require_hive_partition_filter = false);
Copy
Generated Code:
Snowflake
 CREATE OR REPLACE TEMPORARY FILE FORMAT SC_TEST_CUSTOMHIVEPARTITIONEDTABLE_FORMAT
TYPE = PARQUET;

CREATE EXTERNAL TABLE test.CustomHivePartitionedTable USING TEMPLATE (
SELECT
  ARRAY_AGG(OBJECT_CONSTRUCT('COLUMN_NAME', COLUMN_NAME, 'TYPE', TYPE, 'NULLABLE', NULLABLE, 'EXPRESSION', EXPRESSION))
FROM
  --** SSC-FDM-0035 - THE INFER_SCHEMA FUNCTION REQUIRES A FILE PATH WITHOUT WILDCARDS TO GENERATE THE TABLE TEMPLATE, REPLACE THE FILE_PATH PLACEHOLDER WITH IT **
  TABLE(INFER_SCHEMA(LOCATION => '@EXTERNAL_STAGE/FILE_PATH', FILE_FORMAT => 'SC_TEST_CUSTOMHIVEPARTITIONEDTABLE_FORMAT'))
)
!!!RESOLVE EWI!!! /*** SSC-EWI-0032 - EXTERNAL TABLE REQUIRES AN EXTERNAL STAGE TO ACCESS gs://sc_external_table_bucket, DEFINE AND REPLACE THE EXTERNAL_STAGE PLACEHOLDER ***/!!!
LOCATION = @EXTERNAL_STAGE
AUTO_REFRESH = false
!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0014 - HIVE PARTITIONED EXTERNAL TABLES ARE NOT SUPPORTED IN SNOWFLAKE ***/!!!
WITH PARTITION COLUMNS (
  field_1 STRING,
  field_2 INT64)
PATTERN = 'folder_with_parquet/.*'
FILE_FORMAT = (TYPE = PARQUET)
!!!RESOLVE EWI!!! /*** SSC-EWI-0016 - SNOWFLAKE DOES NOT SUPPORT THE OPTIONS: HIVE_PARTITION_URI_PREFIX, REQUIRE_HIVE_PARTITION_FILTER. ***/!!!
OPTIONS(
  hive_partition_uri_prefix = 'gs://sc_external_table_bucket/folder_with_parquet',
  require_hive_partition_filter = false
)
COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": {  "major": 0,  "minor": 0,  "patch": "0" }, "attributes": {  "component": "bigquery",  "convertedOn": "07/02/2025",  "domain": "no-domain-provided" }}';
Copy

Recommendations

  1. Remove the WITH PARTITION COLUMNS clause: Snowflake external tables use automatic partitioning based on the file path. Remove the WITH PARTITION COLUMNS clause from the generated code.

  2. Use Snowflake partitioning: Define partition columns using expressions in the external table’s column definitions. Snowflake can automatically infer partition columns from the directory structure.

  3. Hive metastore integration: If you use a Hive metastore, consider integrating it with Snowflake to synchronize external table metadata automatically.

SSC-EWI-BQ0015

External table requires an external stage to access an external location, define and replace the EXTERNAL_STAGE placeholder

Warning

This EWI is deprecated; please refer to SSC-EWI-0032 for the latest version of this EWI.

Description

When transforming the CREATE EXTERNAL TABLE statement, SnowConvert AI will generate an EXTERNAL_STAGE placeholder that has to be replaced with the external stage created for connecting with the external location from Snowflake.

Please refer to the following guides to set up the necessary Storage Integration and External Stage in your Snowflake account:

Code Example

Input Code:
BigQuery
 CREATE OR REPLACE EXTERNAL TABLE test.Employees_test
(
  Employee_id INTEGER,
  Name STRING,
  Mail STRING,
  Position STRING,
  Salary INTEGER
)
OPTIONS(
  FORMAT='CSV',
  SKIP_LEADING_ROWS=1,
  URIS=['gs://sc_external_table_bucket/folder_with_csv/Employees.csv']
);
Copy
Generated Code:
Snowflake
CREATE OR REPLACE EXTERNAL TABLE test.Employees_test
(
  Employee_id INTEGER AS CAST(GET_IGNORE_CASE($1, 'c1') AS INTEGER),
  Name STRING AS CAST(GET_IGNORE_CASE($1, 'c2') AS STRING),
  Mail STRING AS CAST(GET_IGNORE_CASE($1, 'c3') AS STRING),
  Position STRING AS CAST(GET_IGNORE_CASE($1, 'c4') AS STRING),
  Salary INTEGER AS CAST(GET_IGNORE_CASE($1, 'c5') AS INTEGER)
)
!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0015 - EXTERNAL TABLE REQUIRES AN EXTERNAL STAGE TO ACCESS gs://sc_external_table_bucket, DEFINE AND REPLACE THE EXTERNAL_STAGE PLACEHOLDER ***/!!!
LOCATION = @EXTERNAL_STAGE
AUTO_REFRESH = false
PATTERN = 'folder_with_csv/Employees.csv'
FILE_FORMAT = (TYPE = CSV SKIP_HEADER =1);

Recommendations

  • Set up your external connection in the Snowflake account and replace the EXTERNAL_STAGE placeholder to complete the transformation.

  • If you need more support, you can email us at snowconvert-support@snowflake.com

SSC-EWI-BQ0016

Select * with multiple UNNEST operators will produce column ambiguity

Warning

This EWI is deprecated; please refer to SSC-FDM-0012 for the latest version of this issue.

Severity

Medium

Description

As part of the SnowConvert transformation for the UNNEST operator, the FLATTEN function is used, this function generates multiple columns not required to emulate the UNNEST operator functionality like the THIS or PATH columns.

When a SELECT * with the UNNEST operator is found, SnowConvert will remove the unnecessary columns using the EXCLUDE keyword, however, when multiple UNNEST operators are used in the same statement, the columns can not be removed due to ambiguity problems, this EWI will be generated to mark these cases.

It is recommended to expand the SELECT expression list in order to specify only the expected columns and solve this issue.

Code Example

Input Code:
BigQuery
SELECT * FROM UNNEST ([10,20,30]);

SELECT * FROM UNNEST ([10,20,30]) AS numbers, UNNEST(['Hi', 'Hello', 'Bye']) AS words;
Copy
Generated Code:
Snowflake
SELECT
* EXCLUDE(SEQ, KEY, PATH, THIS, INDEX)
FROM
TABLE(FLATTEN(INPUT => [10,20,30])) AS F0_ (
SEQ,
KEY,
PATH,
INDEX,
F0_,
THIS
);

SELECT
!!!RESOLVE EWI!!! /*** SSC-EWI-BQ0016 - SELECT * WITH MULTIPLE UNNEST OPERATORS WILL RESULT IN COLUMN AMBIGUITY IN SNOWFLAKE ***/!!!
 * FROM
TABLE(FLATTEN(INPUT => [10,20,30])) AS numbers (
SEQ,
KEY,
PATH,
INDEX,
numbers,
THIS
),
TABLE(FLATTEN(INPUT => ['Hi', 'Hello', 'Bye'])) AS words (
SEQ,
KEY,
PATH,
INDEX,
words,
THIS
);
Copy

SSC-EWI-BQ0017

Pending SnowConvert AI translation for UNNEST of an array of structs

Severity

Medium

Description

When unnesting an array of structs, BigQuery generates a column for each struct field and splits the struct values into their corresponding columns. SnowConvert AI does not yet support this transformation. Whenever SnowConvert AI detects that the UNNEST operator is applied over an array of structs, this EWI is generated to flag the need for manual conversion.

Code Example

Input Code:
BigQuery
CREATE TABLE test.myTestTable
(
  column1 ARRAY<STRUCT<x INT64, y STRING, z STRUCT<a INT64, b INT64>>>
);

SELECT structValues FROM test.myTestTable AS someTable, UNNEST(someTable.column1) AS structValues;
Copy
Generated Code:
Snowflake
CREATE TABLE test.myTestTable
(
  column1 ARRAY DEFAULT []
);

SELECT structValues FROM
  test.myTestTable AS someTable,
  !!!RESOLVE EWI!!! /*** SSC-EWI-BQ0017 - PENDING SNOWCONVERT AI TRANSLATION FOR UNNEST OF AN ARRAY OF STRUCTS ***/!!! UNNEST(someTable.column1) AS structValues;
Copy

Recommendations

  1. Use FLATTEN with LATERAL: Manually flatten the array column using Snowflake’s FLATTEN function, then extract individual struct fields using dot notation or GET on the VALUE column.

  2. Example workaround:

    SELECT f.VALUE:x::INT64 AS x, f.VALUE:y::STRING AS y
    FROM test.myTestTable AS t, LATERAL FLATTEN(INPUT => t.column1) AS f;
    
    Copy