Issues and Troubleshooting¶
SSC-EWI-TD0001¶
Recursive forward alias error.
Note
Some parts in the output code are omitted for clarity reasons.
Severity¶
Low
Description¶
This EWI is shown whenever SnowConvert AI detects recursion within aliased expressions, therefore being unable to execute the Forward Alias transformation required for the correct functionality of aliases within Snowflake environment.
A recursive alias happens when an aliased expression contains another alias, and the second aliased expression contains the first alias. This may not be as trivial as the example shows, since the recursion can happen further down the line in a transitive way.
Example Code¶
Note: Recursive aliases are not supported in Snowflake, however, some simple instances are.
Note
Note that recursive alias is not supported in Snowflake, however, some simple instances are. Check the examples below.
The following example code works in Snowflake after migration:
Teradata:¶
Snowflake Scripting:¶
However, the following example code does not work:
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Review your code and make sure recursive forward aliases are not present. The EWI shows the name of the first instance of an alias that has recursive references, but that does not mean that is the only one that has them in your code.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0002¶
Interval type not supported.
Warning
This EWI is deprecated since SnowConvert AI 28.1.100 release
Severity¶
High
Description¶
When the selector of a column in a SQL statement is type INTERVAL, the EWI will be added and a Stub function will be created too. This is a type that is not supported in Snowflake and therefore implies pending work after SnowConvert AI finishes.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0003¶
Collation not supported in trim functions, add original collation to function result to preserve it.
Severity¶
Low
Description¶
In Snowflake, trim functions (LTRIM, RTRIM, or TRIM) do not support collation unless the characters to trim are empty or white space characters.
If SnowConvert AI detects a LTRIM, RTRIM or TRIM LEADING, TRAILING, or both function with the scenario mentioned above, the COLLATE function will be automatically generated to create a copy without collation of the input column. This EWI is generated to point out that the column collation was removed before the trim function, meaning the result of the function will not have collation, and that this may change the results of further comparisons using the result.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- To avoid functional differences during comparisons, please add the original collation of the column to the
TRIMfunction result string, this can be achieved using theCOLLATEfunction and specifying the original column collation as the second argument, this argument has to be a literal string with the collation value. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0004¶
Not supported SQL Exception on continue handler.
Severity¶
Low
Description¶
In Snowflake procedures there is no equivalent transformation for Teradata Continue Handler. For some supported Exception codes, SnowConvert AI does some treatment to emulate this behavior. This EWI is added to Continue Handler statements having an exception code that is not supported.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Check the possible statements that can throw the exception code and encapsulate them in a similar code block as seen in Continue Handler Translation Reference.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0005¶
The statement was converted but its functionality is not implemented yet.
Severity¶
Critical
Description¶
The statement was recognized and it was converted but the converted code will not have the expected functionality because the implementation is not done yet.
The warning is added for the user to be aware that when the script uses this statement the script will not have the expected functionally equivalent.
Example source¶
BTEQ Input code:¶
Python Output code:¶
Best Practices¶
- For more information please refer to translation spec of BTEQ to Python.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0006¶
Invalid default value.
Severity¶
Low
Description¶
The DEFAULT TIME / DEFAULT DATE / DEFAULT CURREN_DATE / DEFAULT DEFAULT CURRENT_TIME / DEFAULT CURRENT_TIMESTAMP column specifications are not supported for the FLOAT data type.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- No additional user actions are required.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0007¶
GROUP BY clause unsupported in Teradata Mode for string comparison
Severity¶
Low
Description¶
This error message indicates a possible issue when migrating Teradata SQL queries to Snowflake, particularly related to differences in how the GROUP BY clause handles string comparison sensitivity in Teradata mode.
In Teradata mode, string comparisons in GROUP BY clauses are case-insensitive by default (NOT CASESPECIFIC), whereas Snowflake is case-sensitive unless columns are explicitly defined with a case-insensitive COLLATE clause. This difference can cause queries that rely on case-insensitive grouping in Teradata to produce different results in Snowflake.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Expected Behavior Differences¶
| Platform | Grouping Behavior | Example Result Rows |
|---|---|---|
| Teradata Mode | Groups ‘John’, ‘JOHN’, and ‘john’ together | John (or JOHN/john), 3 |
| Snowflake | Treats ‘John’, ‘JOHN’, and ‘john’ as separate | John, 1JOHN, 1john, 1 |
Best Practices¶
- Review GROUP BY clauses involving string columns when migrating from Teradata mode to ensure expected grouping behavior.
Note: When using expressions like RTRIM(UPPER(first_name)) or RTRIM(first_name) in the GROUP BY clause to achieve case-insensitive or trimmed grouping, you must apply the same expression consistently in all parts of the query where the column is referenced. For example:
This ensures that filtering, selection, and grouping all use the same logic, avoiding mismatches or unexpected results.
-
Define columns with COLLATE during table creation if consistent case-insensitive behavior is required:
-
Enable the –UseCollateForCaseSpecification CLI flag or Conversion Setting to use COLLATE for case specification during conversion. This option ensures that case specification (such as CASESPECIFIC or NOT CASESPECIFIC) is handled using COLLATE functions instead of UPPER functions. For details, refer to the CLI documentation or conversion settings.
-
If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0008¶
Function for comparing strings is not supported
Severity¶
Low
Description¶
Currently, there is no equivalence for some string-comparing functions in Snowflake.
This EWI is added whenever the comparison type is jaro, n_gram, LD, LDWS, OSA, DL, hamming, LCS, jaccard, cosine and soundexcode.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0009¶
TEMPORAL column not supported.
Severity¶
Low
Description¶
Teradata provides temporal table support at the column level using derived period columns. These columns are not supported in Snowflake.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0010¶
UPPERCASE not supported by Snowflake.
Severity¶
Low
Description¶
The UPPERCASE column attribute is not supported in Snowflake.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Since the
UPPERCASEclause indicates that characters typed as ‘aaa’ are stored as ‘AAA’, a possible workaround can be adding to all the insert references the UPPER function. However, external data loading by ETL processes would also have to be modified. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0012¶
Binary does not support default.
Severity¶
Low
Description¶
This EWI is shown when SnowConvert AI finds a data type BINARY along with a DEFAULT value specification. Since default values are not allowed in BINARY columns, it is removed.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0017¶
Global temporary table trace functionality not supported.
Severity¶
Low
Description¶
This EWI is shown when SnowConvert AI finds a Create Table with the GLOBAL TEMPORARY TRACE option. Review the following Teradata documentation about the TRACE functionality (https://docs.teradata.com/r/Enterprise_IntelliFlex_VMware/SQL-Data-Definition-Language-Syntax-and-Examples/Table-Statements/CREATE-GLOBAL-TEMPORARY-TRACE-TABLE). Since it is not supported in Snowflake, it is removed.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Note: It might be possible to replicate some tracing functionality in Snowflake by using an
EVENT TABLE. Review the following Snowflake documentation about Loggin and Tracing. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0020¶
Regexp_Substr Function only supports POSIX regular expressions.
Note
This EWI is deprecated, please refer to SSC-EWI-0009 documentation
Severity¶
Low
Description¶
Currently, there is no support in Snowflake for extended regular expression beyond the POSIX Basic Regular Expression syntax.
This EWI is added every time a function call to REGEX_SUBSTR, REGEX_REPLACE, or REGEX_INSTR is transformed to Snowflake to warn the user about possible unsupported regular expressions. Some of the features not supported are lookahead, lookbehind, and non-capturing groups.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Check the regular expression used in each case to determine whether it needs manual intervention. More information about expanded regex support and alternatives in Snowflake can be found here (https://community.snowflake.com/s/question/0D50Z00007ENLKsSAP/expanded-support-for-regular-expressions-regex)**.** 
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0023¶
ACTIVITY_COUNT inside SELECT/SET INTO VARIABLE requires manual fix
Severity¶
Low
Description¶
The ACTIVITY_COUNT status variable returns the number of rows affected by an SQL DML statement in an embedded SQL or stored procedure application. For more information, see the Teradata ACTIVITY_COUNT documentation (https://docs.teradata.com/r/Enterprise_IntelliFlex_VMware/SQL-Stored-Procedures-and-Embedded-SQL/Result-Code-Variables/ACTIVITY_COUNT).
As explained in its translation specification, there is a workaround to emulate ACTIVITY_COUNT’s behavior through:
When using ACTIVITY_COUNT in a SELECT/SET INTO VARIABLE statement, it can not be simply replaced by the workaround mentioned above.
Example Code¶
Teradata¶
Snowflake¶
Manual Fix¶
Part of the workaround presented above can be used to still get the number of rows inserted/updated/deleted like this:
Instead of using the complete query, it needs to be adapted manually to Snowflake’s SELECT INTO VARIABLE syntax.
Furthermore, if RESULT_SCAN(LAST_QUERY_ID()) is giving incorrect results, check SSC-FDM-TD0033(../functional-difference/teradataFDM.md#ssc-fdm-td0033) for how to handle possible limitations of using LAST_QUERY_ID.
Best Practices¶
- Manually adapt the proposed workaround.
- Check SSC-FDM-TD0033(../functional-difference/teradataFDM.md#ssc-fdm-td0033) for how to handle possible limitations of using
LAST_QUERY_ID. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0024¶
Abort statement is not supported due to an aggregate function.
Severity¶
Low
Description¶
This EWI appears when an AGGREGATE function is part of an ABORT statement inside of a stored procedure. The statement is commented out.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0025¶
Output format not supported.
Severity¶
Low
Description¶
This EWI appears when a CAST function specifies an output format not supported by Snowflake scripting.
Note
When the format contains only recognized datetime elements and the operand type is a known datetime type, SSC-FDM-TD0046 is emitted instead. SSC-EWI-TD0025 is reserved for formats that contain unsupported elements or where the operand type cannot be resolved.
Code Example¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Check if the output code has functional equivalence with the original code.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0027¶
Snowflake does not support Teradata built-in time dimensions column options
Severity¶
Low
Description¶
The EWI is generated because Snowflake does not support the Teradata built-in time dimensions attributes like VALIDTIME or TRANSACTIONTIME.
Example Code¶
Teradata input:¶
Snowflake output:¶
Best Practices¶
- Manually create TIMESTAMP columns with default values such as CURRENT_TIMESTAMP.
- Leverage the use of table streams, they can record data manipulation changes made to tables as well as metadata about each change. (Guide)
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0029¶
Queue table functionality is not supported.
Severity¶
Low
Description¶
This warning appears when a TABLE with the QUEUE (https://www.docs.teradata.com/r/rgAb27O_xRmMVc_aQq2VGw/tHvboDYXkHchWgJ2CD6Uig) attribute is migrated. The QUEUE keyword is removed because it is not supported in snowflake.
Example Code¶
Input:¶
Output:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0031¶
The result may differ due to char type having a fixed length in Teradata
Severity¶
Low
Description¶
Since Teradata CHAR data type has a fixed length, some functions will try to match against the complete column value (including trailing padding spaces) instead of just the inserted value. In Snowflake, the CHAR type is variable-length, so comparisons match against the inserted values without padding.
Note
For LIKE expressions, SnowConvert AI automatically wraps CHAR columns with RTRIM() to strip trailing spaces.
Example Code¶
Input:¶
Output:¶
Best Practices¶
- For LIKE expressions, no manual action is required — SnowConvert AI automatically applies
RTRIM()to CHAR columns to preserve the original Teradata matching behavior. - For REGEXP_SIMILAR and other functions where this EWI appears, review the converted code to ensure CHAR padding does not affect the result. Consider wrapping the CHAR column with
RTRIM()manually if needed. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0034¶
Multistatement SQL is not supported.
Note
Some parts in the output code are omitted for clarity reasons.
Severity¶
Low
Description¶
Multistatement SQL execution is not supported. The request was handled as a transaction.
Note
The following EWI is only generated when the PL Target Language flag is set to Javascript, like this: ‘–PLTargetLanguage Javascript’
Example Code¶
Input:¶
Output:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0039¶
Input format not supported.
Severity¶
Medium
Description¶
The specified input format is not supported in Snowflake.
Example Code¶
Input:¶
Output:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0040¶
The FORMAT clause on a column definition cannot be automatically converted to Snowflake.
Severity¶
Low
Description¶
SnowConvert AI found a FORMAT clause on a column definition that it cannot translate to Snowflake. The FORMAT clause is preserved in the output and marked with this EWI so you can review it manually.
This issue is raised in two situations:
- Datetime columns with unsupported format elements: The format string contains elements that have no Snowflake equivalent (e.g.,
'EEEE'for day-of-week names). Because the format cannot be translated, no conversion functions are added to DML statements that reference this column. - Columns where the type could not be determined: If SnowConvert AI cannot resolve the column type, it falls back to this EWI as a safety measure.
When the FORMAT can be fully translated, SnowConvert AI uses SSC-FDM-TD0040 instead and adds conversion functions automatically. For character-type display-only formats like X(n), see SSC-FDM-TD0041.
Example Code¶
Input:¶
Output:¶
Notice that the string literal '03-30-2026' in the SELECT statement is left unchanged because the format could not be translated.
How FORMAT issues are classified¶
| Column Type | Format Pattern | Issue | DML Effect |
|---|---|---|---|
DATE, TIMESTAMP, TIME | Snowflake standard (e.g., 'YYYY-MM-DD', 'HH:MI:SS') | None (silently removed) | No conversion needed |
DATE, TIMESTAMP, TIME | Translatable non-standard (e.g., 'MM-DD-YYYY') | SSC-FDM-TD0040 | Conversion functions added automatically |
DATE, TIMESTAMP, TIME | Not translatable (e.g., 'EEEE') | SSC-EWI-TD0040 | No conversion added; manual fix needed |
VARCHAR, CHAR, CLOB, STRING | Display-only X(n) | SSC-FDM-TD0041 | No conversion needed |
| Any other | Any | SSC-EWI-TD0040 | No conversion added; manual fix needed |
Best Practices¶
- Review the format string and check whether it can be rewritten using Snowflake-supported format elements. If so, add the appropriate
TO_DATE,TO_TIMESTAMP, orTO_TIMEcall yourself. - If the format was used only for display purposes and does not affect how data is stored or queried, it can be safely removed.
- After conversion, verify that the converted code behaves correctly for any columns where this EWI appears.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0041¶
Trunc function was added to ensure integer.
Severity¶
Low
Description¶
When migrating Teradata to Snowflake, you may encounter differences in how numeric conversions are handled. In Teradata, casting a value to INTEGER will implicitly truncate any decimal part, even if the original value is a floating-point number or a string representation of a number. However, in Snowflake, casting a non-integer numeric or a string directly to INTEGER can result in errors or unexpected results if the value is not already an integer.
To ensure compatibility, the TRUNC() function is applied before casting to INTEGER. This strips any decimal portion, allowing safe conversion to an integer. However, if the source value is not numeric or is a non-numeric string, errors may still occur and manual intervention may be required. For example, if SnowConvert AI cannot determine the column type due to missing references, you may need to manually adjust the conversion.
Example Code¶
Input:¶
Output:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0046¶
Built-in reference is not supported in Snowflake.
Severity¶
Medium
Description¶
This error appears when there is a reference to a DBC (https://docs.teradata.com/r/Teradata-Archive/Recovery-Utility-Reference/March-2019/Archive/Recovery-Operations/Database-DBC) table and the selected column has no equivalence in Snowflake.
Example Code¶
Input:¶
Output:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0049¶
TPT-Statement not processed.
Severity¶
High
Description¶
A DML statement in TPT could not be processed and converted by the tool. This can happen for reasons like using concatenation with script variables or using escaping quotes inside the DML statement.
Example code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- For this issue, you can type the insert statement manually, and/or since the DML statement is not being supported yet, ask the SnowConvert AI team to add support for that specific case.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0051¶
Teradata BYTES function results differs from Snowflake LENGTH function for byte columns
Severity¶
Low
Description¶
Since Teradata byte datatype has a fixed length, BYTES function will always count the trailing zeros (https://docs.teradata.com/r/1DcoER_KpnGTfgPinRAFUw/f7V55vW7OB1nU2WltjLxig) inserted to fit smaller byte type values into the column, returning the size of the column instead of the size of the value inserted originally. However, Snowflake binary type has variable size, meaning that the LENGTH function will always return the size of the inserted values. Take the following code as an example:
Teradata:
Equivalent code in Snowflake:
Example code:¶
Input code:¶
Generated Code:¶
Best Practices¶
- Analyze the use given to the BYTES function results, the Snowflake LENGTH function behavior was the one desired from the start and no changes are required.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0052¶
Snowflake implicit conversion to numeric differs from Teradata and may fail for non-literal strings
Severity¶
Low
Description¶
Both Teradata and Snowflake allow string values to function that expect numeric parameters, these strings are then parsed and converted to their numeric equivalent.
However, there are differences on what the two languages consider a valid numeric string, Teradata is more permissive and successfully parses cases like empty / whitespace-only strings, embedded dashes, having no digits in the mantissa or exponent, currency signs, digit separators or specifying the sign of the number after the digits. For example, the following strings are valid:
'1-2-3-4-5' -> 12345'$50' -> 50'5000-' -> -5000'1,569,284.55' -> 1569284.55
Snowflake applies automatic optimistic string conversion, expecting the strings to match either the TM9 or TME formats, so conversion fails for most of the cases mentioned. To solve these differences, SnowConvert AI processes string literals passed to functions that do an implicit conversion to numeric and generates equivalent strings that match TM9 or TME so they can be parsed by Snowflake. This only applies to literal string values, meaning non-literal values have no guarantee to be parsed by Snowflake.
Example code¶
Input code:¶
Generated Code:¶
Best Practices¶
- No additional user actions are required.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0053¶
Snowflake does not support the period datatype, all periods are handled as varchar instead
Note
Some parts in the output code are omitted for clarity reasons.
Note
This EWI is deprecated, please refer to SSC-FDM-TD0036 documentation
Precision of generated varchar representations¶
PERIOD_UDF generates the varchar representation of a period using the default formats for timestamps and time specified in Snowflake, this means timestamps will have three precision digits and time variables will have zero, because of this you may find that the results have a higher/lower precision from the expected, there are two options to modify how many precision digits are included in the resulting string:
- Use the three parameters version of PERIOD_UDF: This overload of the function takes the
PRECISIONDIGITSparameter, an integer between 0 and 9 to control how many digits of the fractional time part will be included in the result. Note that even if Snowflake supports up to nine digits of precision the maximum in Teradata is six. Example:
| Call | Result |
|---|---|
PUBLIC.PERIOD_UDF(time '13:30:45.870556', time '15:35:20.344891', 0) | '13:30:45*15:35:20' |
PUBLIC.PERIOD_UDF(time '13:30:45.870556', time '15:35:20.344891', 2) | '13:30:45.87*15:35:20.34' |
PUBLIC.PERIOD_UDF(time '13:30:45.870556', time '15:35:20.344891', 5) | '13:30:45.87055*15:35:20.34489' |
-
Alter the session parameters
TIMESTAMP_NTZ_OUTPUT_FORMATandTIME_OUTPUT_FORMAT: The commandsALTER SESSION SET TIMESTAMP_NTZ_OUTPUT_FORMAT = <format>andALTER SESSION SET TIME_OUTPUT_FORMAT = <format>can be used to modify the formats Snowflake uses by default for the current session, modifying them to include the desired number of precision digits changes the result of future executions of PERIOD_UDF for the current session.
Example code¶
Input code:¶
Generated Code:¶
Best Practices¶
- Since the behavior of
PERIODand its related functions is emulated using varchar, we recommend reviewing the results obtained to ensure its correctness. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0055¶
Snowflake supported formats for TO_CHAR differ from Teradata and may fail or have different behavior
Note
This EWI is deprecated, please refer to SSC-FDM-TD0029 documentation
Format elements that depend on session parameters¶
Some Teradata format elements are mapped to Snowflake functions that depend on the value of session parameters. To avoid functional differences in the results you should set these session parameters to the same values they have in Teradata. Identified format elements that are mapped to this kind of functions are:
- D: Mapped to
DAYOFWEEKfunction, the results of this function depend on theWEEK_STARTsession parameter, by default Teradata considers Sunday as the first day of the week, while in Snowflake it is Monday. - WW: Mapped to
WEEKfunction, this function depends on the session parameterWEEK_OF_YEAR_POLICYwhich by default is set to use the ISO standard (the first week of year is the first to contain at least four days of January) but in Teradata is set to consider January first as the start of the first week.
To modify session parameters, use ALTER SESSION SET parameter_name = value. For more information, see the Snowflake session parameters reference.
Single parameter version of TO_ CHAR¶
The single parameter version of TO_CHAR(Datetime) makes use of the default formats specified in the session parameters TIMESTAMP_LTZ_OUTPUT_FORMAT, TIMESTAMP_NTZ_OUTPUT_FORMAT, TIMESTAMP_TZ_OUTPUT_FORMAT and TIME_OUTPUT_FORMAT. To avoid differences in behavior please set them to the same values used in Teradata.
For TO_CHAR(Numeric) Snowflake generates the varchar representation using either the TM9 or TME formats to get a compact representation of the number, Teradata also generates compact representations of the numbers so no action is required.
Example Code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- When using FF either try to use DateTime types with the same precision that you use in Teradata or add a precision to the format element to avoid the different behavior.
- When using timezone-related format elements, use the first parameter of type
TIMESTAMP_TZto avoid different behavior. Also remember that theTIMEtype cannot have time zone information in Snowflake. - Set the necessary session parameters with the default values from Teradata to avoid different behavior.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0057¶
Binary data in NEW JSON is not supported
Severity¶
Low
Description¶
The NEW JSON function accepts the JSON data represented as a string or in binary format. when the data is in its binary representation the function is not transformed since this binary format is not valid in Snowflake because it cannot interpret the metadata about the JSON object, for more information about this please see Teradata NEW JSON documentation (https://docs.teradata.com/r/C8cVEJ54PO4~YXWXeXGvsA/QpXrJfufgZ4uyeXFz7Rtcg).
Example Code¶
Input Code¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0059¶
Snowflake user default time zone may require configuration to match Teradata value
Severity¶
Low
Description¶
Same as Teradata, setting a default time zone value to the user will make sessions start using that time zone until a new value is defined for the session.
This warning is generated to remind that the same time zone that was defined for the user in Teradata should be set for the Snowflake user, to do this please use the following query in Snowflake: ALTER SESSION SET TIMEZONE = 'equivalent_timezone', remember that Snowflake only accepts IANA Time Zone Database (https://www.iana.org/time-zones) standard time zones.
Example Code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- Remember to set the default time zone of the user to a time zone equivalent to the one set for the Teradata user.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0060¶
JSON_TABLE not transformed, column names could not be retrieved from semantic information
Note
Some parts in the output code are omitted for clarity reasons.
Severity¶
Low
Description¶
The JSON_TABLE function can be transformed by SnowConvert AI, however, this transformation requires knowing the name of the columns that are being selected in the JSON_TABLE ON subquery.
This message is generated to warn the user that the column names were not explicitly put in the subquery (for example, a SELECT * was used) and the semantic information of the tables being referenced was not found, meaning the column names could not be extracted.
If you want know how to load JSON data into a table check this page
Example code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- Please check the code provided to SnowConvert AI is complete, if you did not provide the table definition please re-execute the code with the table definition present.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0061¶
TD_UNPIVOT transformation requires column information that could not be found, columns missing in result
Severity¶
Low
Description¶
SnowConvert AI not supports and transforms the TD_UNPIVOT (https://docs.teradata.com/r/Enterprise_IntelliFlex_VMware/SQL-Operators-and-User-Defined-Functions/Table-Operators/TD_UNPIVOT) function, which can be used to represent columns from a table as rows.
However, this transformation requires information about the table/tables columns to work, more specifically the names of the columns. When this information is not present the transformation may be left in an incomplete state where columns are missing from the result, this EWI is generated in these cases.
Example code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- There are two ways of supplying the information about columns to the conversion tool: put the table specification in the same file as the TD_UNPIVOT call or specify a column list in the SELECT query of the ON expression instead of SELECT * or the table name.
- This issue can be safely ignored if ALL the columns from the input table/tables are unpivoted, otherwise, the result will have missing columns.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0063¶
JSON path was not recognized
Note
Some parts in the output code are omitted for clarity reasons.
Severity¶
Medium
Description¶
This message is shown when SnowConvert AI cannot deserialize a JSON path because the string does not have the expected JSON format.
Example code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- Check if the Json path have an unexpected character, or do not have the right format.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0066¶
The following identifier has one or more Unicode escape characters that are invalid in snowflake
Note
Some parts in the output code are omitted for clarity reasons.
Severity¶
Low
Description¶
This message is shown when SnowConvert AI transforms a Teradata Unicode Delimited Identifier (https://docs.teradata.com/r/Teradata-Database-SQL-Fundamentals/June-2017/Basic-SQL-Syntax/Working-with-Unicode-Delimited-Identifiers) with invalid characters in Snowflake.
Example code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- Use identifiers with valid Unicode characters in Snowflake.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0068¶
Snowflake does not support profiles, referencing role instead
Severity¶
Medium
Description¶
Teradata profiles allow defining of multiple common parameters related to storage space and password constraints management.
However, Snowflake works with cloud architecture and automatically manages and optimizes storage, meaning no storage customization is done on the user side. Also, Snowflake currently has a password policy defined that applies to all user passwords and is not modifiable.
This error is generated when a reference to a Teradata profile is found to indicate that it was changed to a reference to the user’s role, which is the nearest approximation to a profile in Snowflake, although there might be differences in the query results unless the profile and role names of a user are the same.
Example code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- Avoid referencing user profiles, they are not supported, and query results will be different unless the user has the same name for both its profile and role.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0069¶
ST_DISTANCE results are slightly different from ST_SPHERICALDISTANCE
Note
This EWI is deprecated, please refer to SSC-FDM-TD0031 documentation
Severity¶
Low
Description¶
The Teradata function ST_SPHERICALDISTANCE calculates the distance between two spherical coordinates on the planet using the Haversine formula, on the other side, the Snowflake ST_DISTANCE function does not utilize the haversine formula to calculate the minimum distance between two geographical points.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0070¶
A return statement was added at the end of the label section to ensure the same execution flow
Note
This EWI is deprecated, please refer to SSC-FDM-TD0030 documentation
Severity¶
Medium
Description¶
When a Goto statement is replaced with a Label section and does not contain a return statement, one is added at the end of the section to ensure the same execution flow.
BTEQ after a Goto command is executed, the statements between the goto command and the label command with the same name are ignored. So, to avoid those statements being executed the label section should contain a return statement.
In addition, it is worth value mentioning the Goto command skips all the other statements except for the Label with the same name, which is when the execution resumes. Therefore, the execution will never resume in a label section defined before the Goto command.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0076¶
The use of foreign tables is not supported in Snowflake.
Severity¶
Medium
Description¶
Foreign tables (https://docs.teradata.com/r/Teradata-VantageTM-SQL-Data-Definition-Language-Syntax-and-Examples/September-2020/Table-Statements/CREATE-FOREIGN-TABLE) enable access to data in external object storage, such as semi-structured and unstructured data in Amazon S3, Azure Blob storage, and Google Cloud Storage. This syntax is not supported in Snowflake. However, there are other alternatives in Snowflake that can be used instead, such as external tables, iceberg tables, and standard tables.
Example code¶
Input code:¶
Generated Code:¶
Best Practices¶
- Instead of foreign tables in Teradata, you can use Snowflake external tables. External tables reference data files located in a cloud storage (Amazon S3, Google Cloud Storage, or Microsoft Azure) data lake. This enables querying data stored in files in a data lake as if it were inside a database. External tables can access data stored in any format supported by COPY INTO <table> statements.
- Another alternative is Snowflake’s Iceberg tables. So, you can think of Iceberg tables as tables that use open formats and customer-supplied cloud storage. This data is stored in Parquet files.
- Finally, there are the standard Snowflake tables which can be an option to cover the functionality of foreign tables in Teradata
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0077¶
RESET WHEN clause is not supported in this scenario due to its condition
Note
Some parts in the output code are omitted for clarity reasons.
Severity¶
Medium
Description¶
SnowConvert AI currently only supports RESET WHEN clauses with binary conditions (<=, >= or =). Any other type of condition, such as IS NOT NULL, the RESET WHEN clause will be removed and an error message will be added since it is not supported in Snowflake.
This error message also appears when the RESET WHEN condition references an expression whose definition was not found by the migration tool. Currently, the tool supports the alias references to a column that was defined in the same query.
Example Code¶
Condition is not binary¶
Input Code:¶
Generated Code¶
Condition expression was not found¶
Input Code:¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0079¶
The required period type column was not found
Note
Some parts in the output code are omitted for clarity reasons.
Severity¶
Low
Description¶
This warning is shown because the Period column necessary to replicate the functionality of Normalize clause was not found.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- To fix this warning manually you just need to find which was the first period column and remove all its references except where is defined, and then replace PeriodColumn with the column found.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0082¶
Translate function using the current encoding is not supported
Severity¶
Medium
Description¶
The usage of the Translate function using the current encoding arguments is not supported in Snowflake. The function is commented out during translation.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0083¶
Not able to transform two or more complex Select clauses at a time
Note
Some parts in the output code are omitted for clarity reasons.
Severity¶
Medium
Description¶
SnowConvert AI is not able to transform two or more complex SELECT clauses, as it is necessary to map them to a CTE or composite FROM clause, which causes the mapped code to not compile or enter into a logical cycle.
What do we consider a SELECT complex clause?¶
Those that required to be mapped to a CTE or composite FROM clause such as NORMALIZE, EXPAND ON, or RESET WHEN.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0087¶
GOTO statement was removed due to if statement inversion.
Note
This EWI is deprecated, please refer to SSC-FDM-TD0026 documentation
Note
Some parts in the output code are omitted for clarity reasons.
Severity¶
Medium
Description¶
It is common to use GOTO command with IF and LABEL commands to replicate the functionality of an SQL if statement. When used in this way, it is possible to transform them directly into an if, if-else, or even an if-elseif-else statement. However, in these cases, the GOTO commands become unnecessary and should be removed to prevent them from being replaced by a LABEL section.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0091¶
Expression converted as cast with possible errors due to missing dependencies.
Note
Some parts in the output code are omitted for clarity reasons
Severity¶
Medium
Description¶
In Teradata scripts, you can use the following syntax to CAST expressions:
Unfortunately, this syntax generates ambiguity when trying to convert a CAST to DATE or TIME since these keywords also behave as the CURRENT_DATE and CURRENT_TIME functions respectively.
Thus, without context about the expression to be CAST, there is no sure way to differentiate when we are dealing with an actual case of CAST or a function that accepts DATE or TIME as parameters.
In other words, it is required to know whether <expression> is a column or a user-defined function (UDF). To achieve this, when converting the code, one must add the CREATE TABLE or CREATE FUNCTION from which <expression> is dependant on.
E.g. check the following SELECT statement. With no context about AMBIGUOUS_EXPR, we have no way to determine if we are dealing with a function call or CAST to DATE. However, we do know that COL1 (DATE) is indeed a CAST since COL1 is a column from the table TAB.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0092¶
Translation for Teradata Built-In Table/View is not currently supported
Severity¶
Low
Description¶
This EWI is added when SnowConvert AI finds a Teradata system table that is currently not translated.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- Search in Snowflake’s internal tables, such as
Information_SchemaorSNOWFLAKE.ACCOUNT_USAGEfor equivalents - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0093¶
Format not supported and must be updated in all its varchar casting uses.
Severity¶
High
Description¶
This EWI is added when the CAST function is used to cast a numeric expression to another numeric type with a specified format. While the format does not impact the numeric value itself, if the result is subsequently cast to a string, the intended format will not be correctly applied. Therefore, it is necessary to update all instances where the result is cast to VARCHAR, ensuring the format defined in the EWI is used.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0094¶
The IMPORT command was not converted.
Severity¶
High
Description¶
This issue indicates that an .IMPORT (https://docs.teradata.com/r/Enterprise_IntelliFlex_Lake_VMware/Teradata-MultiLoad-Reference-20.00/Teradata-MultiLoad-Commands/IMPORT) command was not converted because it uses unsupported features. The original MLoad layout, DML, and import statements are commented out and each line is annotated with this EWI.
Features pending translation:
BINARYformatFASTLOADformat.TABLEtype layoutINMOD(https://docs.teradata.com/r/Enterprise_IntelliFlex_Lake_VMware/Teradata-MultiLoad-Reference-20.00/Teradata-MultiLoad-Commands/IMPORT/INMOD-Specification) optionAXSMODoption- Non
INSERT-VALUESDML statements
Missing required definitions:
.LAYOUT(https://docs.teradata.com/r/Enterprise_IntelliFlex_Lake_VMware/Teradata-MultiLoad-Reference-20.00/Teradata-MultiLoad-Commands/LAYOUT) definition was not found in the script.DML LABEL(https://docs.teradata.com/r/Enterprise_IntelliFlex_Lake_VMware/Teradata-MultiLoad-Reference-20.00/Teradata-MultiLoad-Commands/DML-LABEL) was not found in the script
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Convert the source file to a supported format (
VARTEXT,TEXT, orUNFORMAT) before running SnowConvert AI. - Manually rewrite the load using Snowflake stages and
COPY INTO. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0095¶
DML statement in IMPORT command is pending translation.
Severity¶
Medium
Description¶
This issue happens when a .IMPORT command uses a DML label that includes statements other than a basic INSERT ... VALUES (for example, UPDATE, DELETE, or more complex INSERT logic). In these cases, the converter will only transform the simple INSERT ... VALUES part into a COPY INTO statement for Snowflake. Any other DML statements are left in the output with a warning annotation, and are not automatically converted. This means that important logic—like updates or deletes—will not be migrated, which can affect your results. Please review and update your script to handle these cases, such as by using a MERGE statement for upserts.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Implement the equivalent upsert logic in Snowflake using
MERGE. - Load data into a staging table first, then merge into the target table.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0096¶
COPY INTO requires an explicit target file name.
Severity¶
Medium
Description¶
When the .IMPORT INFILE path consists solely of a bash variable (for example, ${FILE_PATH}) and no explicit file name can be inferred, this EWI is raised for the COPY INTO source. The converter cannot determine the file name to use in the stage path.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Adjust the original MLoad (https://docs.teradata.com/r/Enterprise_IntelliFlex_Lake_VMware/Teradata-MultiLoad-Reference-20.00/Using-Teradata-MultiLoad) script so that the file name is explicit (separate directory and file name).
- Use a literal file name with variable directory, for example,
.IMPORT INFILE ${DATA_DIR}/employees.csv ... - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0097¶
Local variables not supported in PUT or COPY INTO.
Severity¶
Medium
Description¶
This issue indicates the use of local MLoad variables, such as &FILE_NAME, defined with .SET (https://docs.teradata.com/r/Enterprise_IntelliFlex_Lake_VMware/Teradata-MultiLoad-Reference-20.00/Teradata-MultiLoad-Commands/SET) in INFILE paths. These cannot be resolved in the generated PUT or COPY INTO statements because Snowflake’s PUT command only supports literal paths or Snowflake CLI template variables (<%VAR%>), not Snowflake Scripting variables (:var).
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Replace local variables with bash variables (resolved by Snowflake CLI before execution).
- Alternatively, hard-code the file name directly.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-EWI-TD0098¶
PREPARE with USING clause containing non-variable expressions cannot be automatically migrated.
Severity¶
Medium
Description¶
This issue is raised when a PREPARE statement with an OPEN ... USING clause contains non-variable expressions such as function calls, arithmetic operations, or other complex expressions in the USING clause. SnowConvert AI can only automatically migrate USING clauses that contain simple variable references.
In Teradata, the OPEN cursor USING expr1, expr2 statement allows any expression to be bound to the query’s parameter markers (?). However, SnowConvert AI’s transformation to EXECUTE IMMEDIATE query USING (...) requires simple variable names to ensure correct binding behavior.
When complex expressions are detected in the USING clause, the PREPARE statement is left untransformed and marked with this EWI for manual review.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Extract expressions into variables: Before the PREPARE statement, assign complex expressions to intermediate variables:
- Manually transform to EXECUTE IMMEDIATE: Convert the PREPARE-cursor pattern to use EXECUTE IMMEDIATE with simple variable references in the USING clause.
- Test the conversion: Ensure that the binding behavior matches the original Teradata logic.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0001¶
Column converted from Blob data type.
Description¶
This message is shown when SnowConvert AI finds a data type BLOB. Since BLOB is not supported in Snowflake, the type is changed to Binary.
Code Example¶
Input Code:¶
Generated Code:¶
Best Practices¶
- No additional user actions are required.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0002¶
Column converted from Clob data type.
Description¶
This message is shown when SnowConvert AI finds a data type CLOB. Since CLOB is not supported in SnowConvert AI, the type is changed to VARCHAR.
Code Example¶
Input Code:¶
Generated Code:¶
Best Practices¶
- No additional user actions are required.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0003¶
Bash variable found, Snowflake CLI is required to run this script
Description¶
When the source code of a BTEQ script file migrated to Snowflake Scripting contains Bash variable placeholders ($variable or ${variable}), SnowConvert AI transforms them into Snowflake CLI template variables (<%variable%>).
This warning is generated to point out that the execution of the migrated script depends on Snowflake CLI to work. Snowflake CLI performs client-side substitution of all <% %> tokens before sending the SQL to Snowflake, regardless of position (including inside EXECUTE IMMEDIATE $$ blocks). Please consider the following when running the script:
- All variables must be supplied via the
-Dflag:snow sql -f script.sql -D "VAR=value". - Multiple variables require separate
-Dflags:snow sql -f script.sql -D "VAR1=value1" -D "VAR2=value2".
Example Code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- Run the migrated script with Snowflake CLI:
snow sql -f script.sql -D "variable=value" -D "colname=col1" -D "tablename=my_table" -D "id=123" - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0004¶
Period types are handled as two data fields
Description¶
Teradata has a period data type used to represent a time interval, with instances of this type having a beginning and ending bound of the same type (time, date or timestamp) along with a set of functions that allow initializing and manipulating period data such as PERIOD, BEGIN, END, and OVERLAPS.
Since the period type is not supported by Snowflake, SnowConvert AI transforms this type and its related functions using the following rules:
- Any period type declaration in column tables is migrated as a two column of the same type.
- The period value constructor function (https://docs.teradata.com/r/Enterprise_IntelliFlex_VMware/SQL-Date-and-Time-Functions-and-Expressions/Period-Functions-and-Operators) is migrated into two different constructors of the period subtype one with the begin value and the other with the end value.
- Supported functions that expect period type parameters are migrated to UDFs as well, these UDFs expect almost two parameters for the begin value and the end value.
Example code¶
Input code:¶
Generated Code:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0005¶
Non-standard time zone offsets are not supported in Snowflake, rounded to nearest valid time zone
Description¶
While Teradata provides the flexibility to define any time zone offset between -12:59 and +14:00 using the SET TIME ZONE query, Snowflake exclusively supports time zones listed in the IANA Time Zone Database (https://www.iana.org/time-zones). 
If the specified offset in the SET TIME ZONE query does not align with an IANA standard time zone, Snowflake will automatically round it to the nearest standard time zone with the closest offset. In such a case, a warning message will be generated.
Example Code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- No additional user actions are required.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0006¶
View With Check Option Not Supported.
Description¶
This message is shown when SnowConvert AI finds a view with the WITH CHECK OPTION clause. Which is not supported in Snowflake, so it is commented out from the code.
This clause works with updatable views that can be used to execute INSERT and UPDATE commands over the view and internally update the table associated with the view.
The clause is used to restrict the rows that will be affected by the command using the WHERE clause in the view.
For more details see the documentation (https://docs.teradata.com/r/SQL-Data-Definition-Language-Syntax-and-Examples/July-2021/View-Statements/CREATE-VIEW-and-REPLACE-VIEW/CREATE-VIEW-and-REPLACE-VIEW-Syntax-Elements/WITH-CHECK-OPTION) about the clause functionality.
Example code¶
Input code:¶
Generated Code:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0007¶
Variant column does not support collation.
Description¶
This message is shown when SnowConvert AI a Variant data type in the transformation of a code has a COLLATE clause. Since COLLATE is not supported with the data type VARIANT, it will be removed and a message will be added.
Example code¶
Input code:¶
Generated Code:¶
The data type JSON is converted to VARIANT, while NOT CASESPECIFIC is converted to a COLLATE clause.
Best Practices¶
- No additional user actions are required.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0008¶
When NVP_UDF fourth parameter is non-literal and it contains a backslash, that backslash needs to be escaped.
Description¶
Non-literal delimiters with spaces need their backslash escaped in Snowflake.
Example code¶
Input code¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0009¶
Converted from integer to varchar for current session default.
Description¶
This message is shown when SnowConvert AI finds a DEFAULT SESSION and the data type is NOT a VARCHAR. If that is the case, the data type is changed to VARCHAR and a message is added.
Code Example¶
Input Code:¶
Generated Code:¶
Let’s look at the example. Note that ColumnExample has a data type INTEGER with DEFAULT SESSION. Since the data type is not VARCHAR, in the output it is transformed to VARCHAR.
The data type of ColumnExample2 hasn’t changed since it is already VARCHAR.
Best Practices¶
- No additional user actions are required.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0010¶
Table columns between tables (Teradata) DBC.COLUMNSV and INFORMATION_SCHEMA.COLUMNS (Snowflake). But some columns might not have an exact match in Snowflake.
Description¶
Uses of the table DBC.COLUMNSV in Teradata are converted to INFORMATION_SCHEMA.COLUMNS, but some columns might not have an exact match in Snowflake. That means there are some columns in Teradata for which there is no equivalent in Snowflake, and there are others that do have a matching column but the content is not exactly the same.
.png)
.png)
Notice, for example, that there is no equivalent column for “ColumnFormat” in Snowflake and notice also that “DATA_TYPE” seems to be the match for the column “ColumnType” in Teradata, but their content greatly differ.
Code Example¶
Input Code:¶
Generated Code:¶
Best Practices¶
- Review what columns were used in Teradata and check if the available content in Snowflake matches your needs.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0011¶
Unicode BMP escape is not supported.
Description¶
Snowflake doesn’t support Unicode BMP, so this message is shown when SnowConvert AI transforms Teradata Unicode Delimited Character Literal (https://docs.teradata.com/r/Enterprise_IntelliFlex_VMware/SQL-Data-Types-and-Literals/Data-Literals/Unicode-Delimited-Character-Literals) with Unicode BMP escape to Snowflake.
Example code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- Check if a Unicode equivalent exists.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0012¶
Invalid default value.
Note
This FDM is deprecated, please refer to SSC-EWI-TD0006 documentation
Description¶
The DEFAULT TIME / DEFAULT DATE / DEFAULT CURREN_DATE / DEFAULT DEFAULT CURRENT_TIME / DEFAULT CURRENT_TIMESTAMP column specifications are not supported for the FLOAT data type.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- No additional user actions are required.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0013¶
The Snowflake error code doesn’t match the original Teradata error code.
Description¶
This message is shown because the error code saved in the BTEQ ERRORCODE built-in variable could not be the same in Snowflake Scripting.
Example code¶
Input code:¶
Generated Code:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0014¶
File execution inconsistency
Description¶
This EWI appears when the migrated code is a BTEQ sentence executing an environment file with SQL statements E.g. $(<$INPUT_SQL_FILE). The difference between the BTEQ execution and the python generated code is that BTEQ continues with the other statements in the file when one of them fails but the python execution stops whenever an error occurs.
Example Code¶
Teradata BTEQ:¶
Python:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0015¶
Regexp_Substr Function only supports POSIX regular expressions.
Note
This FDM is deprecated, please refer to SSC-EWI-0009 documentation
Description¶
Currently, there is no support in Snowflake for extended regular expression beyond the POSIX Basic Regular Expression syntax.
This EWI is added every time a function call to REGEX_SUBSTR, REGEX_REPLACE, or REGEX_INSTR is transformed to Snowflake to warn the user about possible unsupported regular expressions. Some of the features not supported are lookahead, lookbehind, and non-capturing groups.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- Check the regular expression used in each case to determine whether it needs manual intervention. More information about expanded regex support and alternatives in Snowflake can be found here (https://community.snowflake.com/s/question/0D50Z00007ENLKsSAP/expanded-support-for-regular-expressions-regex)**.** 
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0016¶
Value ‘l’ for parameter ‘match_arg’ is not supported in Snowflake
Description¶
In Teradata functions like REGEX_SUBSTR, REGEX_REPLACE, or REGEX_INSTR have a parameter called “match_arg”, a character argument with the following valid values:
'i': case-insensitive matching.'c': case sensitive matching.'n': the period character (match any character) can match the newline character.'m': source string is treated as multiple lines instead of as a single line.'l': if source_string exceeds the current maximum allowed source_string size (currently 16 MB), a NULL is returned instead of an error.'x': ignore whitespace (only affects the pattern string).
The argument can contain more than one character.
In Snowflake, the equivalent argument for these functions is regexp_parameters.A string of one or more characters that specifies the regular expression parameters used for searching for matches. The supported values are:
c: case-sensitive.i: case-insensitive.m: multi-line mode.e: extract sub-matches.s: the ‘.’ the wildcard also matches the newline character as well.
As it can be seen, values 'i', 'c', 'm' are the same in both languages, and the 'n' value in Teradata is mapped to 's'. However, values 'l', 'x' don’t have an equivalent counterpart.
For the 'x' value, the functionality is replicated by generating a call to the REGEXP_REPLACE function. However, the 'l' parameter can not be replicated so this warning is generated for these cases.
Input Code:¶
Generated Code:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0017¶
The use of foreign tables is not supported in Snowflake.
Note
This FDM is deprecated, please refer to SSC-EWI-TD0076 documentation
Description¶
Foreign tables (https://docs.teradata.com/r/Teradata-VantageTM-SQL-Data-Definition-Language-Syntax-and-Examples/September-2020/Table-Statements/CREATE-FOREIGN-TABLE) enable access to data in external object storage, such as semi-structured and unstructured data in Amazon S3, Azure Blob storage, and Google Cloud Storage. This syntax is not supported in Snowflake. However, there are other alternatives in Snowflake that can be used instead, such as external tables, iceberg tables, and standard tables.
Example code¶
Input code:¶
Generated Code:¶
Best Practices¶
- Instead of foreign tables in Teradata, you can use Snowflake external tables. External tables reference data files located in a cloud storage (Amazon S3, Google Cloud Storage, or Microsoft Azure) data lake. This enables querying data stored in files in a data lake as if it were inside a database. External tables can access data stored in any format supported by COPY INTO <table> statements. * Another alternative is Snowflake’s Iceberg tables. So, you can think of Iceberg tables as tables that use open formats and customer-supplied cloud storage. This data is stored in Parquet files. * Finally, there are the standard Snowflake tables which can be an option to cover the functionality of foreign tables in Teradata * If you need more support, you can email us at snowconvert-support@snowflake.com ## SSC-FDM-TD0018 JSON path was not recognized ### Description This message is shown when SnowConvert AI cannot deserialize a JSON path, because the string does not have the expected format or is not supported in Snowflake. #### Example code ##### Input Code:
Note
This FDM is deprecated, please refer to SSC-EWI-TD0063 documentationSELECT * FROM JSON_TABLE ( ON ( SELECT id, trainSchedule as ts FROM demo.PUBLIC.Train T ) USING rowexpr('$weekShedule.Monday[*]') colexpr( '[{"jsonpath" "$.time", "type"" : "CHAR ( 12 )"}]' ) ) AS JT(Id, Ordinal, Time, City);##### Generated Code:SELECT * FROM --** SSC-FDM-TD0018 - UNRECOGNIZED JSON PATH $weekShedule.Monday[*] ** JSON_TABLE ( ON !!!RESOLVE EWI!!! /*** SSC-EWI-0108 - THE FOLLOWING SUBQUERY MATCHES AT LEAST ONE OF THE PATTERNS CONSIDERED INVALID AND MAY PRODUCE COMPILATION ERRORS ***/!!! ( SELECT id, trainSchedule as ts FROM demo.PUBLIC.Train T ) USING rowexpr('$weekShedule.Monday[*]') colexpr( '[{"jsonpath" "$.time", "type"" : "CHAR ( 12 )"}]' ) ) AS JT(Id, Ordinal, Time, City);#### Best Practices * Check if the JSON path have an unexpected character, or do not have the right format. * If you need more support, you can email us at snowconvert-support@snowflake.com ## SSC-FDM-TD0019 Transaction and profile level query tags not supported in Snowflake, referencing session query tag instead ### Description Teradata allows users to define query bands at transaction, session, and profile levels, as well as consulting them with functions like GetQueryBandValue. Snowflake equivalent for query bands is the query_tag parameter, which can be set for session, user or account. Also, Snowflake does not have profiles. Due to these differences, this FDM is added to warn the user that transaction or profile-level query tags can not be defined nor consulted in Snowflake and that session-level query tags will be used as a replacement, which may cause functional differences in some cases. #### Example Code ##### Input Code:SELECT GETQUERYBANDVALUE(3, 'account');##### Generated CodeSELECT --** SSC-FDM-TD0019 - TRANSACTION AND PROFILE LEVEL QUERY TAGS NOT SUPPORTED IN SNOWFLAKE, REFERENCING SESSION QUERY TAG INSTEAD ** GETQUERYBANDVALUE_UDF('account');#### Best Practices * Modify your code logic to use query bands at the session level. * If you need more support, you can email us at snowconvert-support@snowflake.com ## SSC-FDM-TD0020 JSON value was not recognized due to invalid format### Description This message is shown when SnowConvert AI needs to deserialize JSON data for a transformation context, but the JSON value didn’t have the expected format or is not valid JSON. #### Example code ##### Input Code:Note
Some parts in the output code are omitted for clarity reasons.SELECT * FROM JSON_TABLE (ON (SELECT id, trainSchedule as ts FROM demo.PUBLIC.Train T) USING rowexpr('$.weekShedule.Monday[*]') colexpr('[ {"ordinal" true}, {"jsonpath" "$.time", "type"" : "CHAR ( 12 )"}, {"jsonpath" "$.city", "type" : "VARCHAR ( 12 )"}]')) AS JT(Id, Ordinal, Time, City); SELECT * FROM JSON_TABLE (ON (SELECT id, trainSchedule as ts FROM demo.PUBLIC.Train T) USING rowexpr('$.weekShedule.Monday[*]') colexpr('{"jsonpath" "$.time", "type"" : "CHAR ( 12 )"}')) AS JT(Id, Ordinal, Time, City);##### Generated Code:SELECT * FROM ( SELECT id --** SSC-FDM-TD0020 - UNRECOGNIZED JSON LITERAL [ {"ordinal" true}, {"jsonpath" "$.time", "type"" : "CHAR ( 12 )"}, {"jsonpath" "$.city", "type" : "VARCHAR ( 12 )"}] ** FROM demo.PUBLIC.Train T, TABLE(FLATTEN(INPUT => trainSchedule:weekShedule.Monday)) rowexpr ) JT; SELECT * FROM ( SELECT id --** SSC-FDM-TD0020 - UNRECOGNIZED JSON LITERAL {"jsonpath" "$.time", "type"" : "CHAR ( 12 )"} ** FROM demo.PUBLIC.Train T, TABLE(FLATTEN(INPUT => trainSchedule:weekShedule.Monday)) rowexpr ) JT;#### Best Practices * Be sure the JSON has the expected format according to the Teradata grammar. * If you need more support, you can email us at snowconvert-support@snowflake.com ## SSC-FDM-TD0021 Built-in reference to {0} is not supported in Snowflake.### Description This error appears when a query referencing DBC.DATABASES (https://www.docs.teradata.com/r/hNI_rA5LqqKLxP~Y8vJPQg/GqTx8VuBIkfaC4fso9f5cw) table is executed, and the selected column has no equivalence in Snowflake. #### Example Code ##### Input:Note
This EWI is deprecated, please refer to SSC-EWI-TD0046 documentationCREATE VIEW SAMPLE_VIEW AS SELECT PROTECTIONTYPE FROM DBC.DATABASES;##### Output:CREATE OR REPLACE VIEW SAMPLE_VIEW COMMENT = '{ "origin": "sf_sc", "name": "snowconvert", "version": { "major": 0, "minor": 0, "patch": "0" }, "attributes": { "component": "teradata", "convertedOn": "08/14/2024" }}' AS SELECT !!!RESOLVE EWI!!! /*** SSC-EWI-TD0046 - BUILT-IN REFERENCE TO PROTECTIONTYPE IS NOT SUPPORTED IN SNOWFLAKE ***/!!! PROTECTIONTYPE FROM INFORMATION_SCHEMA.DATABASES;#### Best Practices * If you need more support, you can email us at snowconvert-support@snowflake.com ## SSC-FDM-TD0022 Shell variables found, running this code in a shell script is required. ### Description In Teradata scripts, shell variables are used to store temporary values that can be accessed and manipulated throughout the script. Shell variables are defined using the dollar sign ($) followed by a name (which can be enclosed by curly braces), and their values can be set using the assignment operator (=).#!/bin/bash ## define a shell variable tablename="mytable" ## use the variable in a Teradata SQL query bteq <<EOF .LOGON myhost/myuser,mypassword SELECT * FROM ${tablename}; .LOGOFF EOFYou can think of shell variables having the same or similar function as string interpolation. Thus, it is important to keep this functionality when transformed.\ \ When converting Scripts to Python, shell variables keep their functionality by running the converted code in a shell script (.sh file). For this reason, these shell variables must keep the same format as the input code. ### Example Code #### Input Code:SELECT $column FROM ${tablename}##### Generated Code#*** Generated code is based on the SnowConvert AI Python Helpers version 2.0.6 *** import os import sys import snowconvert.helpers from snowconvert.helpers import Export from snowconvert.helpers import exec from snowconvert.helpers import BeginLoading con = None #** SSC-FDM-TD0022 - SHELL VARIABLES FOUND, RUNNING THIS CODE IN A SHELL SCRIPT IS REQUIRED ** def main(): snowconvert.helpers.configure_log() con = snowconvert.helpers.log_on() exec(""" SELECT $column FROM ${tablename} """) snowconvert.helpers.quit_application() if __name__ == "__main__": main()#### Best Practices * Running the converted code in a shell script is required. * If you need more support, you can email us at snowconvert-support@snowflake.com ## SSC-FDM-TD0023 String Similarity might have a different behavior. ### Description This FDM is shown when SnowConvert AI transforms the Similarity Function from Teradata to Snowflake. It indicates the results might have different behavior. #### Example Code Given the following data as an example | Id | a | b | | – | -------------- | --------------- | | 1 | | | | 2 | Gute nacht | Ich weis nicht | | 3 | Ich weiß nicht | Ich wei? nicht | | 4 | Ich weiß nicht | Ich wei? nicht | | 5 | Ich weiß nicht | Ich weiss nicht | | 6 | Snowflake | Oracle | | 7 | święta | swieta | | 8 | NULL | | | 9 | NULL | NULL | ##### Input Code: ##### Query-- Additional Params: -q SnowScript SELECT * FROM StringSimilarity ( ON ( SELECT id, CAST(a AS VARCHAR(200)) AS a, CAST(b AS VARCHAR(200)) AS b FROM table_1 ) PARTITION BY ANY USING ComparisonColumnPairs ('jaro_winkler(a,b) AS sim_fn') Accumulate ('id') ) AS dt ORDER BY 1;##### ResultId sim_fn 1 0 2 0.565079365 3 1 4 0.959047619 5 0 6 0.611111111 7 0.7777777777777777 8 0 9 0
Generated Code¶
Query¶
Result¶
| ID | SIM_FN |
|---|---|
| 1 | 0.000000 |
| 2 | 0.560000 |
| 3 | 0.970000 |
| 4 | 0.950000 |
| 5 | 0.000000 |
| 6 | 0.610000 |
| 7 | 0.770000 |
| 8 | 0.000000 |
| 9 | 0.000000 |
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0024¶
Set table functionality not supported.
Description¶
This EWI is shown when SnowConvert AI finds a Create Table with the SET option. Since the SET TABLE is not supported in Snowflake, it is removed.
Example Code¶
Teradata:¶
Snowflake Scripting:¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0025¶
Teradata Database Temporal Table is not supported in Snowflake
Description¶
The Teradata Database Temporal Support (https://docs.teradata.com/r/0TSAVrLIwk23SLHbA4nUvQ/root) involves the creation of temporal tables and temporal DDL and DML objects. The support for temporal (time-aware) tables and data are not supported in Snowflake since there is not an absolute equivalent.
All these statements are recognized (parsed) by SnowConvert AI, but to execute the queries in Snowflake, these elements are removed in the translation process.
It is worth noting that in cases where an abort statement is encountered, it will be transformed into a Delete command to keep the equivalence functionality allows you to undo operations performed during a transaction and restore the database to the state it had at the beginning.
Example code¶
The following example shows a Temporal-form Select being translated to a usual Select.
Input code:¶
Generated Code:¶
Case where the Abort command is used in the context of a transaction.
Input code:¶
Generated Code:¶
Best Practices¶
- No additional user actions are required.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0026¶
GOTO statement was removed due to if statement inversion.
Note
Some parts in the output code are omitted for clarity reasons.
Description ¶
It is common to use GOTO command with IF and LABEL commands to replicate the functionality of an SQL if statement. When used in this way, it is possible to transform them directly into an if, if-else, or even an if-elseif-else statement. However, in these cases, the GOTO commands become unnecessary and should be removed to prevent them from being replaced by a LABEL section.
Example Code ¶
Input Code:
Output Code
Best Practices ¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0027¶
TD_UNPIVOT transformation requires column information that could not be found, columns missing in result
Note
This FDM is deprecated, please refer to SSC-EWI-TD0061 documentation.
Description¶
SnowConvert AI supports and transforms the TD_UNPIVOT (https://docs.teradata.com/r/Enterprise_IntelliFlex_VMware/SQL-Operators-and-User-Defined-Functions/Table-Operators/TD_UNPIVOT) function, which can be used to represent columns from a table as rows.
However, this transformation requires information about the table/tables columns to work, more specifically the names of the columns. When this information is not present the transformation may be left in an incomplete state where columns are missing from the result, this EWI is generated in these cases.
Example code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- There are two ways of supplying the information about columns to the conversion tool: put the table specification in the same file as the TD_UNPIVOT call or specify a column list in the SELECT query of the ON expression instead of SELECT * or the table name.
- This issue can be safely ignored if ALL the columns from the input table/tables are unpivoted, otherwise, the result will have missing columns.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0028¶
JSON_TABLE not transformed, column names could not be retrieved from semantic information
Note
This FDM is deprecated, please refer to SSC-EWI-TD0060 documentation.
Description¶
The JSON_TABLE function can be transformed by SnowConvert AI, however, this transformation requires knowing the name of the columns that are being selected in the JSON_TABLE ON subquery.
This message is generated to warn the user that the column names were not explicitly put in the subquery (for example, a SELECT * was used) and the semantic information of the tables being referenced was not found, meaning the column names could not be extracted.
Example code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- Please check the code provided to SnowConvert AI is complete, if you did not provide the table definition please re-execute the code with the table definition present.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0029¶
Snowflake supported formats for TO_CHAR differ from Teradata and may fail or have different behavior
Format elements that depend on session parameters¶
Some Teradata format elements are mapped to Snowflake functions that depend on the value of session parameters. To avoid functional differences in the results you should set these session parameters to the same values they have in Teradata. Identified format elements that are mapped to this kind of functions are:
- D: Mapped to
DAYOFWEEKfunction, the results of this function depend on theWEEK_STARTsession parameter, by default Teradata considers Sunday as the first day of the week, while in Snowflake it is Monday. - WW: Mapped to
WEEKfunction, this function depends on the session parameterWEEK_OF_YEAR_POLICYwhich by default is set to use the ISO standard (the first week of year is the first to contain at least four days of January) but in Teradata is set to consider January first as the start of the first week.
To modify session parameters, use ALTER SESSION SET parameter_name = value. For more information, see the Snowflake session parameters reference.
Single parameter version of TO_ CHAR¶
The single parameter version of TO_CHAR(Datetime) makes use of the default formats specified in the session parameters TIMESTAMP_LTZ_OUTPUT_FORMAT, TIMESTAMP_NTZ_OUTPUT_FORMAT, TIMESTAMP_TZ_OUTPUT_FORMAT and TIME_OUTPUT_FORMAT. To avoid differences in behavior please set them to the same values used in Teradata.
For TO_CHAR(Numeric) Snowflake generates the varchar representation using either the TM9 or TME formats to get a compact representation of the number, Teradata also generates compact representations of the numbers so no action is required.
Example Code¶
Input Code:¶
Generated Code:¶
Best Practices¶
- When using FF either try to use DateTime types with the same precision that you use in Teradata or add a precision to the format element to avoid the different behavior.
- When using timezone-related format elements, use the first parameter of type
TIMESTAMP_TZto avoid different behavior. Also remember that theTIMEtype cannot have time zone information in Snowflake. - Set the necessary session parameters with the default values from Teradata to avoid different behavior.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0030¶
A return statement was added at the end of the label section to ensure the same execution flow
Description¶
When a Goto statement is replaced with a Label section and does not contain a return statement, one is added at the end of the section to ensure the same execution flow.
BTEQ after a Goto command is executed, the statements between the goto command and the label command with the same name are ignored. So, to avoid those statements being executed the label section should contain a return statement.
In addition, it is worth value mentioning the Goto command skips all the other statements except for the Label with the same name, which is when the execution resumes. Therefore, the execution will never resume in a label section defined before the Goto command.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0031¶
ST_DISTANCE results are slightly different from ST_SPHERICALDISTANCE
Description¶
The Teradata function ST_SPHERICALDISTANCE calculates the distance between two spherical coordinates on the planet using the Haversine formula, on the other side, the Snowflake ST_DISTANCE function does not utilize the haversine formula to calculate the minimum distance between two geographical points.
Example Code¶
Input Code:¶
Teradata Output¶
| location1 | location2 | Distance_In_Km |
|---|---|---|
| POINT (-73.989308 40.741895) | POINT (40.741895 34.053691) | 9351139.978062356 |
Generated Code¶
Snowflake Output¶
| LOCATION1 | LOCATION2 | DISTANCE_IN_KM |
|---|---|---|
| { “coordinates”: [ -73.989308, 40.741895 ], “type”: “Point” } | { “coordinates”: [ 40.741895, 34.053691 ], “type”: “Point” } | 9351154.65572674 |
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0032¶
CASESPECIFIC clause was removed from LIKE expression
Note
Some parts in the output code are omitted for clarity reasons.
Description¶
This error appears when the LIKE expression is accompanied by the [NOT] CASESPECIFIC clause.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- Case-Specific Behavior in TERADATA depends on TMODE system configuration.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0033¶
ACTIVITY_COUNT transformation might require manual adjustments
Description¶
The ACTIVITY_COUNT status variable returns the number of rows affected by an SQL DML statement in an embedded SQL or stored procedure application. For more information, see the Teradata ACTIVITY_COUNT documentation (https://docs.teradata.com/r/Enterprise_IntelliFlex_VMware/SQL-Stored-Procedures-and-Embedded-SQL/Result-Code-Variables/ACTIVITY_COUNT).
As explained in its translation specification, there is a workaround to emulate ACTIVITY_COUNT’s behavior through:
However, this presents some limitations listed below.
Limitations ¶
First case¶
If ACTIVITY_COUNT is called twice or more times before executing another DML statement, the transformation might not return the expected values.
Teradata¶
Snowflake¶
In both procedures, ACTIVITY_COUNT is called twice before another DML statement is called. In Teradata, ACTIVITY_COUNT will return the number of rows in the INSERT statement above them, even when called twice. However, since the Snowflake transformation uses LAST_QUERY_ID(), the result depends on the result set held by LAST_QUERY_ID().
InsertEmployeeSalaryAndLog_1() requires no manual adjustments. Check the Query History (bottom-up):
.png)
INSERTstatement is executed.LAST_QUERY_ID()will point to this statement.SELECT(firstACTIVITY_COUNT) is executed, and$1will be1.LAST_QUERY_ID()will point to this statement.SELECT(secondACTIVITY_COUNT) is executed; since the last statement result was1,$1will be1for thisSELECTas well.- Finally,
row_count1holds the value1, which is inserted inactivity_log.
On the other side, InsertEmployeeSalaryAndLog_2() does require manual adjustments. Check the Query History (bottom-up):
.png)
INSERTstatement is executed.LAST_QUERY_ID()will point to this statement.- SELECT (first
ACTIVITY_COUNT) is executed, and$1will be1. However, notice howQUERY_TEXThas the+ 10; this will affect the result that will be scanned.LAST_QUERY_ID()will point to this statement. SELECT(secondACTIVITY_COUNT) is executed. The result for the last query is11; thus$1will hold11instead of the expected1.- Finally,
row_count1holds the value11, which is inserted inactivity_log.
These are the values inserted in activity_log:
| LOG_ID | OPERATION | ROW_COUNT | LOG_TIMESTAMP |
|---|---|---|---|
| 1 | INSERT PROCEDURE | 1 | 2024-07-15 09:22:21.725 |
| 101 | INSERT PROCEDURE | 11 | 2024-07-15 09:22:26.248 |
Adjustments for the first case¶
As per Snowflake’s documentation for LAST_QUERY_ID, you can specify the query to return, based on the position of the query. LAST_QUERY_ID(-1) returns the latest query, (-2) the second last query, and so on.
The fix for the problem in InsertEmployeeSalaryAndLog_2() will be to simply specify LAST_QUERY_ID(-2) in the second use of ACTIVITY_COUNT (second SELECT) so that it gets the results from the INSERT statement instead:
Second case¶
If ACTIVITY_COUNT is called after a non DML statement was executed, the transformation will not return the expected values.
Teradata¶
Snowflake¶
Similar to the previous, LAST_QUERY_ID does not point to the correct query and thus returns an incorrect value, which is assigned to row_count1. Check the Query History (bottom-up):
.png)
INSERTstatement is executed.LAST_QUERY_ID()will point to this statement.SELECT INTOis executed, and $1 will be 101.LAST_QUERY_ID()will point to this statement.SELECT(ACTIVITY_COUNT) is executed. The result for the last query is101; thus$1will hold101instead of the expected 1.- Finally,
row_count1holds the value101, which is inserted inactivity_log.
These are the values inserted in activity_log:
| LOG_ID | OPERATION | ROW_COUNT | LOG_TIMESTAMP |
|---|---|---|---|
| 1 | EMPLOYEE INSERTED - ID: 101 | 101 | 2024-07-15 11:00:38.000 |
Adjustments for the second case¶
- One possible fix is to specify the correct query to return by
LAST_QUERY_ID. For example, hereLAST_QUERY_ID(-2)will be the correct query to point to.
- Another possible fix is to use
ACTIVITY_COUNT(SELECT) immediately after executing theINSERTstatement.
Best Practices¶
- Make sure to point to the correct query when using
LAST_QUERY_ID. - Make sure
ACTIVITY_COUNTis used immediately after the DML statement to evaluate. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0034¶
Period contains transformed to user defined function.
Description¶
The Teradata CONTAINS expression performs a validation indicating whether the element at the right is contained in the element at the left which is supposed to be of PERIOD type. The CONTAINS only applies for DATE, TIME, TIMESTAMP or PERIOD. Since PERIOD is not supported in Snowflake, an user-defined function will emulate the logic of the native CONTAINS behavior.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
-
The
VARCHARused instead ofPERIODassumes<PERIOD_BEGIN>*<PERIOD_END>format in all the values. If the values are split by a token different than*, you can change the value returned from thePUBLIC.GET_PERIOD_SEPARATORUDF provided by SnowConvert AI. Notice that the structure should have a token that marks the begin and end of a PERIOD, so the two dates, times or timestamps should be always separated with the same token. -
If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0035¶
Statistics function not needed in Snowflake.
Note
This FDM is deprecated, please refer to SSC-EWI-0037 documentation
Description¶
DROP, COLLECT, or HELP statistics are not needed in Snowflake. Snowflake already collects statistics used for automatic query optimization, which is why these statistics statements are used in Teradata.
Example Code¶
Input Code:¶
Generated Code¶
Best Practices¶
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0036¶
Snowflake does not support the period datatype, all periods are handled as varchar instead
Note
Some parts in the output code are omitted for clarity reasons.
Precision of generated varchar representations¶
PERIOD_UDF generates the varchar representation of a period using the default formats for timestamps and time specified in Snowflake, this means timestamps will have three precision digits and time variables will have zero, because of this you may find that the results have a higher/lower precision from the expected, there are two options to modify how many precision digits are included in the resulting string:
- Use the three parameters version of PERIOD_UDF: This overload of the function takes the
PRECISIONDIGITSparameter, an integer between 0 and 9 to control how many digits of the fractional time part will be included in the result. Note that even if Snowflake supports up to nine digits of precision the maximum in Teradata is six. Example:
| Call | Result |
|---|---|
PUBLIC.PERIOD_UDF(time '13:30:45.870556', time '15:35:20.344891', 0) | '13:30:45*15:35:20' |
PUBLIC.PERIOD_UDF(time '13:30:45.870556', time '15:35:20.344891', 2) | '13:30:45.87*15:35:20.34' |
PUBLIC.PERIOD_UDF(time '13:30:45.870556', time '15:35:20.344891', 5) | '13:30:45.87055*15:35:20.34489' |
-
Alter the session parameters
TIMESTAMP_NTZ_OUTPUT_FORMATandTIME_OUTPUT_FORMAT: The commandsALTER SESSION SET TIMESTAMP_NTZ_OUTPUT_FORMAT = <format>andALTER SESSION SET TIME_OUTPUT_FORMAT = <format>can be used to modify the formats Snowflake uses by default for the current session, modifying them to include the desired number of precision digits changes the result of future executions of PERIOD_UDF for the current session.
Example code¶
Input code:¶
Generated Code:¶
Best Practices¶
- Since the behavior of
PERIODand its related functions is emulated using varchar, we recommend reviewing the results obtained to ensure its correctness. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0037¶
LOGTABLE removed.
Description¶
The .LOGTABLE (https://docs.teradata.com/r/Enterprise_IntelliFlex_Lake_VMware/Teradata-MultiLoad-Reference-20.00/Teradata-MultiLoad-Commands/LOGTABLE) command in MLoad (https://docs.teradata.com/r/Enterprise_IntelliFlex_Lake_VMware/Teradata-MultiLoad-Reference-20.00/Using-Teradata-MultiLoad) is used for checkpoint and restart metadata, but Snowflake handles these features automatically. Instead of .LOGTABLE, you can monitor and audit your data loads in Snowflake using the COPY_HISTORY function and related account usage views.
Code Example¶
Input Code:¶
Generated Code:¶
Best Practices¶
- Use
COPY_HISTORYand related Snowflake account usage views to monitor load history. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0038¶
PUT command requires execution through Snowflake CLI.
Description¶
The PUT command lets you upload files to a Snowflake stage, but it only works when you run your script with Snowflake CLI (snow sql -f script.sql). It won’t work inside scripts, procedures, or the web UI. If your script includes a PUT command, make sure to run it using Snowflake CLI.
Code Example¶
Generated Code:¶
Best Practices¶
- Run scripts containing
PUTcommands using Snowflake CLI:snow sql -f script.sql. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0039¶
Collation handled at query level for this table, any new query over this table should apply collation appropriately.
Description¶
When the “Disable use of COLLATE for Case Specification” general conversion setting is enabled, SnowConvert AI will emulate the case insensitive behavior of the NOT CASESPECIFIC clause by modifying comparisons in queries with the UPPER function, this is performed at query level instead of using collation at the column level. This warning will be generated on any table whose case sensitivity is being emulated at the query level to remind the user that any new query over these tables will require to properly handle the case sensitivity behavior on comparisons.
Example code¶
Input code:¶
Generated Code:¶
Best Practices¶
- If you provided all your queries over the table to SnowConvert as part of your transformation then no additional actions are required, this FDM is informational only.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0046¶
CAST-level date FORMAT clause is not supported in Snowflake. If the column is used as a string, an explicit TO_VARCHAR with the format may be needed.
Description¶
In Teradata, the FORMAT clause on a CAST expression controls how a date or timestamp value is displayed when implicitly converted to a string. For example:
Snowflake does not support inline FORMAT clauses on CAST. When SnowConvert AI can verify that the format contains only recognized datetime elements (e.g., YYYY, MM, DD), it removes the FORMAT clause and emits this FDM instead of an EWI, because the date conversion itself is functionally correct — the only difference is the display format.
If the result is later used in a string context (concatenation, assignment to VARCHAR, etc.), you may need to wrap the expression with TO_VARCHAR and the corresponding Snowflake format string.
Note
When the format contains unsupported or unrecognized elements, or when the operand type cannot be resolved as a datetime type, SSC-EWI-TD0025 is emitted instead.
Example Code¶
Input Code:¶
Generated Code:¶
When the result is wrapped in a CAST ... AS VARCHAR, SnowConvert AI applies the format inside a TO_VARCHAR call:
Input Code:¶
Generated Code:¶
Best Practices¶
- If the converted column is only used as a date (comparisons, filters, date arithmetic), the conversion is functionally equivalent and no action is needed.
- If the column is used in a string context (e.g., concatenation, display, assignment to
VARCHAR), wrap the expression withTO_VARCHARand the appropriate Snowflake format string. For example:TO_VARCHAR(TO_DATE(capture_date), 'YYYY/MM/DD'). - Review the Snowflake TO_VARCHAR documentation for supported format models.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0040¶
Column-level FORMAT clause is not supported in Snowflake. Conversion functions are used in DML statements as a workaround.
Description¶
In Teradata, the FORMAT clause on a column definition tells the system how to display and parse datetime values. For example, a column defined as DATE FORMAT 'MM-DD-YYYY' expects date strings like '03-30-2026'.
Snowflake does not have an equivalent FORMAT clause on column definitions. To preserve the original behavior, SnowConvert AI:
- Comments out the
FORMATclause in theCREATE TABLEoutput. - Adds explicit conversion functions (
TO_DATE,TO_TIMESTAMP, orTO_TIME) around string literals in DML statements that reference the formatted column, using the Snowflake-equivalent format string.
This ensures that DML statements continue to parse string literals the same way Teradata did.
Note
When the FORMAT matches Snowflake’s default output format for the column type ('YYYY-MM-DD' for DATE, 'HH:MI:SS' for TIME, 'YYYY-MM-DDBHH:MI:SS' for TIMESTAMP), the FORMAT clause is silently removed from the DDL without this FDM, and no conversion functions are added to DML statements. These formats are natively handled by Snowflake. This FDM only appears for non-standard formats that require explicit conversion.
Important
For this transformation to work, the CREATE TABLE statement that defines the FORMAT clause must be included in the conversion input. SnowConvert AI reads the FORMAT value and column type from the DDL and uses that information when converting DML statements. If the DDL is not included, the tool has no way to know which format applies and the conversion functions will not be added.
Conversion function mapping¶
| Column Type | Conversion Function |
|---|---|
DATE | TO_DATE |
TIMESTAMP, TIMESTAMP WITH TIME ZONE | TO_TIMESTAMP |
TIME, TIME WITH TIME ZONE | TO_TIME |
Example Code¶
Input code:¶
Generated Code:¶
Example with BETWEEN:¶
Example with INSERT VALUES:¶
Example with MERGE:¶
Best Practices¶
- Always include the
CREATE TABLEstatements that defineFORMATclauses in the conversion input. Without them, SnowConvert AI cannot determine the correct format for DML conversion. - After conversion, verify that the converted code behaves correctly when these formats are present. In particular, check that the format string in the generated
TO_DATE/TO_TIMESTAMP/TO_TIMEcalls matches the original Teradata FORMAT. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0041¶
Column-level display-only FORMAT clause is not supported in Snowflake. No action needed.
Description¶
Teradata supports a display-only FORMAT 'X(n)' clause on character-type columns (VARCHAR, CHAR, CLOB, STRING). This format controls only the display width of the column and has no effect on data storage or query behavior. Snowflake does not support this clause, so SnowConvert AI comments it out.
Because the X(n) format is purely cosmetic, no conversion functions are added to DML statements and no manual intervention is required. This FDM is informational only.
Example Code¶
Input code:¶
Generated Code:¶
Note
The UPPER(RTRIM(...)) wrapping on the WHERE clause is due to the collation handling for NOT CASESPECIFIC columns (SSC-FDM-TD0039), not the FORMAT clause.
Best Practices¶
- No action is required for this FDM. The
X(n)display format has no functional impact in Snowflake. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0042¶
SIGNAL condition information items other than MESSAGE_TEXT are not supported in Snowflake. RAISE is used as a workaround.
Description¶
In Teradata, the SIGNAL statement can include a SET clause with multiple condition information items such as MESSAGE_TEXT, CLASS_ORIGIN, SUBCLASS_ORIGIN, RETURNED_SQLSTATE, and others. These items provide additional context when raising an error condition.
Snowflake’s RAISE statement only supports a single message through the EXCEPTION declaration. SnowConvert AI preserves the MESSAGE_TEXT value and uses it to declare a Snowflake exception, but any other condition information items (e.g., CLASS_ORIGIN, SUBCLASS_ORIGIN) are dropped because Snowflake has no equivalent mechanism.
This FDM is attached to the generated RAISE statement whenever unsupported condition information items are present in the original SIGNAL statement.
Example Code¶
Input code:¶
Generated Code:¶
Note
When all condition information items in the SET clause are supported (i.e., only MESSAGE_TEXT is present), the SIGNAL is converted to RAISE without this FDM.
Best Practices¶
- Review each occurrence of this FDM to determine if the dropped condition information items (
CLASS_ORIGIN,SUBCLASS_ORIGIN, etc.) are critical for your error-handling logic. If so, consider adding custom logging to capture that information. - The
MESSAGE_TEXTvalue is always preserved in the SnowflakeEXCEPTIONdeclaration, so the primary error message remains intact. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0044¶
PREPARE with USING variables bound at EXECUTE IMMEDIATE time instead of OPEN CURSOR time.
Description¶
In Teradata, the PREPARE ... FROM query statement stages a SQL query for execution, and the OPEN cursor USING var1, var2 statement binds variable values at OPEN time, allowing the cursor to use the current values of those variables when the cursor is opened.
In Snowflake, SnowConvert AI transforms PREPARE S1 FROM query into EXECUTE IMMEDIATE query USING (var1, var2), which binds the variable values at EXECUTE IMMEDIATE time (when the PREPARE is converted). The cursor is then fixed at the LET CURSOR FOR RESULTSET declaration. This means that:
- Variable values are captured earlier in the execution flow (at PREPARE/EXECUTE IMMEDIATE time, not OPEN time)
- Reassigning the resultset variable or re-executing PREPARE in a loop will not update the cursor
This functional difference marker indicates that the binding timing has changed. Review your code to ensure that variables contain the correct values at PREPARE time (EXECUTE IMMEDIATE time in Snowflake).
Example Code¶
Input Code:¶
Generated Code:¶
Note: In the generated code, column_value is bound when EXECUTE IMMEDIATE runs (where it still equals 0), not when OPEN CURSOR_S1_INSTANCE_V0 executes (after it’s set to 1). In Teradata, the binding happens at OPEN time, so the cursor would use the value 1.
Best Practices¶
- Review variable assignment order: Ensure variables used in USING clauses have the correct values before the PREPARE statement is executed (which becomes EXECUTE IMMEDIATE in Snowflake).
- Move assignments earlier: If variables are assigned after PREPARE but before OPEN in Teradata, move those assignments to before the PREPARE statement.
- Test cursor behavior: Verify that cursors return the expected result sets, especially in loops or when variable values change between PREPARE and OPEN.
- If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0043¶
Dynamic MESSAGE_TEXT in SIGNAL is not supported by Snowflake exceptions. CUSTOM_SQLERRM is used as a workaround.
Description¶
In Teradata, SIGNAL ... SET MESSAGE_TEXT can accept a variable or expression as the error message, allowing the message to be built dynamically at runtime. For example:
In Snowflake Scripting, the EXCEPTION declaration requires a compile-time literal for the message. There is no way to dynamically set the exception message at raise time using the RAISE statement.
As a workaround, SnowConvert AI:
- Declares the exception with a static fallback message (e.g.,
'Condition 75001 signaled'). - Assigns the dynamic value to a
CUSTOM_SQLERRMvariable before theRAISE.
The CUSTOM_SQLERRM variable holds the intended dynamic message, but when the exception propagates, Snowflake reports the static message from the EXCEPTION declaration — not the dynamic one. Exception handlers that need the dynamic message must read CUSTOM_SQLERRM explicitly.
Example Code¶
Input code:¶
Generated Code:¶
Best Practices¶
- In exception handlers, read
CUSTOM_SQLERRMto retrieve the dynamic error message instead of relying on the exception’s static message. - If the dynamic message is only used for logging purposes, consider moving the logging statement before the
RAISEso it captures the dynamic value directly. - If you need more support, you can email us at snowconvert-support@snowflake.com
SSC-FDM-TD0047¶
This macro references a Teradata built-in view that has no Snowflake equivalent.
Description¶
Teradata provides a set of system views under the DBC database (e.g., DBC.Software_Event_LogV, DBC.EventLog) that expose infrastructure-level metrics such as disk usage, software events, and resource monitoring. These views are specific to the Teradata platform and have no functional equivalent in Snowflake.
When SnowConvert AI encounters a CREATE MACRO (or REPLACE MACRO) whose body references one of these unsupported DBC views, the entire macro is commented out and this FDM marker is emitted. The marker identifies the specific unsupported view that triggered the action.
Note that macros referencing supported DBC views (such as DBC.Columns or DBC.Tables, which map to INFORMATION_SCHEMA equivalents) are converted normally and do not trigger this marker.
Example Code¶
Input Code:¶
Generated Code:¶
When a macro contains a mix of supported and unsupported DBC references, the entire macro is still commented out because partial conversion would produce a broken procedure.
Input Code (mixed references):¶
Generated Code:¶
Best Practices¶
- Review the commented-out macro to determine whether the underlying monitoring or diagnostic need can be addressed through Snowflake-native features such as
INFORMATION_SCHEMA,ACCOUNT_USAGE, orQUERY_HISTORY(). - If only part of the macro logic depends on the unsupported view, consider splitting it into separate procedures — one for the convertible queries and one for the Teradata-specific monitoring logic.
- If you need more support, you can email us at snowconvert-support@snowflake.com