SnowConvert: Best practices¶
1. Extraction¶
We highly recommend you use our scripts to extract your workload:
Teradata: Click here (https://github.com/Snowflake-Labs/SC.DDLExportScripts/blob/main/Teradata/README.md).
Oracle: Click here (https://github.com/Snowflake-Labs/SC.DDLExportScripts/blob/main/Oracle/README.md).
SQLServer: Click here (https://github.com/Snowflake-Labs/SC.DDLExportScripts/blob/main/SQLServer/README.pdf).
Redshift: Click here.
2. Preprocess¶
We highly recommend you use a Preprocess Script that aims to give you better results before starting an assessment or a conversion. This script performs the following tasks:
Create a single file for each top-level object
Organize each file by a defined folder hierarchy (The default is: Database Name -> Schema Name -> Object Type)
Generate an inventory report that provides information on all the objects that are in the workload.
2.1 Download¶
Please click here (https://sctoolsartifacts.blob.core.windows.net/tools/extractorscope/standardize_sql_files?sp=rl&st=2023-08-10T22:52:55Z&se=2025-08-11T06:52:55Z&spr=https&sv=2022-11-02&sr=c&sig=zJuuojUcLJ6XHpNEmiN%2F36urHcetW1vJtYy%2F4gBF53A%3D) to download the binary of the script for MacOs (make sure to follow the setup on 2.3).
Please click here (https://sctoolsartifacts.blob.core.windows.net/tools/extractorscope/standardize_sql_files.exe?sp=r&st=2024-07-08T19:51:50Z&se=2025-07-09T03:51:50Z&spr=https&sv=2022-11-02&sr=b&sig=nQ3UUIyfXoLwwP%2BSZpH4qkUEtwAAtRoWGfOijZCbKDU%3D) to download the binary of the script for Windows.
2.2 Description¶
The following information is needed to run the script:
Script Argument | Example Value | Required | Usage |
Input folder | /home/user/extracted_ddls | Yes | { -i | ifolder= } |
Output folder | /home/user/processed_extracted_ddls | Yes | { -o | ofolder= } |
Database name | sampleDataBase | Yes | { -d | dname= } |
Database engine | Microsoft SQL Server | Yes | { -e | dengine= } |
Output folder structure | Database name, top level object type and schema | No | [ { -s | structure= } ] |
Pivot tables generation | Yes | No | [ -p ] |
Note
The supported values for the database engine argument (-e) are: oracle, mssql and teradata
Note
The supported values for the database engine argument (-e) are: database_name, schema_name and top_level_object_name_type.
When specifying this argument all the previous values need to be separated by a comma (e.g., ““-sdatabase_name,top_level_object_name_type,schema_name).
This argument is optional and when it is not specified the default structure is the following: Database name, top-level object type and schema name.
Note
The pivot tables generation parameter (-p) is optional.
2.3 Setup the binary for Mac¶
Set the binary as an executable:
chmod +x standardize_sql_files
Run the script by executing the following command:
./standardize_sql_files
If this is the first time running the binary the following message will pop-up:
Click OK.
Open Settings -> Privacy & Security -> Click Allow Anyway
Running the script¶
Running the script using the following format:
Mac format
./standardize_sql_files -i "input path" -o "output path" -d Workload1 -e teraddata
Windows format
./standardize_sql_files.exe -i "input path" -o "output path" -d Workload1 -e teraddata
If the script is successfully executed the following output will be displayed:
Splitting process completed successfully!
Report successfully created!
Script successfully executed!