Snowflake Migration Skill

The Snowflake Migration Skill is an AI-powered skill for Cortex Code that guides you through an end-to-end database migration to Snowflake. It provides a conversational, interactive workflow — from connecting to your source database through code conversion, deployment, data migration, and validation.

Migrations take time. A full migration — especially for large workloads with hundreds of objects — is not completed in a single session. The skill automatically tracks your progress at every step. Each time you start a session, it reads your project state and picks up exactly where you left off. There is no need to start over.


Why use the Migration Skill

Cortex Code on its own is a capable coding agent, but a database migration involves coordination across dozens of tools, stages, and hundreds of objects over days or weeks. The Migration Skill adds the structure, automation, and domain expertise that a general-purpose agent doesn't have.

What the skill gives you

Benefit

What it means for you

Guided end-to-end workflow

You don't need to know the right sequence. The skill moves through connection, extraction, conversion, assessment, deployment, data migration, and validation automatically.

Session persistence

Close your terminal and come back tomorrow. The skill picks up exactly where you left off — no repeated setup, no lost progress.

SnowConvert integration

Source SQL is translated deterministically by SnowConvert before AI touches it. You start from a high-quality baseline, not a best-effort LLM rewrite.

Dependency-aware deployment

The skill analyzes object dependencies and builds deployment waves so objects are deployed in the right order. You don't manually sort hundreds of tables and views.

Two-sided testing

Functions and procedures are tested against source-side baselines automatically. Failures trigger a fix loop — the agent diagnoses, patches, and re-tests until the output matches.

Reusable fix rules

Every correction you make can be extracted into a rule and propagated across the entire project. The skill gets smarter as you go.

Zero setup

All dependencies (Python packages, SnowConvert AI, ODBC drivers) are installed automatically the first time the skill runs.

Skill vs. plain Cortex Code

Migration Skill

Plain Cortex Code

Structured multi-stage workflow

Yes — six stages from connect to migrate

No — you drive every step manually

Automatic state tracking

Yes — resumes across sessions

No — you restart context each session

SnowConvert deterministic conversion

Yes — integrated

No — you run it yourself and import results

Source database connectivity

Yes — connects, extracts, and migrates data

No — no built-in database connectors

Deployment wave planning

Yes — dependency analysis and interactive wave editor

No — you plan deployment order manually

Automated testing loop

Yes — baseline capture, two-sided validation, auto-fix

No — you write and run tests yourself

Reusable fix rules

Yes — extract, search, apply, propagate

No — fixes are one-off


How people use the skill

There are three common ways to start a migration. The skill adapts to each.

Start a new migration (greenfield)

You have a source database and want to migrate it to Snowflake from scratch. The skill walks you through every stage: connect to the source, extract objects, convert code, assess the workload, deploy, migrate data, and validate.

Use the migration-guide skill to migrate a database

Resume an in-progress migration

You already started a migration in a previous session — maybe days or weeks ago. The skill reads your project state and picks up at the exact point where you stopped. No re-extraction, no re-conversion.

Continue

Import an existing migration

You already have .sql files from a previous SnowConvert run or another source. The skill imports them into a project and picks up at the conversion or deployment stage.

Import SQL files from ./my-exported-scripts/

What you can do

The skill guides you through the full migration lifecycle and lets you jump directly to any stage at any time. You can pause and resume across sessions, and multiple users can collaborate on the same project — the skill tracks which code units each person is working on so effort isn't duplicated.

Capability

Description

Connect to source

Set up a connection to your source database with credentials stored securely for reuse

Extract source code

Pull DDL and stored procedures directly from a live database, or import local .sql files

Convert code

Translate source SQL to Snowflake-compatible SQL via SnowConvert, with a full EWI report

AI-assisted conversion

AI explains remaining conversion issues, suggests fixes, and applies them interactively

Assess workloads

Generate an interactive HTML report covering deployment waves, object exclusions, dynamic SQL patterns, and SSIS/Informatica ETL analysis

Deploy objects

Deploy converted tables, views, functions, and procedures to Snowflake wave by wave

Migrate data

Copy rows from source tables to Snowflake with automatic row-count validation

Test functions and procedures

Capture source-side baselines and run two-sided validation to confirm output equivalence

Fix and rule engine

Create reusable fix rules from corrections you make and propagate them across the project automatically

Convert ETL pipelines

Translate SSIS packages and Informatica workflows to dbt, using deterministic conversion with optional AI-assisted remediation

Validate data

Compare row counts and data between source and Snowflake tables after migration to confirm completeness

Repoint reports

Repoint Power BI reports to use Snowflake as the data source


Supported source systems

Not all capabilities are available for all source systems. The following table shows what is supported today and what is coming soon.

Capability

SQL Server

Redshift

Teradata

Oracle

Other source systems

Code extraction

Yes

Yes

Planned

Planned

Deterministic code conversion

Yes

Yes

Yes

Yes

Yes

AI conversion and verification

Yes

Yes

Code deploy

Yes

Yes

Planned

Planned

SSIS to dbt (deterministic)

Yes

Yes

Yes

Yes

Yes

Informatica to dbt (deterministic)

Yes

Yes

Yes

Yes

Yes

AI conversion of SSIS to dbt

Yes

Yes

Yes

Yes

Yes

AI conversion of Informatica to dbt

Yes

Yes

Yes

Yes

Yes

Cloud data migration

Yes

Yes

Planned

Planned

Cloud data validation

Yes

Yes

Yes

Planned

Testing framework

Yes

Yes

Planned

Planned

Testing with synthetic data

Planned

Planned

AI assessment (via Cortex Code)

Yes

Yes

Yes

Planned

Power BI report repointing

Yes

Yes

Yes

Yes

Synapse and PostgreSQL only

Other dialects with deterministic conversion support: Azure Synapse, Sybase IQ, Google BigQuery, Greenplum, Netezza, PostgreSQL, Spark SQL, Databricks SQL, Vertica, Hive, IBM DB2

Available extraction scripts: Teradata, SQL Server, Synapse, Oracle, Redshift, Netezza, Vertica, DB2, Hive, BigQuery, Databricks, Sybase IQ


Prerequisites

Before starting a migration, ensure you have:

  • Cortex Code installed (the migration skill is bundled).

  • A Snowflake account with a connection configured in ~/.snowflake/connections.toml.

  • A source database (SQL Server or Redshift) accessible from your machine.

All other dependencies (uv, scai, Python packages) are installed automatically when the skill runs for the first time.


Quick Start

1. Launch Cortex Code:

cortex

2. Start a migration:

Use the migration-guide skill to migrate a database

The migration-guide skill activates automatically, confirms plugin installation with you, registers the plugin, and walks you through the full migration workflow. If a project already exists, the agent picks up where you left off.


Migration workflow

The skill guides you through six stages. Each session starts by detecting your current progress and resuming automatically.

Stage

Name

What happens

1

Connect

Set up a connection to your source database

2

Init

Create a local migration project

3

Register

Extract DDL and code from the source, or import local .sql files

4

Convert

Translate source SQL to Snowflake-compatible SQL via SnowConvert

5

Assess

Generate an interactive report covering waves, exclusions, dynamic SQL, and ETL

6

Migrate

Deploy objects, migrate data, validate output, and fix errors — wave by wave

Stage 6 is where the bulk of the work happens. AI-driven testing, conversion remediation, and iterative fix loops run here — often across multiple sessions and days. This is also the stage where collaboration pays off most: multiple users can work on the same project simultaneously, each picking up different code units while the skill coordinates to prevent duplicated effort.

Every session shows a live progress checklist so you always know where you stand:

✅  1. Connect             — Connected to SQL Server
✅  2. Init                — Project initialized
✅  3. Register            — 342 objects registered
◐   4. Initial Conv        — 280/342 converted
⬚   5. Assess              — Not run
⬚   6. Migrate Objects     — 0/120 tables deployed

Skills Reference

The skill is organized as a skill tree. The root skill detects your project state and delegates to the right sub-skill. You can also invoke any skill directly by describing what you want.

Setup Skills (Stages 1–5)

connection

Walks you through connecting to your source database. The agent collects credentials, tests the connection, and saves it for reuse across sessions. Supports:

  • SQL Server — configures ODBC driver, host, port, and authentication.

  • Amazon Redshift — configures host, port, database, and IAM or password authentication.

register-code-units

Gets source code into the migration project. Two paths are available:

Path

When to use

Extract from database

You have a live source connection and want the agent to pull DDL and object code directly

Import local files

You already have .sql files on disk and want to import them into the project

convert

Runs SnowConvert to translate your source SQL (T-SQL or Redshift SQL) into Snowflake-compatible SQL. After conversion, the agent presents:

  • Total objects converted successfully.

  • EWI (Early Warning Issue) summary broken down by severity (errors, warnings, informational).

  • A list of objects that require manual review.

assessment

Generates an interactive multi-tab HTML report. The assessment includes four analyses that can be run individually or together:

Analysis

What it does

Deployment Waves

Analyzes object dependencies to produce an ordered deployment sequence. Objects within a wave have no inter-dependencies; waves are ordered so dependencies are always deployed first.

Object Exclusion

Identifies objects that do not need migration: temporary tables, staging objects, deprecated objects, and test artifacts. Reduces scope before deployment.

Dynamic SQL Analysis

Classifies and scores Dynamic SQL patterns in your converted code. Identifies patterns that Snowflake handles natively, patterns requiring manual rewrite, and patterns with elevated migration complexity.

ETL/SSIS Assessment

Analyzes SSIS packages individually: classifies each package (Ingestion, Transformation, Export, Orchestration, Hybrid), maps control and data flow, and estimates migration effort.

The report is generated as a single self-contained HTML file. You can iterate on the wave plan interactively — for example, reprioritizing objects, adjusting wave sizes, or relocating specific objects — before locking it for deployment.

Migration Skills (Stage 6)

migrate-objects

The main deploy loop. Processes all objects in the current wave in dependency order:

Object type

What happens

Tables

Deployed to Snowflake, then data is migrated from the source.

Views

Deployed to Snowflake. Blocked views retry after their dependent functions/procedures pass.

Functions & Procedures

Deployed, tested against source output, and fixed if tests fail. The loop repeats until tests pass or the user decides to skip.

After each wave completes, the agent automatically advances to the next wave.

baseline-capture

Captures the expected output of source stored procedures and functions for use as test baselines. Two approaches are supported:

Approach

When to use

Query Logs

You have CSV logs of real EXEC or CALL statements from your source system. The agent parses these to extract parameters and expected outputs.

AI-Assisted

No logs are available. A swarm of specialized agents generates test cases covering business logic, data-driven scenarios, and edge cases by analyzing the source SQL.

Baselines are stored locally and uploaded to Snowflake so they can be used for two-sided validation (source output vs. Snowflake output) during the migrate-objects loop.

rule-engine

Manages reusable migration rules stored in Snowflake. Rules encode known source-to-Snowflake fix patterns and are shared across all objects in the project. Each rule can operate in two modes:

Mode

How it works

Regex

A regex find-and-replace applied mechanically to SQL files

AI-guided

The rule provides context and strategy; the AI interprets and applies it

The rule engine has four sub-capabilities:

Sub-skill

What it does

search

Scans a SQL file against all rules using regex pattern matching and Cortex semantic search. Returns matched rules ranked by relevance.

apply

Applies matched rules to local SQL files. Regex rules are applied automatically; AI-mode rules are shown for review before applying. Supports single-file and batch application.

extract

Creates a new reusable rule from a fix you just made. Works from an interactive before/after comparison or retroactively from git history.

propagate

Given a rule, finds every code unit in the project it applies to (via reverse regex + semantic search), then hands off to batch apply.

Rules accumulate over the lifetime of the project. Every time the agent fixes an object and extracts a rule, that rule becomes available to all subsequent objects — reducing manual effort as the migration progresses.


What You Can Ask

You do not need to follow the prescribed path. You can ask for any capability at any time.

Status and navigation

Prompt

What happens

"What is the current state?"

Shows the progress checklist

"What should I work on next?"

Returns the next dependency-ready object

"Continue"

Picks up the prescribed migration path

Setup

"Connect to my SQL Server database"
"Extract objects from the source"
"Import SQL files from ./my-scripts/"
"Convert my source code"

Assessment

"Run a full assessment"
"Generate deployment waves"
"I want a maximum of 30 objects per wave"
"Prioritize all Payroll objects in Wave 1"
"Identify temporary and staging objects"
"Analyze dynamic SQL patterns"
"Assess my SSIS packages"

Migration

"Deploy tables"
"Migrate data"
"Deploy and test the next function"
"Capture baselines for dbo.GetCustomerOrders"

Rule engine

"Search rules for this file"
"Apply all matched rules"
"Extract a rule from my last fix"
"Propagate this rule across the project"
"Show me all rules"

Example Workload

You can use AdventureWorksDW (https://learn.microsoft.com/en-us/sql/samples/adventureworks-install-configure) as an example source database to try the skill end-to-end. Substitute any SQL Server or Redshift database you have access to — the skill adapts to whatever source you connect.


Troubleshooting

Problem

Resolution

migration-guide skill doesn't appear

Ensure you're on the latest version of Cortex Code. Run cortex --version to confirm.

Plugin installation fails during skill setup

Check that python3 is available on your PATH. The skill installs dependencies via Homebrew (macOS/Linux) or winget (Windows).

scai not found after skill install

Run the install hook manually: migration-plugin/hooks/install-dependencies. Or install directly: brew install --cask snowflakedb/snowconvert-ai/snowconvert-ai.

Snowflake connection errors

Verify your connection in ~/.snowflake/connections.toml and confirm the connection name matches what you provided during setup.

Agent seems lost

Say "What is the current state?" — the agent re-reads project status and resets context.


Support

For help with the migration skill, contact: snowconvert-support@snowflake.com