Run and schedule Notebooks in Workspaces

Scheduling notebooks in Workspaces

You can automate your data science and ML workflows by scheduling notebooks in Workspaces to run at recurring intervals. Scheduled runs use Snowflake Tasks and execute notebooks with all dependencies in a consistent, top-down sequence.

Note

Scheduling is not currently supported for notebooks in shared workspaces.

Required permissions

To schedule a notebook, the role used to create the schedule must have the following permissions and grants:

Account level

  • EXECUTE TASK on the ACCOUNT: Allows the role to run tasks associated with scheduled notebooks.

Schema level

  • CREATE TASK: Required to create the underlying Snowflake task.

  • USAGE on the target schema: For storing task metadata and notebook project objects.

Compute and warehouse permissions

  • USAGE on the compute pool used for notebook execution.

  • USAGE and MONITOR on the query warehouse used for SQL cells inside the notebook.

USAGE on External Access Integrations (EAIs) for notebooks

  • USAGE on any EAI used by the notebook.

Schedule a notebook

To schedule a notebook in Snowsight, follow these steps:

  1. In the navigation menu, select Projects » Notebooks.

  2. Create a new notebook or open an existing notebook to be scheduled.

  3. Select Schedule in the top-right of the notebook editor. - If this is the notebook’s first task, the Schedule button is a calendar icon. - If a schedule already exists, the icon is a calendar with a clock.

  4. Select Create Schedule.

  5. In the Schedule a Notebook Task dialog, provide the following information.

    Basic settings

    • Task name: The unique name for the scheduled task. The default name is {notebook-name}_task_# but can be updated if necessary.

    • Owner role: The Snowflake role under which the task executes. Select a role with the required permissions to execute all operations performed by the scheduled notebook. This role must have permissions to:

      • Read/write the database objects the notebook uses.

      • Access warehouses, compute pools, and integrations.

      • Create/update the task and project objects.

    • Location: The database and schema where the task object and associated notebook project object will be created. Choose a schema where your role has CREATE TASK, USAGE, and relevant privileges.

    • Frequency: How often the notebook should run. Choose from: Hourly, Daily, Weekly, Monthly, or Custom (Cron scheduling). All execution times use your local time zone.

    Advanced settings

    • Project name: A unique name for the notebook’s project container that Snowflake creates for task execution. If not edited, Snowflake provides a default name.

    • Parameter (optional): Key-value parameters passed to the notebook at runtime and appear as command-line arguments (in sys.argv). Parameters are useful for passing dates, environment flags, thresholds, or model versions.

    Runtime variant

    • CPU: Uses a CPU Container Runtime environment and runs on a CPU compute pool (for example, the automatically provisioned SYSTEM_COMPUTE_POOL_CPU).

    • GPU: Uses a GPU Container Runtime environment that includes GPU-accelerated libraries and runs on a GPU compute pool (such as SYSTEM_COMPUTE_POOL_GPU).

    • Python version: The Python version used during task execution.

    • Runtime version: The base Container Runtime image. Choosing the correct runtime version ensures your notebook runs consistently between development and scheduled execution.

    • Compute pool: The compute pool that will execute the notebook task. Ensure that the compute pool has capacity (free nodes) at the time of the scheduled execution. To avoid scheduled runs to fail, we recommend that you use a dedicated compute pool to ensure no other SPCS services take up full capacity.

    • Query warehouse: The Snowflake warehouse used for all SQL queries inside the notebook.

    • External access integrations (optional): Defines which Snowflake External Access Integrations (EAIs) the notebook may use. EAIs are required if your notebook requires external APIs, third-party services, or cloud storage outside of Snowflake’s internal stages. If no EAIs are listed, your selected role does not own or have privileges on any integrations.

  6. Review the schedule preview, and select Create.

If a task fails because the user lacks the required permissions, the dialog will remain open and display the relevant error messages.

View scheduled notebook runs

You can view scheduled tasks in two places:

From the notebook

  1. In the navigation menu, select Projects » Notebooks.

  2. Open a scheduled notebook.

  3. Select Schedule in the top-right of the notebook editor. A popover displays the following information:

  • All scheduled runs for this notebook. To view or interact with scheduled runs, you must use a role with access to the database and schema where the schedule and project object were created.

  • The next scheduled run time.

  • Status of past runs. Hover over a status indicator to see details such as Query ID, last run time, duration, and status.

From the Actions menu

  • Open Run History: Opens the notebook’s project object showing all past runs, including status, duration, and results. Selecting a run’s result opens the executed notebook with its output.

Deploy updates to scheduled notebook tasks

After editing a notebook, you must deploy your changes before scheduled runs use the updated version. Deployment ensures reproducibility and prevents scheduled tasks from running code that differs from what was last deployed. When a notebook has changes that require deployment, the Schedule (calendar) icon displays a clock indicator.

After modifying code or cells, the icon indicates that there are undeployed changes.

  • Select Deployed Changes.

    Snowflake then updates the associated notebook project object, and all scheduled tasks for that notebook will use the newly deployed version for the next run.

Find a notebook project object in the Object Explorer

Each scheduled notebook automatically creates a notebook project object that stores its deployed code, execution history, and artifacts.

To locate a notebook project object in Snowsight, follow these steps:

  1. In the navigation menu, select Catalog » Database Explorer.

  2. Navigate to Database > Schema > Notebook Project Objects to view all project objects in that schema.

Alternatively, you can:

  1. Open the relevant notebook.

  2. Select Schedule in the top-right corner.

  3. From the drop-down menu, select Run history to open the associated notebook project object.

View the notebook’s run history

If any step fails during execution, Notebooks stops the run to prevent partial or inconsistent downstream results.

To view run history, follow these steps:

  1. In the navigation menu, select Projects » Notebooks.

  2. Open the notebook whose run history you want to review.

  3. In the top-right corner of the notebook editor, select the Schedule (calendar) icon.

  4. Select View run history from the drop-down menu.

Run History shows start and end times, run status, and error details such as logs and metrics for the notebook’s project object.

Schedule a notebook using Tasks

  1. In the navigation menu, select Projects » Workspaces.

  2. Run the following command in a SQL file/worksheet:

CREATE OR REPLACE TASK <database_name>.<schema_name>.<task_name>
    WAREHOUSE = <warehouse_name>
    SCHEDULE = 'USING CRON 10 13 * * * America/Los_Angeles'
    -- CRON format: <minute> <hour> <day_of_month> <month> <day_of_week> <timezone>
AS
    -- Execute a notebook stored within a Snowflake project.
    EXECUTE NOTEBOOK
        PROJECT = '<database_name>.<schema_name>.<project_name>'
        -- Notebook file to run
        MAIN_FILE = '<notebook_file_name>.ipynb'
        -- Compute pool used to run the notebook
        COMPUTE_POOL = '<compute_pool_name>'
        -- Runtime environment (Python version, CPU/GPU, etc.)
        RUNTIME = '<runtime_version>'
        -- Warehouse used for SQL statements inside the notebook
        QUERY_WAREHOUSE = <query_warehouse_name>;
Copy

After creating this task, run the following command to activate the schedule:

ALTER TASK <database_name>.<schema_name>.<task_name> RESUME;
Copy

Note

To learn more about credit usage, idle timeout behavior, and notebook service management, see Setting up compute and Idle timeout.

Language: English