Skip to main content

Creating Tables in Snowflake: Types, Examples & Best Practices

Learn how to create different types of tables in Snowflake, including permanent, temporary, external, and dynamic tables. Includes step-by-step examples and best practices.
Feb 20, 2025  · 9 min read

Snowflake is a powerful cloud-based data warehousing platform that helps organizations store, manage, and analyze their data efficiently. Unlike traditional data warehouses that require complex hardware setups, Snowflake runs entirely in the cloud, making it easier for businesses to scale their data operations as needed.

Creating tables in Snowflake is a fundamental skill that forms the foundation of data management. Tables are where you store your structured data, making it possible to run queries, generate reports, and derive valuable insights from your information. Whether you’re working with customer data, sales figures, or analytical results, understanding how to create and manage tables in Snowflake is essential.

In this tutorial, you’ll learn everything you need to know about creating tables in Snowflake. We’ll cover the different types of tables, show you step-by-step examples, and share best practices for table design. By the end, you’ll be confident in choosing the right table type for your needs and creating tables that follow industry standards for performance and maintainability.

Understanding Table Types in Snowflake

Before we dive into code examples on how to create tables, let’s explore different table types in Snowflake as they each serve unique purposes and have distinct characteristics in terms of data persistence, accessibility, and storage location.

Permanent tables

Permanent tables are the standard table type in Snowflake. They persist until explicitly dropped and are stored within Snowflake’s internal storage. These tables are ideal for storing your core business data that needs to be retained long-term and accessed across multiple sessions. The data in permanent tables remains intact between sessions and can be accessed by any authorized user.

Temporary tables

Temporary tables are session-specific tables that automatically drop when your current session ends. They serve as workspace tables for intermediate processing steps, helping organize complex data transformations without cluttering your permanent storage. The data in temporary tables is only visible to the session that created them and automatically cleans up when the session closes.

External tables

External tables provide a way to query data that resides outside of Snowflake’s storage. They act as a metadata layer that maps to data in external storage locations like AWS S3 or Azure Blob Storage. Rather than ingesting the data, external tables create a reference to it, allowing you to query external data directly. This approach is particularly valuable when working with data that needs to remain in its source location or requires frequent updates from external systems.

Dynamic tables

Dynamic tables are a powerful feature in Snowflake that automatically maintain up-to-date results of a query. They act as materialized views that refresh automatically based on changes in the underlying source tables. When you create a dynamic table, you define it using a SELECT statement, and Snowflake automatically manages the refresh process to ensure the data stays current.

Dynamic tables are particularly useful for:

  • Building real-time dashboards that need fresh data without manual refreshes
  • Maintaining aggregated or transformed versions of source data that update automatically
  • Creating dependent data models where downstream tables need to reflect upstream changes
  • Optimizing query performance by pre-computing complex transformations

Key features of dynamic tables include:

  • Configurable refresh intervals (from 1 minute to 24 hours)
  • Automatic dependency tracking of source tables
  • Target lag time settings to control data freshness
  • Incremental processing to minimize compute costs
  • Built-in monitoring and optimization capabilities

Each table type has its distinct purpose in data architecture. Permanent tables form the foundation of your data warehouse, temporary tables facilitate data processing workflows, external tables enable integration with external data sources, and dynamic tables provide automated, real-time data transformations.

Comparing Snowflake tables 

In the table below, we’ve outlined the different types of Snowflake tables in a concise, quick-reference guide: 

Table Type

Persistence

Visibility

Best For

Permanent

Until dropped

Any session

Core business data

Temporary

Session-only

Same session

Intermediate transformations

External

External storage

Any session

Querying external data (e.g., S3)

Dynamic

Auto-refreshing

Any session

Real-time analytics

Create Tables: Setting Up Snowflake

Before running the examples in this tutorial, you’ll need access to a Snowflake environment. Here are several ways to connect to and run Snowflake commands:

1. Snowflake web interface (Snowsight)

The simplest way to get started is through Snowflake’s web interface:

  1. Log in to your Snowflake account 
  2. Navigate to Worksheets
  3. Create a new worksheet to run SQL commands

2. SnowSQL (command-line tool)

For command-line access:

  1. Install SnowSQL 
  2. Connect using:
snowsql -a <account> -u <username> -d <database> -w <warehouse>

You can find your Snowflake account credentials in the following places:

  1. Account URL: Look in your welcome email or account settings
  2. Username: Provided during account setup
  3. Database: Created or assigned by your admin
  4. Warehouse: Created or assigned by your admin

For security, never share or commit these credentials. Store them securely in environment variables or a configuration file.

3. Python connection

If you’re using Python, you can use the snowflake-connector-python library:

import snowflake.connector

conn = snowflake.connector.connect(
   user='your_username',
   password='your_password',
   account='your_account',
   warehouse='your_warehouse',
   database='your_database',
   schema='your_schema'
)

cur = conn.cursor()

cur.execute("SELECT current_version()")

Important Security Note:

  • Never store credentials directly in your code
  • Use environment variables or secure secret management
  • For production, consider using key pair authentication
  • Follow your organization’s security policies

Now that our setup is ready, let’s create tables!

Creating Tables in Snowflake: Step-by-Step Examples

Example 1: Creating a simple Snowflake table

Let’s create a table to store customer information. This example demonstrates basic table creation with different data types and constraints.

-- Create a permanent table for customer data
CREATE OR REPLACE TABLE customers (
   customer_id INTEGER PRIMARY KEY,
   first_name VARCHAR(50) NOT NULL,
   last_name VARCHAR(50) NOT NULL,
   email VARCHAR(100) UNIQUE,
   date_of_birth DATE,
   created_at TIMESTAMP_NTZ DEFAULT CURRENT_TIMESTAMP()
);

Key points about this example:

  • CREATE OR REPLACE TABLE - Creates a new table or replaces it if it already exists
  • PRIMARY KEY - Ensures unique identification of each record
  • NOT NULL - Prevents null values in required fields
  • UNIQUE - Ensures email addresses are not duplicated
  • TIMESTAMP_NTZ - Timestamp without time zone
  • DEFAULT - Automatically sets creation timestamp

Example 2: Creating a temporary table

Temporary tables are perfect for intermediate data processing. Here’s an example that creates a temporary table for analyzing customer orders:

-- Create a temporary table for order analysis
CREATE OR REPLACE TEMPORARY TABLE temp_order_analysis (
   order_date DATE,
   total_orders INTEGER,
   total_revenue DECIMAL(18,2)
)
AS
SELECT
   DATE_TRUNC('day', order_date) as order_date,
   COUNT(*) as total_orders,
   SUM(order_amount) as total_revenue
FROM orders
GROUP BY 1;

Key points about temporary tables:

  • The TEMPORARY keyword indicates that this table will only exist within the current session. This means other users cannot see or access this table.
  • Using CREATE TABLE AS (CTAS) allows you to create and populate a table in a single step by combining the table definition with a SELECT statement.
  • The temporary table is automatically dropped when your session ends, helping to keep the database clean without manual cleanup.
  • Temporary tables are useful for storing intermediate results during complex data transformations or analysis workflows.

Example 3: Creating an external table in Snowflake 

This example shows how to create an external table that reads data from files stored in a cloud storage bucket:

-- Create file format for CSV files
CREATE OR REPLACE FILE FORMAT my_csv_format
   TYPE = CSV
   FIELD_DELIMITER = ','
   SKIP_HEADER = 1
   NULL_IF = ('NULL', 'null');

-- Create external stage
CREATE OR REPLACE STAGE my_external_stage
   URL = 's3://my-bucket/data/'
   CREDENTIALS = (AWS_KEY_ID = 'xxx' AWS_SECRET_KEY = 'xxx');

-- Create external table
CREATE OR REPLACE EXTERNAL TABLE ext_customer_data (
   customer_id VARCHAR AS (VALUE:c1::VARCHAR),
   purchase_date DATE AS (VALUE:c2::DATE),
   product_id VARCHAR AS (VALUE:c3::VARCHAR),
   amount DECIMAL(10,2) AS (VALUE:c4::DECIMAL(10,2))
)
WITH LOCATION = @my_external_stage/customers/
FILE_FORMAT = my_csv_format
PATTERN = '.*[.]csv';

The first statement defines a file format specification that tells Snowflake how to parse external CSV files, including the delimiter, header handling, and NULL value representations.

The second statement creates a stage object that establishes a connection to external cloud storage (in this case, S3) using the provided credentials. This stage acts as a reference point for accessing the external data files.

The third statement creates the external table itself, using column definitions with AS clauses to explicitly cast the raw string data from the CSV into proper Snowflake data types. The WITH LOCATION clause points to the specific path within the stage where the files reside.

The PATTERN parameter filters which files should be included using regex matching, ensuring only CSV files are processed.

Prerequisites to run this external table creation query:

1. Access to a Snowflake account with permissions to:

  • Create file formats
  • Create stages
  • Create external tables

2. Valid AWS credentials (AWS_KEY_ID and AWS_SECRET_KEY) with permissions to:

  • Read from the S3 bucket
  • List objects in the bucket

3. Source data requirements:

  • CSV files stored in an S3 bucket
  • Files follow a consistent format with four columns
  • Column data matches specified types:
  • customer_id as string
  • purchase_date as date
  • product_id as string
  • amount as decimal

4. Network connectivity:

  • Network access to AWS S3
  • Proper security group and firewall settings

5. Compute resources:

  • Active warehouse with adequate computing capacity
  • Sufficient user credits/quota

Example 4: Creating a dynamic table

This example creates a dynamic table that maintains an up-to-date summary of sales data:

-- Create a dynamic table for real-time sales analytics
CREATE OR REPLACE DYNAMIC TABLE daily_sales_summary
TARGET_LAG = '20 minutes'
WAREHOUSE = 'COMPUTE_WH'
AS
SELECT
   DATE_TRUNC('day', sale_timestamp) as sale_date,
   product_category,
   SUM(sale_amount) as total_sales,
   COUNT(DISTINCT customer_id) as unique_customers
FROM sales_transactions
GROUP BY 1, 2;

The TARGET_LAG parameter of '20 minutes' defines the maximum allowed delay between updates to the source table and when those changes are reflected in the dynamic table. This ensures data freshness within that time window.

The WAREHOUSE parameter specifies which compute resources (in this case COMPUTE_WH) will be used to periodically refresh the dynamic table. This warehouse must have adequate capacity to handle the refresh operations.

The dynamic table automatically updates its contents when the underlying sales_transactions table changes without requiring manual intervention. This provides near real-time analytics capabilities.

By maintaining pre-aggregated daily sales summaries grouped by product category, the dynamic table enables faster querying compared to calculating these metrics on demand from the raw transaction data. This improves query performance for reporting and analytics use cases.

Each of these examples demonstrates different table creation scenarios you might encounter in real-world applications. Remember to adjust column names, data types, and other parameters according to your specific needs.

Best Practices for Creating Tables in Snowflake

Let's explore some best practices to follow when creating tables in Snowflake to ensure maintainability, performance, and data quality.

1. Naming conventions

Always use clear, consistent names for your tables and columns:

-- Good naming examples
CREATE TABLE customer_orders (
   order_id INTEGER,
   customer_email VARCHAR,
   order_date DATE
);

-- Bad naming examples
CREATE TABLE Orders123 (
   ID INTEGER,
   email VARCHAR,
   dt DATE
);

Tips for naming:

  • Use lowercase letters and underscores (snake_case)
  • Avoid spaces and special characters
  • Make names descriptive but not too long
  • Use singular form for table names (customer not customers)
  • Add prefixes for different environments (dev_, test_, prod_)

2. Column data types

Choose the right data type to save storage and improve performance:

-- Good data type choices
CREATE TABLE products (
   product_id INTEGER,
   price DECIMAL(10,2),    -- For money values
   name VARCHAR(100),      -- Set a reasonable limit
   description TEXT,       -- For very long text
   is_active BOOLEAN      -- Instead of VARCHAR or INTEGER
);

Common data types to use:

  • NUMBER/INTEGER: For whole numbers
  • DECIMAL/NUMERIC: For precise decimal numbers (especially money)
  • VARCHAR: For text with a known maximum length
  • TEXT: For unlimited length text
  • TIMESTAMP_NTZ: For dates and times (no timezone)
  • BOOLEAN: For true/false values

3. Using clustering keys

Add clustering keys to improve query performance:

CREATE TABLE sales (
   sale_date DATE,
   store_id INTEGER,
   product_id INTEGER,
   amount DECIMAL(10,2)
)
CLUSTER BY (sale_date, store_id);

When to use clustering:

  • For large tables (more than 1 million rows)
  • On columns you frequently filter or join on
  • Usually date columns plus 1–2 other columns
  • Don’t cluster by columns with too many unique values

4. Adding table constraints

Use constraints to keep your data clean:

CREATE TABLE users (
   user_id INTEGER PRIMARY KEY,  -- Ensures unique IDs
   email VARCHAR(255) UNIQUE,    -- No duplicate emails
   username VARCHAR(50) NOT NULL, -- Must have a value
   age INTEGER CHECK (age >= 18) -- Must be 18 or older
);

Important constraints:

  • PRIMARY KEY: For unique identifiers
  • FOREIGN KEY: For linking tables together
  • NOT NULL: For required fields
  • UNIQUE: For fields that can’t have duplicates
  • CHECK: For simple data validation

5. Common mistakes to avoid

Don’t do this:

-- Bad practice: No primary key
CREATE TABLE orders (
   date VARCHAR,           -- Wrong data type for dates
   amount VARCHAR(10),     -- Wrong data type for numbers
   status text            -- Inconsistent case with other columns
);

Do this instead:

-- Good practice
CREATE TABLE orders (
   order_id INTEGER PRIMARY KEY,
   order_date DATE,
   amount DECIMAL(10,2),
   status VARCHAR(20)
);

The improved version ensures data integrity with a primary key, uses correct data types for dates and numbers to enable efficient filtering and calculations, and maintains consistency in column definitions for better readability and compatibility.

6. Performance tips

Optimizing table performance in Snowflake can significantly improve query speed and reduce storage costs. Follow these best practices to ensure your tables are efficient and scalable.

  • Create smaller tables instead of one huge table
  • Add column descriptions using COMMENT
  • Set up automatic data cleanup rules
  • Test your table design with real data
  • Monitor table usage and adjust as needed

7. Security best practices

Proper security measures are essential to protect sensitive data and control access. Implement these best practices to safeguard your Snowflake environment from unauthorized access and data breaches.

  • Always set proper access permissions
  • Use row-level security when needed
  • Encrypt sensitive columns
  • Keep track of who can access what
  • Regularly review access patterns

Remember: A well-designed table makes your work easier later on. Take time to plan your table structure before creating it!

Conclusion

In this tutorial, we’ve covered the essentials of creating tables in Snowflake, including:

  • Different types of tables (permanent, temporary, external, and dynamic)
  • Step-by-step examples for creating each type
  • Best practices for table design and management
  • Common pitfalls to avoid

This knowledge forms the foundation for building efficient and scalable data solutions in Snowflake.

Next Steps

To deepen your Snowflake expertise, check out these valuable resources:

1. Interactive courses:

2. Comprehensive tutorials:

3. Career resources:

Don’t forget, the best way to learn is by doing. Start creating tables, experiment with different types, and build real projects. As you encounter challenges, refer back to these resources and keep exploring new features of Snowflake.

Snowflake Create Table FAQs

What are the different types of tables in Snowflake?

Snowflake supports four main table types: permanent tables for long-term data storage, temporary tables for session-specific use, external tables for querying external data sources, and dynamic tables for automatic query result maintenance.

How do I create a basic table in Snowflake?

Use the CREATE TABLE command with your desired schema. For example: "CREATE TABLE customers (customer_id INTEGER PRIMARY KEY, first_name VARCHAR(50), last_name VARCHAR(50))". Include appropriate data types and constraints.

What is a dynamic table in Snowflake and when should I use it?

Dynamic tables automatically maintain up-to-date results of a query, similar to materialized views. They're ideal for real-time dashboards, automated data transformations, and maintaining aggregated data that needs frequent updates.

What are the best practices for naming tables in Snowflake?

Use lowercase letters with underscores (snake_case), avoid spaces and special characters, make names descriptive but concise, use singular form for table names, and consider adding environment prefixes (dev_, test_, prod_).

How can I improve table performance in Snowflake?

Improve performance by using appropriate clustering keys, choosing correct data types, setting proper constraints, creating smaller focused tables instead of large ones, and regularly monitoring table usage patterns.


Bex Tuychiev's photo
Author
Bex Tuychiev
LinkedIn

I am a data science content creator with over 2 years of experience and one of the largest followings on Medium. I like to write detailed articles on AI and ML with a bit of a sarcastıc style because you've got to do something to make them a bit less dull. I have produced over 130 articles and a DataCamp course to boot, with another one in the makıng. My content has been seen by over 5 million pairs of eyes, 20k of whom became followers on both Medium and LinkedIn. 

Topics

Top DataCamp Courses

course

Introduction to Snowflake

3 hr
26.2K
This course will take you from Snowflake's foundational architecture to mastering advanced SnowSQL techniques.
See DetailsRight Arrow
Start Course
See MoreRight Arrow
Related

blog

Star Schema vs Snowflake Schema: Differences & Use Cases

This guide breaks down star and snowflake schemas — two common ways to organize data in warehouses. You’ll learn how they work, how they’re different, and when to use each to fit your data needs.
Laiba Siddiqui's photo

Laiba Siddiqui

20 min

blog

How to Learn Snowflake in 2025: A Complete Guide

Master Snowflake in 3-6 months with this comprehensive learning guide. Includes step-by-step roadmap, practical projects, career paths, and more.
Bex Tuychiev's photo

Bex Tuychiev

14 min

tutorial

Snowflake Tutorial For Beginners: From Architecture to Running Databases

Learn the fundamentals of cloud data warehouse management using Snowflake. Snowflake is a cloud-based platform that offers significant benefits for companies wanting to extract as much insight from their data as quickly and efficiently as possible.
Bex Tuychiev's photo

Bex Tuychiev

12 min

tutorial

CREATE TABLE in Databricks: A Full Guide to Building and Managing Tables

Discover how to create tables in Databricks using the CREATE TABLE command. Learn different methods for different scenarios, including how to a create a table from existing data and how to CREATE TABLE as SELECT.
Allan Ouko's photo

Allan Ouko

10 min

tutorial

Using Snowflake Time Travel: A Comprehensive Guide

Discover how to leverage Snowflake Time Travel for querying history, cloning tables, and restoring data with our in-depth guide on database recovery.
Bex Tuychiev's photo

Bex Tuychiev

9 min

tutorial

What is a Database Schema? A Guide on the Types and Uses

A database schema provides a comprehensive blueprint for the organization of data, detailing how tables, fields, and relationships are structured. Read to learn about the schema types, such as star, snowflake, and relational schemas.
Laiba Siddiqui's photo

Laiba Siddiqui

9 min

See MoreSee More