IMG_5865

Daily Data Snapshot Automation with T-SQL in Microsoft Fabric (Part 4)

August 12, 2025

This is the fourth installment in our series on taking data snapshots. If you haven’t read our previous snapshot blogs, we recommend reviewing them first to better understand the overall context [insert link]. In this post, we’ll walk through how to use T-SQL to implement an automated snapshot process in Microsoft Fabric.

Prerequisites:

Before starting, ensure you have the following:

Source Table: A table containing only the latest data (e.g., a currency rate table that updates daily).

Warehouse: A warehouse in Microsoft Fabric where you can run T-SQL queries.

Step-by-Step Guide to Automating Snapshots with T-SQL

Step 1: Create a New SQL Query

T-SQL is only accessible in a warehouse. First, create a warehouse named “Snapshot1”. Then, navigate to the warehouse and create a new SQL query.

image 42


Figure 01: Create New SQL query

Step 2: Explore the Data

In this example, we have a currency rate table where data is updated daily. To get an overview of the data, run the following query:

SELECT * FROM [Snapshot].[dbo].[currencyrate]

Results:

image 41

Figure 02: Results of currencyrate table.

This table has a total of 4 columns and 33 rows.

Step 3: Create a Snapshot Table

The goal is to take daily snapshots of the currencyrate table and append the results to a new table. To achieve this, create a table named historical_currencyrate using the following query:

CREATE TABLE historical_currencyrate (
    [From] VARCHAR(10),
    [To] VARCHAR(10),
    Rate FLOAT,
    [Date] DATETIME2(0)
);

To verify that the table has been created successfully, run this query:

select * from dbo.historical_currencyrate

At this stage, the table will be empty since no data has been inserted yet.

Step 4: Create a Stored Procedure

To automate the data insertion process, create a stored procedure that inserts data from the currencyrate table into the newly created historical_currencyrate table.

CREATE PROCEDURE AppendCurrencyRate
AS
BEGIN
    -- Insert data from source table to the historical table
    INSERT INTO historical_currencyrate ([From], [To], Rate, [Date])
    SELECT 
        [From],
        [To],
        Rate,
        [Date]
    FROM [Snapshot].[dbo].[currencyrate];

    -- Provide feedback after successful execution
    PRINT 'Data appended to historical_currencyrate table successfully.';
END;

This procedure will be called automatically in the pipeline to capture daily snapshots.

Explanation:

AppendCurrencyRate: The name of the stored procedure.

Insert: This command inserts all data from the currencyrate table in the lakehouse into the historical_currencyrate table.

Step 5: Execute the Stored Procedure

Now, execute the stored procedure using the following query:

EXEC AppendCurrencyRate;

Once the query is executed, data insertion will begin, and new data will be appended to the historical_currencyrate table. To verify that the data has been successfully inserted, run the select query again:

select * from dbo.historical_currencyrate

Results:

image 40

Figure 03: Results of dbo. historical_currencyrate table.

Step 6: Automate the Snapshot with a Pipeline

To ensure the stored procedure automatically runs every day, we’ll create a pipeline. Navigate to your workspace, create a new item, and select Pipeline.

image 39

Figure 04: Create a Pipeline

Add a new Stored Procedure Activity and name it “Run Stored Procedure”. In the settings, select Warehouse Snapshot1 as the connection and choose the newly created stored procedure, [dbo].[AppendCurrencyRate]. Click save to make all changes effective. Finally, save the pipeline as “Run Stored Procedure” and return to the workspace.

Step 7: Set the Schedule

To automate the pipeline, hover over the pipeline, click the three dots, and select Schedule. Configure the schedule as shown below:

image 38

Figure 05: Schedule Pipeline

Set the pipeline to run daily at 9:10 AM. This timing ensures that the source table, which updates at 9:00 AM, has enough time to complete its updates before the snapshot is taken.

Conclusion

By following the steps outlined above, you can set up an automated snapshot process using T-SQL in Microsoft Fabric. The pipeline is configured to run daily at 9:10 AM, executing the stored procedure and capturing a snapshot of the currencyrate table. This approach ensures that your historical data is consistently and accurately stored without requiring manual intervention.

While this method provides a straightforward way to manage snapshots, it’s important to monitor the process periodically to ensure it runs as expected. Additionally, depending on your specific use case, you may want to explore further optimizations or enhancements, such as error handling or logging.

Nazmul Hasan

Analytics Engineer : Financial Analytics

Nazmul is a seasoned accountant and financial analyst with 7+ years of experience in financial analysis, reporting, and intercompany reconciliation. Holding a bachelor’s degree in finance, he has worked across hospitality, healthcare, and software industries, delivering data-driven financial insights that drive smarter business decisions. With a keen eye for detail, Nazmul ensures accurate financial reporting and helps businesses maintain financial stability through clear, actionable insights.

Rejaul Islam Royel

Analytics Engineer • Data Engineering

Rejaul Islam is a skilled Data Analyst specializing in Power BI, SQL, DAX, and Azure, with expertise in data modeling and business intelligence (BI). As a Trainer & Instructor, he empowers businesses to optimize data-driven strategies through practical insights and hands-on training. Recognized as a leading voice in BI, Rejaul combines technical expertise with a passion for teaching to help organizations maximize their analytics potential.

In this article

Like what you see? Share with a friend.

Related Events

Related Services

Ikramul Islam

Khaled Chowdhury

Datacrafters | DatabricksDatacrafters | Microsoft FebricDatacrafters | AzureDatacrafters | power BI Services

Rubayat Yasmin

Microsoft-Certified-Power-BI-Data-Analyst-AssociateMicrosoft-Certified-Fabric-Analytics-Engineer-AssociateMicrosoft-Certified-Azure-Data-Engineer-AssociateMicrosoft-Certified-Azure-Solutions-Architect-Expert

Rami Elsharif, MBA

Microsoft-Certified-Power-BI-Data-Analyst-AssociateMicrosoft-Certified-Fabric-Analytics-Engineer-Associate

Govindarajan D

Microsoft-Certified-Power-BI-Data-Analyst-AssociateMicrosoft-Certified-Azure-Data-Engineer-AssociateMicrosoft-Certified-Azure-Administrator-AssociateMicrosoft-Certified-Azure-Solutions-Architect-ExpertDatabricks-Certified-Data-Engineer-ProfessionalLinux-EssentialsMicrosoft-Certified-Fabric-Analytics-Engineer-AssociateMicrosoft-Certified-Azure-Enterprise-Data-Analyst-AssociateDatabricks-Certified-Data-Engineer-AssociateMicrosoft-Certified-Trainer-MCTAzure-Databricks-Platform-Architect
// linkedin