HomeUpcoming webinars

Build Event-Driven Data Pipelines with Snowflake & Airflow

Key Takeaways:
  • Learn how to design and build event-driven data pipelines using Snowflake and Airflow.
  • Understand best practices for reliable and maintainable data pipeline architecture.
  • Explore a real-world use case and see how to take a pipeline from prototype to production.
Thursday, December 4, 11 AM ET
View More Webinars

Register for the webinar

By continuing, you accept our Terms of Use, our Privacy Policy and that your data is stored in the USA.

Description

Data pipelines are the foundation of modern analytics and machine learning. But making them reliable, scalable, and efficient requires the right combination of tools and design principles. In this hands-on session, you’ll learn how to build and orchestrate a complete data pipeline using Snowflake for data warehousing and Airflow for scheduling and workflow automation.

In this code-along webinar, Jake McGrath, a Senior Software Engineer at Hike2, will guide you through constructing an event-driven data pipeline from ingestion to transformation. You’ll explore how to automate tasks, manage dependencies, and ensure your pipelines are robust enough for production workloads.

Presenter Bio

Jake McGrath Headshot
Jake McGrathSenior Software Engineer at Hike2

Jake is a Senior Software Engineer at Hike2 who specializes in Data and AIOps. Previously, Jake worked at Astronomer, the Apache Airflow company, helping companies of all shapes-and-sizes implement Airflow in their environment. Jake is also a six time DataCamp Instructor! His most recent course is "Data Types and Functions in Snowflake". He loves all things Python and Apache Airflow.

View More Webinars