Data pipelines are the foundation of modern analytics and machine learning. But making them reliable, scalable, and efficient requires the right combination of tools and design principles. In this hands-on session, you’ll learn how to build and orchestrate a complete data pipeline using Snowflake for data warehousing and Airflow for scheduling and workflow automation.
In this code-along webinar, Jake McGrath, a Senior Software Engineer at Hike2, will guide you through constructing an event-driven data pipeline from ingestion to transformation. You’ll explore how to automate tasks, manage dependencies, and ensure your pipelines are robust enough for production workloads.
Presenter Bio
Jake McGrathSenior Software Engineer at Hike2
Jake is a Senior Software Engineer at Hike2 who specializes in Data and AIOps. Previously, Jake worked at Astronomer, the Apache Airflow company, helping companies of all shapes-and-sizes implement Airflow in their environment. Jake is also a six time DataCamp Instructor! His most recent course is "Data Types and Functions in Snowflake". He loves all things Python and Apache Airflow.