Data pipelines are the foundation of modern analytics and machine learning. But making them reliable, scalable, and efficient requires the right combination of tools and design principles. In this hands-on session, you’ll learn how to build and orchestrate a complete data pipeline using Snowflake for data warehousing and Airflow for scheduling and workflow automation.
In this code-along webinar, Jake McGrath, Senior Software Engineer at Hike2, guides you through constructing an event-driven data pipeline from ingestion to transformation. You’ll explore how to automate tasks, manage dependencies, and ensure your pipelines are robust enough for production workloads.
Key Takeaways:
- Learn how to design and build event-driven data pipelines using Snowflake and Airflow.
- Understand best practices for reliable and maintainable data pipeline architecture.
- Explore a real-world use case and see how to take a pipeline from prototype to production.



