Cours
Introduction to Databricks Lakehouse
DébutantNiveau de compétence
Actualisé 04/2026DatabricksData Engineering3 h15 vidéos43 Exercices3,550 XPCertificat de réussite.
Créez votre compte gratuit
ou
En continuant, vous acceptez nos Conditions d'utilisation, notre Politique de confidentialité et le fait que vos données seront hébergées aux États-Unis.Apprécié par des utilisateurs provenant de milliers d'entreprises
Former 2 personnes ou plus ?
Essayez DataCamp for BusinessDescription du cours
Understand the Lakehouse Architecture
Start by discovering what sets the lakehouse apart from traditional approaches. You'll explore the medallion architecture — bronze, silver, and gold layers — that transforms raw, messy data into clean, business-ready insights. Then get oriented inside the Databricks workspace to understand how catalogs, schemas, and volumes organize everything.Master Compute and Notebooks
Learn to choose the right cluster for the job, configure autoscaling and auto-termination to control costs, and build notebooks that mix Python, SQL, and Markdown. You'll also connect your work to Git through Databricks Repos for version control and team collaboration.Govern and Share Data Securely
Explore Unity Catalog to manage access controls and track data lineage across your organization. Then use Delta Sharing to distribute data to partners — on Databricks or any other platform — and query external sources with Lakehouse Federation, all without copying a single byte.Deploy to Production with Asset Bundles
Wrap up by packaging your notebooks, pipelines, and jobs into Databricks Asset Bundles for repeatable, automated deployments. A capstone scenario brings everything together so you leave ready to apply these skills on the job.Prérequis
Introduction to Databricks1
The Lakehouse Paradigm
Discover what makes the lakehouse different from traditional architectures, how the medallion pattern organizes data, and where things live inside the Databricks platform.
2
Compute and Notebooks
Spin up the right cluster for the job, configure it for cost and performance, master the notebook environment, and connect your work to Git - all inside the Databricks workspace.
3
Governance and Sharing
Lock down your data with Unity Catalog, share it securely with Delta Sharing, and federate queries to external sources - all without copying a single byte.
4
Deployment and Next Steps
Package your work with Databricks Asset Bundles, deploy to production, and bring everything together in a capstone scenario.
Introduction to Databricks Lakehouse
Cours terminé
Obtenez un certificat de réussite
Ajoutez cette certification à votre profil LinkedIn, à votre CV ou à votre portfolioPartagez-la sur les réseaux sociaux et dans votre évaluation de performance
Inclus avecPremium or Teams
S'inscrire MaintenantRejoignez plus de 19 millions d'utilisateurs et commencez Introduction to Databricks Lakehouse dès aujourd'hui !
Créez votre compte gratuit
ou
En continuant, vous acceptez nos Conditions d'utilisation, notre Politique de confidentialité et le fait que vos données seront hébergées aux États-Unis.