Pular para o conteúdo principal

Preencha os detalhes para desbloquear o webinar

Ao continuar, você aceita nossos Termos de Uso, nossa Política de Privacidade e que seus dados serão armazenados nos EUA.

Palestrantes

Para Empresas

Treinar 2 ou mais pessoas?

Dê acesso à sua equipe à biblioteca completa do DataCamp, com relatórios centralizados, tarefas, projetos e muito mais.
Experimente o DataCamp para EmpresasPara uma solução sob medida , agende uma demonstração.

The State of Data & AI Literacy in 2026

March 2026
Webinar Preview

The Report, Slides, Resources

Summary

A data- and AI-leadership briefing for decision-makers building workforce skills—across HR, finance, marketing, IT, analytics, and transformation—inside large organizations.

This webinar recaps key findings from the “State of Data & AI Literacy in 2026” report, based on a YouGov survey of 500+ enterprise leaders in the U.S. and U.K. at companies with 500+ employees (fielded and completed in February 2026). The headline result is a clear link between workforce skills and outcomes: organizations with mature, structured, organization-wide data or AI literacy programs were nearly 2x as likely to report significant positive AI ROI. Leaders also agree that foundational skills—data literacy, AI literacy, and data-driven decision-making—now sit alongside staples like writing and project management. Yet they report persistent skills gaps and uneven training quality, creating what the speaker calls a “skills paradox”: training exists, resources exist, and urgency exists, but outcomes often lag.

The discussion turns practical: what blocks progress (time scarcity, budget, resistance, inadequate resources), why “one-off” training fails in a fast-changing field, and how the most e ...
Ler Mais

ffective programs are designed like shared infrastructure—scalable, flexible, embedded in daily work, and reinforced over time. Several slides and charts are worth seeing directly, especially those connecting upskilling maturity to AI ROI and the widening gap between skill importance and actual training coverage.

Key Takeaways:

  • Organizations with mature, workforce-wide data/AI upskilling are far more likely to report meaningful AI ROI; without training, “almost nobody reported significant ROIs.”
  • Leaders rank basic data literacy (88%) and AI literacy (72%) as important to day-to-day work—on par with long-standing workplace fundamentals.
  • Foundational, interpretive skills (decision-making, storytelling, correct interpretation) are viewed as broadly more critical than deep technical skills for the average employee.
  • Despite high reported training availability (roughly 80–90%), only about one in three leaders report having a mature, organization-wide program—where ROI diverges.
  • The biggest obstacles are not singular; time scarcity, budget, resistance, and insufficient resources each contribute to a system-wide “friction” problem.

Deep Dives

1) The ROI Divide: Why Workforce Capability Beats Tooling

The most important finding is also the least flashy: AI returns appear to depend less on which tools are purchased and more on whether people across the company can use them responsibly and effectively. The session’s centerpiece is a relationship between upskilling maturity and outcomes. As Lynne Heidmann puts it, “leaders who report having a mature organization wide data literacy or AI literacy skilling program were nearly twice as likely to report significant positive ROI from their AI investments.” Just as important is the counterfactual: in organizations that offered no data/AI training, “almost nobody reported significant ROIs.”

This is not an argument for training in the abstract. The nuance—and the operational challenge—is that maturity matters. The program has to be structured, organization-wide, and span both foundational literacy and role-relevant technical skills. In other words, the “workforce-wide” dimension is not a nice-to-have; it is tied to whether AI actually lands in workflows beyond a small technical enclave. Heidmann frames the implication bluntly: “ROI from AI is actually not primarily potentially a tooling story. It’s really driven by workforce capability.”

That framing is a useful corrective to how many organizations currently talk about AI. Procurement decisions, model benchmarking, and platform architecture are visible and often urgent, but they can disguise a quieter bottleneck: the organization’s ability to absorb change. AI tools can draft, summarize, code, and analyze; yet if staff cannot evaluate outputs, define appropriate use cases, or integrate results into decisions, value stays trapped in pilots and demos.

The session also hints at why workforce capability translates into ROI. A mature program standardizes fluency—common language, shared expectations, repeatable methods—so that AI adoption is not limited to isolated teams. It reduces rework caused by misinterpretation, avoids misuse that creates risk, and helps translate experimentation into consistent productivity gains. The full webinar is especially helpful here because the supporting charts show how sharply outcomes diverge once an organization crosses from “some training exists” into “a mature program is in place.”

2) The Data & AI Skills Paradox: Importance, Gaps, and Underpowered Training

One reason this conversation feels unsettled inside many companies is that leaders can hold several “true” beliefs at once—and still be stuck. Heidmann labels this the “data and AI skills paradox.” Most leaders say data and AI skills are important; most also say their organizations have skills gaps; most also claim to offer training and that employees have learning resources. Yet capability shortfalls persist, and AI value remains uneven.

The webinar’s charts place data literacy among core workplace skills. Leaders rank basic data literacy as important or very important to day-to-day work at 88%, and AI literacy at 72%—alongside staples like writing and project management. The message is not that every employee must become a machine-learning engineer. In fact, the session shows that “foundational and interpretive and sort of application oriented skills like data driven decision making, were ranked overall as more important day to day than technical skills.” Technical skills still matter, but for narrower roles; broad fluency matters for everyone else.

Leaders also attach money to these claims. A large majority report they are willing to pay salary premiums for strong data and AI skills (with many premiums falling in the 10–30% band), and 74% say they would pay more for strong data skills (69% for AI literacy). That willingness matters because it suggests the skills are not seen as a passing trend; they are becoming priced into labor markets and performance expectations.

So why does the paradox persist? Part of the answer is definitional slippage. “We offer training” can mean many things: a few optional videos, ad hoc workshops, or a narrow set of courses for technical staff. But the earlier AI ROI result suggests that training only changes outcomes when it is mature, structured, and organization-wide. Another part is that resources alone do not create behavior change. The session makes a pointed distinction: “You don’t become data fluent just by…being around dashboards…[and] you don’t become AI fluent by…having a chatbot available to you.” Fluency requires practice, feedback, and role-specific application—none of which is guaranteed by a library of content.

To see the paradox clearly, it helps to watch the full webinar: the visual gap between what leaders say is important and what they actually train is where the story becomes concrete.

3) Why Upskilling Stalls: Time Scarcity Is a Symptom, Not the Disease

When leaders are asked what blocks data and AI capability building, “time scarcity and competing priorities” rises to the top—yet not by a landslide. Budget constraints, inadequate training resources, and employee resistance all cluster close behind. The distribution matters: it suggests there is no single choke point. Instead, there is cumulative friction across an organization’s operating model.

It is tempting to dismiss time scarcity as an excuse—especially after leaders have already said these skills are critical. Heidmann’s reading is more structural. The problem is not merely that people refuse to make time; it is that many organizations “don’t have a model for learning that actually makes this sustainable.” In fast-evolving domains, a one-off workshop is quickly outdated, and ad hoc learning competes poorly with immediate business deadlines. Capability building, in this view, must be treated less like a project and more like an organizational utility—planned, funded, refreshed, and expected.

The session also highlights a deeper human constraint: adaptability. One cited response captures the gap not as missing knowledge, but as the difficulty of “accepting changes in the way we do things.” AI is not a static tool that can be mastered once; it changes, its use cases expand, and expectations shift. That turns literacy into an operating mindset. Companies that treat training as episodic often fail precisely because the environment demands continuous updating.

Another nuance is role diversity. The webinar argues that data and AI skills are embedded in everyday work across functions, but applied differently by each. HR, finance, and marketing will not share identical workflows or risks; “one size fits all learning doesn’t really work.” This complicates design: the program must scale across the organization while still mapping to the realities of each role.

Finally, the discussion hints at a hidden cost of “do-it-yourself” training development. Roughly one-fifth of leaders report creating their own training materials—highly custom, yes, but difficult to keep current in a space that changes quarterly. Watching the full webinar helps here because it contextualizes these trade-offs with additional survey cuts and supporting charts that are difficult to appreciate in text alone.

4) Designing Literacy as Infrastructure: Scalable, Flexible, Embedded, Reinforced

The session’s recommendations converge on a single idea: treat data and AI capability building as infrastructure. That means designing programs that can scale across roles, stay current as tools change, fit into daily workflows, and reinforce competence over time—rather than relying on isolated courses and hopeful self-study.

Scalable: Organizations seeing ROI are not only upskilling data teams; they are building baseline fluency across the workforce. The practical starting point offered here is to define a “minimum viable” data and AI literacy baseline for every role (or role family). Make expectations explicit: what should employees be able to do reliably? Then tie those expectations to role requirements so literacy is not perceived as optional.

Flexible: AI literacy today is not what it was “two years ago,” the speaker notes, and this volatility makes fixed curricula brittle. Flexibility requires updating learning paths as tools evolve, refreshing use cases quarterly, and building feedback loops from business application back into learning design. The implication is operational: someone must own content currency, and the organization must expect periodic refresh.

Embedded in work: The survey’s “time scarcity” theme is treated as evidence that learning feels separate from work. Embedding means bite-sized, hands-on, just-in-time learning that is tied to real problems employees face. The discussion also introduces “gate points” that create time and motivation: making learning mandatory at key career stages or linking it to access to specific tools—described as a “stick carrot method.”

Reinforced over time: Continuous change makes “train once” strategies unreliable. The webinar suggests redefining fluency away from “courses completed” and toward the decisions employees can make: Can they validate AI outputs? Identify appropriate use cases? Interpret data correctly? Articulate impact? This reframing doubles as an evaluation strategy—more aligned with outcomes leaders actually care about.

For readers building programs, the full webinar is worth watching for its implementation texture: the charts, the sequencing of recommendations, and the concrete examples that show how capability becomes a repeatable system rather than another initiative competing for attention.


Relacionado

infographic

Infographic: The State of Data & AI Literacy in 2026

Explore key findings from the 2026 State of Data & AI Literacy report.

white paper

The 2026 State of Data & AI Literacy Report

We surveyed 500+ enterprise leaders across the US and UK to understand how workforce capability is shaping performance, productivity, and AI ROI. This report unpacks the results.

webinar

The State of Data & AI Literacy in 2024

Join this webinar to learn how and which data & AI skills are becoming increasingly pervasive in organizations across industries, how leaders are adapting their teams and workforce to the era of data & AI literacy and more.

webinar

The State of Data & AI Literacy in 2025

Explore the latest trends shaping how organizations are building data and AI skills across their workforce.

webinar

The Executive Guide to Data & AI Literacy

Experts share how to design and implement a literacy strategy that works. Hear practical lessons on training at scale, increasing data and AI maturity, and identifying the highest-impact use cases.

webinar

The State of AI Literacy in Education

Oli Dewhurst and Mafer Bencomo walk through key findings from the survey and discuss what they mean for the future of education.