Skip to main content
HomePodcastsArtificial Intelligence (AI)

Is Data Science Still the Sexiest Job of the 21st Century?

Thought Leader and Author in the Data & Analytics field Thomas Davenport joins Adel on the show to share his perspective on where data science & AI are at today, and where they are headed.
Updated Jun 2023

Photo of Thomas Davenport
Guest
Thomas Davenport

Thomas Davenport is the President’s Distinguished Professor of Information Technology and Management at Babson College, the co-founder of the International Institute for Analytics, a Fellow of the MIT Initiative for the Digital Economy, and a Senior Advisor to Deloitte Analytics. He has written or edited twenty books and over 250 print or digital articles for Harvard Business Review (HBR), Sloan Management Review, the Financial Times, and many other publications. One of HBR’s most frequently published authors, Thomas has been at the forefront of the Process Innovation, Knowledge Management, and Analytics and Big Data movements. He pioneered the concept of “competing on analytics” with his 2006 Harvard Business Review article and his 2007 book by the same name. Since then, he has continued to provide cutting-edge insights on how companies can use analytics and big data to their advantage, and then on artificial intelligence.


Photo of Adel Nehme
Host
Adel Nehme

Adel is a Data Science educator, speaker, and Evangelist at DataCamp where he has released various courses and live training on data analysis, machine learning, and data engineering. He is passionate about spreading data skills and data literacy throughout organizations and the intersection of technology and society. He has an MSc in Data Science and Business Analytics. In his free time, you can find him hanging out with his cat Louis.

Key Quotes

I think the time to move on AI projects is now. Morgan Stanley have been working with OpenAI for 18 months and Deloitte has been working with OpenAI for 18 months on code generation. You don't want to just sit back and wait for things to develop. It may be a hard thing to be a fast follower on. So start experimenting aggressively. Think big about what it's going to do for your organization and how it's going to transform it.

To become an AI-fuled organization, you need a lot of sort of breadth and depth. The breadth is lots and lots of use cases, you know, with Deloitte participated in some surveys where, you know, maybe 25% of large organizations have seven or more use cases implemented and are getting some positive outcomes. out of them. So that's still a small number. There are companies with literally thousands of use cases that have been implemented. So you need a lot, you need a variety of different types of technology, AI technology in your organization, conventional machine learning, deep learning, even rules, automation oriented technologies. conversational AI, which is increasingly based on deep learning but wasn't always. So a lot of different technologies, a lot of support from senior executives, a lot of investment in the area, a lot of talent, some sort of ethical framework to make sure your AI work doesn't go off the rails. You need... governance structure, at least a center of excellence, to kind of manage these activities and select tools and manage the careers of people and so on. So those are the kinds of things and it takes a lot of money to make that work, but the companies that we found that are doing it seem to be quite successful.

Key Takeaways

1

Embrace generative AI now, the time to experiment and integrate generative AI into your business processes is now. Don't wait for the technology to mature further before you start.

2

Expect AI to transform online interactions over the next 12 months, anticipate a significant shift in how we interact with online platforms, with AI mediating activities such as search, social media, and e-commerce through the use of conversational front-ends.

3

Data leaders should balance technology and cultural change. While technology is important, investing in cultural change and education is equally crucial for the successful adoption of data and analytics.

Links From The Show

Transcript

Adel Nehme: Hello everyone. Welcome to DataFramed. I'm Adel, Data Evangelist and Educator at DataCamp. And if you're new here, DataFramed is a weekly podcast in which we explore how individuals and organizations can succeed with data and ai. About 10 years ago, Thomas Davenport and DJ Patil published the article 'Data Scientist: The Sexiest Job of the 21st Century'.

In this article, they described the burgeoning role of the data scientist. And what it will mean for organizations and professionals over the decade to come. In a lot of ways, they popularize the job title as we know it today. So much so that the truism data scientists is the sexiest job of the 21st century.

Was even mentioned in the previous show description of data frame just a few years ago. However, as data science has become more and more institutionalized as it has morphed from a nice to have to a must have for every boardroom today, as technologies like AI and ChatGPT continue to blow our minds with their ability to perform data science tasks.

This begs the question, is data science still the sexiest job of the 21st century. So we invited Tom Davenport to data Frame to share his perspective on data science and AI today. Tom Davenport needs no introduction apart from co-authoring the article I mentioned earlier. He's also the President's distinguished professor of Information Technology and Management at Babson College.

The co-founder of the International Institute for Analytics, a fellow of the MIT Initiative for the Digital ... See more

Economy, and a senior advisor to Deloitte Analytics. He has written or edited 20 books and over 250 print or digital articles for Harvard Business Review, Sloan Management Review, the Financial Times and many other publications here in his PhD from Harvard University and has taught at the Harvard Business School, the University of Chicago, the Tuck School of Business, Boston University, and the University of Texas at Austin.

His most recent book, all in on ai, how smart companies win Big with Artificial Intelligence. Couldn't have come at a better time given what we've seen in the industry today. Throughout the episode, we discussed how data science has changed since he first published his article, how it has become more and more institutionalized, his views on AI and where he thinks it's going and a lot more.

Also this week is our Radar AI conference. If you've been following the AI space and have been interested in its potential impact on data, you'll definitely want to tune in. Tom will be there alongside a host of awesome data and AI leaders discussing what AI and chat chip team mean for data science, data literacy, and a law.

More the link is in the description. As always, if you enjoyed this episode, do let us know in the comments and in ratings and on social, and now on today's episode. Thomas Davenport, it's great to have you on the show.

Thomas Davenport: Thanks for having me.

Adel Nehme: know, 10 years ago you co-authored the article with DJ Patel in Harvard Business Review titled, data Science is the Sexiest Job of the 21st Century. There's a lot to unpack in how data science has evolved since the article. But maybe to set the stage for today's conversation, is data science still the sexiest job of the 21st century?

And if so, why? Yes. And if not, why not?

Thomas Davenport: Well, first I would say I think it was a memorable title. But there were many non-sexy aspects of data science at the time. And I remember thinking as I was interviewing the people for the article I said, there's a whole lot of data plumbing going on here. That's not sexy at all.

But it was obviously sexy in terms of the. Desirability of, hiring these people and the labor market and so on. I think that's still the case. It's changed a bit in that there are many other sources of data science expertise than the traditional sort of PhD in physics.

Types who were really the only thing around at the time. And the job has fragmented a fair amount. There are these different roles. And then we can do more with citizen data science than we could at the time. Certainly. we realize now these are not unicorns. They're pretty good at.

Creating models, but now we realize that's not all there is to data science to successful data science anyway. So I don't think, there's a lot of clarity in organizations about who is the data scientist and who isn't, and we haven't really succeeded very much in creating certification for them and skill testing and so on, despite, a few attempts.

But in general, it's still. Desirable job and if you have the skills, it's great to be a data scientist.

Adel Nehme: And you recently co-wrote an. Another article with DJ Patel on is Data science still sexy. And data science in a lot of ways you mentioned in that article has become a lot more institutionalized. We've seen over the past few years a significant increase in investment in data science by organizations across various industries.

How would you characterize the current landscape of investment for data science that organizations and leaders are doing, and how do you think that shift and that increase has. Impacted the state of data science adoption today in comparison to when you first wrote the article.

Thomas Davenport: It's interesting. I, think data science is much more institutionalized. In some companies it has become, the preeminent job in the sense the company realizes is it's a data driven business and they are trying in all sorts of ways to understand their data and develop products and services based on it and so on.

But sadly, I think that's still a minority of companies and. In most companies there doesn't seem to be a data driven culture. A lot of the decision making is still not data driven. A lot of the senior executives still don't understand what's so important about this. And so, in the majority of companies, I think even though there may be some data scientists, they're still on the margins of the business.

And their work is not given the respect that it deserves, I would say. But you know, that was true with traditional analytics as well. A relatively small percentage of companies decided that they wanted to compete on the basis of their analytical capabilities. And a relatively small percentage of companies are competing on data science and AI today getting larger.

And that's great progress, but still a minority I think.

Adel Nehme: Yeah. One thing you know early on that we saw is that data science adoption really shifted by industry as well and change. By industry. We saw, financial services organizations try to become data driven and adopt data science much quicker. Data rich industries tend to be much more skewed towards adoption faster than other industries.

Can you give us maybe an overview of where data science is being most widely adopted today? which industries, outside of technologies are leading the back and which industries are maybe lagging behind?

Thomas Davenport: Sure. Yeah. Still the case that, it takes a lot of data to do data science Well, so if a company doesn't have a lot of data they're going to be severely handicapped. And I still think, the biggest industries or financial services where there's a lot of data, both banking and insurance, and certainly, investments and hedge funds and so on.

A lot of data in telecom. So, telecom I would say is it doesn't have the same level of kind of cultural centricity as it does in financial services, but it's pretty powerful. And I just um, a few weeks ago wrote a piece about at and t and how they're they have a lot of data scientists, but I think quite interestingly, they have.

Developed a very active sort of citizen data science program. They've done a lot to democratize the activity. And I think that's what many organizations need to do. If they're serious about data science, they can't just rely on a relatively small percentage of, highly trained professionals.

Other industries. I recently did a session with Johnson and Johnson, and it's mostly a pharmaceutical company these days. And this was for all the people who were interested in data science is the first one they'd had. I've done this a lot. Speaking of the first one, they got 5,000 people to sign up.

So, that tells you there's a lot of data science going on. In the pharmaceutical industry, and I see it elsewhere. In that industry, manufacturing has been relatively slow to adopt data science. It's coming on with predictive maintenance and digital twins and so on, but it's still behind most organizations, some really high tech manufacturing.

Has a fair amount of it. I've written some things about Seagate, for example, and they're used of image detection, image analysis systems with AI to detect problems in electron microscope images. So, that's quite sophisticated, but it's relatively rare, I think, in that industry and.

You're starting to see it a bit in professional services, not just, consulting to other organizations, but in audit and tax and law even. Still and I think as AI enters law, you'll see a lot more of it. Mid-size companies tend to be less aggressive than either startups or really large companies, so that's another distinction to me.

Adel Nehme: Yeah, that's definitely the case and we've seen that as well in case of, data culture adoption as well from all. Our perspective, keeping here on the theme of data culture and creating a culture of decision making and data driven decision making, as you mentioned.

That's a obstacle that a lot of organizations have suffered with in the past decade or so as they increase adoption. Why do you think that people, component and cultural component is still a major obstacle for organizations and what have you seen to be a good pattern or success stories from organizations who've been able to overcome that hump?

Thomas Davenport: Yeah, well, I think there are two, primary issues that are related. Culture, tends to trickle down from the top of the organization. And if you don't have senior executives who are really committed to data science and analytics and AI as really important resources in the business. It's gonna be much harder for the rest of the organization to adopt a data-driven culture.

And a related factor is that most organizations, we've had this feeling, there's that old um, phrase you can lead a horse to water, but you can't make a drink. And I don't know if that's true in Europe, it's, it was popular at one point in the United States. And we think that because we develop these information systems and, analytics and AI and so on, that people will actually use them.

But that's not always true, and we don't really, we invest a tiny fraction of. The amount we spend on technology in cultural change and education and initiatives to create a more data-driven culture. So, it's like 1% to 99%. and that's despite the fact that every year I do a survey it's large companies, typically about a hundred or so.

Large companies, mostly in financial services, but other industries as well with new vantage partners. And every year the c e o whom I analyze this with Randy Beans as a question. What's the primary cause of your challenges with data and analytics and ai? Is it technology or is it.

Human cultural organizational process factors, and generally it's, between 80 and 90% human organizational process culture factors. But nobody spends 80 to 90% of their budgets on the things. So there's a real imbalance between what we. Spend on technology and the attention that technology gets and the attention that the cultural side gets.

Adel Nehme: In a lot of ways there's a pitfall of shiny toys here the organizations fall into. It's very easy to develop a high quality, cool, robust predict. Model that may not, or may not get actually deployed into production, but a lot of the times enabling folks with low-code tools and data-driven decision making, data literacy can get you a long way.

We've seen in a lot of ways organizations shift their priority and create data literacy programs. Have you seen that succeed in organizations? Is that only enough to create a data culture or is there an additional mile that needs to be crossed there?

Thomas Davenport: Well, it's a good thing to do that well, it's a good thing. It wasn't nearly enough. It should be tailored to particular parts of the business. There should be a human component, face to face probably where you can discuss these issues, not just. Watch a few minutes of online video.

And so, just good, but not enough. And, you know, a friend of mine, he he's recently changing jobs, but he was the head of analytics and ai data science at Eli Lilly, and he said, these cultural change programs are really multifaceted. You have to do data literacy, you should do it for differently, for different levels and different parts of the organization and so on.

But you should also have one-on-one things with senior executives. You should have um, communities developer on the organization. You should have behavior change programs in meetings. I always thought the best thing you can do is have somebody in a meeting say excuse me, but do you have data to support that hypothesis?

Or, oh, by the way, you're showing a correlation, but that doesn't mean there's causality going on. That best thing that could happen and maybe the hardest thing to change.

Adel Nehme: It's one thing to develop skill sets, right? But it's another thing to apply it in the flow of work, and more importantly, to develop that healthy data skepticism within the organization where there is an open and honest conversation about the data around. How can you action that in a particular business setting?

Now, let's shift gears and, discuss also how data science has matured from across different dimensions. You mentioned earlier in our discussion how the data science role, responsibility and skillset, that has evolved as well. Maybe walk us through some of the key shifts that you've seen in the data science role in the past few years.

How has that differed since the nascent days of data science as a profession?

Thomas Davenport: Well, one big difference is that, data scientists were the only job that were supposed to advance data science. And, we found out that didn't work very well. These people are not unicorns. They can't do it all. They're. Particularly good at developing models that, fit a set of data and maybe at writing some python code to, make it all work.

But some of the other factors not so good at, maybe not so good at interfacing with the business building their trust, maybe not so good at building a machine learning infrastructure at scale. Maybe not so good at changing the organization in all the necessary ways to make effective use of their models.

Changing the process, changing the skills changing the culture, et cetera. And maybe not so good at ongoing management of. The system once it's been put into deployment, assuming you're lucky enough to get it into deployment. Not very many systems were, that was part of the problem.

So, we've had this evolution of disaggregation of jobs. So now you have data scientists and you have machine learning engineers, and you have. Data product managers, which I think are the single in a way, most important job because they're the ones who integrate all this stuff. You have ML ops engineers.

In some organizations you have translators, right? I think many of those translation functions can be, data product managers. But really have seen a proliferation of. And data engineers more broadly to take some of that data wrangling off the data scientists. You've seen a proliferation of these jobs and I think it's a great thing, but, they have to be coordinated in the context of a particular project, and that's where the data product management role often comes into play.

Adel Nehme: So in a lot of ways, the story of the past few years have been, I've seen a debate raging on in the data science space between, is the optimal path for a data science profession. Specialization or generalization. Do you think that, as data science has matured, the path for succeeding in data science is specialization, becoming a specialist in a particular area of the data science pipeline, for example.

Thomas Davenport: Well, I think it depends to some degree on what your skills are. There's. Certainly are some data scientists who like dealing with managers who like overseeing a project who, understand how to create a model but realize that's only a small fraction of the job. But they're pretty rare to be honest.

And most of the time people got into data science. So. Because they like modeling, you know, they like coding. Those were the initial skills that were considered most important and most valuable. And it's why you needed, PhDs in these quantitative disciplines in, in many cases. And I often say, librarians sometimes like books more than people.

System developers like computers more than people. Data scientists sometimes like models more than people. So, I think you have to go with the skills and inclinations that you have in most cases. I think you're gonna be better off with that kind of specialization. And the data scientists who really like that broader range of activities can, move into.

Data, product management, oversight of the data science function and so on, where they have more managerial activities and less, kind of day-to-day modeling stuff.

Adel Nehme: And what's interesting, speaking of skills here is not just how the skill sets and roles and responsibilities of data science have evolved, is the also the educational landscape within data science and how, education. Within the space has evolved to unlock that specialization.

Can you shed some light maybe on the changes of the educational landscape within the data science industry and what that looks like today?

Thomas Davenport: Yeah, well, you guys are in that business, so, you probably know more about it than I do, but from my perspective, being a professor we at Babson College have a masters of Science in business analytics. There are literally hundreds of these programs in the United States alone.

Five years ago I wrote about this and there were over 200 just in business schools. And there are also programs in data science and analytics and AI in engineering schools, computer science schools, et cetera. So they're really all over the place. And in many cases, I think there are two problems.

One is, It's hard for somebody who's applying to one of these programs to know, okay, well what am I getting mostly? Am I getting something that lets me do hardcore modeling? If so, I better have some pretty good statistic skills before I go in because these are mostly one year programs. Or am I getting something that's more oversight of the entire process or.

Babson, my school is mostly known for entrepreneurship and we try to take an entrepreneurial spin on business analytics, which is a little unusual but valuable if you want to go into that. combination of emphases, so students don't know what they're getting. And in many cases, I don't think that one year is enough to create, a great data scientist.

somebody was telling me the other day, they think what it really does is create excellent citizen data scientists, and that's probably true. And then you end up focusing on some part of the business domain, supply chain or marketing or whatever in terms of getting a lot of expertise in that and knowing how you can use data science to advance that, that particular aspect of the business.

Adel Nehme: And you mentioned that especially it's great, like, a lot of programs are great creating citizen data scientists. We see that on the employer side as well, a lot of times. Employers have a lot of ease in getting, junior data scientists, data analysts in the door when it comes to hiring, right.

And recruiting, but struggle in getting that extremely proficient, advanced data science talent within the organization that has a lot of experience in machine learning, modeling, deploying these models, et cetera. What do you think needs to change in the education industry today, or education in general to be able to create a healthier pipeline from that junior talent to advance talent?

Thomas Davenport: Well, it's interesting. There are a ton of master's programs, not very many PhD programs in the world in data science. So that would be one, one thing I think data science is almost by definition a multidisciplinary activity. And universities don't tend to do well at that. They're good at creating statisticians or physicists or whatever, but you're combining a bunch of different skills, which may come from different parts of the university not so good generally.

I think, somebody who has. A master's degree could be encouraged to go back and get more training, maybe to get a PhD or more specialized training in certain aspects of data science. I think if you are highly motivated, I. To acquire new skills. There are tons of places where you can get them.

Your organization, there are tons of these online courses by Coursera and Udemy and so on. If you're highly motivated, you can get the education that you want, even on YouTube. But most people don't really know enough about what they want and so I think it means that organizations need to develop.

Categorizations and certification programs within their companies to say, okay, here are the skills that you need to really be, a level one data scientist. Here's some places you can get them here. If you wanna be a data product manager, here's what you need. If you want to be a data engineer, here's what you need.

A few companies are doing that, but they're not nearly enough out there.

Adel Nehme: Yeah, and this. Connects to another question that I wanted to ask is that, a lot of organizations, while they've made strides in building data teams, I think if you wanna compare data science maybe to a natural counterpart, which is software engineering. Software engineering tends to be a lot more mature when it comes to leveling, right?

You get a junior software engineer level one, there's a level two, level three, you become a staff engineer after that distinguished engineer. Et cetera, and so on and so forth, or you become a people manager. Data science doesn't necessarily have that. It's not codified across the industry. You see organizations trying to do that for their specific use cases.

What is the solution to the laddering problem in data science? Because I do see the end, the long term, this can create problems where data scientists feel stagnant depending on the type of organization they work in.

Thomas Davenport: Yeah, well, it's interesting on the systems side, you tended to have some dominant vendors. In a lot of Microsoft certified systems engineers, for example, you don't really have that on the data science side. So it would be helpful if some vendors would step up. And there are some organizations out there.

There's one that I've done a little bit of work with called I A D S S. I forget what the letters stand for. That's created a certification approach. There's an analytics certification approach that is fairly popular. It's called a CAP program, certified Analytics Professional. It doesn't really deal with all aspects of data science though, so, we need more people developing certification approaches and I think if a, if, some vendor can inject a lot of money and resources into it, that would probably help even though, it'd be great if you could open source it, but that hasn't happened.

Adel Nehme: Okay, that's great. So, we talked about data science. How has it matured a lot? But you know, your latest book on all in on ai we need to talk about that as well given especially the many changes and movement that we're seeing in the AI space. So, as data science is becoming more institutionalized, maybe AI is this new frontier for organizations.

You mentioned in the book that less than 1% of large organizations view themselves as AI driven. Maybe to first set definitions straight, what does it mean to be an AI fueled organization?

Thomas Davenport: There's a lot of breadth and depth. The breadth is lots and lots of use cases. with Deloitte participated in some surveys where, maybe 25% of large organizations have seven or more use cases implemented and are getting some positive outcomes. Out of them.

So that's still a small number. There are companies with literally thousands of use cases that have been implemented. So, you need a lot, you need a variety of different types of AI technology in your organization. Conventional machine learning, deep learning, even rules automation oriented technologies.

Conversational ai, which was increasingly based on deep learning, but wasn't always. So, a lot of different technologies, a lot of support from senior executives, a lot of investment in the area, a lot of talent. Some sort of ethical framework to make sure you, your AI work doesn't go o off the rails.

You. Need a governance structure, at least a center of excellence to manage these activities and select tools and manage the careers of people and so on. So, those are the kinds of things and, it takes a lot of money to make that work, but the companies that we found that are doing it seem to be quite successful.

Adel Nehme: Definitely takes a lot of money, takes a lot of resources. So maybe let's unpack a lot of that and start at the beginning. You discuss in the book on how to become AI driven, that there are three strategic archetypes, usually the AI organizations can adopt, right? Can you explain maybe these archetypes, if an organization's trying to get started wants to look at a roadmap, a framework for developing their AI strategy, what should that look like?

Thomas Davenport: And you could do more than one, but I think it's good to have the strategic discussions around the organization about, well, what are we gonna do with ai? So there's the doing something, new idea, new strategy, new business model important new products and services that you're going to offer.

And that to me is the most interesting one, although can be hard to do, requires a lot of conviction on the part of. Senior executives, and it requires a lot of people to understand the business strategy and AI so they can make that connection. The second one is operational transformation.

Not just improving things on the margin, but doing dramatic change in how you do key aspects of the operations. So one of our examples of that is Shell, where they're Using things like AI image recognition to inspect all of their refineries with, drones and image recognition to identify potential problems in piping and valves and so on, so they can send a human out to look at what, appears to be a problematic area.

Just a dramatic change in how you do maintenance. And they're also doing some really interesting things with subsurface stuff and some. Probably not enough yet for the planet, but some things also in the space of of non carbon based energy forms. Where to put charging stations that kind of machine learning analysis.

And then the third category is really trying to change customer behavior where you are Trying to do something with AI to make life better for your customers. And that tends to be insurance companies who used to, pay you if you died or if you got sick or if you crashed your car.

But they realized, gee, wouldn't it be better if we helped our customers not die, not crash their cars, not get sick. So it's early days of that, but using AI and behavioral science nudges to create more healthy behaviors, better driving behaviors, et cetera. So those are the three things.

Operational transformation is the most common. Changing customer behavior, probably the least common.

Adel Nehme: So maybe let's talk about operational transformation because mentioning that Shell case, that sounds like a very ambitious project. Right? What would be your, advice here for an organization starting off when trying to prioritize use cases? Right. Because conventional wisdom always says you need to start small and trade from there.

Versus going for a big project that has a lot of complexity requires a lot of talent, a lot of data, a lot of overhead. What would be your advice when organizations and leaders are trying to make that trade off?

Thomas Davenport: I think what kind of opportunity is there for this? And you can assess how much money are we spending in that particular area. Now, how much could we save if we were doing this a different way, you know, for Shell, it was takes years to inspect a refinery and the whole refinery and the traditional uh, Approach, and it takes days to inspect it in the AI enabled approach.

So obviously that's a big payoff. And maintenance and inspection is an important part of, keeping things going and successful. and then there's how viable is it? And you have to think about, since some of these things take a while, you have to think, well, how rapidly is the technology maturing?

We. Talked about a company in the book, not well known as a kind of medium sized company, approaching a billion in revenues called C Intelligence Solutions. And what they do is enable this capability to, for insurance companies, to let their customers take a photo of their car collision damage and get an immediate estimate for what it's gonna cost to fix it.

And they started on that. Well before, deep learning based image recognition was at the capability it needed before smartphone camera resolution was at the level that it needed to be and so on. So you requires some ability to look ahead into the future and see how these technologies are gonna be developing.

Adel Nehme: Okay, that's really great and we'll definitely talk about what you think about generative AI here, but maybe before we talk about and look forward to the future. In the book, you also talk about the human side of becoming an AI driven organization. This connects back to our earlier conversation on data literacy and data culture.

developing the human side is the most important aspect of succeeding in ai, not the technology side. Something we wholeheartedly believe in a data camp as well. Can you explain maybe why this is so important in the context of ai?

What an AI driven culture looks like, and how can leaders cultivate such a culture?

Thomas Davenport: Well, I think, traditionally an AI driven culture required a lot of, professional data scientists. But increasingly, in part because of generative AI and automated machine learning and so on, I think we're going to be seeing an environment where it has to be a.

Democratic pursuit in the sense of we need tons of people doing this. Even at Shell. I think they've now certified 5,000 of their engineers as knowledgeable in ai. Uh, you know, They had some online programs that they recommended and so on. So, Airbus did a similar thing in getting a lot of their engineers up to speed.

So, I think if you're starting from scratch now and you don't have that strong human culture for ai, I think you're much better off thinking about how do I enable a large number of people in my organization, many of whom are already employees to acquire these skills. And, the skills themselves are changing.

There aren't too many programs out there on. What do you need to know in order to be a good citizen data scientist? How do you use automated machine learning effectively? Even more. I think they're going to be machine learning programs that have Some companies have already announced them.

I'm not sure they actually exist yet but the front end is generative ai. So how do you tell a generative AI system what you want out of a machine learning algorithm and um, how do you refine your prompts? How do you interpret the output, et cetera. So we're gonna need some new programs of that type as these technologies mature.

Adel Nehme: Okay. I couldn't agree more. And you mentioned here generative ai. Let's dive deep into it. Of course, John Generative AI have garnered quite a lot of attention recently. Maybe discuss what.

Thomas Davenport: of, I haven't heard of any.

Adel Nehme: maybe discuss what the generative AI impact or landscape of impact looks like for large organizations, enterprises today, and how should they think about leveraging these technologies?

Thomas Davenport: Well, it's potentially quite earth shattering. It's quite early days, but very wide range of potential. Use cases. Now I think we're seeing the relatively less important ones. Writing blog posts or writing copy for ads or marketing materials. Really easy to do that. And um, what's created is not gonna be world class, but it's gonna be certainly a good input to an expert copywriter and it will very much improve their productivity.

I think the whole world of coding is gonna change dramatically. And I was, talking about this yesterday with some of my friends at Deloitte. We were together doing a presentation for a big private equity firm, and Deloitte is saying they expect improvements in coding productivity of like 50%.

Which means that we need, Far fewer programmers than we thought we needed. And I think more and more the programmers are gonna be like everybody else. They're gonna become editors of generative AI code rather than, people who write it in the first place. Any writer is gonna become an editor more than a first draft writer.

So, that's something that we have to deal with. I think there're gonna be a lot of applications in r and d, both kind of product design in terms of image related technologies, but also pharmaceutical companies are starting to figure out that just as generative AI can model Text sequences.

It can also model protein sequences or molecule sequences or something like that and see which ones are most likely to be effective. So, big changes coming. There a huge amount of change I think in professional services. Law is going to change dramatically. So, you name it, I think there're going to be a lot of impacts.

And now what should you be doing? Well, I think a lot of. Experimentation may you probably want some production applications. But I wrote a piece last week about Morgan Stanley. That big wealth management company is using it they fine tune, trained on a hundred thousand of their documents and they're using it to provide their financial advisors with high quality answers to questions they have on, what are the firm's analysts saying about particular investments from a recommendation standpoint, how best to.

Do certain tasks, like how do I establish an irrevocable trust for my grandchildren? What regulatory requirements have to be met in certain domains. So a hundred thousand documents is not that many. They have to be well curated to make sure that they're accurate in everything. But I think it's going to revolutionize, what we used to call knowledge management, which was an early enthusiasm of mine making it much, much easier to capture.

High quality organizational knowledge.

Adel Nehme: This is a very fascinating space. You mentioned here, really The breadth and depth of use cases that generative AI can unlock in a lot of ways. We're really still early. If you wanna compare this, maybe the last technology that was, that could have this potential of change is maybe the invention of the iPhone, for example, where you had a massive amount of software ecosystems that could have unlocked new type of application software.

we're still really early days right now in the generative AI space. But maybe walk us through how you think the economy will be impacted in the next few years. I'm gonna look at it from two different perspectives here. A lot of people tend to feel worry and anxiety when these types of tools come in.

As they can, may be displaced jobs, disrupt jobs and leaders as well are having to, address, create a vision for generative AI within the organization. That incorporates the human side as well. What would be here your advice for, professionals today that feel this worry and anxiety, and what would be your advice for leaders who are trying as well to communicate a vision that both incorporates humans and generative AI as part of the organizational strategy and future?

Thomas Davenport: Well, I've been a pretty persistent advocate of augmentation over automation. And I've written two books on the subject, one with 29 case studies of people whose work is augmented. Day to day called working with ai. Another, a little bit more theoretical, but with some examples back in 20 16.

And I didn't really suspect that. Large scale automation could be coming anytime soon. I'm less sure of that now with generative ai. In content creation fields, it is just so, amazingly productive and effective already. You say in the early days that I do have some concerns but I think.

For now you're nuts if you take the output of generative AI and release it directly into the marketplace without lots of, editing and trying out multiple prompts and so on. So there's still a human component at the beginning, at the end of generative AI process. I don't know that will always be the case as it improves in capability, but right now that's true.

So the most important thing you can do is. Start to use these tools as an individual and be more productive with them so that people can say, oh yeah we're getting some value outta this. We didn't have to hire an additional copywriter because our existing copywriters were so productive in, in what they did.

It's a Recessionary climate and some economies now we're already seeing a fair number of layoffs. You wanna make yourself as valuable an employee as possible. So, by all means, tell your boss what you're doing. where we're doing some, I'm doing some work with another guy on the whole idea of citizen development, and some people have used these.

Citizen development tools, even before generative ai. They've used him to hold down multiple remote jobs at once as they could be so much more productive. That I do not recommend. I think it would long run be not a good strategy. But so managers senior managers should be saying, This is a very powerful tool.

Let us know what you're doing, but we think it's gonna be really critical to our future and we'd like you to experiment with it. And I think they should be forming communities of practice within the organization to say, what have we learned about this? works, what doesn't work? Which tools work best for us?

Are there legal issues? Clearly, you want to get your general. Counsel and your legal staff involved for things that, go out into the world in terms of production. There's some, maybe some security issues, et cetera, some ethical issues. I. Talked to a fair amount about Unilever's approach to AI ethics in my all in an AI book, and I got a message from the woman who focuses on ethics primarily their yesterday, saying that they're quite focused on generative ai and what does that mean for their ethics approach.

So, I think there're gonna be a lot of potential ethical issues around misinformation and DeepFakes and the usual issues of transparency and bias and so on.

Adel Nehme: Yeah, I couldn't agree more. This is. Definitely an interesting, exciting but also wary space in a lot of ways that I'm looking at maybe Tom, to cap off our discussion. I know it's always annoying to put someone on the spot to give predictions. Given how fast the generative AI space is moving, What would be your prediction for this space in 12 months from now?

Thomas Davenport: Well, I think there will be generative AI front ends to all. Almost every software product that we use. We've already seen some announcements of these for AAU and HubSpot and Salesforce CRM and so on. More and more companies are gonna say command line interfaces or even point and click interfaces are going to be replaced by conversational interfaces.

We're gonna see finally, I think, chat bots that are really quite useful. It's not gonna happen overnight because they're gonna have to be fine tuned, trained, and it's gonna be, have to be on high quality customer service. Content. Which many companies haven't, bothered to curate in the past?

I think we'll see almost every online activity that we performed, search or on social media or Just so you know internet e-commerce will be mediated by a conversational front end. I think we'll still have prompts, but I suspect within 12 months we'll have some alternative to prompts where a generative AI's system would lead us into creating a prompt with a sort of intelligent front end as.

Crop creator. I think there will be many different customized models, fine tuned models millions probably in 12 months. So I mean, just think about in the legal profession. there are already a couple of large language models trained for the legal field.

Then you think about, okay there are differences between US law, European law, UK law. So they're gonna be versions for different countries or regions. There are going to be different versions for types of real estate law within those regions, like real estate law and even individual law firms will say, okay.

Allen and ovary a UK firm. We're gonna have the Allen and ovary UK real estate law model that nobody else has. It's not exactly like ours. So there're gonna be lots and lots of these things. And no matter what we do you wanna take a vacation in Peru, there'll be a model, custom design for where do you go, where do you stay?

How do you spend your time in Peru?

Adel Nehme: That is very interesting and very fascinating. It's gonna be a very interesting future and next 12 months and beyond. Finally, Thomas, do you have any final call to action or final words before we wrap up today's conversation?

Thomas Davenport: Well, I think the time to move on these things is now the Morgan Stanley people have been working with open AI now for 18 months. And Deloitte has been working with OpenAI for 18 months on code generation. So, there are some companies that are already ahead of yours in this regard.

So, you don't wanna just sit back and wait for things to develop it may be. A hard thing to be a fast follower on. So, start experimenting aggressively. Get your content in shape. This is a generative AI space. In the AI space in general, I think you can start small because AI tends to be small use cases, small level of functionality, individual tasks so far.

But think big about what it's going to do for your organization and how it's going to transform it. So, think big. But start small if you, if you need to.

Adel Nehme: Okay. Thank you so much, Thomas, for coming on the podcast.

Thomas Davenport: My pleasure. Thanks for having me.

Topics
Related

blog

What is Llama 3? The Experts' View on The Next Generation of Open Source LLMs

Discover Meta’s Llama3 model: the latest iteration of one of today's most powerful open-source large language models.

Richie Cotton

5 min

blog

Attention Mechanism in LLMs: An Intuitive Explanation

Learn how the attention mechanism works and how it revolutionized natural language processing (NLP).
Yesha Shastri's photo

Yesha Shastri

8 min

blog

Top 13 ChatGPT Wrappers to Maximize Functionality and Efficiency

Discover the best ChatGPT wrappers to extend its capabilities
Bex Tuychiev's photo

Bex Tuychiev

5 min

podcast

How Walmart Leverages Data & AI with Swati Kirti, Sr Director of Data Science at Walmart

Swati and Richie explore the role of data and AI at Walmart, how Walmart improves customer experience through the use of data, supply chain optimization, demand forecasting, scaling AI solutions, and much more. 
Richie Cotton's photo

Richie Cotton

31 min

podcast

Creating an AI-First Culture with Sanjay Srivastava, Chief Digital Strategist at Genpact

Sanjay and Richie cover the shift from experimentation to production seen in the AI space over the past 12 months, how AI automation is revolutionizing business processes at GENPACT, how change management contributes to how we leverage AI tools at work, and much more.
Richie Cotton's photo

Richie Cotton

36 min

tutorial

How to Improve RAG Performance: 5 Key Techniques with Examples

Explore different approaches to enhance RAG systems: Chunking, Reranking, and Query Transformations.
Eugenia Anello's photo

Eugenia Anello

See MoreSee More