Megan Bowers is Senior Content Manager, Digital Customer Success at Alteryx, where she develops resources for the Maveryx Community. She writes technical blogs and hosts the Alter Everything podcast, spotlighting best practices from data professionals across the industry.
Before joining Alteryx, Megan worked as a data analyst at Stanley Black & Decker, where she led ETL and dashboarding projects and trained teams on Alteryx and Power BI. Her transition into data began after earning a degree in Industrial Engineering and completing a data science bootcamp. Today, she focuses on creating accessible, high-impact content that helps data practitioners grow. Her favorite topics include switching career paths after college, building a professional brand on LinkedIn, writing technical blogs people actually want to read, and best practices in Alteryx, data visualization, and data storytelling.

Richie helps individuals and organizations get better at using data and AI. He's been a data scientist since before it was called data science, and has written two books and created many DataCamp courses on the subject. He is a host of the DataFramed podcast, and runs DataCamp's webinar program.
Key Quotes
When it comes to domain knowledge, you need to know what your business stakeholders care about. What metrics are most important to them, why? Understanding and asking good questions around, what does this mean when this number goes way down? Or asking those probing questions of getting out, why do you want this analysis?
We've seen people in Alteryx pull in LLMs to enrich or to clean up their data. Then, once that is incorporated into a full automated workflow, and you can schedule it, it schedules on a run, I think it gets more into an AI agent category.
Key Takeaways
Focus on developing skills in advising and monitoring AI systems, as the role of data professionals is shifting towards guiding AI rather than solely coding, especially with AI's growing capability to generate code.
Prioritize the creation of a semantic layer within your organization to ensure consistent definitions of business metrics across teams, reducing confusion and improving the reliability of AI-driven insights.
Develop a habit of continuous learning by integrating small, regular learning sessions into your routine, and leverage AI tools to efficiently digest and summarize complex research papers and industry developments.
Transcript
Megan Bowers: Hi, everybody. Excited to be here for a new type of episode for me at least. I'm Megan Bowers and I am a senior content manager at Alteryx, and I'm host of the Alter Everything podcast, which is podcast about data science and analytics culture. So excited to chat about data hot topics today.
Richie Cotton: Yeah. Great to be chatting with you, Megan.
So I'm Richie. I am a senior data evangelist at Data Camp. I'm also the POD podcast host. I host the data frame podcast about data and ai. Great to have a collaboration with you.
Megan Bowers: Yeah, definitely.
Richie Cotton: I wanna talk a bit about changes in skills. At Data Camp we do a lot of education for people wanting to learn about data skills or wanting to learn some AI skills.
And one of the big questions we have is, aI can increasingly write good code. Do I still need to learn about how to write code? Since working for a company that makes a low code tool do you wanna talk about how has that changed things?
Megan Bowers: It is really fun to, to work at Alteryx and talk to just the variety and like diversity of people that use.
Alteryx and like the divi, the diversity of roles in the business. So we see a lot of business analyst type roles jumping in with Alteryx, whether that's someone in supply chain who you know is getting drowned in the data and the spreadsheets and the supplier name challenges, or it's folks in finance who need more governability ... See more
But yeah, that's one thing that I see all the time is that users come from more of the business background, the domain expertise, and they are able to pick up the data analytics with the low-code, no-code tool that we have. It's cool to watch and it does raise the question of do analysts need these coding skills?
And it's hard to come up with a one size fits all answer, but I do feel like it's moving away from, you have to know are, you have to know Python to, you have to be able to pick up some of these tools and the tools might change, you might get new software to learn. And that's that's one of the things I've observed. And I think, when it comes to more of the data science role I have just been reading a little bit more on that and I think like there are some shifts from going from as heavy on the coding since AI can help with the coding moving towards like the advising, the monitoring, the AI kind of.
Advisor role, but I've seen, I've seen literature on this or writing on this, but I'm like, Ooh I don't know if I've seen those job descriptions quite yet, but yeah. I'm curious what your take is.
Richie Cotton: Yeah. I do think there's a big difference depending on what your job is. So like you mentioned, like even if you have never wanted to be a data person, you've gone into business or you've gone into some other role, they just evading everything.
So you do. See a lot of people where I have no intention to be a data person, but now I'm somehow a data person. And there it's like you really do want a lot of tooling assistance. 'cause maybe you don't. Care about the nitty gritty of data so much, you just want to get your job done. Whereas if you are actively a data professional, you've got into this on purpose, you've been mindful, I love working with data, then yeah, you probably want a bit more control.
And so coding is still incredibly useful in some cases. And maybe it's different between data analysts and data scientists, I think for a lot. Data analytics roles, particularly business analytics, common as you like, Alteryx plus BI tool. That's amazing. You don't need to write code. You will solve all your problems.
But for data scientists, I, I think dup to notebooks are probably still the kind of standard interface. And yeah. A bit of Python, a bit of a sequel that's gonna get you a long way.
Megan Bowers: Yeah. And something I think about too, just seeing some of the code generation or people using clot or open AI to help with generating code.
I think about, what if it doesn't work or what if you run into problems or yeah, you get some errors or you have issues six months down the line, like you do still need to be able to troubleshoot. And to be able to diagnose and yeah, I think that. There's that element to it as well.
Richie Cotton: I'm right there with you. And I think one of the things we've been thinking a lot about at data camera as we create courses where it's like you learn to code originally, like the courses were very focused on you need to learn the syntax. And now it's less about that. It's the AI can help you with the syntax.
You need to understand the concept of what do. Actually trying to do. And you also need to be able to read the code that the AI has generated and understand it just to make sure is this actually right or not? And be able to debug it. 'cause if something goes wrong, you need to have a clue about, oh, okay what's happened here?
What do I need to do to fix it? Being able to read stuff is more important. Be able to write stuff, I think, at the moment.
Megan Bowers: Yeah. Yeah. That makes a lot of sense.
Richie Cotton: Yeah. Beyond this I think you had a story about AI agents.
Megan Bowers: Yeah, I can jump in. We, I know we wanted to talk about AI agents and like the hype versus reality.
Are we there yet? Which is a hard question to answer. I appreciate like Gartner and other companies trying to survey people on this but I did find some interesting, like applications on this one Wall Street Journal article where they were talking about, okay, how are companies using. AI agents today.
This was like more beginning of the year, and there was an interesting one at eBay where they're using. To write code and create marketing campaigns. It was talking about how they have this like agent framework where it kind of determines which AI models are used for which tasks. So to me, that felt like a use case at the next level.
It's not just we have some AI models we're using, it's like we are orchestrating and deploying, sending out. I don't know exactly how it works, sending out the right. Agent for the right task, that kind of higher level of determining, what is needed for which task. And I thought that was interesting.
And they're using it for code as well to Yeah. Generate code, improve code, things like that.
Richie Cotton: Yeah. It just seemed like there were a lot of use cases and maybe the definition of AI agent is very broad because. It's on a marketing term at the moment because there's, everyone wants to say, yeah, we're using AI agents.
And so there's some very simple stuff where it's it's basically just like it's a business process encoded in software and like maybe there's an LLM in it somewhere. And then you have all these companies promising, like AI workers. So you've got like this. Cognition Labs has this Devvin AI software engineer.
You've got, like Julius AI's got an AI data scientist. E'S got like an a universal AI employee, which sounds very ambitious and I'm not quite sure yet. I'm not quite sure these work completely yet. Like you can't quite replace humans reliably. I don't know whether you have a sense of like where the sort of cutoff point is for what's an agent that actually works?
Megan Bowers: It is tough. What you said about it being sometimes marketing, there's almost this AI agent washing like greenwashing where it's like you take something that's close to an AI agent and you spin it out as like we, we have an AI agent. So it can be tough if you don't know the details to know.
What exactly truly is AG agentic AI being deployed at the companies? Some of these are still maybe more in experimental phases. There was this like report by Gartner that like 40% of Ag agentic AI projects will be canceled by 2027 and that prediction, and I was like, whoa, that's a. That's a big number, and I think it speaks to some of the early stage ness of it all.
Richie Cotton: Although surprisingly low, actually, like only 40%, that means 60% is actually gonna work. I feel like your glass's wild, optimistic.
Megan Bowers: Okay. Okay. I don't know. I was just like, oh, if we're, if people are missing billions of dollars, that's a lot of, 40% is a lot of dollars that are just gonna disappear because of like unclear business value or risk, I appreciate the glass half full take 60 per Yeah. You're like 60% of these agents like that, that could translate to people losing their jobs because if the agents are like a digital workforce. It's true, 60% is pretty high.
Richie Cotton: Definitely. I think there's the very simple agents and maybe where most of the action is at the moment.
Regardless one of my colleagues in the sales team created this agent where after every sales call the salespeople are supposed to. Write about like the state of the deal, like what happened in the call, and there's this framework called med pick, which I don't really understand.
A sales thing just basically says the these are the attributes, like where we're up to with the deal and all the account executives, their business development reps they really hate doing this. 'cause writing, what happened in the call is tedious and it's. Going after new deals. So there's an agent now.
It takes the transcript, it summarized it according to the format. It uploads the details like wherever they're archived, and Salesforce somewhere. Is doing a task that like humans just really hate doing, and they do it badly because they hate it. And so automating that kind of task is it's fairly straightforward.
It works, and you don't even have to, it doesn't have to be perfect. You just have to be better than a human that doesn't care. That's like just a really nice example. I don't whether you've seen similar examples of just like simple things that you can automate.
Megan Bowers: Yeah. I think. That it's the automation piece that kind of moves into the agent.
Like you're not the one copy and pasting the transcript, sending it to chat to bt, getting it, like sending it out if there's like that automation of the task. I feel like that is the first thing like that I've seen where it's just like the next, it is the next logical step where Okay. We've seen people in Alteryx pull in LLMs to enrich or to clean up their data, things like that. But then once that is incorporated into a full automated workflow and it's, you can schedule it, it schedules on a run, it does this I think it gets more into that AI agent category. And I've also seen it. Seen the agents starting to be used in cases where just you wanna get to answers but you're trying to pull from like many data sources, not just a web scrape of the internet.
You, you want your business sources, you want your Salesforce data in there, you want your customer data or. Maybe not customer data, all these different data sources at your company and having an interface where you can interact with all of that and even automate some reporting, things like that, I think is where I've seen some companies moving into.
Richie Cotton: Okay. Yeah, so I'm, I like the idea of, or at least automated data cleaning. 'cause that's the the part of analytics no one really likes, like no one's yeah, I love cleaning data. Yeah. That's a great idea for just yeah. Automating the business job that you don't like. And it's good for your mental health anyway, even if even if you don't care about the productivity benefits,
Megan Bowers: right?
Yeah. Creating lookup tables not good for your mental health. So being able to automate the data cleaning and. Swapping and all of that. Totally. It's much better as well as the, yeah. Memorizing coding syntax. I don't know, people already are Googling syntax. You just can't remember it all the time.
So anyway. Yeah, I like that.
Richie Cotton: Absolutely. And I guess so maybe. One of the bigger developments over the last few months has been that these deep research tools are getting quite popular and they seem to be getting to the point where they actually work of what they promise to do, which is just think a bit more carefully.
And so you can have slightly more complex reasoning all, so if you build these into agents, I feel like that's the next step of having agents that can think a little and do and make smart decisions. I don't know. Have you played around with the deep research tools at all? Have you seen whether they work or not?
Megan Bowers: I've seen some colleagues play around with them and show some kind of things around, say they have a new prospect or a new customer and they really wanna know I wanna help their business with our software. Like I wanna know all about their business and the amount of information, 12 page to 20 page research report on.
This is the business, these are the challenges. Like some really interesting information and I was impressed by just the volume of it all as well as the amount of time that this was through Chacha bt that Chacha BT took to think about all of this. Yeah, I haven't played around. I like love using.
For research, but haven't played around as much myself with the deep research. But what about you? Have you seen some interesting results from that?
Richie Cotton: Yeah, so it does seem to be one of those things where the theoretical sort of really good use case is doing customers for research or competitive research where you're like.
I don't really want to go and read through like 20 blog posts from this competitor. I just wanna see what they're doing. They're doing. But then, yeah. The deep research tools, they go away. They spend like 10 minutes. They come back with a 20 page report and they're like, do I wanna read 20 pages on this?
Then you have to use AI again. To summarize what the deep research tool did, I just lay that is laser.
Megan Bowers: It occurred to me when this 20 page document was shown, I was like, woo, gonna need an executive summary.
Richie Cotton: Yeah, so maybe it, it's a work in progress. I feel like at some point it's probably gonna get very useful, but now it's used with caution and you really do have to spend quite a lot of effort, I think, just from engineer to make sure you're getting exactly what you want there.
Megan Bowers: And something that made me think when you said, the next step of maybe incorporating this mode into AI agents, like it is more research. Resource intensive and if there are issues with really defining the true business use case, the true ROI and of an AI agent, and then you put in this insane, costly, or like resource heavy computing with the deep research mode, it's oh yeah, definitely proceed with caution in my mind of.
You definitely would wanna work your way up maybe to incorporating a model like that to make sure that you truly need that power and that you're not just making it advanced research just to make it advanced research. That's just something that comes to mind.
Richie Cotton: Absolutely. You can research things like forever, but.
Unless you actually act on that or you've got like the right information in there, then it's just a sort of a waste of everyone's time. I suppose
Megan Bowers: a lot of this, we're talking about a lot of AI applications and there's been headlines around companies saying like they're an AI first company. So I'm curious like what your take on that is, on what that really means.
Richie Cotton: Oh yeah. I have very mixed opinions on this, so I think on the positive side is yeah, in. Every business, there are gonna be lots of use cases of AI that people have not yet thought of. And it's important for CEOs to motivate people to be like, yes, let's go and see what we can do here. The cynical side of me is actually this is just a polite way of saying we wanna do a hiring freeze where we wanna make layoffs.
And this is just like a way to make it more public acceptable. So there are definitely gonna be, some sketchier cases where this is just a sort of PR front around layoffs or around hiring freezes. But in general, I think it is good for almost every business to be start thinking, in fact, not just about ai, but in general about how can you go about re-engineering your processes because just the fact that tooling has got so much better just for automating things with software and doing new things with LLMs, doing new things with computer vision.
It just mean there are a lot of use cases for making your process better, and as you start thinking about changing your processes, you might go, okay, we don't actually even need a technological fix. It's just something simple oh, this workflow is stupid. Let's do things a different way and we're gonna see some improvements there.
Megan Bowers: Yeah, definitely. And I think that. I can also be a little on the cynical side with the top down mandates or the, we are AI first, and it's interesting to see. There was one example with the company Klarna, where they went AI first for their support service chat bots, and it went from humans to ai and then they ended up rolling that back and starting to hire back some humans because the customer feedback just wasn't it They were getting Yeah, lower satisfactions.
Like I think that at the end of the day, like. The AI that they implemented, it wasn't truly what the customers wanted. And I think that's like an, a good thing to keep in mind for, or yeah, to keep in mind for all these companies when you're saying AI first, and I think it's important to start considering whatever business unit you're in, wherever you are.
Like, how could I use AI to make this better? But then, if you're putting it in your products, your offerings, your services. Okay. Is this truly what the customer wants? Does this really solve their problem? Does this really improve their experience? I think that is gonna be like a more important question moving forward.
Richie Cotton: Yeah, definitely. No, I do find that Klarna case very fascinating because with everyone saying we're going AI first, now it's like Klan started this process like back in 2023. They're very much ahead of the curve. And I think they did have some successes. They they had this sort of chat bot that was hooked.
It was like a, I think a GPT power chat bot that was hooked up their recommendation engine. So giving product recommendations and that kind of work. The area where it didn't work so well was with trying to completely automate customer support. I think it's like careful, it's good to think carefully about where do you want your humans in the process still and are there any absolute limits on we will not go rid, get rid of these humans as you're starting to think about? Incorporating ai?
Megan Bowers: Yeah. Yeah, definitely. That. Where's the human in the loop? Where is the review point? No matter what. What process you're looking at improving with ai?
Richie Cotton: Yeah, definitely. Actually, as I say out loud, you're probably not gonna know that in advance, right? You probably just have to do some experiment and just say, okay, maybe we take it too far and we roll it back like cloudy. Like you can always undo these cancer decisions, right?
You can always hire more people not the most efficient way of doing things, but yeah,
Megan Bowers: not without some reputation damage in this case, but yeah, you're right. You have to have some sort of. Testing at least to, to get a sense of where you might need review points. But I, in my mind, ideally that testing is done either as a beta test or internal tests or whatever before you launch it out to all your customers.
Richie Cotton: Yeah, absolutely. You're probably gonna want to. Like just trial, the AI first process at least a little bit and see, does it, do your customers hate it or not before before you roll it out to everyone. Okay. The next thing to talk about is the changes in data roles. For a long time. There's this idea that the data scientist was the sexiest job of the 21st century. I really had that phrasing actually. 'cause it's not really true. A good adjective for most data science work. Yeah. Spending all day swearing at a Jupiter notebook is not really that sexy.
But anyway, data engineering is arguably, claim that Crown. So data engineering is super hot at the moment. And then also you've got analytics engineering more recent role as well. So yeah, I guess let's talk about let's talk about those roles. Do you wanna talk us through how you feel about those roles?
Megan Bowers: I feel like the more of the focus on the data engineering, the analytics, engineering it's all this more focus on getting those. Getting the good clean data in pipelines like so that it can be used for ai. So it makes more sense to me that as we start to automate some of the, maybe some of the modeling or AI assisted modeling and then, AI data vis all this stuff, it's going to be just even more important that we have the reliable data.
'cause if you feed bad quality data through these AI systems. Then you're left with something even worse. I don't know. That's been like a theme I think on our podcast. Just the importance of good data, of the right data for your business, of all of those aspects. And I just think that obviously there are data scientists who have these huge, really broad roles that kind of are already doing this, and maybe they will just shift their focus a little bit more or see.
Job descriptions that shift more into the data preparation and data pipeline element. But yeah, that's just what I've been seeing of just the importance of having the right data set up for these AI models so they can actually get the value out of it. And I think that will continue to shift what, what is wanted in the market for.
For data type roles, what have you been seeing?
Richie Cotton: Yeah, it's, it is really interesting the idea that sometimes you really want like high quality control in your analysis. And I forget with a lot of analytics that's half the job. Near the half is just very much ad hoc analytics. The stuff I do is oh, it's trying to work out which podcast episode?
More popular, very little quality control needed. It's just like a quick look at the data. But if my colleagues in the finance department where it's okay, we need to know exactly how much like a RR the company has, like how much cash flow we have. They do paranoid data science, so everything has gotta be absolutely spot on and they're the quality controls there and you need to have those pipelines in place.
And that's where the data engineering, the analytics engineering, to make sure every I is dotted, every T is crossed. That's where it comes incredibly important, I think.
Megan Bowers: Yeah. Yeah, definitely. It's important to have that trust. And like the quality control I think that you were talking about.
Definitely.
Richie Cotton: Absolutely. So actually with the engineer, I think it's incredibly interesting because a lot of the job there is that when you've got a. Different teams having different definitions of things. Their job is to synchronize that. So I guess like a common things like maybe one team you say, okay anyone who has registered Delmar website, they're a customer.
And the team might say okay, they actually have. Buy something and pay us some money in the last financial year. And that, that, that comes from as being customer. And so different teams having different definitions of stuff cause a whole lot of confusion. And analytics engineers, one of their job is to create this semantic layer of what are proper definitions of any business metrics that are used by all teams.
And I love that kind of synchronization, just reducing confusion throughout your business, which is incredibly important as you, particularly as you scale your business and you've got more teams who are. In different locations.
Megan Bowers: Yeah. Yeah. I think that's awesome. I think that is like an age old problem of, oh, these people in this business unit by mean this, by this metric, but someone else in another business unit will see that on a dashboard and freak out.
And so I really love that having that, that business layer, have been data dictionaries and like that layer to describe, okay, this, these are the. The rows and columns you're looking at, but like having that for the business I think is really important. And I guess that does transition into what I wanted to talk about next because it gets into this like domain knowledge, like what happens with domain knowledge, the business knowledge as we move into, in the data space as we move into more of the AI powered more of the automation.
Richie Cotton: Oh yeah, absolutely. Because just thinking about it. If you've got seven different definitions of what is a customer, and you're asking your AI chatbot, how many customers do we have? It doesn't know which one to use. You're gonna get different answers, different times. It's probabilistic or it's a little bit random.
So you really want that single definition. So you need that semantic layer, that single definition of any given business metric. Otherwise the AI is not gonna be able to give you a good, consistent answer.
Megan Bowers: Yeah. Definitely.
Richie Cotton: Yeah. And you were talking about the importance of domain knowledge.
Do you have a sense of what data practitioners need to know about their domain or their business?
Megan Bowers: Yeah. It's a big question,
Richie Cotton: right?
Megan Bowers: It's a big question. That's a good question. I'm gonna go with the classic. It depends. I think it can really depend on whether like the data practitioners are in this, in a center of excellence function where.
They are, they are the data analytics, data engineering, whatever team servicing for a bunch of different teams versus being a finance analyst or being, a mar the analytics person in the marketing department. I think definitely the threshold for those, that second type of role is higher.
Like you're expected to have more of the domain knowledge. Overall, though, the biggest thing for me when it comes to domain knowledge is you need to know what your business stakeholders care about, like what metrics are most important to them and the why kind of thing. Understanding. And asking good questions around if you're dealing with data, it's oh, what does this mean when this number goes way down?
Or asking those probing questions of getting out, why do you want this analysis? Because occasionally, sometimes stakeholders will come and think that they want one thing and as you dive deeper you realize, oh, there's actually, we, I know about the data at this company. There's actually more data we could use to like really get you some more insights.
But they don't know 'cause they're focused on their own roles and so asking those like questions and having a feedback loop with the business side, no matter how embedded your role is, I, I think is just like a crucial skill, especially if we're gonna. If some of the technology skill pieces might be automated.
Richie Cotton: I do love the idea that it's incredibly important to be able to translate between business questions and data question. It's just like going backwards and forwards between the two. I think that's a pretty timeless skill. So yeah definitely want to work on for a lot of people. I was definitely interesting.
So you talk about the difference between being in a central data team and being, embedded as a data practitioner within like a commercial team or some other team. So I think that's definitely a trend I've seen is that for data analysts and data scientists, they are increasing, becoming embedded.
So unless you a large company it's fairly rare to have analysts in central data teams, though Morgan is my job title isn't data analyst, it's sales analyst. 'cause I'm in a sales team or I'm market a marketing analyst because I'm in the marketing team or. I'm doing data plus product and I'm in a product team I dunno whether there's something you've seen as well.
And do you think that's changing the nature of the data analyst role?
Megan Bowers: I do see that at smaller organizations I don't know, in my mind the centralized function where they're taking in tickets and they're managing things like that can be maybe like a more mature kind of data.
Structure, but I'm not sure. But like I do see a lot of embedded analysts and even, I guess going back to the very beginning of the episode, I see people who, their teams need an analyst and they step up and they become an analyst even though that's not their background and something interesting like a conversation.
A few months back on the podcast was with with an author who talks about this role of citizen data scientist and I guess it's something that is also an, a new industry term that I think is really interesting of like those people who are almost, 50% business, 50% data scientists, and they're able to do that because of the low-code, no-code.
Tools that arise. And yeah, I think like also going back to what you said on there's data everywhere, like you have to no matter what role you're in, data might data's gonna find its way into your work. So there's some evolution of these roles. And I'm curious to see how that'll continue to play out.
It's hard to know.
Richie Cotton: Yeah, definitely. I have to say I'm totally there with you that, even if your background isn't data, a lot of these the tooling is there now, so it's very easy to do at least some analysis. Even if you have no idea to be a a data analyst is all your job. But just yeah, just picking up some simple like Alteryx workflows draw some d.
Dashboards in Power Bi or Tableau or Looker or whatever the platform is, these aren't terribly difficult skills to learn. So it's something that you can do in addition to having like your. Sales, marketing, product, whatever, knowledge. So yeah, you can do it as like a kind of part as part of your job rather than it being like a full-time everything.
Megan Bowers: Yeah. And, but then I do think it gets a little trickier as you move into the more traditionally, like more technical data science, data engineer, what is that? Are there citizen versions of that? That can get a little. A little harder to balance with the business side probably.
Richie Cotton: Yeah, definitely.
I think once you get into I'm trying to think of what's a more difficult thing. So maybe like time series forecasting is a very useful thing for business analytics, but it's also it's a pretty technical skill. You are gonna have to spend like a lot more time learning that. The thing is with statistical stuff, so hypothesis testing it's one of those things where when people take courses on data camp, they, we have to push it further back in the curriculum 'cause it's it's the first thing that's like really conceptually tricky. Like everyone gets the hang of drawing plots pretty quickly. Not everyone, but most people get hang of drawing plots pretty quickly.
And meating data is fairly easy to learn for most people. You just gotta spend the time, but then statistics, you really gotta stop and think carefully 'cause a lot of it's counterintuitive. Yeah. There are definitely some. Conceptually harder things that maybe you do have to put in a lot of time to do well, although maybe AI will make it easier for everyone there.
Megan Bowers: Maybe it will. Yeah. It's it's a crazy time. I even just preparing for this episode, like just looking at all of the articles out there on ai, this, AI first that these companies are doing this is not getting value, this is getting value. I sometimes I feel like I can't keep up a little bit.
That is, it's challenging to keep up with the pace of all this sometimes.
Richie Cotton: Absolutely. Yeah. I think it's like it's you and 8 billion other people trying to figure out how do you keep up. Yeah, I think continuing to learn stuff, especially when you've got a day job and you've got deadlines of things, it is difficult and part of it's just about.
Building that habit of spending time to learn. So whether it's like you block off half an hour on a Friday afternoon to just sit and read stuff or whether you try and make it like a 10 minutes on your commute every day. Yeah. Building that habit of learning is incredibly important.
I don't think you have any tips for like, how to build learning into your day-to-day life.
Megan Bowers: No, I think it is important to incorporate as you can in small bits. And for me it's been important to when I do have a little bit of extra time, just taking a bit of time to like experiment. If I hear about a new tool, if I hear about somebody else's use case.
Do the learn by doing, at least with just like generative ai. For me, that's been helpful to play around with it and see what it is in action. But yeah, I'm sure that people wanna learn more. The data camp courses I think could be super useful to, to get into that.
Richie Cotton: Absolutely. And so I think one of the things I've found especially helpful is there's all these like papers on archive, like new technology developments and. Frankly, I can't bother the reader reverse. So it's okay, download the PDF, throw it into one of these AI chats. Okay, can you just tell me like the executive summary, like a bit more, get a bit more detailed than the abstract maybe, but just ask a few simple questions to try and understand what's going on there.
Found that incredibly helpful.
Megan Bowers: Yeah, that makes me think like maybe the logical next step of this podcast is creating an AI agent that sends you automated summaries on data hot topics if we go through the next steps of what AI agents could do. But anyway, more of a silly example.
Richie Cotton: Beyond this, I have to say with.
LinkedIn feed, I have to follow a lot of AI influencers and so it's very difficult to go on LinkedIn without just seeing someone talking about whatever the hot new thing is. Yeah. It is definitely fairly easy to passively consume this when it's been thrust at you.
Megan Bowers: Yeah. Definitely.
Richie Cotton: Do you have any like final advice for people who want to build their data or AI careers?
Megan Bowers: What's interesting is typically for, or sometimes on our podcast, it's like advice for people getting into data. And for that, like my biggest thing is like find the data where you are in your role now and start experimenting, start trying to use these new tools and whether that looks like automating your Excel analysis or downloading a.
Visualization tool or experimenting more with an LLM, I think that's a great place to start. In terms of like building, an already driving, career and data as we see AI more I do think a piece of it is trying to. Keep up with some of these announcements and like understanding when certain new technologies come out, when that could be transformative, like for your work and instead of maybe shying away from, oh gosh, that tool, it's like maybe it could replace my job and in five years or something, like leaning into how could I use some of the stuff I have now to show and demonstrate that I am, automating and I am improving with ai and be that person that people start coming to at your company when they have some AI questions or use cases or ideas. If you can get to that, more of the AI advisor role, whether on Aish or officially, I think that's huge. How about you? What is your advice?
Richie Cotton: Yeah, no I love that idea of getting to the point where other people ask you for advice just by understanding use cases and having at least like the first clue about what to do next. Having a, developing an introduction for what's sensible, what's not. For me, I think a lot of the stuff we talked about on the show today about the idea of having a mix of technical skills and business skills and beyond that, it's the soft skills, it's the communication skills sense of ownership, all this kind of stuff. Those is a package that's really gonna help your career, I think. And beyond that, I guess the one thing we've not really talked about is the idea that everyone's going AI first, and it's all about process engineering.
So understanding process analytics, that's the one area of analytics, I think is dramatically underrated. No one seemed to care about, but it's the future.
Megan Bowers: Yeah. That's awesome. Thanks so much for this conversation. This was super fun.
Richie Cotton: Absolutely. I really enjoyed chatting with you, Megan.