Introducing Sharon Castillo
Richie Cotton: Hi, this is Richie. Welcome to DataFramed. There's a fairly universal agreement that effectively using data is a huge competitive advantage for any organization, but while almost everyone at least says they're making an effort to get better at working with data, not every organization is doing that. A large part of the success of any data transformation or digital transformation program is getting the training part right. And data education is of course very close to our hearts at DataCamp. Um, naturally I've developed a lot of opinions on the subject, but I also find it very interesting to hear how other companies think about data training.
So in this episode, I'm chatting with Sharon Castillo, who at the time of recording is the vice President of Global Education at Data. And Sharon's been involved in e-learning for two decades and is a true expert in the subject.
Hi Sharon. Thank you for taking the time to chat. I'm excited to hear your thoughts on who needs data training and how you roll out a training program for an organization.
Sharon Castillo: Well, thank you for having me.
Who needs Data Training?
Richie Cotton: I guess to begin with, one really common question we tend to get at DataCamp is well, who, which my colleagues actually need data training, so perhaps you can shed a little bit of light on that.
Sharon Castillo: Well, the longer I've been in business, the more I realize how everybody needs data training because people have a tendency to use anecdotal evidence, and when you start to look into the data, it's astounding all the things that you can learn. And so in my organization, Even when we've had interns, the insights that they drive in my own business, they say, “Hey, look at some of this data and look at what's happening here and point out things that I never knew just by getting an understanding of the data in our business.”
So the question isn't who needs the training, but what kind of training do they need? You know, some people need more. Some people need different kinds of data training. So that's more where I would start.
Richie Cotton: Okay, so that's a kind of really interesting question about who needs what sort of training. So maybe you could give a few examples of that just for different roles or organizations.
Sharon Castillo: So some of it is looking at the benchmarks of what you need and the key performance indicators. Everybody in my team needs to know what's the business we're in and what are we driving at, and then what business do we report up to and what are they driving at? Because I'd like to think we're all about training, but we're really not. We're about customer retention, and they're not really just about customer retention, they're about the larger goals of our business. So you have to really understand all the way up the chain of what your business is about. But beyond that, the more senior people on the team need to dig into sort of narrow pieces of it and really pull it apart and say, Gee, if we did things differently, Maybe this is a smarter approach, or gee, if you take something that seems unrelated to something else, maybe there's a pattern there and maybe the offer we have today isn't the smartest way of doing something. So those are all different things to look at.
Mapping Data Training to KPIs
Richie Cotton: Okay, so I really like that idea of matching the training needs back to your KPIs or business goals. Do you have any specific examples of how getting some data training or some data skills might lead back to one of these KPIs or goals?
Sharon Castillo: So we had somebody on our team who started using a particular tool where they looked at what users did after training. Inside our products. And so we followed the path of what users did two weeks later, four weeks later, a month later, and two months later to see if we could find patterns. So that person needed training on that tool, but they also needed training on what good looks like using our product. And they needed training on what customers who were doing well looked like, and what the differentiators were between people who were doing well and not doing well. And then that person could start to see patterns of people who went to training versus people who didn't and people who went to certain kinds of training. And so we started to pivot some of the training offerings to focus on areas that might have more.
Richie Cotton: Okay. I wanna come back to impact in a moment. Cause that's a really important topic, is to try to sort of evaluate what's the actual benefit from training. But if you're saying, Well, everyone needs some level of data training, I'm wondering, how do you decide who goes first? So maybe if you don't have the budget to go and give everyone a load of data training, how do you prioritize who needs what?
Sharon Castillo: So not everybody needs tools, or training. So some of our training is informal. Every quarter I have a meeting with my team and I say, Here are our KPIs, and here how's we did as a team. And somebody who develops training still learns about delivery, and somebody who delivers training learns about development metrics and they go to meetings about customers and the data about our customers.
So there's informal training. And then the people who come to me and say, I think I can dig in here. Dig in there, but I don't know how to use whatever kinds of tools they're gonna use. BI tools, or in our case, we're lucky we have DataRobot, so, So you know, learn. Learn our product and feed all the data in the product and dig into it. So fortunately, those people are also trainers, so learn it really well.
Laddering Training Objectives
Richie Cotton: I suppose, yeah. People who have done some training maybe have an advantage in terms of knowing how to learn things. Going back to this idea that sort of different people need different things that they have to learn. I'm interested in how you figure out what order people have to learn things in and how you go from just learning individual things to having a whole learning path for people to get to their training goals.
Sharon Castillo: Yep. So you have to look at the journey and it's more complicated if you're in a more emerging technology area. So I've been in several when SAS became a thing, when e-commerce became a thing, when cloud management became a thing. Each time you have to pick apart. What do people need to know? In different, I hate to use the word roles, I'll get back to that in a second. But what did they need to know to do their job? And what's the basic foundational stuff that everybody needs to know? Like how does this technology work? And then layer the skills that everybody needs to know, and then what are the specialized skills? And then the learning path has to include.
Preassessment so that people can figure out where are they so you don't waste their time on the foundational skills and meet them where they're at. Cause that's one of the problems of people not wanting to send people to training is, Oh, I know that. And then they don't want to go to the whole thing because they think they're wasting time.
In fact, they end up missing a good chunk of things that they don't know because. Maybe they know a third of it or half of it, but then they jump past the half they don't know. And then the other problem is the roles, because there's a lot of overlap or the titles are wrong, titles don't mean anything. If you work at a tiny company and you're called a Data Analyst or a Business Analyst or a Data Engineer, Or a Data Scientist- that means a lot of things in a lot of places.
It could encompass a lot of things. It could be a very narrow thing. It could be someone who's fairly junior. It could be somebody very senior, you know, it could be somebody who's just finished a bootcamp, or somebody who finished a bootcamp or master's degree or something 10 years ago and has lots of industry experience.
It's a lot of things. So the title doesn't really mean anything. Of the things we found by using data is that we had roles and some people would take the training for every role and like about a third of the way through the second role course, they'd realize how much overlap there was and then bail out at the second course because they're like, Oh my God, I already saw that. So instead, we've really pivoted on.
Ignoring Job Titles in Mapping Skills
Richie Cotton: This is really fascinating and I have to say the idea that job titles are sort of nonsense for trying to work out what training needs. For a while, I had my own consultancy and I was like a one-person band, so I called myself CEO. But I don't have CEO skills in general, so it does really resonate with the idea that job titles are nonsense. So I'm interested about this idea of skill mapping that you talk about. So how do you figure out what content is related to particular skills?
Sharon Castillo: So we work cross functionally a lot. I'm also not a fan of just focusing on features, so some software companies get obsessed about features. Features don't make customers successful. Customers don't care about a feature. They care about solving their business problem, and so we look at success for customers.
And we work with customer success. We work with a lot of data scientists. Our company has hundreds of data scientists, and we talk to them and we say, You know, what are the key things that make people successful? Where are the gaps? Where do they stumble along the way? And then we map those things out and we don't always get it right. Sometimes you have to iterate back and have those things in.
Richie Cotton: Okay. Can you maybe give some examples of specific data skills? I'm hesitant to, You said , the word data scientist now. Cause I know there's a role and you said you're not keen enrolled, but can you talk about maybe some of the skills that you think are applicable for maybe a data scientist role?
Sharon Castillo: The biggest one I would say is asking the right questions. So how do you frame a problem? How do you pick the right problem? How do you know the business, the data of that business? What's important data? What's not important data, like what's just sort of tangential and it's interesting, but it's not gonna get you to your outcome.
And then you really have to dig deep and become a subject matter expert of your data, of your processes and look at your outcomes. If you work for a public company, Read the stuff they put out on brochures that say every quarter, what's important to the business and align everything you're working on up to that.
And then start small. Start to figure out what problems can I attack and how do I prioritize those? That's more important than any cool tool or algorithm. And then you have to start to learn how to see, is your data any good? Is it biased?
Richie Cotton: Okay, so you mentioned maybe the most important thing, or perhaps a good place to start is understanding how to ask a sensible question about your data. How do you go about learning that? Cause that seems like a lot of skills in itself.
Sharon Castillo: That's a lot of skills. I'm not a data scientist. I talked to my data scientist about, you know, where to dig and what's reasonable. I once worked for a financial management company and they were having an issue. And I was learning about mutual fund expenses. I know nothing about mutual funds expenses, but like for a certain size mutual fund, is this fee reasonable?
And so if you know your business, is this particular thing reasonable? Does it make sense? Is this how you would calculate this? And that's where I would start to dig into, is this really how somebody goes about, you know, if you took away the computer and you had to figure something out, what are the steps that at the basic level has to happen? And then start digging into each of those, the gates as to what happens, you know?
People Skill Audit
Richie Cotton: And just continuing this sort of theme of trying to figure out what skills people need on an organizational level. So you mentioned it's sometimes quite hard to work out where people need to start training, like what they know already and what's the first course, or what's the first bit of training they need to take. So how do you go about sort of auditing what skills people have to begin with?
Sharon Castillo: Yep. So that's a challenge when you work for a software company because our job is to train people on our software. And sometimes people come without the skills of the things that came before it or the things that sit next to it. So if they wanna integrate with somebody else's software and they don't know the other software, or they come to us and they wanna use our API and they don't know Python, you don't need to know Python to use our product, by the way, but, Say they wanted to do that.
You know, are we gonna teach you Python? Are we gonna teach you how our API works? Most of the people who come to us don't want us to teach them that, but there are some people who do. There are some people who come to us who dunno what AI is. It's a black box, it's magic, and have no idea what the difference between AI and machine learning is. And so you walk that fine line all the time of where do we start? You know, do you refer them to other things? Do you, whatever. But you also have to, as help people assess themselves as to where's their base level and do we need something supplemental to help them, which we can do, Or can they start right at what the entry level of the product is?
Richie Cotton: That's interesting. So this is also a problem we face in, in a DataCamp, is just trying to make sure that all of our learners are aware of all which courses are gonna be suitable; which are gonna be too hard or too easy. So you mentioned the idea of being able to self assess and DataCamp has skill assessments, but I'm, I'm curious as to what your sort of DataRobot’s take on letting users self-assess in order to find out what content is appropriate.
Sharon Castillo: Yeah, so we're just starting this. We. Descriptions in our courses are short and we're making them shorter all the time so that there's not a lot of them time investment. If you get into a self based course and it's 30 minutes long and you get to the first bit and you say, Gee, that's, I need the basic stuff before it, there's already a little place that you can click out to.
I call 'em on ramps and off ramps, so there's always now an OnRamp and an offramp for every new joinee- both instructor led and self paced.
Richie Cotton: Can you just tell us a bit more about that? What? What do you mean by an on ramp or an off?
Sharon Castillo: So an offramp is if you make courses shorter, then there's always a place for somebody to go. Instead of saying you need three days of training, if each class is like an hour and a half long, two hours long, the off ramp is. next steps could be one of the following, depending on what your interests are. And we always give people a map of what that is gonna look like.
The on-ramp is where they came from, and so when we design, we assume they came from one of the following places, and one of those could be somewhere else. So I could assume. Somebody went to data camp, I could assume that somebody came from a sales demo and bypassed the little intro course we have because they said, Oh, I saw the sales demo. I don't need the intro course. So then what supplemental things do I need to let them have a sort of sidebar so that they can continue on. We are starting to build in though a process for people to take a couple. questions. One for each objective in the course to say, we don't have their adults, so we're not gonna say, You can't take this course if you haven't passed this.
But here, take this thing. This is what this course is about. If you do well on this assessment and you feel comfortable with this material, feel free to move on to the next thing. If you pass this assessment and still don't feel comfortable, go ahead and take it. And if you don't pass this assessment, And you don't feel comfortable, this is what's gonna be in here, so you might wanna continue on.
Richie Cotton: So basically you take an assessment and that's gonna help you sort of decide, okay, do you want to do this course or. That's pretty, It sounds straightforward, at least Eastern Theory, doesn't it? there's, I know for a fact there's a lot of technical details involved in this, but that, that's a good theory.
I'm curious if you think about like, whole teams needing training. What happens if some people have, um, like more advanced skills than others? Some people, most of, like, everyone's coming from a different opinion, so they need to different levels of things. How do you sort of reconcile differences in skill?
Sharon Castillo: It happens all the time. Some teams like to go through as a cohort some teams go through as individuals. So if you go through as a cohort, there's a couple of ways you can do it. You could have tracks in the cohort where there are meetup points and you say each person has their own individual things they work on.
And then there's either a mentor or a meetup, or in our case we have a data scientist who has sort of an enablement session. People can work on what they're working on. But then they come together and they say, This is how it applies to your use case. And then everybody sort of is together and thinking together about how to work together as a team.
Cuz they have to work together as a team to, to make this thing come to life. But different and different groups do different things. Some of them do hackathons at the end together, but go through the training separate. Some of them have, the more advanced people still go through class together. In the less advanced class works out great in some team. In some teams it's very intimidating to the people who are less experienced. Depends on the culture of the company.
Learning Formats - Pros & Cons
Richie Cotton: I can imagine the social dynamics is complicated. So if you've got some people who are very advanced and some people who aren't, if the person who's advanced has got a sort of mentor, Personality, it's gonna work well, but if they're just competitive, then it's gonna be a disaster. Maybe it's just, that's just the answer, right?
It's just you have to think about the, the human aspect and who's actually involved in this. That's absolutely fascinat. I just wanna sort of segue a little bit into talking about like different formats for training. So Data Camp has always been very much focused in the sort of learn at your own pace online training, but I. Uh, DataRobot originally started out doing more live training than online training, and I'd love to hear your thoughts on what you see as the pros and cons between the sort of different formats.
Sharon Castillo: Yep. So the big con of live training is covid. I started a DataRobot three weeks before lockdown when I told my team, You've got a week to pivot, but. Besides that, there's a lot of overhead and lag in doing life training. Life training happens a lot at companies when they're starting up, and you want customers who are in an emerging market who have low maturity in the whole technology.
And there's a lot of nuance to their questions. Sticking 'em in front of self-paced, when you don't really know what their questions are and you don't really know what their journey is, is not gonna get them to where they need to go, cuz you don't know where they start. So in those circumstances, typically small companies don't even really have trainers.
Typically, they use support. They use customer success people, they sometimes engineers, like it's not even a formal trainer. They might have one, they don't necessarily believe in customer education as a professional entity. And as they get larger, then they start to think about, Gee, we really should have professional training and professionals who understand education and, and all of that.
Other companies with more straightforward products start the other direction and start with videos, and you can use video more similarly to sort of a more extensive documentation, but, but more engaging in more designed around education. So it's just a different starting point. And then the virtual training has really changed during Covid.
I think we all, I've already done it in a lot of places for a long time, but the thing that changed in Covid is that students used to be in the office just globally, and we would teach, and they were in their office doing it. Now we see a lot of people who are like on the east coast of the United States, but taking a class in apac, And they do it because they want the flexibility.
Nobody's bothering them on Slack. Nobody's bothering them on email because it's late at night. They can have dinner with their family, they can do their thing. It's quiet. They can focus on the training. There's nothing else going on. And some people really find that to be a more effective time if they're gonna do live training.
Richie Cotton: That's really interesting, just the idea that people would just want to do training at midnight or whatever. I'd say, I'm not sure like how well my brain would work at that time, but I guess it do. It does work for some people, so, Yeah, I'd love to hear more of your thoughts on scheduling training, cuz this is like a really common problem is people, well, I've got a full-time job, I've got deadlines to hit. When do I find time to learn?
Sharon Castillo: There's two things about that. One, people are zoomed out, so courses have to be shorter, they have to be more engaging. You can't lecture at people for two hours on Zoom. That's not training. People sometimes hand me recordings of meetings that they call training and say, Can we turn this into self based?
And I'm like, That's not training, That's a meeting. So you have to be really thoughtful in what training is. A recording of something isn't training, but training isn't an extracurricular activity. You can't layer it on in addition to someone's job. So I wouldn't recommend somebody doing. An eight hour a day and then staying up till midnight like you were in college and cramming for the next day.
People can't sustain that and you can't get anything out of it. So I would say the two big mistakes that happen with training, and this happened with live in person too, is that the logistics of training make it easy to overload people to the point where the training isn't really effective for.
Richie Cotton: In, in what way would you overload?
Sharon Castillo: So if you have a trainer who flies somewhere, it's less expensive to train people for five days than to have 'em come back three times. But what we know is effective is space repetition and layering on and practicing in between. And then people have questions and people know what they don't know.
Because they went tried and they're like, You know, you said this the last time, but it makes no sense. Or I tried it in my own job and that's, that doesn't work for me. But they don't know that if you're just like giving them the fire hose of information over a five day period, cuz you happen to be there and they forget everything.
So that's one piece. The second piece is that just sending somebody to training without a neat practice. Is not useful. So years ago I had a student and he used to come my same class every six months. I'm like, Why are you here? And he is like, Well, I didn't have time to practice in between. I forgot everything and now I need to use it. And about the third time he took the class at the end of the class, he said, Yeah, see in six months. Cause he knew the reality was he wasn't gonna get to practice it in between either.
Richie Cotton: That, that's, that's kind of a sad story . Um, but yeah, I, I can certainly understand that. So I wanna go back to the, you were talking about spaced repetition. I know this is like one of the most important theoretical ideas from a learning from. Point of view, Can you just tell everyone what is spaced repetition?
Sharon Castillo: So for something to sink in for people, you can't just blast a ton of information at them. You need to take a piece of information and skills and other things that you're gonna do in class. Go away, let them work on it, come back, recap a little bit of it, but then dig deeper, like peel in the onion and then layer onto that and take sort of the hooks that they know and come back to it and then iterate on that sort of process.
Um, these days you could use hybrid learning as well, where we do. In class, like virtual, and then we have labs that people don't need to do in class that are self based. And maybe you pick your own adventure with those labs. Like maybe one person cares about manufacturing, another person cares about finance.
You go off and do a lab on the same topic. You care about regression modeling, but you care about it relative to manufacturing and you care about it relative to finance, but you both go off and do something. And then you come back and you learn a new skill, but you're using that basis that you just had and you just keep pushing through that. So the idea that training is just onboarding and then I'm done, it's a balancing act.
Richie Cotton: So I like this idea of having lots of bits of training that sort of build on top of each other. You also mentioned the idea of hybrid learning, so a mixture of sort of online training and, and live training. How does that work best to you? Like online first and live training or the other way around, or what works for people?
Sharon Castillo: We don't choose for people. Some of it's financial too, so it's more expensive to do. with instructor led. So it depends what people's budget is. Some people solely do self-paced with us, but we don't dictate what you do first. I think it's cultural. Some companies do only self-pace. Some people, my kids only look at video.
I don't know that they would note. I'm wondering what happens. They're in the corporate world, are they? They're gonna wanna watch three minute tick TikTok videos or YouTube videos.
Richie Cotton: Having to read reports, how terrible. But it, that would actually be really interesting if there's like a generation's time, all like corporate reporting goes away and it's replaced by, I dunno, 32nd videos or something. That'd be a, a very interesting world. All right. Talk a bit about scheduling time to learn.
You were talking about how some people will just do their learning at midnight and maybe it's like a terrible idea for people to have like full time job and then have to do. Learning afterwards. This is a point that I would really love, like all the managers listening to this sort of aware that don't just like overload people's brain to make them learn in their own time.
So when you've got these competing priorities of, okay, people have to hit deadlines, but also they need time to build less skills, what's the sort of message you'd give to people who are managing or involving like scheduling things to say, Well, this is how you justify doing some learning?
Sharon Castillo: Should be their OKRs. Just like everything else, the only things that work are, you know, money talks. So if you don't compensate for people, you don't build it into their metrics. You don't build it into the company culture. Nobody says, Oh, nobody should be trained. Everybody nods and says, Oh, training's a good thing.
We should train our people. But like you said, competing interests, I have to get A, B, and C done, or I don't get my bonus. This person has to get A, B, and C done. Well, it should go all the way up the train that there's an OKR or a K P or whatever it is you have to hit to get your bonus. One of them should be that your workforce is trained and they should be trained on the things that are valuable. Don't put things into just check a box, put things in there that add value to your business.
Richie Cotton: Wonderful. And maybe you can sort of elaborate on that. Like if you're trying to prioritize and go, okay, these things are valuable to our business. What's the sort of strategy for deciding, okay, these are the important trainings we need, and like is there any sort of quantitative way to justify this?
Sharon Castillo: So take it back to where you asked me in the beginning, the data, if people are trained on the data and that data helps you. Improve your business. That data helps you find insights about your business. That data helps you benchmark and improve or find insights you never knew about before, or do things more productively that you could take off the list so you could be more effective. Isn't that worth the time and the money you just invested in that?
Common Patterns of Success and Failure
Richie Cotton: Absolutely. I've got this horrible feeling. You could end up in a catch 22 situation where the manager doesn't have the data train to be able to analyze the data to figure. People needed what people need to be trained on. But okay. So maybe that those managers needs need the data training first, but that's, that's pretty interesting.
So do you see any common sort of success patterns or any common like disasters that organizations make when they're trying to figure out these learning programs and trying to figure out when to schedule?
Sharon Castillo: I don't see disasters. I just see more when it's ineffective. They really should be looking at high quality training. Is it the right training? Don't just pick up program just cuz it's a program that has the topics they want. Does it really address the needs that they have? Does it address it in the way that their company works? And have a trial the same way you would with software. Don't send 500 people to training. Put a couple of people through it and say, Did you get anything out of it? And if you did, then continue on. And if you didn't, then find something else. I, I would say that would be both a success way of doing things and the disaster if you don't.
Richie Cotton: I like that idea of sort trialing things with a few people. Then gradually sort of building up how many people are gonna take the training so you don't have, uh, a big mistake in the wrong direction to begin with that. That seems really sensible. So going back to the idea of having like several, like groups of people taking training at the same time, this idea of cohort learning where several people do the same training at the same time and then they can socialize, this seems to be becoming very popular. Have you seen any examples of like cohort learning a data skill?
Sharon Castillo: Yes, but we do it a little differently so we could do the same people in the same class at the same time. It doesn't always work for everybody from a scheduling perspective, because if you have global teams, that's painful. It's painful, the schedule. There's always somebody who's unavailable who missed it.
So the better way we've been working. And this is new, so I'm saying it's better, but I don't have, The data is having a certain period of time that everybody has to take certain courses, and so we have a schedule of you take it and anybody in any time zone, pick the ones you want to take, but you have to take this course in this two week period of time, say, and everybody has to take the next course in next period of.
And then have reporting to report back to the coordinator of the cohort to say, How you doing? Are you progressing through and so forth. Now, some of this depends on the size of cohort you have, and again, you need accountability and all that other good stuff, but it helps make it more scalable because if you just say, I'm gonna do a cohort, it works great if you have a lot of resources. And you have works in certain size organizations, but it's very difficult to scale.
Richie Cotton That's true, and you mentioned the idea of time zones being a problem, and so this is something that doesn't really exist with on demand learning cuz people just take it whenever they're awake. I can imagine with cohorts, it seems like everyone in the cohort should be in a similar time zone then. Is that correct?
Sharon Castillo: If you do a self-paced cohort, so we've done that too, where people take the self-paced training and then you have a mentor in different time zones and they have a Q and a session and they kind of tee up and preview, and you could have one for the Americas, one for AMEA, or a couple time zones in amea, a couple of time zones, an eight pack kind of.
And at least then you're accommodating different zones. But people still are doing the self-paced and there's still a schedule that people have to adhere to. And you could still, through the learning management system, get reporting back on, Oh, these people completed. And if you take a test at the end of each, these ones are doing well, these, these people seem to be falling behind or having problems.
And then if you use things like community or chat or. , whatever method you wanna use to say, Hey, you seem to be confused on topics or office hours, or whatever it is you wanna use to make sure that people stay on track.
Richie Cotton: Okay. And one of the, sort of the big sort of selling points behind this idea of the cohort learning is the idea that because people got some sort of social interaction, they're gonna be more engaged. So what sort of techniques have you seen? Organizations to have to encourage social interaction between people doing.
Sharon Castillo: The cohort idea works well and again, using whatever kind of communities you already have, so whatever form. Social media or whatever your company's culture is, to engage with people, whatever kind of meetups you have. So if you have community meetups around, if it's a message board, if you have postings on an internal wiki with with information for people.
More advanced companies have centers of excellence that run these things and have best practice sessions and they're the ones who kick it off in the first place. And I had mentioned hackathons and other kinds of more creative, fun prizes were, would be amazed how much prizes, even inexpensive fun t-shirts.
Importance of Management’s Buy-in
Richie Cotton: I, I definitely like that . Yes. Always happy to receive prizes. So yes, I agree. That's like a, a really simple way of getting people encouraged and engaged in this sort of thing. That's brilliant. One thing that sort of, we seem to have talked about sort of intermittently throughout this is just the idea of management's sort of role in data.
So I'd, I'd like to sort of get into this a bit more detail. So in terms of like running a sort of successful training program, like how important is getting like management buy into it?
Sharon Castillo: I think early in my career I read some study that said the manager before training is probably the most important thing because if the manager says to you, Go to training, I cleared off your calendar. Ignore your emails and your slack, or whatever it is, focus on clouds and when you come back, worry about it getting caught up.
But right now, this is the most important thing you can do. That message is really different than, Oh my God, we can't do without you next week. And yeah, I'm gonna let you go, but we really can't do without you and I'm gonna blast you 15 times while you're in class and I'm gonna pull you outta class. What does that say to people?
So what they say beforehand is important. I think another thing that would be really helpful for a manager beforehand is to set expectations of what they hope someone gets out of the training. I'm sending you to class for X, Y, Z reason. I really hope you find out about A, B, and C, cause I'm really curious about that and I'm hoping that that'll apply to this project in this way.
So keep an eye out for that While you're in class, I bet people pay more attention. And then I ask them when they get outta class, and I bet the next time they go to class, they know somebody's gonna ask them about it when they come outta class. And if you have a project that uses it after. And you get people engaged in those projects.
And oh, by the way, I also went to that training before too. Maybe you guys could work together on it. Cause I know that they worked on it and they're probably about six months ahead of you on it. And again, that perpetuates a culture of learning.
Richie Cotton: I really love that trick of just asking people like, What did you learn in that training? . Then they're like, Oh, well I wasn't concentrating. I was checking my email. Yeah, they'll only do that once. So certainly having those sort of tricks to make sure that people concentrate seems, seems really useful. I also like the thing you talked about, how you should just sort of clear your mind, get rid of any distractions beforehand.
It sounds a lot like the sort of thing you get at the start of a yoga class where it's just, I don't think about anything else. You just hear in the zone for an hour and just concentrate. So that seems like a really good trick for learning. I've now got ideas for productizing this. I'm gonna speak to a public manager afterwards.
Okay. Maybe we can talk about the flip side, like what happens if there's sort of no real management interest in training. Now I know we have Data Camp has some customers where. Let employees do their own thing and just get training on an talk basis. Do you have any sense of whether that sort of technique works? Just letting employees go at their own, like figure out what they want by themselves.
Sharon Castillo: It depends on the employee. If the employee is super motivated and probably has their own reasons for wanting to go their own career development, something they're working on and they're really stuck on it and they're like, Wow, I could get further with it. Yeah, they'll be successful, but. Beyond that, probably if you go back and look at your students, you could probably pick out which ones went because there was real push behind them to go and support for them going, and you could probably pick out the ones who begrudgingly were like, Okay, well we have a budget and you're allowed to go, but I don't want you.
Richie Cotton: I guess we're, we're going back to like human dynamics here. So it just very much depends on the individual. Like some people need a bit of hand holding and guidance and some people like unmanageable and they just have to do their own thing. And I guess most people are somewhere in between. So, sort of last thing I wanna talk about is about how you go about measuring the success of a training program. So, do you know, has this training been useful or not?
Sharon Castillo: So I talk about Big C's and little C's. Meaning the customer. So a little C is a user, and the big C is an organization. So the little C, the customer, you really do look at what's the outcome for that user. Did they use the product and the aspects of the product that they took in training within two weeks of class, they took it six months later, it probably had nothing to do with go doing class.
They did it within two weeks probably what you taught them stuck and they're using it and it's effective. If you have enough data, you can see what they did. Did they do it correctly? If you can really trace their path through the product, if they get certified, they take an exam and they do all of that kind of stuff, did they go deeper?
Did the user get engaged? Did they come back and take more training? Did they follow the path and go to community? Did they do other things? Now some of that is a little bit si circular. Were they motivated to start with? Were they coming to lots of things? And so it's a little bit hard to to tease that out.
But in general, the people who go to training, if you take a trend across all of your users, the trained group tends to do better in these areas by a lot. And then you look at the big C's and you say, Gee, overall. How are those customers doing? And generally they hit all the targets of onboarding, better engagement, better churn is lower, all those other things.
So if you're talking about a training program for customer education at a software company, those are the types of things we would look at.
Measuring Learner Engagement
Richie Cotton: Interesting. And maybe can you talk a bit more about engagement? Cause I know at Data Camp we have lots of different metrics for how engaged users are. I'm, I'm curious as to how you think about this.
Sharon Castillo: Yeah, there's so many different ways, and my colleagues and I sit around and my colleagues in customer education across software companies talk about all the different ways you could measure engagement. And it's difficult because we don't have access to the data. It's hard to see inside your product. We don't always know what the journey should look like, especially if you have nascent products that are newer or changing all the time.
And it's hard to know because everything's changing all the time. So we change, our training offers a lot to adapt to everything that's going on. And so somebody will bring me data and I can say to them, Yeah, that data doesn't work for me before this date. And they'll say, Why that date? And I'll say, Cuz that's the date we changed all of our curriculum around this and that's the date that we did this other thing.
And that's the date we did. So we make some major changes. It's really hard then to tease apart what happened around that cuz everything sort of shifted. It's very, very big challenge.
Richie Cotton: Right, and maybe you can talk a bit about what you find are the kind of most re requested things that people want to learn about.
Sharon Castillo: So a couple of things people ask about when they're earlier on, they ask a lot about things that have nothing to do with our product. They really need change management, project management, basic skills around the, the industry itself. What is ai? What is, what is data, what is all of that stuff? Then they want the basics of the product.
Tons and tons of things about the basics of the product and how can they be successful with the product. And then as they move up the maturity curve, they really want best practices and deeper and deeper knowledge about how to solve more and more complex problems that are very, very specific.
Richie Cotton: And so just to bring this back to something you said at the start, you were talking about how everyone kind of needs some sort of training and some sort of data training perhaps. So what would you say to people who are hesitant about learning some data? Able.
Sharon Castillo: I would relate it back to something that they do every day and think about it in terms of data that they know. So think about something you're really passionate about. Whether it's sports or music or something that you really, really get into and are passionate about, and think about the information that goes with that, and how could you use that and use that to frame it instead of worrying about algorithms and math and statistics, and don't get too far ahead of your. Just think in terms of really small, basic things that you could do with that data and take small steps,
Richie Cotton: Last question. Do you have any final advice for any organizations wanting to start a data training?
Sharon Castillo: Figure out what your goals are. It's the same thing. It's small steps. So what are the most important goals? Break it down. What's achievable? What are quick wins?
Richie Cotton: All right, super. Thank you very much Sharon. Lots of really exciting things we talked about today. I think lots of great advice in there. Thank you very much for your,
Sharon Castillo: Thank you.
Data Journalism in the Age of COVID-19Betsy Ladyzhets discusses the importance of data shaping the narrative, and characteristics of traditional journalism needed for data journalists
Successful Frameworks for Scaling Data Maturity
How Chelsea FC Uses Analytics to Drive Matchday SuccessGet behind the scenes at Chelsea FC with Federico Bettuzzi to see how data analytics informs tactical decision making.
Successful Frameworks for Scaling Data Maturity
Ganes, talks about scaling data maturity, building an effective data science roadmap, navigating the skills and people components of data maturity and more.
How Chelsea FC Uses Analytics to Drive Matchday Success
Get behind the scenes at Chelsea FC with Federico Bettuzzi to see how data analytics informs tactical decision-making and driving match day success.
Inside the Generative AI Revolution
Martin Musiol talks about the state of generative AI today, privacy and intellectual property concerns, the strongest use cases for generative AI, and what the future holds.