Skip to main content
HomePodcastsArtificial Intelligence (AI)

Seeing the Data Layer Through Spatial Computing with Cathy Hackl and Irena Cronin

Richie, Cathy and Irena explore spatial computing, it's prominence alongside the release of Apple's Vision Pro, expected effects in the gaming and entertainment industries, future developments in the space and much more.
Jun 13, 2024

Photo of Cathy Hackl
Guest
Cathy Hackl
LinkedIn

Cathy Hackl is a web3 and metaverse strategist, tech futurist, speaker and author. She's worked with metaverse-related companies such as HTC VIVE, Magic Leap, and AWS, and currently consults with some of the world's leading brands, including P&G, Clinique, Ralph Lauren, Orlando Economic Partnership and more. Hackl is one of the world's first Chief Metaverse Officers and the co-founder of Journey, where she works with luxury, fashion, and beauty brands to create successful metaverse and web3 strategies and helps them build worlds in platforms like Roblox, Fortnite, Decentraland, The Sandbox, and beyond. She is widely regarded as one of the leading thinkers on the Metaverse.


Photo of Irena Cronin
Guest
Irena Cronin

Irena Cronin is SVP of Product for DADOS Technology, which is making an Apple Vision Pro data analytics and visualization app. She is also the CEO of Infinite Retina, which helps companies develop and implement AI, AR, and other new technologies for their businesses. Before this, she worked as an equity research analyst and gained extensive experience in evaluating both public and private companies.


Photo of Richie Cotton
Host
Richie Cotton

Richie helps individuals and organizations get better at using data and AI. He's been a data scientist since before it was called data science, and has written two books and created many DataCamp courses on the subject. He is a host of the DataFramed podcast, and runs DataCamp's webinar program.

Key Quotes

I think we're going to continue to see not only gaming, but big Hollywood IP embrace spatial computing very early on, but changing, I think, the way people engage, that human computer interaction in the sense that we've been really passive recipients, so passive users of technology in some ways, as adapting to technology. I think technology is going to start adapting to the physical world. So that's where these new content formats, whether it's the Mayo as a game and spatial computing or what Marvel is going to bring us, is going to start to make us a little bit more of an active part of what is happening in these stories.

I also think from a data perspective, what spatial computing in some ways allows us to do through computer vision and through new hardware, especially is see that data layer that we can't really see with our eyes, right? Our phones, our computers, like everything's giving off data in ones and zeros, we don't see it, right? Because we're humans, we don't consume it that way. But what you're gonna start to see with these devices is that you're gonna be able to see that data layer, right? So the data layer that the robots in the autonomous vehicles and the drones are navigating using, we're gonna be able to see that, right? In annotations or holograms, all sorts of things. So I do think it opens up the access to that new data layer that we have not been able to see as humans for a long time. I even use the term, it's where we're going to meet the machine because we're all going to be able to see the same thing. They're going to see it differently than we do, but we are going to be able to see that data layer that we haven't been able to navigate through before.

Key Takeaways

1

Spatial computing offers advanced data visualization techniques, making 3D data more useful and interactive. Leveraging these tools can provide deeper insights into complex data sets.

2

Combining spatial computing with generative AI and machine learning can create powerful new applications, such as immersive gaming experiences and advanced data analysis tools.

3

As spatial computing collects extensive biometric data, including retina scans and eye gaze, organizations must be proactive in addressing privacy concerns and adhering to emerging regulations.

Links From The Show

Transcript

Richie Cotton: Hi, Irena and Cathy. Welcome to the show. I'm glad to have you both here.

Irena Cronin: Thank you.

Cathy Hackl: Thank you. 

... See more

r: transparent; font-weight: bold; font-style: normal; font-variant: normal; text-decoration: none; vertical-align: baseline; white-space: pre-wrap;">Richie Cotton: Just to kick off, like, what are the high level is spatial computing? So, like, is it the same thing as augmented reality or is it something different? 

Irena Cronin: Yeah, it is a little bit different in terms of theory. So, basically, spatial computing is comprised of computer vision and stuff like augmented reality and virtual reality, and it basically denotes anything and everything that could move through space that's, um, digital or, you know, has that kind of, uh, capability.

Um, yeah. So that it also includes robots. 

Richie Cotton: Okay, so that's a pretty broad definition then. So you mentioned like that, computer vision and augmented reality and robots. So plenty to talk about today. That's good news. 

Cathy Hackl: I would say, Richie, like the way we've been kind of talking about is a, a, an evolving 3D centric form of computing that at its core uses computer vision, AI, extended reality and other technologies.

to blend virtual elements or virtual experiences into someone's experience of the physical world. Um, but it also means kind of a change in human to human communications and human computer interaction. That's kind of exactly what I was referencing. It includes robots. It includes all these like, you know, virtual beings and robots navigating the world alongside us.

So, um, so yeah, it's way more than just augmented reality. 

Richie Cotton: Okay, uh, that's very, uh, that sounds very cool. Uh, lots of, uh, impact there. Uh, alright, so, uh, why are we here now? Like, what has changed recently that has made spatial computing a viable thing? 

Irena Cronin: Well, I would say it's always been viable. I mean, I've been talking about spatial computing since 2018 before the term was really broadly used at all.

Um, the reason why the term has, and also I wrote a book in 2020, uh, which used the words spatial computing in it. But, um, what's really bringing this all along right now is that Apple came out with the Apple vision pro and Apple is using the terms spatial computing. Um, versus Metaverse versus XR, MR, you know, AR, VR, that kind of thing.

So it's kind of the tipping point now in really bringing this technology and understanding what it is forward. 

Richie Cotton: Okay. It does seem like whenever Apple gets into the game, it means like something's about to go mainstream. So yeah, hopefully this is a different point for spatial computing. 

Cathy Hackl: Yeah, I don't think, I mean, I think that any, everything Apple does strategic, from a marketing perspective, they don't do anything.

They don't step into anything lightly. So the fact that they chose spatial computing as a term, And that they describe the Apple Vision Pro as a spatial computer, I think is very powerful. Um, and just like Irina, I mean, I've been in the space for quite a while, actually worked at Magic Leap, which in my perspective is one of the original spatial computing companies.

We used the term spatial computing before most people were, um, but I definitely agree. It's, it's a really interesting moment that we arrive at, um, with a term. And I think for many people listening to the podcast, It might have been the first time they heard spatial computing when Apple launched the Vision Pro and, you know, ushered in the era of spatial computing, even though it's already been in the works.

and said the world, you know, and told the world this is a spatial computer. 

Richie Cotton: Uh, so it does definitely sound like we're at the cusp of something, uh, interesting. Um, okay, so I suppose, um, gaming has maybe been hyped as like one of the big use cases. So we'll perhaps cover that first. And when I think of like spatial computing games, I'm thinking fancier Pokemon Go.

Uh, but this is probably just scratching the surface. So can you tell me, like, what would spatial computing bring to gaming? 

Irena Cronin: Well, a lot. Okay, so basically, uh, the gaming that it has existed up until now has used a lot of the time spatial computing because it uses computer vision, and you need computer vision to do spatial computing.

So computer vision allows, um, technology to be able to see in front of it and be able to gauge what's what's in front of it, the environment. And if you're moving, where are you moving, and that kind of thing. Location. Um, so Niantic has always used, uh, spatial computing. Um, there are other games that, let's say Meta, that use less spatial computing.

So it's a matter of degree. And that's why it's really interesting that this term is now becoming, um, hopefully, I think it's going to be broadly used. But in actuality, the technology has always been there. So it's nothing really new in terms of gaming. Um, the new thing that might blow everything apart is generative AI, which allows people to create their own images and videos.

And if you pair that along with spatial computing and also make a game with that, that really makes it fantastic. 

Richie Cotton: Yeah. So I think, um, spatial compu sorry, uh, computer vision was sort of, it was a kind of okay, maybe like, uh, mid 2010s. And now it's getting to the point where you can reliably do object detection on an awful lot of things.

Yeah. Yeah. It just seemed like that. technology is maturing. 

Cathy Hackl: Yeah, Richie, I think it's about creating that spatial awareness that Irina is mentioning. Um, and you know, and, and like AI now understanding the physics of the space and the physical world. So these digital elements have, um, you know, have, um, spatial awareness.

Right, I think that that's, that's kind of where it starts to become really interesting. And from a gaming perspective, if you look at everything that's happening in that space, for example, very recently Resolution Games launched the Mayo, which is kind of really going to be a big title coming to the Apple Vision Pro.

So I'm hoping to see kind of how more people start to game in the device. And granted, it's a device that not that many people have, but you know, as a proud owner of one, I'm looking forward to doing some more gaming with it. And DeMayo is one of those titles that is kind of the first to arrive and is truly kind of in the, in the kind of spatial computing gaming space.

Richie Cotton: Okay. So it sounds like maybe the gaming industry has a bit of work to do just to figure out, well, what sort of games do you need to make in this medium? 

Cathy Hackl: Yeah, and also, um, Marvel, for example, Disney and Marvel on May 30th, um, are going to be launching, or will have launched depending on when this airs, um, kind of new immersive experiences and new storytelling formats for their Marvel universe.

So I think we're going to continue to see not only gaming, but big Hollywood IP embrace spatial computing very early on, but changing, I think, the way people engage that human computer interaction. in the sense that, you know, we've been really passive recipients or passive users of technology in some ways as adapting to technology.

Um, I think technology is going to start adapting to the physical world. So that's where these new content formats, whether it's the mail as a game in spatial computing or what Marvel's going to bring us, uh, is going to start to make us a little bit more of an active part of what is happening in these stories.

Richie Cotton: That's really interesting, the Marvel angle. So does this mean we're going to expect like new, um, forms of like TV shows and other, uh, entertainment media? 

Cathy Hackl: I think so. I think it's giving way to spatial video, um, throws three centric depth content, like beyond 3D, beyond just 3D TV, right? Like I, I want people to understand we're not talking about like the old type of 3D TV where you wear these cookie glasses.

Um, it's more, you know, once, once this virtual content is spatially aware, you can start to engage with it in different ways because it's going to understand your, the physical world around you. Um, so I definitely think we're going to see new content types, spatial video, um, Definitely from everything I'm hearing, people are getting more and more interested in spatial video, which is video that has depth.

So right now with like my iPhone 15 pro max, I can shoot spatial video, um, video that has depth, the video that I can see in 3d that feels a lot more, more volumetric. It feels more like, like what I live in, in a daily basis. Um, so I can shoot it with my phone, but then I can watch it back in my vision pro.

Um, so you're going to start to think, you know, sports, entertainment, all these sorts of new formats. Uh, new, new, uh, all the sort of these new industries embrace these new formats to start to experiment with my, what might come next. Um, but I mean, I, I don't know if you want to add something regarding kind of like the depth element of, of, of video and what's coming.

Irena Cronin: Well, I mean, Apple has really played up the entertainment angle. And I mean, it makes a lot of sense. They've got, uh, you know, Apple TV plus, you know, to be able to put on their own, um, their own shows. So I'm looking forward to see what they produce. And I know it's going to be quite a bit. Thank you. I mean, right now they do have some sports on there.

They have soccer, you know, and, and, and basketball and stuff like that. So they really wanted to play up the sports angle in 3d. Um, and they tried to get the NFL to join, but they weren't able to do the contract, unfortunately. So, yeah, I just want to add that, um, it's there, it's, it's kind of, there are new forms that people might think would be needed for spatial computing.

But I think that the Apple Vision Pro is just pushing forward in better quality what has come before. 

Richie Cotton: Okay. Yeah. I certainly, as someone who spends way too much of my life just staring at screens, it'd be interesting to have like things go back from like 2D to 3D again. Uh, yeah, that's really interesting.

I certainly imagine you have something like, uh, a horror genre where you've got things moving at you. That's going to be like even more terrifying. Um, all right, so, uh, let's move on from entertainment to talking about proper work stuff. So, um, it feels like any of the industries where you've got physical objects or things like manufacturing or logistics, spatial computing ought to be viable there.

Can you just talk me through some of these use cases where, in industries where you have got physical objects? 

Irena Cronin: Well, you did mention in the write up manufacturing. So, manufacturing actually has used augmented reality since around 2008. Um, so it's, it's kind of like, it's just better now than it was before.

Um, the resolution is better, the capabilities of, um, you know, texting and speaking, uh, commands and stuff like that, and visuality with computer vision. So, um, along with manufacturing, there's logistics that goes hand in hand with that. So it's a, it's a no brainer that it would be really helpful not only for creating stuff, making things, but also fixing.

Fixing machines and also teaching people how to do things. So with the Apple Vision Pro, it's like augmented reality. It overlays, um, the information and visuals onto what's existing in, in front of you, in your environment. So you can imagine how helpful that would be. I mean, there's a ton of different use cases.

Um, I had mentioned that spatial computing includes robots, so robots are going to be able, they need to use spatial computing to be able to move through space. So anything and everything that exists now in terms of computers that exist and will exist in the future will be using spatial computing, so it's a really huge deal.

And you could also use robots. in manufacturing and you could use it, um, use them at home for consumers. Um, there's a whole range of other industries. Um, spatial computing, uh, for training goes along the whole gamut. In retail industries, teaching people how to do certain things. Um, in education, teaching students how to, you know, to learn better.

Basically, so, and in FinTech, um, you know, teaching people and traders, et cetera, how things are moving. Um, one of the, one of an interesting use case is in the Apple Vision Pro, you can have as many windows up as possible. So, I mean, if you want to put up 25 windows, it allows, it allows you to do that, but you probably wouldn't.

But it's certainly more than what, um, a trader would have in front of them, which would be like four, To eight screens, basically, and you can move them around and, you know, do what you want with them. So that's a really impressive use case for, uh, for spatial computing. 

Richie Cotton: Yeah, so on that last point, I always have way too many browser tabs open that I can't see and can't find again.

So yeah, having lots of different screens open at once, uh, if there's a better way of, like, managing them, then that, that sounds like it might be pretty useful. Uh, so lots of use cases there. I don't know, Cathy, have you got any more, um, use cases that you're particularly excited about? 

Cathy Hackl: Yeah, I think that there's one that I usually mentioned from, um, uh, our friend Shelly Peterson, who used to be at Lockheed Martin, now she's at Microsoft.

Um, and it was with the Microsoft HoloLens, which was a spatial computer, you know, an early spatial computer, uh, where they, where Shelly's team at Lockheed Martin used the device to build the Orion capsule for Artemis. And I think the, the, the data they got back was really impressive. I think it was like 90 percent reduction in labor hours, in labor costs.

Which, you know, 90 percent is pretty, pretty amazing, um, amount of data. So that's one of those use cases. I don't think a lot of people talk about, but very, very, very important when they're talking about how to use, you know, spatial computing in the manufacturing side, and obviously it's a, it's a very specific use case because it's one capsule, um, for something like Artemis, but.

Still very, very relevant. Um, and I think it's probably one of the reasons Shelly ended up going to Microsoft afterwards from Lockheed. Um, but yeah, I think it's a great use case on how it can be used for labor and, you know, to augment the workforce really in some ways. But Irina mentioned is like giving, you know, the worker can now see all this data overlaid over them, this data layer that we can't really tap into unless we have a device.

Richie Cotton: I'm sure I've seen this before in movies. This, this feels like, um, the, the Terminator view, like this is going back to like the 1980s movies. So I guess the concept has been around for a while. 

Cathy Hackl: Yeah. I, I prefer Jarvis to be honest. Like this is a discussion I have very often is I rather not do not think Terminator.

I really think like Marvel Jarvis, you know, when Iron Man's wearing like his glasses, Jarvis, that kind of like. Overlays all this data on top. Um, I think that's a better example, but yeah, I mean, 

Richie Cotton: Okay. Yeah. Maybe, maybe jobs is the happy example compared to Terminator. 

Cathy Hackl: I'm not as dystopic. 

Richie Cotton: All right. Uh, so, um, I'm also curious, so since you mentioned robots, um, now I'm, um, imagining like those Boston dynamics, like dancing robots and things.

Is this the kind of thing that you have in mind, uh, for spatial computing or are there other sort of robot use cases here? 

Irena Cronin: Oh, there's a lot of different types of robots, but yeah, it includes those robots for sure. I mean, you can have like humanoid robots that really kind of resemble, you know, have two legs and two arms and the head might be there, but it might not be as descriptive as what we have, you know, in human nature.

Um, but those have, those have a use case or use many, many use cases from being workers. Thanks. You know, because they can walk around, um, to being in the home, washing your dishes and doing your laundry and stuff like that, um, or delivering food, you know, walking and delivering food to you at your, at your door.

Um, but there are all kinds of other types of robots. They're robots that are co bots that work, um, next to workers, uh, currently. That kind of aid the workers, so they're not taking the place of the workers, they're helping them. And they're usually pieces, they're like arms or whatever. They're not whole humanoid beings.

Um, also there are these little robots that do deliver food that maybe you've seen, um, they're in specific locations that roll down the street. where, uh, if you enter a code, you open the top and you get your food. So those aren't humanoid robots, but they do have the capability of, you know, moving through space, like spatial computing.

Um, and then there are robots that, um, that have been experimented, that there's an experimentation with Amazon and, um, certain other companies that have stopped because it's a little bit too expensive currently to, to try to experiment with them. But there are these robots that, uh, leave the car, leave a car and are able to go up steps and are able to open doors at the same time.

So, um, I'm sure those will, those will come back. In addition, drones are considered robots. So, um, there's lots of experimentation done now, uh, Walmart is doing it and a couple of other companies in delivery for, uh, any kind of, you know, objects that are too large to be able to bring it down to the person and, and the person pick it up.

So yeah, there's, there's a ton of use cases for, for robotics. 

Richie Cotton: Okay. Yeah. So it certainly sounds like any organization where they've got like a big logistics component to their business, they're going to need to start thinking about this if they haven't already. 

Irena Cronin: Yeah. There's also another, um, one more use case that I just thought of.

Um, you know how Boston Dynamics has that dog looking robot? 

Richie Cotton: Oh, the spot thing. Yeah. 

Irena Cronin: Yeah. Okay. So there are other companies that have robots like that too. And, um, depending on how, um, large they are, you know, there could be smaller ones, they could go into dangerous areas and dismantle bombs. or go into areas and find survivors of earthquakes.

So, um, that's another use case as well. 

Richie Cotton: Okay. Uh, yeah. So places where you don't want to send a human. Actually, on that note, I did see this company was selling, it was like that, that sort of, uh, the dog sort of robot, but it had a flamethrower attached to it. And it's supposed to be for like dealing with wildfires, but it also sounds like a, dystopian nightmare as well.

Irena Cronin: Um, 

Richie Cotton: okay. So while we're talking about use cases, because a lot of our listeners work in data, what are the sort of use cases for spatial computing in the field of uh, working with data? 

Irena Cronin: Well, I can tell you that I'm actually working with a company right now, uh, to be very transparent. It's called Datus Technology.

And that's exactly what we're doing for the Apple Vision Pro. We're, we're creating data visualizations and, and, you know, ways to be able to portray data using spatial computing. So, um, obviously I'm a big believer in this and I think, um, the display of data, not only in columns and rows, but also in charting.

And other kinds of things, if you apply AI to it, is, is very, is, is very exciting. 

Richie Cotton: That's very cool. So, actually, I have, I have a maybe nerdy question about this, but for a long time, uh, the idea of 3D plots has been frowned upon, because it's very difficult to, like, determine anything useful from 3D plots, because you have weird perspective issues.

Does spatial computing solve this? Are 3D plots going to become, um, Useful or cool again? 

Irena Cronin: Oh, yeah, it's, it's, the reason why before it was kind of looked down upon was because the resolution wasn't good and you wouldn't, you weren't able to zoom in, um, very well on, in the 3D plot. You can do all of that right now and you can manipulate it and turn it whichever way you want.

So it's, it's really fabulous. 

Richie Cotton: Okay, so once you can spin it around things, you can get the right perspective to answer your question. Yeah. Okay. Yeah. All right. I'm gonna have to update all my digitalization advice now, I think. Um, that's very cool, but that's changing. 

Cathy Hackl: And I think another important part, and I agree, I agree with Irina on like, when you're in the Vision Pro and you're in spatial computing, you can actually see the 3D plot in 3D, right?

It's a totally different experience at a flat surface trying to see something in 3D, which is just seems unnatural, even though we use it all the time. Um, so that's one thing. I also think from a data perspective, Um, what, what spatial computing in some ways allows us to do through computer vision and through new hardware, especially, is see that data layer that we can't really see with our eyes, right?

Our phones, our computers, like everything's giving off data in ones and zeros, we don't see it, right? Because we're humans, we don't consume it that way. But what you're going to start to see with these devices is that you're going to be able to see that data layer, right? So the, the data layer that the robots in the autonomous vehicles and the drones are navigating using, we're going to be able to see that, right?

In, in annotations or holograms, all sorts of things. So, um, uh, uh, I do think it opens up the access to that new data layer, uh, that we have not been able to see as humans for a long time, and um, I even use that, I even use the term like it's where we're going to meet the machine, because we're all going to be able to see the same thing, where it's, You know, they're going to see it differently than we do, but we kind of are going to be able to see that data layer that we haven't been able to kind of navigate through it before.

Richie Cotton: Okay. So just better representations of the data are going to make things a little bit more understandable. That seems incredibly useful. 

Cathy Hackl: Yeah. Plus when you think about right now, you're going to look at some, let's say that data layer that I mentioned, you're probably going to look through it through a phone or, you know, through a computer.

Like this, this is very small, right? But when you're talking about actual devices that. have a bigger field of view and everything, it starts to open up that aperture quite a bit, right? Um, I always, I always do it, I always joke and I say like, I hate when I, when I go to like a Beyonce concert or a Taylor Swift concert, whatever concert you want to go to, and like everyone's living through their phones, right?

They're, they're taking, they're watching this concert, but through the phone, which is a very small kind of device in reality, but once we have this newer hardware, like you're going to be kind of experiencing things a little bit more present. Um, and I had that experience recently. Uh, when I was on safari, I left my phone many times, uh, at the hotel, at the villa, and I brought my Ray Ban meta glasses, like the multimodal smart glasses that they put out.

And I was actually, you know, being present and taking images of the, of the animals as I was seeing them, but I was being present. And that to me was really transformative. In the sense that I wasn't living through this little rectangle. Um, I, I was wearing the device and everything, taking pictures, but I was present, I was able to kind of engage with what I was seeing in the physical world in a new way.

And, and I think from a data perspective, that's something we're going to start to see. 

Richie Cotton: Definitely. I have to say all those people at concerts, when they're holding their phone up the whole time, very, very annoying. So if that goes away, then that's, that can be a brilliant development for society. And so, uh, beyond that sort of, um, Uh, data visualization aspect.

Are there other ways do you think that, um, spatial computing is going to change, like, um, the sort of technical work through working with data, working with code, all that sort of stuff? 

Irena Cronin: Well, the interest, a very interesting thing, and what some people actually don't like is that when you're in the Vision Pro, it collects a lot of data on you, on the person that's, that has the headset on.

However, Apple is really into privacy, so, um, everything will be on the local network versus going to the cloud, at least they say. There's some things that would have to go to the cloud, um, let's say if you're using, Um, an LLM and there's a very difficult question or whatever, and it can, it has to do compute power, it goes to the cloud, but then it would be anonymized.

So that's kind of interesting that, you know, the, the data that it actually gets that is really amazing is, uh, it could see where you're looking Exactly. It's very, very precise. And based on where, where, where you're gazing, where you're looking, that's how you actually choose, you know, the app that you want to go into, um, or whatever else that you want to choose what to do.

So and then obviously other kinds of biometric, um, understanding of your face and et cetera, that kind of thing. Um, so yeah, it, it picks up on your own data. Which makes a big difference for corporations, I think, that are allowed to, if they're allowed to get that data. 

Richie Cotton: Yeah, I can certainly see how that sort of things like eye tracking and being able to understand like a lot about like your, your personal state, that can be very powerful.

Um, so I don't know, I guess you, you know. See something horrendous is going to pick up on it, it's going to, uh, it's going to be able to give you some feedback from a corporate point of view. Yeah, I can also imagine that's going to cause some privacy issues. 

Cathy Hackl: In Irina's other book, uh, one of her books called The Infinite Retina, it's a fantastic read by the way, but like these devices scan your retina.

And by looking into someone's eyes, you actually know what they're going to do before they actually do it, right? So there's that level of intent, uh, which from a biometric standpoint can be quite scary, but I could imagine, you know, this is an example I use often. Let's say you're on the airplane and you're going to use the free wi fi and you got to watch the ad.

Never, no one watches the ad, right? You just like, okay, you look away and then 20 seconds is over and you got your free Wi Fi. But with these devices, like, they know where you're looking. They know if you're paying attention. So like, you can't get away from not looking at something because they will know you're looking at it or not, right?

Um, so there is that level of, let's say of intrusiveness of sorts, right, from a biometric standpoint. Um, I don't know if there's anything else you want to add from like retina, the retina scanning side. Um, no, not from that perspective, 

Irena Cronin: but I did want to add from the, um, from another perspective, if you attach an LLM or use an LLM, um, or any kind of other AI besides also computer vision, um, when you're looking at something and let's say you, Everybody knows, like, if you ask, like, what kind of plant this is, and then it gives you back an answer, right?

Well, this is going to go way beyond that. It's going to be able to see the panorama of where you're standing in the environment, and it's going to be able to take in, uh, what you're looking at. And if you ask an LLM a question about your environment, it'll be able to answer you. So in that way, data is at your fingertips.

Richie Cotton: Now I'm wondering, like, what Uh, like who gets access to this data then if you're in a work context? So is this gonna be a case of like your boss knows your biometrics all the time? If it's like some kind of corporate headset, like it feels like there's some more stuff. Okay. So that's certainly possible.

Like they know when I'm not concentrating. They know when you're not 

Irena Cronin: looking at your computer. There's a lot of companies that do that now through a computer. That if you're not looking at your computer, you're not typing away, you're not working. So, and people get fired over that. So I'm sure that that'll be, you know, something that people wouldn't want, but that corporations will use.

Absolutely. I mean, think about logistics. Uh, you know, you got to be working your, like it, it, it, it, Amazon, you got to be working your tail off. They know when you take breaks. I mean, it's, it's down to less detail of what that person does every day. 

Richie Cotton: Okay. So this is sort of dystopian, like real optimization of worker productivity possibilities.

But are there any, I don't want to assume it's a completely dystopian episode. So are there any like good ways in which spatial computing is going to help with productivity? 

Irena Cronin: Well, like I said, if, if you look in front of your, in, in your environment, it could tell you what's there. So if I use logistics again, now on the positive side, it can tell you to, to a degree, it could read tiny, tiny little numbers.

And it could tell you if the package that you're looking for is, You know, 150 feet away or something like that, and which is like supervision, right? So in that way, it's, it's a great positive and anything where, um, your environment is important for your job, like on an oil rig or, um, any, any kind of thing where you're outside and looking at something and even inside when you're looking for something, uh, it's a very positive thing.

Also, as I said, um, for manufacturing and also for, for teaching capabilities, it can see what's in front of you and it over, it overlays how to, you know, fix something or overlays how to actually work something. So it's, it's a teaching computer that didn't exist before and now it has wonderful resolution and wonderful color and it's really easy to use and that's what Apple brings.

Richie Cotton: Okay. Uh, so I like the idea of, uh, sort of teaching computers to support you through things and just help you find out stuff that you didn't know before. Actually, so I remember when Google Glass first came out, one of the sort of use cases they were pitching for this was that it could recognize other people and remind you of like their names and things like that.

I don't know whether it worked at the time. Is this something that, um, is now viable? Like being able to see someone and it. the technology tell you who this person is? Because I meet a lot of people and I'm like, I'm not sure whether I know this person or not. So this seems incredibly useful for me at least.

Um, Dawn? 

Irena Cronin: Um, yeah, I'd say that it's, it's so much better than it used to be. Uh, as long as people opt in to it, right? Um, you'd be able to, and there's an app, if there's an app for it currently, there's no app that exists for the Apple Vision Pro as far as I know that does that. But I can imagine that if you allow, and others allow you to, to do that, that it'd be super easy.

Richie Cotton: And so, we've talked a bit about, um, increasing worker productivity. I guess the other, Maybe, um, use case for this would be something to do with improving customer experience. Are there any ways in which spatial computing can be used to, uh, make things better for your customers rather than just directly, like, for your workers?

Irena Cronin: Uh, for your customers, I, it's an interesting question. I guess if you're on the floor of a retail shop and you're looking for something and the customer is looking for that object and they can't find it, it can help you find it. So again, it's kind of like a location device. that, um, works for the customer.

Uh, unless the customer actually puts the headset on, I wouldn't say that it's exactly for the customer. It's for the worker helping, uh, the customer and doing their job. 

Richie Cotton: Okay. Uh, so that's interesting that, um, this is mostly about, like, locating things and is kind of quite close to the logistics aspect. So you mentioned, like, a customer putting a headset on.

Can you imagine any situations where that might be a good thing? Mm hmm. Are you going to go into some store and put a headset on and show you something? 

Irena Cronin: Sure, there could be several different, I, I, I thought about entertainment, so it's not even a store. But like, if you, if you have some kind of movie or some kind of whatever has come out, you can have an AR experience where it overlays something on top of something else.

So that's not a customer experience, but, um, yeah, let's say, um, and there's been a lot of tries at this and it hasn't really worked very well. Let's say you want to see how a blouse fits you. without putting it on. So you'd be able to do that with an app, you know, and it wouldn't be expensive. Right now it's really expensive to do something like that.

They have these big, obtrusive, um, screens where you do that and it doesn't work very well. So I could 

Cathy Hackl: see that happening very easily. And another good example there, Richie, that I'm starting to hear a lot from the fashion industry is like the stylists that work with celebrities, and normally they have to send them photos or videos of some of the outfits that they're choosing for them.

If they're able to do that in spatial computing, where it has depth, and the, you know, the star or starlet or whoever can actually see it with a little bit more volume, they might get a better idea of what it looks like. So definitely starting to see the fashion industry really get interested in figuring out how to use, you know, spatial computing.

Richie Cotton: Okay. Um, that's pretty interesting use case that I'm trying on the clothing. Uh, all right. So, uh, I'd love to get into some of the technical details of how this works. You mentioned like some of the techniques like this computer vision and, um, some of the sort of spatial aspects of things. So, um, what, what are the tools and techniques that are involved under the hood?

Irena Cronin: Oh, I mean, it ranges, um, all the machine learning and AI, I mean, from deep learning reinforcement learning, you know, the whole gamut. It's, it's not just one thing. I mean, it depends on, on the app exactly what it uses, but it could use a combination of different machine learning AI techniques. Um, and computer vision uses a lot of this type of thing as well, but in the guts of what it's doing for computer vision itself.

Um, but yeah, that's, that's a quick answer to your question. 

Richie Cotton: Okay. So, uh, lots of machine learning. Um, uh, Cathy, I don't know if you had anything else to add to that, like, uh, on what the What's going on? 

Cathy Hackl: I would say like, at least with the Vision Pro, it's at least 12 cameras. There might be more. I can't remember how many cameras exactly.

Right. But at least 12 cameras, which are pointing out, well, they're, they're pointing in, they're pointing out, but they're physically scanning the world in real time on a constant basis. Right. So that's, I think, very, very powerful technology, um, that allows for that computer vision to happen. So, um, I always, I always comment and say like the Vision Pro as a device, for example, definitely has 3, 500 worth of technology in it.

It doesn't have 3, 500 worth of value to the regular consumer yet, right? Cause it's not, it's not for, it's not a mass market product just yet, but eventually it will be. Um, but yeah, the technology that's in the device, um, is extremely powerful. Those cameras. Um, and then also all the computer vision and AI that's happening in the device is pretty impressive.

Irena Cronin: What's interesting is Apple doesn't like to use the word AI, and they will be using it very shortly for something else, for LLMs. But when it comes to hardware, they've shied away from ever using the term AI, and they always use machine learning, which is very, very interesting. They want to make sure that people aren't afraid.

of using their hardware, because anytime you say AI, up until recently, it was kind of sometimes scary for some people. 

Richie Cotton: Uh, yeah, we're going back to, to, but yeah, that's interesting that, um, the, That machine learning is considered, um, a better term here. It's like quite often, um, I spend a lot of time talking about machine learning and we have to put AI in the marketing stuff because then more people listen to it because AI is cool right now.

So it's interesting that Apple have gone in the other direction there. Um, all right. So, um, let's talk about, uh, implementing this in business. Uh, because it's like, it's not really a consumer thing as much at the moment. It's more of a sort of business, um, proposition. So for organizations who want to get started with spatial computing, what is step one?

Cathy Hackl: I think anyone listening to this needs to understand that we're not going out, we're not telling you to go out and buy a Vision Pro today. Right. But one of the ways to start kind of thinking about what this means is when computing starts to understand the physical world, how does that change your products and services?

So starting to think about that, you know, reading our book is a first easy step, right? To start to think about how you start to implement this. Um, I think we're going to start to see like Irina mentioned, um, Apple bringing more announcements and more things that might force people. to really rethink what spatial computing is and start to take it more seriously.

So those are some of the things I would say, kind of try to understand the shift that's happening here. Um, but by no means are we saying go buy a hundred Apple Vision Pros. I don't, I don't think that that's what we're, we're alluding to. Irena? Yeah. 

Irena Cronin: Recently Tim Cook said that 50 percent of the S& P 500 companies have bought Apple Vision Pros and they're currently evaluating them.

So that's really, that's interesting. If he didn't say that people really wouldn't have a clue, right? I mean, that's a, that's a lot of companies. That's a lot of power. So what are they doing with that? Well, they're, you know, onboarding some people to figure out what the power of the Apple vision pro is, what it could bring, what the use cases for their particular companies, um, how easy it is to use, which it's super easy to use.

And what are the limits of its use? So, the whole range of, you know, how does this piece of hardware work? And how does software get integrated in it? And they might even be thinking about, well, what kind of software would we build for this thing? For a use case. 

Richie Cotton: Okay, so it sounds like most companies are in, uh, very much a sort of prototyping stage at the moment.

Um, I'm curious as to who needs to be involved in this, like, so which teams or roles should be thinking about, well, do we need to adopt spatial computing or not? 

Irena Cronin: Uh, well, it would start with IT and innovation. So there's a lot of people in major corporations that are ahead of innovation. So it would basically come from there.

It can even come from higher up, you know, the C suite, and then they tell the innovation person, hey, we really want to do this. It's your, you have to now have a plan and figure out how to get this used. And then the head of innovation goes to IT and then it goes down from there. So basically the, the power spot for this would be the innovation person.

Cathy Hackl: And I've definitely had really great conversations with like CIOs, CTOs, even CMOs. Um, that have like, like asked their teams to look into this. Um, so definitely there seems to be interest from the C suite, but definitely head of innovation under the tech teams. 

Richie Cotton: Okay, so it sounds like, um, from an executive point of view, it's going to be the chief technology officer who, who ends up as like head of spatial computing as well.

Or are there other executives that need to start worrying about this? 

Irena Cronin: Yes, those, but even CEOs might say, you know, it depends on how innovative the CEO is. 

Richie Cotton: All right. So yeah, I guess it depends on the level of innovation within your organization and how sort of forward thinking you need to be. Um, all right.

So, um, I guess most of the applications of this are only going to be as powerful as the data, um, that sort of put into them. So what sort of data should organizations be collecting in order to be able to make use of spatial computing in the future? 

Irena Cronin: It could be, now I'm thinking of combining it with an LLM again.

So, if you put in all kinds of data and lots and lots and lots and lots and lots of data which you could carry into a spatial computing headset, um, and then you apply an LLM to it to crunch the data, um, there's just, I'd say there's no limit to what can be done with data. These days, I mean, without the LLM, I would have said, sure, you know, there is a limit and, um, we don't know what that is, but there's a limit to it.

And it's going to be much slower, but right now, um, considering that you have LLM capability with spatial. Um, I mean, it's just, it's just amazing, I think. 

Richie Cotton: Okay, lots of possibilities. Um, I guess the tricky part is going to be trying to figure out, like, which project to start off with. And some of the stuff, like, you suggested, like, okay, let's have a fleet of, um, delivery robots.

It sounds like quite an expensive first project. Is there anything simple that you think, um, a lot of organizations might be able to get started with as, like, a first spatial computer project? 

Irena Cronin: Um, yeah, I'd say if you go back to logistics and for logistics, I wouldn't say the people that are on the floor exactly.

I'd say the person in charge. So it'd be management because you don't want to outfit the people on the floor with 3, 500 headsets, even 2, 000 headsets. Um, right now the people on the floor could do very well with 300 headsets and do their job. So we're talking more about management capability and using the headset.

They would be able to locate where workers are, they would be able to locate where, um, pallets are. And what's going on, uh, with their scheduling and all that kind of stuff. So, and that's not a hard thing to build. They already have stuff like that, but it's, it's on a flatty surface, you know, uh, versus a 3D capability.

Once you put in 3D on there and also, um, being able to do things with your hands, to be able to control things with your fingers and, And your eyes and everything, it becomes so much easier, so much faster to do your job. 

Richie Cotton: When you've got expensive technology, you probably want to think about management use cases before, like, everybody use cases.

That, that makes sense. 

Irena Cronin: But there's also another one that it's being used in right now, um, that you might say is not that simple, but it makes sense. There's my cat here. Go! Get down! Geez! Wants to be in the frame. Um, okay, so for surgeries, um, there are a lot of surgeons right now that are using the Apple Vision Pro, both for prepping, they have people that are prepping for the surgeries, taking a look at visuals, and then helping the surgeon go through those visuals, and also overlay during surgery.

So there were surgeons that were using the Microsoft HoloLens to do this before, but the resolution on the HoloLens was just not good at all. And, you know, in terms of mistakes that can happen, that's a very expensive mistake that you don't want to make. So, um, I think that this will be used very, very quickly in a whole bunch of different surgeries.

And they already have software to do it. It's just, they have to implement it into the Apple Vision Pro. 

Richie Cotton: Okay. Yeah. I can certainly see how the cost of getting a surgery wrong is just incredibly high. And so certainly for the cost of, uh, uh, uh, an Apple headset. So, uh, yeah, that seems like a great use case.

Uh, Cathy, do you have any examples of, um, like good sort of getting started use cases? 

Cathy Hackl: Well, I definitely want to talk a little bit about the medical side, like Irina was. Um, I, at an event I spoke recently, we did a live demo of the Apple Vision Pro on stage. And, um, I chose an app called AnimaRes, which shows you a 3D model of a beating heart, both in a healthy state and in a non healthy state.

Um, and when we were doing that, the fact that I could manipulate the heart, make it bigger, make it smaller, I could go into the heart, like there was so many things I could do with it that I could hear the crowd going, Ooh, ah. Like, so it definitely from a, from a perspective of the medical, um, professionals is extremely powerful.

I also think from the perspective of the patient. can be extremely powerful because I'm going to be able to kind of meet with my doctor and better understand what is this procedure I'm going to have? What is it going to do to my body? Like, how do I better educate myself before like a procedure, right?

Cause all of us need to go and go down the web and then, you know, we might get all these crazy ideas, but if we're actually have this 3d model and I'm in a virtual space with my doctor kind of understanding what this procedure could do or what is going to be happening. I think that's a very powerful tool, both from an education perspective for the people that are, uh, the medical professionals, but also for patients.

Um, so I do want to say that I find that kind of, uh, an interesting use case for sure. 

Richie Cotton: Yeah, definitely. I mean, certainly there have been a lot of occasions where you go to the doctor and they give you a load of jargon about what's going on with your body and then you have to go and look it up and it's a time consuming process.

Just being able to actually see what happens inside yourself, but I'm sure that can be like a powerful, uh, tool. Well, first of all, an educational tool, but also like the lifestyle change stuff. Like you want to see what your liver actually looks like. Maybe you go easy on the drinking after that.

Cathy Hackl: Technology nowadays, you can create virtual twins, right? You can start creating virtual twins of someone with like the MRI scanning, everything that's happening, you can start creating virtual twins of people. So, um, so yeah, like, I think that there's extreme power in that and kind of how you start to look inside, right.

And, and educate yourself about what's going to happen. in a procedure or something, so. 

Richie Cotton: So, I'd like to talk a little bit about, um, business risks and regulations, I think. So, are there any regulations around spatial computing and particularly the ones that affect businesses? 

Irena Cronin: That's a really good question.

I actually, um, know quite a few lawyers at Perkins Theory. I've, I've brought up that question, especially having to do with privacy. And, um, right now there are no real specific laws, like if you're talking about eye gaze. There's no law that talks about eye gaze or regulation that talks about what the limits are there.

It's basically the corporations that have to come up with their own procedures and their own regulations, and then have the buy in from their own customers, people who are using their products to understand how far it goes. But I, I believe in the future, yeah, there's, uh, once this stuff is around long enough, And people who aren't even using it, uh, have a need to be able to regulate it.

Um, there's gonna be very precise kinds of regulation about it. People, people especially that haven't used it could be quite afraid of what it offers. Once, if you, if you're using it, you know exactly what, what it entails. But if you're not, it's hard to tell like how far it goes. 

Cathy Hackl: I also want to add the state of Colorado, for example, just passed a law, um, protecting people's brainwaves.

Um, so like that's a, like a president in some ways, right? This is a very, like, it's just a state law, et cetera. But I think that that might come into play when you have these neural interfaces, right, that are tapping into intention. Um, so I think that that's interesting. And from a regulatory standpoint, I live in Washington DC and I'm having really interesting conversations around virtual air rights, like who owns the air around me.

Uh, because with, with spatial computing, what you have is the physical world becomes a canvas. right for the device, it becomes, you know, canvas, but at the same, same time it becomes real estate. So therein lies kind of that, you know, that issue of like, well, who owns the space around me and who can actually advertise there?

What are they going to be able to put in front of me? Because yeah, like with some of these devices, what you're going to start to see is that everything within eyesight and earshot of you becomes a canvas, but also real estate. So that in itself, ushers and a lot of questions around privacy like Irina's mentioning.

Richie Cotton: That sort of reminds me of the early days of the internet where there were just banner pop up adverts everywhere and it feels like if we repeat that with spatial computing and there's adverts floating in front of my gaze all the time that's going to be horrendous. I'm, I'm hoping no one's like seriously considering that but Cathy is, is anything happening in that area then?

Cathy Hackl: Well, I, I don't, I don't think anything's really happening with virtual air rides just yet. It's really early and I don't think lawmakers are quite wrapping their heads around that. There's so much more they're wrapping their heads around, around right, right now. Um, but we'll have to see. I mean, it's definitely, we have a whole section in the book where we talk about some of the, you know, some of the considerations, etc.

So, I definitely recommend as a, you know, as a, a thought, you know, a thought experiment to definitely tap into that part of the book and think about, well, what does, what does this mean, right? For the future, if you're in, if you're a lawmaker for your future constituents, all those sorts of things. 

Irena Cronin: I do imagine that in the future, in the very near future though, there will be advertisements that pop up in front of you, but you would have to opt into that.

You know, it's not going to just show up and then you're going to have to opt out. At least that's my hope that that would happen and I'm pretty sure that that's, that's the way it's going to go. 

Richie Cotton: Okay, that's slightly reassuring. Um, so, just on this sort of privacy note, so, um, I don't, like, basically most personal data are protected, any kind of medical data about yourself is protected.

I'm not sure about things like, is your location, does that count as personal data? And things like, I guess, you talked about like, it contracts the state of your retinas. That feels like it ought to be medical information that's protected, but I have no idea. Are there any sort of limits to, like, what is protected and what isn't?

Irena Cronin: The retina, um, there, there's certain regulations that are talking about retina scans that are in place right now, but they're not laws yet. So it's still in motion. Um, the location, uh, location doesn't necessarily have to be anything too bad as long as it's anonymized. So you don't know who's at that location, but you know there's somebody there.

So that, that, that could be perfectly fine. And in fact, if you ask Apple this, they'll say that all of the information that is available is anonymized. So that they don't know, unless you want to, unless you have an app that you've allowed to be able to go into your data and you'd have to opt into that.

And Apple allows apps to be able to do that. Um, they wouldn't know who it was. But yeah, um, you can imagine, Apple used Threaten to Scan. So, it's like, that's got to be really carefully, Uh, carefully under lock and key, that no one gets access to that, for sure. 

Richie Cotton: Absolutely, I guess I don't want all of my retina scans being made public.

That sounds weird and a bit, uh, intrusive. Um, okay. So, uh, let's talk a bit about, um, how spatial computing is going to change people's jobs. So, I guess to begin with, like, what skills do you need in order to take advantage of spatial computing? 

Irena Cronin: It depends on what perspective you're talking. If you're talking about someone who uses.

Um, that could go the gambit from no education whatsoever because the Apple Vision Pro is extremely easy to use. And in fact, um, they've done obviously lots of testing about it and they found that almost anyone can use an Apple Vision Pro. So, uh, from a user perspective, it could go from really easy to quite difficult depending on if you have a special app on there and you have to learn what that app is for the corporation.

Uh, now, if you're a builder, if you're a builder of software, I'd say it's, it's just along the same lines of someone who knows CGI, you know, or, or has been building for iOS because the, the operating system for an Apple Vision Pro is very close to an iOS thing. Operating system. which is your phone, you know, for your phone.

Um, so, actually, people are building for the company I'm working at, Datos Technology, have built iOS, uh, stuff for iOS devices, and they've had to learn SwiftUI and a couple of other things that Apple uses, but they're very easily building the app right now, so that's not even hard either. The things that could be a little difficult are since, um, the Apple Vision Pro operating system is, is quite new, there are things that aren't, um, that obvious that you need to ask someone like a developer relations person, you know, I really can't figure this out.

There's no documentation. Can you help me? And in fact, we do have a developer relations person and are able to answer, to ask these questions that are answered by them. So I'd say it's really, if you're a developer, it's really not that difficult to just, you know, decide to now do the Apple Vision Pro. 

Richie Cotton: And is it important to have, um, skills in like spatial data analysis or spatial statistics?

Do you think that's going to help people adopt, um, spatial computing? 

Irena Cronin: I'd say those are completely new areas. Spatial data is something that is uncharted, you know, it's like we're building now this whole thing about spatial data. Yes, there are companies like virtualistics that virtual, virtualitics that have done spatial data, but they do it in a different way.

And there's some other companies, but it's not, um, out there in the populace that, Hey, there's this thing it's called spatial data, right? How do I learn what to do with it? So I, I think that, um, the Apple vision pro is going to democratize this. to an extent that it'll be really easy to figure out how to use spatial data.

It'll be really easy to integrate the data into something and then have something else come out of it that is spatial. But right now, um, it's a, it's a totally new field. 

Richie Cotton: All right. Um, and just to wrap up, so we talked a lot about, um, uh, the Apple, uh, platform. It feels like, um, that Vision Pro is like, it's sort of the V1 like viable, uh, platform.

What are we going to see in the future? How are things going to change? Are there going to be competitors? Like what's going to come next? Um, Cathy? 

Cathy Hackl: I think, and I think a lot of people agree that we're going to see a camper and explosion of new hardware over the next six to 18 months, especially with everyone trying to put AI into everything.

Yeah, you're Um, and some of these devices are going to be spatial computers, right? So we're going to continue to see meta advancing in some of the things that they're working on, Apple as well, uh, Samsung. Um, so I think we're going to continue to see a lot of new hardware types for sure, and definitely new software.

Richie Cotton: Okay. Uh, so competition coming soon. Uh, that's, that's a good note. Uh, competition doors is good for the consumer. Uh, and Irina, um, what, what's in your crystal ball? What are you forecasting is going to happen soon? 

Irena Cronin: So it's been, um, leaked out by several people that the Apple, Apple is building two different new Apple Vision Pros.

One that is more simplified and less expensive to the tune of something like 1, 500. And another one, which is the second version. of the Apple Vision Pro that's currently out there. Uh, in terms of timing when this is going to come out, originally it was supposed to be next fall, and now they're saying that it's sometime in 2026 or something like that.

We'll see, um, I can't really add any more to that, even if I knew more to, that I do actually know. Um, But yeah, uh, so I'd say that it's really great. We do need a cheaper version. That 3, 500 price, uh, tag is way too much for a lot of people and it's not going to be able to make it easy for, uh, most people to buy or nor would they want to buy it at that price.

Um, other, other things, Apple is working on Apple Glasses. So sometime in 2027, 28, they're gonna come out with glasses. They've been working on glasses forever. So I wouldn't say that it would eat into, uh, the Apple Vision Pro because they'll be for very different things with the glasses. You can go running.

Um, you could play sports. You could do all kinds of things. Um, and it'll have a different use. use case. I mean, even if the Apple Vision Pro becomes lighter right now, it's a bit too heavy. Um, you certainly don't want to go outside, although people have done that for pranking or whatever, go outside and wear it and, and, you know, walk through traffic with it.

But with the glasses, you'll be able to do a lot of different things. Um, but as Kathy said, there's a whole bunch of other companies that are coming along that have been working on both headsets and glasses. And we're going to see both those variations coming within the next three years. 

Richie Cotton: It sounds like we have a sort of, um, distinction then between the consumer space and the, the business space, um, for spatial computing.

Irena Cronin: I, I'd say that, um, the Apple Vision Pro with the cheaper model at around 1, 500 can very easily become a consumer thing. 

Richie Cotton: All right. Um, and then do you have any final advice for any individuals or organizations who want to get started with spatial computing? 

Irena Cronin: I'd say that it's super easy to just jump in and do it besides the price tag.

Just buy the headset and use it and see how actually it works and how you don't need to teach anyone how to use it and that there are lots of use cases for it and it's just the beginning. So don't be afraid to jump in and try it. 

Richie Cotton: Okay, I like it. Just, uh, go and jump in. And, uh, Kathy, what's your final advice?

Cathy Hackl: Yeah, I agree with Irina. Jump in, test it out, and definitely, you know, definitely read her book. I think that'll help you kind of get a more grounded understanding of what spatial computing, uh, is and the role AI plays in it. 

Richie Cotton: All right, super. Uh, that's, uh, a bit of homework then for the audience is just, uh, jump in and, uh, try this thing out for themselves.

Uh, all right. Wonderful. Uh, thank you, Irena. Thank you, Cathy. Uh, it's been great having you both on the show.

Irena Cronin: Thank you. Thank you.

Topics
Related

podcast

Do Spreadsheets Need a Rethink? With Hjalmar Gislason, CEO of GRID

Richie and Hjalmar Gislason explore the integral role of spreadsheets in today's data-driven world, the limitations of traditional Business Intelligence tools, and the transformative potential of generative AI in the realm of spreadsheets.

Richie Cotton

54 min

podcast

Data & AI Trends in 2024, with Tom Tunguz, General Partner at Theory Ventures

Richie and Tom explore trends in generative AI, the impact of AI on professional fields, cloud+local hybrid workflows, data security, the future of business intelligence and data analytics, the challenges and opportunities surrounding AI in the corporate sector and much more.
Richie Cotton's photo

Richie Cotton

38 min

podcast

Data Trends & Predictions 2024 with DataCamp's CEO & COO, Jo Cornelissen & Martijn Theuwissen

Richie, Jo and Martijn discuss generative AI's mainstream impact in 2023, trends in AI and software development, how the programming languages for data are evolving, new roles in data & AI, and their predictions for 2024.
Richie Cotton's photo

Richie Cotton

32 min

podcast

The Past, Present & Future of Generative AI—With Joanne Chen, General Partner at Foundation Capital

Richie and Joanne cover emerging trends in generative AI, business use cases, the role of AI in augmenting work, and actionable insights for individuals and organizations wanting to adopt AI.
Richie Cotton's photo

Richie Cotton

36 min

podcast

Can We Make Generative AI Cheaper? With Natalia Vassilieva, Senior VP & Field CTO & Andy Hock, VP, Product & Strategy at Cerebras Systems

Richie, Natalia and Andy explore recent progress in GenAI, cost and infrastructure challenges in AI, Cerebras’ custom AI chips and other hardware innovations, RLHF, centralized vs decentralized AI compute, the future of AI and much more.
Richie Cotton's photo

Richie Cotton

46 min

podcast

[AI and the Modern Data Stack] How Databricks is Transforming Data Warehousing and AI with Ari Kaplan, Head Evangelist & Robin Sutara, Field CTO at Databricks

Richie, Ari, and Robin explore Databricks, the application of generative AI in improving services operations and providing data insights, data intelligence and lakehouse technology, how AI tools are changing data democratization, the challenges of data governance and management and how Databricks can help, the changing jobs in data and AI, and much more.
Richie Cotton's photo

Richie Cotton

52 min

See MoreSee More