Conversations on Wellbeing at Work

Exploring The Impact of AI on Employee Wellness: A Conversation with Meg Price, Co-Founder of Noacoach

November 08, 2023 John Brewer
Conversations on Wellbeing at Work
Exploring The Impact of AI on Employee Wellness: A Conversation with Meg Price, Co-Founder of Noacoach
Show Notes Transcript Chapter Markers

Imagine utilizing artificial intelligence to improve wellness in your workplace - sounds like a dream, right? Well, it's no longer just a dream. 

Join us as we chat with Meg Price, the trailblazing co-founder, and product director of Noacoach, a tech startup that's reshaping the wellbeing journeys of employees. Noacoach has harnessed the power of AI to provide accessible and low-cost coaching round the clock. Listen in and get a first-hand account of how this innovative platform combines neuroscience-backed activities with AI coaching to revolutionize the wellness space.

But that's not all, we dive into the fascinating area of cognitive behavioral coaching and the role AI can play in it. We question whether AI is a mere tool, or can it truly be a technology partner fostering a human connection? We take on the weighty responsibility that tech developers bear to create tech that doesn't just dazzle us, but genuinely serves us. But, we also confront the potential pitfalls - could we become too reliant on AI? How does it impact our psychological safety at work?

Our conversation further turns towards the ever-important balance between technology and nature. We ponder on how tech not only enhances connections but also prompts us to disconnect and appreciate the world outside the screens. The icing on the cake is a sneak peek into the upcoming conference summit and the considerable value it promises. So, tune in as we traverse through the intricate world of AI and wellbeing at work. This episode promises to stimulate your mind, challenge your perspectives, and leave you with food for thought.

You can contact Meg and find out more about Noacoach on their website at https://noacoach.com

Find our more about Wellbeing at Work's Global Summits, our Global Hub Community of C-Suite executives and our Bespoke division at wellbeingatwork.world



John Brewer:

Hello and welcome to the latest episode of Conversations on Wellbeing at Work, the podcast by Wellbeing at Work. Well, I work for the organisation and I run some other events in North America, but we're having a podcast series now with some of our speakers and exhibitors from the Australian summits coming up in a couple of weeks, and you can check out all our summits on our websites, wellbeingatworkworld. We run summits in eight regions. There's also a hub and a number of podcasts and also webcasts that you can check out, so please do. Our guest today is Meg Price. She's the co-founder and product director of NOAA Coach and she's also currently doing a PhD on AI and human wellbeing, which is going to be a big part of our conversation today. So welcome, meg.

Meg Price:

Thank you. Thank you, John. Thank you very much for having me.

John Brewer:

Okay. So before we start, as always, I put my guests on the spot and just simply asked them how are you doing?

Meg Price:

I am doing very well, thank you. I'm based in Melbourne, australia. We're leading into summer, which is my favourite season, so I'm very happy at the moment. Thank you.

John Brewer:

That's good. I hopefully. I know you've had some unpleasant weather in Melbourne in recent months. Was it not some flooding? I thought, I heard or imagined Not in.

Meg Price:

Melbourne. But we are having everything and we are gearing up for a very hot summer, they tell us. So we are all in preparation mode, if you like, but at the moment it's spring, is being okay.

John Brewer:

Lovely. It's a good time of the year wherever you are. It's winter now in Canada, or in, anyway. So before we get going with the conversation around that, let's say those issues around the AI and wellbeing and the future of work and a few other really meaty things that we're going to be talking about Can you maybe just set the scene a little bit about the work you do at NOAA Coach and what that is?

Meg Price:

Yes, yeah, I'd love to. And NOAA has evolved. We're a tech startup. We started up only three years ago, so we have evolved and we're now really a fairly comprehensive wellbeing platform and it's designed to declutter that wellbeing journey, if you like, for employees, so to have a platform where we have AI coaching, but we also have a lot of neuroscience backed activities and courses that can help people, but also a platform that organisations can put their own wellbeing initiatives in, so that there's just this one one spot to declutter and demystify all these wellbeing activities in a nice one platform that you can get your phone or on your computer.

John Brewer:

Okay, sounds great. I'll be checking it out shortly after this episode. So let's talk a little bit about the AI, because I know that there's an extensive use of AI within your platform and there's a lot of talk nowadays, I think, about how AI is changing work. There's this sort of revolution that no one really quite knows what it is, but we're pretty sure it's going to be something and it's going to be something huge. And so what's the wellbeing environment in Tokyo? How does AI? How is AI influencing that in your experience?

Meg Price:

Yes, and this is really interesting because things have evolved a lot in the few years that we've been building this. When we started NOAA, we really it was during COVID, to be honest, and we're in COVID lockdowns, and Melbourne has been one of the most lockdown city in COVID. So we were going for a walk. One of my co-founders and I were going for a walk and we both do a lot of coaching work and we were saying that a lot of people who are incredibly stressed, who are getting burnt out, who are having trouble working from home, having trouble with lockdowns, that organisations couldn't afford to give everybody coaching because it's expensive and it's time and because I'd done some work in technology before, we said could we create something like an AI coach that it doesn't replace human coaching, but it's available 24 seven and it just might help somebody talk through some issues and work out who do I need to actually speak to. So that's where we, that's where we first started.

Meg Price:

It was through this lockdown conversation of saying, hey, we could do this, and when I started researching it, I found that people there's still a stigma, sadly, about mental health and stress and well-being and there's actually research and this went back a few years ago. There's research then, and more so now, that people would rather talk to an AI about their mental health than a human, which I don't know if that says something about the quality of the AI or the quality of humans, but it's there and people some of the statistics one in two people would rather speak to an AI about their mental health than their doctor and four out of five would rather speak to AI than their manager. So this is where we decided could we create something? And we've spent the three years actually building and putting coaching techniques and all sorts of information to help that AI have a chat to you.

John Brewer:

Yeah, now I know one of the things and I think in some ways, you've described the way in which AI also gets understood in terms of work, and there's a lot of talk about how you know it's not going to replace your job, it's going to augment it, it's going to make. It's going to make, it's going to simplify things, make them quicker, make them more accessible, in the same way as with an AI. I don't know if can I use the term AI therapist or AI.

John Brewer:

AI coach is what we can compare. Companion coach, sorry, coach. Well, it's providing a service which is you know. I say that it's, it's. It's a lot less lower cost. It's not more accessible in terms of time and place, right, yeah, yes.

Meg Price:

Absolutely, because it's there 24, seven it's if I'm stressed and I want to just chat through my thinking and bring those thoughts outside my head and I might not want to even share that with somebody because they're just my thoughts at this stage. I can share that with an AI coach at 3am if that's what I choose to do and there's a lot of value in that, because otherwise our thoughts often stay on our head and go round and round and we don't escape them.

John Brewer:

Well, I don't want to say to me, one of the big changes in the last few years with well being at work is the way that it's been rooted in a lot of organizations, in their diversity practices and equity practices, that there's become a much greater realization that not every yeah, not every therapist should be a white guy, right? Yeah, I put it in a very sort of crass kind of way in that sense, and bias is a big problem with artificial intelligence. So how do you address that? How do you make sure that the that you're providing an appropriate sort of perspective if you're that to the individual who's got an issue that they want to share with the AI?

Meg Price:

Yes, and that is sort of continual work on my end and part of my research, because a lot of AI is bias because of the nature of who it's developed. By Now we were lucky enough that one of our co founders has an AI business, so we've been building the AI of NOAA coach and in doing so we have been inputting all sorts of coaching conversations and well being conversations, globally and from different, diverse backgrounds, to try to mitigate some of that bias that perhaps is in some of the general GPT or general AI's. Is it perfect? No, because even I might be doing bias when I'm looking for some information putting it in. So I think it's to me it's a work in progress that, especially with things like AI, I'm all for more people actually getting on board and helping build this, then leaving it to certain parties, yeah.

John Brewer:

I know when you say I a little bit into writing work and the like and that's very sort of text based. But if I was communicating with a coach, I do so verbally. Presumably so does an AI in the well being space. Is the AI an actual kind of person in that sense, or is it? Is it a text based piece, jen? Does that matter?

Meg Price:

even it matters for a couple of things in my mind. I want people to know that it's an AI coach. I don't want them to confuse that this is a human, because I want my AI coach to drive to human connection and I think that I think we shouldn't be getting people confused. Am I speaking to a human here or am I speaking to a chatbot? They should know upfront. This is what it is. So ours is text based. I can speak into my phone rather than just like I can when I send a text message to somebody. I can actually speak, so I don't have to type.

Meg Price:

But we want people to know this is an AI coach. It's not perfect. It's not a human. It'll help you thinking and it might help you. We've got clients who said one of the things they like about talking to the AI coaches they just get help. For how shall I approach this person? What's the best way? What's my opening line? To go and talk to this person, Then the AI will help them do that. But I'm a big believer that we should be not trying to blend them too much.

John Brewer:

So the way you describe it, there is much more of to me. I think of an analogy there the technology is performing the function of a co-pilot as I navigate the world, emotionally and mentally, versus. I want to go to this thing and be the equivalent of me lying on a psychiatrist's couch, delving with a bit of dream interpretation and thrown in sort of thing, absolutely.

Meg Price:

The format. It should be a co-pilot and an assistant, just like you're using AI to help with some writing, and an AI coach that I can just check my thinking with, have a chat, maybe get some ideas about how I'm going to go through today the best way I can. Yeah, absolutely an assistant, but remembering that AI now is more of a partner than a tool, and I say that there's a difference there in that it's a partner, in that I'm going to be able to talk to it. It will talk back in a similar way to a human and I'm also helping the AI learn, so there's a two-way relationship rather than just using a hammer or a tool.

John Brewer:

But one of the big challenges I think being challenge is that is faced, it seems to me, globally. There was a report recently by the Surgeon General in the US describing loneliness as an epidemic not unlike COVID, in fact and that sort of lack of human connection a lot of people associate with the use of technology. More time we spend with technology is less time we spend with real humans. Is there a danger here that we become? It becomes quite seductive, and so I'll get advice on a particular issue from and again. Whatever name your AI thing that has, whether it's so Siri or whatever in the Apple space, is there a danger that people become too attached to that?

Meg Price:

Yes, I think there is a danger depending on the technology and how it's built. Again, this is part of my research and part of my interest, even having teenagers that have come through the gaming world and thinking all should they be sitting on gaming? Shouldn't they be outside on a bike playing with friends? But realizing actually they're gaming with friends? What's the difference here?

Meg Price:

But I think part of why NOAA coach is not just an AI coach, part of the reason that is a small feature of NOAA is we realized fairly early that we didn't want to create that dependency on the AI coach, but we wanted to create an AI coach that helped people have human connections. So the NOAA platform actually has specialized forums where we have humans. It has a coaching and course hub where you can actually go and look for human coaches as well. Now I know there's technology out there that is built to make money and engagement and wants to take your mind and your engagement and keep you on the platform. That might be helping that person feel less lonely in the moment, but basically I don't think that's necessarily good technology for humans. So I sit on the edge of saying it might make more money to have you engaged in my product 24-7, but it doesn't go with what we're looking for as a well-being platform. So I guess that's my answer to that.

John Brewer:

There's a yes, but yes, yeah, and I'm vaguely familiar with the work of Sherry Turkle, for instance, who I think is out of MIT and she wrote a book many years ago called Life on the Screen, I think it was. That was about people taking on personalities in multi-user dungeons on the Internet and seeing that as a therapeutic process, because it does give people the opportunity to explore persona and do all sorts of other things and see who people who are fairly introverted. So it opened up quite a lot socially, so it's really positive. Now her later work is much more kind of oh my God, we're all alone and we're all going to die. But there is that. I think it's not just a tool, it's how you, it's obviously how you use it. Yes, yeah.

Meg Price:

And how, if you're, I think you the responsibility is on tech developers like myself and my co-founders to create technology for good. As I say that you have to think about, you build an AI coach. How is this going to be used and how is this going to affect people? And therefore, what else are you putting in the platform that ensures that it is good technology? And I I'd like more people come to that table and more tech developers looking for all the different ways that people will use your technology and making sure it is is good technology for humans.

John Brewer:

Yeah, I know, I know one of the things that that one of the sort of tools you use in some ways is cognitive behavioral coaching. It's the sort of modality that the how does the? Firstly, what is, what does that mean? And secondly, how does the sort of AI deliver that? Or is that delivered by real people? What's the situation? Not just not also not just in your, the context of your thing, but also what's going on in terms of employers delivering wellbeing to people, to their people?

Meg Price:

Yeah, so a lot of. There's a lot of different techniques for coaching, just like there is for therapy, and cognitive behavior is often talked about as a it's certainly in the therapy world is that talk therapy, and coaching is often that in terms of the workplace, we talk through issues. So a lot of my coaching accreditations and the work that I've done in coaching has been around neuroscience and brain based coaching. So what I say with that and the cognitive bitties Often people feel very emotional about an issue, especially when we're working on something, or they can't see the trees for the forest or they're stuck in not being able to see the strategy moving forward.

Meg Price:

So with that idea of using brain-based coaching, it's to try to turn back on this prefrontal cortex and be able to talk through some of the things that might be clouding judgment or causing so much almost extreme emotions to just to be able to clarify it and find, okay, what are the one or two steps forward that I can make? Right now that's in my control and so in doing that with the AI coach, we've spent the last few years really building up the AI coach, to be quite specific, in the way it asks questions and the types of questions, powerful questions that are going to encourage your thinking. In that way, we built in a huge amount of coaching conversations into NOAA coach so that NOAA knows what sort of questions to ask. If you're really pushing for advice, it can also give some advice, because sometimes people say I just don't want questions, I just need a quick idea now about what I can do, and coaches will give that to and NOAA will do that.

John Brewer:

So it's one of the things that you hear a lot around AI. Is that prompt engineering. If you're given chat GBT and you just ask it questions randomly that you think might be interesting and you might need answers to, you may not really make much progress. What you need to do is to be more disciplined in the questions that you ask, more refined, and I've seen ridiculously long documents about prompt engineering and you can make a lot of money doing this, which I'm sure is great, and I wish I could do it. But there you go. Is there where people are interacting with a person that's familiar territory for us? Do we need to train ourselves to deal with AI in the well-being space, or is that something that the AI already gets for us?

Meg Price:

In terms of NOAA, it already gets it because chat GBT came out the end of last year and then you started hearing about the prompts, whereas we've really been prompting but it's machine learning for the last four years. So when you go and chat to NOAA, it already knows and NOAA can chat and respond Again. Is it perfect? No, Is a human coach perfect and not bias and not judgmental? No, but you don't need to go in and prompt it and say, oh NOAA, I'm feeling like this, and this is the sort of conversation what NOAA gets, that NOAA gets off the straight off. It starts the conversation. Just, you know what's on your mind and let's go.

John Brewer:

Going back to one of the things that you mentioned at the start was, which I think is quite interesting, because it's something which I find I do think the world has become more cluttered to the extent that it can become quite a burden, really, just in terms of the volume of information and the if you look at something like Twitter, or just the sheer kind of craziness is probably an inappropriate term to use but just the volume of information, the inability to distinguish between what's good and what's bad, different, lots of stuff generated by imaginary people that to me seems to be both a mental and emotional challenge to me and others, but also a problem within the that you've fingered a little bit in terms of the world being space, where there is just so many things going on and different offers and different stuff. So you try and bring a certain sort of sanity to that.

Meg Price:

Again, that's probably inappropriate.

John Brewer:

No, absolutely.

Meg Price:

And I think this is you know as. And look, I've got decades of work in human resources and looking after people's wellbeing and I think we've almost created a monster in corporate wellbeing, in that, if I'm stressed, I'm already in in threat mode. I'm already struggling to just make decisions and get things done. Yet when I'm stressed and in that mode, you're going to say to me hey, it's time you looked after you. It's time you looked after your wellbeing. There's 700 resources here. Go find something to make you better. And I'm going to go. You've just made me more stressed, because now I think I shouldn't be stressed and I've got to pick one of these things.

Meg Price:

As we developed the platform, we wanted to declutter that. So you're not walking into this room and going, oh wow, what thing am I meant to pick here? And think it's too stressful and walk back out of the room. You can start by just chatting about where am I at and then, which resource is the thing in this moment? That's going to help me, because I think every person's different, every brain's different, every moment of stress is different.

Meg Price:

There is no one solution fits all, and I'd like to think that we can cut through some of that noise and help hold somebody's hand to what's going to work for you now. And the way we built the platform is that if companies have some fantastic resources, they can put that all into our platform. But let's not have one of those computer screens that's got just covered with apps and every app's open, or your phone that's got too many apps open. Let's shut them all down and help people find what they need in the moment, and that moment's 24. Seven People don't turn their stress off at five o'clock when they go home, so let them have a platform on their phone where, if they say, you know right now what is an activity I can do, that maybe just even gets me outside or maybe just helps me for five minutes before I walk into my family so that I can be the parent I want to be.

John Brewer:

But perhaps the best advice is to put the phone down, of course.

Meg Price:

And there is that and that may be. The advice might be. That's exactly what you're doing.

John Brewer:

I mean, we live in a world where we I'm spending my evening here with chat. I mean, I'm thoroughly enjoying the conversation, but I'm sitting in front of the screen chatting to you the other side of the world. I might be seeing you a little later than usual, so I'll order in. You know, perish a thought actually go somewhere and eat something. I'm not going to go to a movie theater because Netflix has anything I could possibly want to watch and lots of stuff that I have absolutely no interest in watching, that there is a, the. I think it's hard for the technology not to become part of the problem.

Meg Price:

Yes, yeah, and we need to make sure that we're again. The technology is not just guiding people to human connection, but it's guiding people into research backed activities that help them off technology. The technology can help you with that as well.

John Brewer:

Yeah, I believe, and yeah, and a related issue to that sort of clutter and overwhelm and distraction, that just distracting world that we live in with. I hate to think what my record is for a number of tabs I have open my browser during the day, for instance, is burnout.

John Brewer:

So people do get, do get very burnt out and again, that's one of those problems where the solution is not to pile on to them and you remember all other for the choices. So is that something that can I help with? That it can help on the management of the work side, presumably. Yeah, it can help us. It can help us manage work, yeah, and by once we're done with this conversation I'll be, I'll have some AI thing that will write a neat little summary that I'll then edit, versus having to listen to the whole thing, to the conversation again and or tweaks. Can it help us with addressing the problem within our heads?

Meg Price:

Oh, absolutely when most of you've got it, you're. You're at the evening, your day. My day is just starting. I plan to say to Noah these are the hundred things I need to do today. No, I don't even know how I'm going to do it. Before dinner time. Can you just help me make my day seamless? Give me some ideas of how I'm going to do this without getting myself stressed and procrastinating, and Noah will help me do that and start my day off.

John Brewer:

That's pretty good isn't it. That's pretty cool. Yeah, that's probably gives you back, but again, you're still and that's probably a question that you wouldn't ask a human. There isn't a human that you can ask that question. Really a lot of the time Is that?

Meg Price:

no, they'll give you no. They might ask you more questions, they might tell you what they do something I just want no it. No, it's me. I've been talking to Noah for a bit. No, it will give me also some science back ways to do it. But there's a lot of things that we actually don't want to ask a human, and we don't because we don't want to feel judged or we're just not ready to voice that opinion yet. And they're the things we can. We can speak to an AI about, and that's OK if it helps me Work out who I then need to speak to or how to go forward.

John Brewer:

So if I take that one step further and say there's a lot of focus nowadays on the importance of psychological safety yes, that which I think we can agree- is a good thing. Yes, what you're saying is actually the AI is safer than the people, or is the AI?

Meg Price:

can feel safer at times. So we actually run through no one of the courses is a psychological safety course that we partner with neuro capability with, and we use the course to understand psychological safety and how we can create that in organisations and also being able to talk to the AI. So we've had people who have come back to us and said I was able to speak to the AI when I was just I was furious at someone, but I knew that if I spoke to them this wasn't going to go well and I needed to be able to just calm down, see things clearly and work out how to speak to that person. So the AI really just helped them de-stress in that moment to have a more productive conversation with the person.

Meg Price:

And I've had a few different examples that I've had somebody who said they just didn't know how to approach somebody about something and they had to then procrastinate it for days and days on something you know it was a positive thing that they were looking for and Noah just gave them the opening line which worked. So sometimes it's just getting help that, yeah, you, perhaps you wouldn't ask a person, but it's a bit like Khan Academy now have an AI tutor on their side and listening to a podcast from him. He said one of the reasons they have that is because students don't want to ask what seems like a dumb question. But they'll ask the AI a question and they actually might have a whole concept around maths that they just couldn't figure out that the AI tutor will help them because they didn't want to seem silly to ask the teacher. And AI coaching is the same.

John Brewer:

But shouldn't we create an environment where people don't feel silly?

Meg Price:

We should, and part of my research is looking at why is it that people feel judged by others? Why is it that not everybody is as empathetic as we'd like them to be?

John Brewer:

Because that's what they're doing most of the time is they're judging you, Because it's human nature, but yeah so if people.

Meg Price:

But we know that if we've got to sit in a conversation, we want to hold that judgment, we've got to work at that skill. We don't all have that skill.

John Brewer:

So I think the AI coach has been and we have a different standard from those people with whom we're intimate. In that way, we expect our spouse to be judgmental in the way that our boss might be within that context, or a teacher or a judge, even if you happen to have committed some horrible offence and being called.

Meg Price:

And that's the interesting thing people often say to me but AI is not empathetic. Humans are different, because humans are empathetic, but I've worked with a lot of AI who seem to be more empathetic than many humans, I know.

John Brewer:

Yeah, I think that's probably true. But the other thing is, though. The other thing is, though you say, ai is a very I think we're going off to bit tangent here. I think that's okay, if you'll bear with me a bit. Yeah, it's creating a very welcoming environment for us. I feel a connection, I feel I belong almost within this environment, but there's also people who are terrified of AI, that this is going to take my job, this is going to who knows what it's going to do. It could do all sorts of horrible things. So there is that again. There's that dark side to this, isn't there, in the way that there's a dark side to belonging as well?

Meg Price:

Yeah, absolutely, and I think there's always people that are worried about the calculator taking my job, the internet, the Google. We've always had those things and we always will, and change is scary it's not. Yeah, it is a really difficult area, and AI is growing and changing at such an exponential rate that we haven't seen technology like this. So the change is almost even scarier for people, and my only advice there if you like, if anybody wanted to take is just dabble in it, just start playing with it. Don't put your head in the sand because it's here. It will take some jobs, but it will create a lot more jobs as well. So just start playing with it. You don't need to accept it, you don't need to love it, but I would love far more people involved with it, making it less biased and more globally used acceptable, if you like.

John Brewer:

Aside from your own work in this area, and obviously you've got the application that's got heavily integrated AI with that. Do you think that five years from now, we'll all be using AI as some form of support for our mental wellbeing? Yes, you've gone quiet.

Meg Price:

Yes, when I finished my PhD, I'll let you know, because I do want to research that idea of, if we're talking to AI for emotional support, what does that do for our wellbeing and our human relationships? Yes, but I think, looking at the amount of AIs that are out there and young people that are reaching out to like characterai and you can also get an app called Text with Jesus the amount of different apps that are out there for emotional support now, then yes, what they'll be in five years and the way people talk to Siri and Alexa, yes, I think we will be. We'll only be using it more.

John Brewer:

Is there much of people adopting with our time? I shouldn't ask this question, I'm going to anyway. Is there much technology? One of the things that interests me is around grief and loss. I mean, I got very upset when my father died and my I got upset with my father, but one thing that upset me was the there was a voicemail from him that he'd left. It was like the last voicemail he'd left before he died and it was on my phone and my service provider deleted it and then offered me a $5 refund as a apology. But anyway, this idea is capturing people's identities and having them live on, as I had moments when I think it would be nice to chat today.

Meg Price:

It's now, but I can't right.

John Brewer:

Are there people in recreating people in AI? I have, yes, there are Fully question. Of course there are.

Meg Price:

I haven't done a lot of research in it, but I know that, yes, you can, you can capture voice and it's very easy now to capture voice and then create, just so that, yes, that you and I could be speaking to our fathers and hearing their voice, and they can. If we've trumped these things that are built from, what I understand is that my dad would give me the advice that I expect to hear from my dad in his words. Yes, of course. Yeah, again, the research I'd like to know is that good for us or not to be able to access this?

John Brewer:

I don't think it's Harry, that Harry Potter scene where he's standing in front of the mirror and it's his parents in the mirror and he would happily stay there all day. But no, you have to go out into the world without those people.

Meg Price:

Yeah, this conversation is taking a bit of an old turn, hasn't it?

John Brewer:

Yes, Unfortunately we're out of time, which is terrible, but I do have time for one more question before you go. And it's interesting because you were talking about the clutter and the choices. And then he asked we do ask our guests to share the one thing that they do that they find best supports their mental health and well-being. So what's the one activity or that you engage in that you really wouldn't give up?

Meg Price:

I have a tech startup. I love tech. The thing that I do for my well-being is. I have a small alpaca farm. I turn off the technology. I go into the paddock.

John Brewer:

Is that small alpacas or a farm that's filled with regular sized alpacas?

Meg Price:

It's a small farm with regular sized alpacas and it's getting out into nature. I'm a big fan of technology, but I'm also a big fan of turning it off.

John Brewer:

Yes, and that, I think, is an essential message. It's a balance. Connection with nature is really important, but, on the other hand, technology also helps us in all sorts of other ways. Yep, it's helped us have this conversation today, which is great, absolutely. Thank you. I thoroughly enjoyed this and I wish I was going to be in Melbourne and meet you in person and pick your brain a little bit more and probably share maybe even share a gin at some point. That would be wonderful, because I thought you were going to go the gin route with the things that you would well be.

John Brewer:

I'm a delight and I really appreciate it and all the best. With the conference summit coming up and I'm sure you'll be you'll have a lot of people interested in the work you're doing and offer a lot of really valuable insights. So thank you.

Meg Price:

Right. Thank you, john, really appreciate your chat and your questions and I really love the chat.

John Brewer:

All right, take care. They're great, thank you.

AI and Wellbeing at Work
Exploring AI Coaching and Human Connection
AI's Impact on Psychological Safety
Balancing Technology and Nature