The Future of Work is Still Human with Rashmi Jolly

Home Resources The Future of Work is Still Human with Rashmi Jolly
The Future of Work is Still Human with Rashmi Jolly
The She+ Geeks Out Podcast by Inclusion Geeks
About The Episode Transcript

The robots aren't in charge... yet!

In this episode of the She Geeks Out podcast, we chat with Rashmi Jolly, founder of Assideo Consulting, global innovation leader, and deeply thoughtful future-of-work nerd, to talk about what happens when AI collides with humanity, power, and culture.

Rashmi shares her wild journey from immigrant kid with “doctor or bust” expectations, to Wall Street, to entrepreneurship in women’s health and genetics, to roles at the Economist Intelligence Unit, Mastercard, and Bain’s innovation group, and now to life split between Dubai, Zurich, and her kids’ school in the U.S.

Together, we explore:

  • How AI is being treated like the new high-priced consultant, and what gets lost when leaders trust the tool more than their own people
  • The quiet ways generative AI is eroding creativity, learning, and confidence, especially for younger workers who never got to solve problems without it
  • The ethics red flags Rashmi is most worried about, from biased datasets in women’s health to opaque data collection and “empathetic” chatbots that are a little too good at keeping us hooked
  • How different countries (including China, Singapore, and the UAE) are regulating tech, education, and kids’ screen time, and what the U.S. might learn from that, even with all the complexities and human rights concerns
  • Why psychological safety is non-negotiable for real innovation, and how framing work as “serving another human” changes everything

Rashmi also shares hopeful stories about her kids and their peers, the emotional language they’re developing, and why she still believes the next generation can pull us out of this feverish tech dream and back into a more grounded, human way of working.

If you care about AI, inclusion, power, leadership, and what kind of world we’re handing to young people, this one will stick with you long after you hit pause.

Episode Chapters:

  • (0:00:07) - Intro (Felicia and Rachel) Neuroscience of Trust in Workplace
  • (0:10:16) - Navigating a Dynamic Work Landscape
  • (0:16:45) - Reimagining Work in AI-Era
  • (0:28:00) - Balancing Empathy in AI Development
  • (0:41:33) - Building Psychological Safety for Innovation
  • (0:54:19) - Ethical Concerns in AI Development
  • (1:00:52) - Cultural Perspectives on Future Work

Visit us at InclusionGeeks.com to stay up to date on all the ways you can make the workplace work for everyone! Check out Inclusion Geeks Academy and InclusionGeeks.com/podcast for the code to get a free mini course.

0:00:07 - Rachel Murray Hi and welcome to the she Geeks Out podcast, where we geek out about workplace inclusion and talk with brilliant humans doing great work, making the world a better and brighter place. I'm Rachel.

0:00:17 - Felicia Jadczak And I'm Felicia. Our guest today is Rashmi Jolly, the founder of Assideo Consulting and a powerhouse voice in the conversation about the future of work. Rashmi helps leaders and organizations navigate the messy, magical intersection of humans and technology, where AI meets empathy and innovation meets inclusion. With deep experience across strategy, culture and change. She's here to remind us that the future of work isn't just about algorithms and automation. It's about people. We dive into the risks and promises of AI, how we can be more intentional about its use, and why compassion and psychological safety has to be at the center of how we shape the workplace of tomorrow.

0:00:58 - Rachel Murray Well, how wonderful is that. It's so related to what we're going to talk about. Who knew that people still mattered in 2025?

0:01:04 - Felicia Jadczak I know Shocker right.

0:01:07 - Rachel Murray It was a great conversation too, and I think it is very relevant to sort of what we wanted to get into. So, Felicia, what do we want to get into today?

0:01:15 - Felicia Jadczak Let's get into some discussion around. Trust you know, just a fun little thing to talk about. Do you trust me?

0:01:23 - Rachel Murray Can I tell you I was literally just saying to my delightful life partner that you are someone who I've said this to you before. I'm pretty sure that I feel like if someone was going to shoot at me, you would take your body and just go right in front of me. You would literally take a bullet for me. I feel like.

0:01:48 - Felicia Jadczak I'm really glad that we'll probably never have to test that out. But yeah, maybe, maybe, probably.

0:01:54 - Rachel Murray I know Like, I just feel like because you are such a good like, you've just got all that Quaker energy, even though you're not a Quaker like, but you I'm Quaker like but you're Quaker infused. Oh my God, that's amazing that I do. I feel like I trust you pretty much more than like anyone on the planet.

0:02:16 - Felicia Jadczak I mean listen right back at you, I mean I'm joking but we do have a very deep foundation of trust and I think that's really why we're still in business and working together a decade later, a decade plus later, since we first met. And I want to talk about some research that I was telling you about this morning, which is actually a little older but, I think, still really relevant. But before I even get into that, what just came to mind was and I don't know if you'll remember this, but when we first started working together so this would have been like 2013, 2014 timeframe I just remember it was when we had decided to actually kind of like make a go of this thing that we were doing together. We weren't really full time on it, it was just a side project, but we were formalizing our relationship and we didn't really know each other very well at that point. Still, and I remember you and I were at a coffee shop and we were kind of talking about trusting each other and there was a reason why it was like some other person was coming in and being kind of shady or whatever. And I just remember sitting with you and we talked about how we trusted each other and we verbalized it and I remember thinking to myself yeah, I do trust her and I don't really know why, but like I do trust her and I don't really know why, but like I do trust her and I'm glad that we're like that, we seem to be on the same page about this and you know that gut instinct really has not served me wrong so far. But I think it's really grown a lot more since then. That was probably at least 10 years ago, if not longer.

0:03:39 - Rachel Murray And it's kind of wild considering that the two of us and we've said this to each other before is like we have evolved a lot since we met and it's really cool to see how we, I think, have grown in this space and in our understanding and I do feel very much like we've grown together, even with, you know, 3,000 miles apart and a global pandemic and God knows, like challenges to our business, you know, which is kind of going to get to your point around the research. So, yeah, it's kind of it is kind of remarkable, go us.

0:04:15 - Felicia Jadczak Yeah, yeah, yes, virtual pats on the back, exactly A little virtual high five, virtual high fives. Well, but let's look for the data, because there's one thing that maybe this is why we trust each other so much it's that we are both big data nerds and we love the research.

So one of the things that I like to do every morning is I like to kind of start off my day by reading a bunch of research, and I get lots of stuff coming into my inbox. And so this particular article which we can put in the show notes, it's from Harvard Business Review and it's actually a little older, so it's from 2017. The article title is called the Neuroscience of Trust, and I was reading it. It came up it was like a link in some other article, because you know, you go down these little rabbit holes sometimes, but I was reading it and I thought it was really, really interesting and I shared it with you because this is all about how the brain is involved in sort of engendering or supporting or not supporting trust when we're in working environments especially, and so I'll just do a kind of quick skim over it. But basically, these researchers were trying to see if humans are naturally inclined to trust each other because of like a neurological signal or not, and what they found was that, again, this is like very much TLDR, like TLDR up the wazoo, but TLDR when there are higher levels of oxytocin levels in your brain, you are more inclined to trust other people, and when you have less of those levels, and basically like if you're more stressed out, you are less inclined to trust people. So there's a lot more to it, obviously, and, again, feel free to read through and dig into it. It's really interesting.

But I thought that was so interesting, and especially in regards to our relationship. To start off with. It obviously has big ramifications for the workplace writ large, but the reason that it kind of immediately popped out to me is because I was thinking about how we trust each other, as we just talked about the fact that it's been so long and the fact that I would say that today, in 2025, we are probably at the most stressed out we have ever been. 2020, we have had, and the last five years especially have been a time of extreme high stress for us as individuals and our personal lives and in working together, and so I think it's really remarkable to see that we both still highly trust each other, because I think it actually shows how much more work goes into having that trust than if we were discussing this back in 2016,.

Right, and so that was something that I just was really kind of struck by, because it, you know, like we joke about our relationship, but it is a relationship and it does take work. And if we're going to extrapolate to the workplace, just because you're a CEO or a leader or a manager like you're, still, there is still a relationship between people in charge and coworkers and colleagues and employees. And I think that it's probably a lot harder these days to create environments of trust because of all these factors, because of the fact that there's so much uncertainty, because of the fact that there's so much stress in the world and obviously there's a lot more to it. But I was just reflecting on how much harder it must be for folks right now on all levels because of just the neuroscience of it all, let alone the fact that the world is burning. So that was my little two cents TLDR takeaway from a 2017 article that was.

0:07:44 - Rachel Murray I mean, it could not be more relevant to today. I mean, and we're seeing it in all the headlines of you know, employees not trusting, you know, their managers and not believing in corporations. And you know, we're seeing all of the trust eroded in all of the institutions, really, and I'm sure it's related to the amount of stress that is being put upon so many people and, by the same token, perhaps people who are in positions of power not trusting people who aren't right.

So it works both ways right, because there's stress all over the board. So what I'm hearing is we all need to meditate a lot more. We do more trust balls.

0:08:22 - Felicia Jadczak We need to lower our heart rates and really get in tune with ourselves more. We need to not have the robots take over all of our lives. I mean, there's so many things that we need.

0:08:34 - Rachel Murray No, it's so true. No, I love that fact and it's a really good tie into the conversation that you're about to listen to with Rashmi, because we absolutely talk about the, you know, until the robots do take over, like we all still need to figure out how to work together, like that is the reality. And so, you know, what we are working on is very much supporting organizations and teams in thinking about how to really be present and respond in an effective way to what is happening in the world and recognizing that, you know, change is happening at a faster pace than ever, and you know we really kind of need to address that. So this is where we're at.

0:09:18 - Felicia Jadczak Yeah, another fun fact and I'm not going to get this super specific because I actually wasn't thinking you'd bring this up, but you just made me think of something. So I was just doing a workshop for a client last week, last Friday, and this exact point came up. So I will, if I need to, I'll clarify this afterwards. I'll put a link to this in the show notes as well. But basically there was some research I found recently around change and how many average changes per year the average employee goes through, and they were comparing between I think it was 2016 versus 2025. And so would you like to guess, in 2016, how many changes per year do you think the average employee went through?

0:09:55 - Rachel Murray Oh my, God, I feel like I just looked at this article too. I don't know if you shared it and that's why it's in my head. It was like 21 or something. So it was actually two oh two, and now it's 21. Now it's yeah, I think it's.

0:10:04 - Felicia Jadczak In my head it was like 21 or something, so it was actually two oh it was two and now it's 21. Now it's yeah, I think it's 20. It's either 20 or 21, but yeah.

0:10:11 - Rachel Murray Yeah, something like that.

0:10:12 - Felicia Jadczak But now it's 20, let's say 20. And then in 2016, it was two. So the fact is that even if, like you, are in an industry where nothing has changed which God bless you're still doing your same job. You're showing up to work, it's all great, You're getting paid all the good things you are still like organizationally, we are going through more changes than ever now, than we used to, and all of that takes work. It takes energy, it takes mental load, it takes actual workload, it takes like emotional showing up.

So there's so much going on right now and I think everyone is just like we're all operating at a level that if we were to time travel back to 2016, we would not recognize ourselves because we thought that we were so productive back then. But like, watch out, because it is just at a heightened level right now. So yeah, and you know Rashmi helps us talk about like what does the future look like and how does AI play into all of this? And you know, what can we predict and what is happening. So there's just you know we're in a period of massive change right now and we don't know where it's going to go. But you know, people like Rashmi and you and I have some thoughts about it. So sure, do, stay tuned, stay tuned, all right. So yeah, welcome to the show, Rashmi.

0:11:23 - Rachel Murray Yeah, Our guest today is the lovely Rashmi Jolly. Hello and welcome.

0:11:28 - Rashmi Jolly Rashmi Hi, thank you so much for having me today, oh we're so excited to get into this.

0:11:34 - Rachel Murray We've got a lot to cover, but what we love to do is start with the origin story. So take us back to the beginning. Tell us everything.

0:11:44 - Rashmi Jolly Well, I was born to Indian parents in America. I grew up in the United States and both of them I had one that was a doctor of education she taught and one that was a doctor of medicine, and so I came from a super educated family that wanted me to be a doctor as well, but I fainted at the sight of blood and so I was sent out to the world to find something in between. That wasn't either of their professions and that was really challenging as an immigrant, because at the time, which was in the early 80s and 90s, there was a large group of us that had come in but there weren't a lot of examples ahead of us as to what different immigrants could be and the choices of you know, based on the people in front of you. The choices were very narrow, right, like lawyer, engineer, doctor, and so I sort of entered my college years really exploring, trying to figure out which areas of life were of interest to me, and there were a couple. I was very interested in being a journalist at the time because I wanted to study the dynamics of power and money, especially around the globe, and it was something that was always a factor in our family, we were always very aware of what's happening in Asia. People came to America for money and power, and I just wanted to understand how it works.

So I started my career in Wall Street and spent a lot of time learning about capitalism at its core, at the very front and center, sitting on a trading floor, and after a couple of years of that I decided well, actually it's the businesses that make the money, so I want to learn how businesses run and I started a consulting business with a business partner that focused on the healthcare industry but kept me away from things that made me faint. And then, after that, I did that for about 18 years in women's health and fertility and genetics, and I had a couple of international adventures along the way, which I'll get to. I decided that it was time to go back into corporate. I'd had children and being an entrepreneur for over 15 years had a lot of wonderful joys, but it wasn't completely compatible with the responsibilities of children. So I went into corporate and I did a couple of really I was lucky to have a couple of really great roles. I worked for the Economist Intelligence Group and their healthcare and finance division, and then I went on to MasterCard and then I went into Bain's Innovation Group and I can talk a little bit more about my work across all of that.

But I sort of give this overview to say you know, I started my career and my life more as an accidental tourist because I had no real tour guides to follow, and it's been a bit of the way that I've gone all the way through and across that. I've had the fortune of living in China, singapore, indonesia. I now live in Dubai. I lived in London. I currently have a partner that I also live in Zurich with half the time and my children are in the US. So I've had the luxury of having these experiences across like a really global landscape and I'm still learning. I really, if somebody asked me what I wanted to be when I grow up, I'd still want to be a journalist and a writer, but I also want to be a bazillion other things. So I hope that I live a long time.

0:14:37 - Felicia Jadczak Wow, I mean you have already lived I don't even know like 20 lifetimes. It sounds like so far. So I also hope you live a long time more, because I want to see what else comes along. Holy crap, sorry Well, I'm just apologizing to myself for cursing on my own podcast, but wow, that was a lot.

There is so much that interested me and there was just so much information I was trying to keep in my mind.

I feel like there's a million ways to go, but one thing that kind of jumped out to me immediately is so my mom is from India as well, came over in the 70s, and so a lot of what you shared in the beginning really resonated, because I feel like that was part of her and her family's experience as well trying to figure out what is the place and business was so new.

I mean I remember her telling me about how she actually met my father in business school, but the program was brand new. I mean I remember her telling me about how she actually met my father in business school, but the program was brand new, like there was no MBA before that, and so that was whenever it was, you know, late 70s, early 80s. So anyway, you have really crafted quite the journey and I just don't even know where you want to pick on to go next from there. But I'd love to know if there are any particular standout experiences or choices that you made along this journey of yours so far that really have shaped your approach to this idea of the future of work, because obviously that's a big topic and we want to get into that with you. But is there anything that kind of sticks out to you in particular or is it a bit more broad in general for you?

0:16:03 - Rashmi Jolly Well, I think the thing about my approach to work I've always found it interesting. So I have a deep like. I come from multiple spiritual traditions and I always find it very interesting that never in any of the spiritual texts that I've ever read had anybody ever talked about their career. But we spend so much time in our human life talking about what we do and this disconnect. I've been aware of it from the beginning. I never really identified well with a strong sense of career progression and I always envy people. I go onto LinkedIn and I'm like, oh my gosh, these people look like they get up in the morning and there's a roadmap on their computer and they know exactly where they're going.

But I grew up with a family that was building independent practices in both of their respective professions and as immigrants they felt that the best way to build their work life, less so a career, was just to be as much of service to the community they were in as possible. I mean, my father was a tremendously wonderful example of this. He was a small town doctor. He had an ENT practice. He sat in the middle of this main street of a very iconic American town. He only bought groceries from the grocery store near his office. He only bought his clothes from the retailer near his office. He made a big point even though he could afford to go to bigger cities Pittsburgh, new York, get really you know much fancier things but he really made a point to to be a patron of the businesses, of the people who were also his patients, and my mother very much the same, like she taught in the schools, in the, in the area and and so when I went out into the workforce I really had this mindset that like I needed to figure out in every role that I had, where my skills can serve, and of course I would get paid for that.

But then I would also learn, and I think that's been my framework for the, for the way that I've crafted my career. Every time I made a change it was because I felt like I had learned what I needed to learn from that space and I needed to learn something else. And that's how I kind of accumulated all these different roles that in a way, I understand. When I play them out they sound kind of, you know, they sound different than the norm, but to me they sort of sound like a travel itinerary, like I was, like I wanted to go to Paris, I wanted to go to Germany, so I got on a train and I went and I've been lucky enough, I guess, to go into these these great organizations and be able to do these great things. But I really think it's because work is the way that we serve our community and I still, to this day, do not understand what a career is. I really don't, and I have tried.

0:18:38 - Rachel Murray I mean, I can certainly relate to that.

I feel like Felicia can too.

We've been sort of working together for quite some time now running this business, and it's been interesting because prior to that well, I'll speak for myself, having had many like that sort of, exactly that sort of like traveling path, and I do wonder, if you know, that is the future right Like, how are we defining ourselves? Are we defining ourselves in a certain career or are we defining ourselves much larger than that? It seems like, you know, we're giving ourselves a lot more space to think about ourselves holistically, and it's really interesting because it's all happening at the same time as AI, which is, you know, obviously such an impactful aspect of what is happening right now with regard to work and how, and will it give people more of an opportunity to actually explore who they want to be, not just what they want to do? And and you did write a bit about sort of what a post-apocalyptic AI world look like, so, and whether that is something that we would want. So I would just love to know, like, what do you think a fully AI integrated world could look like?

0:19:52 - Rashmi Jolly Well, I think okay. So my vision for the world starts with where we are today, and you said something really critical. That is, I think, the heart of my answer to the question is that I think the way that work needs to change and I'm hoping AI brings it along is that, instead of us constantly getting up and thinking about what we want to do, I hope that it gives us a chance to start to think about who we want to be at the workplace. So I feel like we started our you know our big economic growth that we know all around the world and I've seen it repeat itself in China. I moved there when, so you sort of go in cycles you start in a manufacturing cycle, then you move up the knowledge industry cycle and now we move up to the really deep ends of the innovation cycles. And in China it was very fast. I moved there during the manufacturing cycle and already making all of the world's smartest things. But it does sort of go in this step, and the reason that I think it's important to start there is that when you're in the manufacturing cycle, it defines how corporations evolve, are much more open-ended and they require a lot more creativity and innovation and I think a lot of us who have been working in this generation have found ourselves stuck in kind of the tension between the two, like we get into these roles and they've got vague definitions and they've got bosses with big levels of hierarchy and understanding of managing people and the scopes are varied and the problems we're solving are sometimes too small and too big, and it's because I think we grew our organizations out of a manufacturing base and we never really fully evolved them to deal with an information world, and now it's almost like the tech companies are like well then, nevermind, we're just going to have AI solve this problem. So we're basically going to make AI the manufacturers of information because we struggle too hard to make people in the middle, and I think that that's wrong.

I think that AI is a really great tool for pattern recognition and structure no-transcript, and so what I'm hoping for the post-apocalyptic world is that we as humans start to organize the bits of our information world that can be turned into processes and frameworks and really aren't really the things that inspire us that much at work anyway as knowledge workers and outsource that and then use the time to recultivate all the things we've lost in the information age, like we've lost our imagination, we've lost our creativity, we've lost our confidence in our own I mean in our own voices and our authenticity, and we spend a lot of time trying to talk about these at work, but there's never any space for them to come to life.

It just becomes these corner conversations and then the work railroads us right back into the same structures that confine us and make work not as really pleasant as it should be. If it's really, if work is really an act of service which I believe it is it should be something that rewards us. I'm not going to go so far as the pop culture says that it should be our passion. I don't think work always fits into that mold, but I do think it shouldn't always make us feel every single day Like we're so out of alignment with ourselves and our talents and how we want to experience life that we all just come home defeated and kind of hoping that either we find a new job that doesn't do that, or like we can just retire, you know, I mean which I think is the majority of people's experiences. I felt that way too. I'm like I just can't find a corporate job where there's enough space for me to to live at work the way I want to feel like I'm living, you know.

0:23:43 - Rachel Murray Yeah, you just said something that made me think of something I hadn't thought of before, which is it almost feels like AI the way companies are using it.

It reminds me of the way companies use consultants. It's like leaders give consultants so much more credit and leeway and opportunity than their own employees, like they just trust them because they are the experts, and I feel like AI, in some ways, is like they are the new experts and, in full transparency and Felicia certainly knows this I do use AI. I do use it as, like, a thought partner in many ways, so I do find that it can create some sparks of other ideas as I have conversations with it, but I also feel like it's limiting right Like, because I'm like okay, this is great, I'm having this conversation with this AI and now I want to like, have this conversation with Felicia, to like and have it be bigger. It's almost like the AI is like one of many voices that should be included, and you're totally right that I think that it does feel like there's too much credence being put into this one important, but not only aspect of how we think about things. So I just wanted to throw that in there before I kick it back.

0:25:12 - Rashmi Jolly Well, I just want to just drop in a little anecdote. So I actually I've been working on a new sort of consulting product and I put it into a framework with a good consultant and my last role was at Bain, which is, you know, one of the big three consulting firms, and I sent it out to somebody I've been working with just for their feed pack and I wrote it and they came back and they said it sounds like ChatGPT wrote it, Rude.

I actually laughed because I thought, to your point, ChatGPT has become a consultant and I was like I mean, that's how we would have written it at Bain, but now ChatGTP can do it, which actually you know. I wonder where consultants will shake out at the end of this, because that's a whole nother segue.

0:25:52 - Felicia Jadczak But yeah, there was a recent story and I'm probably not going to remember all the details because it was a while ago, but I want to say that it was McKinsey that got called out by the government of Australia, I want to say, for using generative AI in a half a million dollar report that they gave to the government. And basically what happened was, you know, long story short was some professor sat down and actually read the report that was delivered and was like this is not right and this research is fake. And then, when they got called out on it, they kind of doubled down on the usage of the AI. So I am very curious to your point about how the big consultancies will actually end up faring if they're just you know, if this is how it's going to go. But yeah, rachel, I think you might have fact-checked me on the back end.

0:26:42 - Rachel Murray I wanted to fact-check it, not that McKinsey probably didn't do the same thing, but I would clarify that it was Deloitte in this particular instance.

0:26:50 - Felicia Jadczak I mean, I would hate to have McKinsey called out for something they didn't do.

0:26:54 - Rachel Murray They would never do anything like that. Just kidding, they probably would do. But here we are $290,000 report.

0:27:02 - Felicia Jadczak Yeah, well, I mean, I think all this leads me to the next point I want to kind of touch on, which is, I think, around like, okay, so you know we're using AI as a voice. We're, you know, rachel's example, which obviously I participated in as well, is around using it as a launching pad. Organizations are maybe not necessarily going in the wrong pathway, but, like, what gives me cause for concern is like younger folks or folks who are newer to an industry or a job or the workforce, who are just fully relying on AI without using their own brains as part of that process and then, as we see, with, like, the Deloitte example, using it without fact-checking it and losing that sense of creativity that you mentioned. And I feel that very personally because I felt like, after working head down on this business for so long, I had to actually relearn how to re-spark my creative juices, and so, you know, we're seeing AI in places like mental health therapy, which is also really concerning.

So I'm curious, Rashmi, in your you know experience and thought processes, what do you think it actually means to bring things like empathy and more of that human side back into AI development? Not necessarily AI the output itself, but the development of these tools, not just as like oh, this is our values. Yes, of course we want to be empathetic and supportive, but really building it in as like a measurable practice. What have you seen, or what are your thoughts around that?

0:28:27 - Rashmi Jolly Yeah, Well, I have some thoughts a little bit on how I've noticed AI is using empathy to hook us in, which I think is a bit of a problem. I mean, I've noticed lately that Grok is really, really nice to me to try to get me to keep going, and sometimes I'll literally say I think you're just being too nice and then it'll say, no, I think your idea is amazing and I'm like I know my idea is not amazing, and so I think AI, just as a starter point, I'm a little worried that AI is going the more vulnerable consequences that you just described, where people really lean on it as a trusted advisor, really lean on it as a real tool like, and assume that it's smarter and we as humans respond so well to empathy. It short circuits all of our logic. It short circuits everything. It just hits in the very thing that connects us, that it's an easy way to build trust. But the machine it doesn't expend anything on the machine's end, right, it's still just like just words that it's been trained to teach, whereas on our end we end up expending more energy being more connected to the machine than we really should. So that bothers me and it really bothers me that the humanness of the voice that we're all embracing is, is not being flagged as also, um, you know, it's like those, those people who stalk children online. It's not all. It's not also being flagged as a method to trick us in?

I also think that, um, we're not having enough conversations about when to use AI for those of us who have been solving problems without it for years. As you said, rachel, we know when the output's good and when the output's bad, and we know when a problem you know, we know where in a problem to put it in. But if you haven't, as a young person, solved those problems and walk through, you know the hours at your desk trying to figure out how everybody ahead of you is doing it. There's going to be a whole generation of people who just, even in the most intentioned classrooms and the most intentioned workplaces, who aren't going to get it, and so I think we as a society need to be more empathetic to their experience, because what we're actually robbing them of are some of the core cornerstones that build our sense of value in work Learning, wrestling with problems with the team, connecting with diverse opinions, being able to see your own skill set grow over time If that's the closest definition to a career that I can come up with. It's probably that one the ability to feel like you started out with something that was difficult and then reached the end and was able to do it, and that gives you the confidence to build on other challenges.

We're actually robbing an entire generation of that experience and not discussing enough what's going to happen when we don't replace it. And so you know, when it comes to weaving empathy into the AI experience, these are the two places that I'm really uncomfortable with. We're not being empathetic to the generation that's going to be born into it. We're not being empathetic to the generation that's going to be born into it. We're not being empathetic enough to the people who don't have the framework or the skills or the way to distinguish between man and machine that maybe some of us may feel like we're more savvy, but even then, I don't know. I mean, I get roped in too. I'm like thanks, groff, you were so nice today.

0:32:10 - Felicia Jadczak I had a lousy day. You were so kind Totally, and I don't even really dug into that aspect of it. But that, I think, is another slightly scary avenue that is happening because we're losing that ability to connect with each other on that very human level. And you know, being behind screens and going hybrid and remote has a lot of benefits but it also has a lot of you know, sort of downsides to it. I think that's part of it as well.

0:32:35 - Rashmi Jolly We are, we are trying to, and I don't understand why.

I mean, I think one of the things I wrote about was I don't understand why we think this is a good idea for people.

I mean, I get economically, why it's another evolution in the attention economy, right, like we've created social media, and now, from a business model perspective, this is amazing. You have another tool to get people to stay online all the time and for it to infiltrate all the areas of life, from your personal relationships to everything. Right, you can actually literally plug your brain into it if you want to and, in theory, not have to experience life at all. But why would we want that? And this is the question, outside of the incentive of money, that I just I cannot answer fully to myself and that makes me very disappointed in the technology leaders that are pushing this out there, because I just think we haven't trusted them that they understand, or we haven't trusted them that they're wholer human beings than I think they are maybe appearing to actually be and not going slowly enough for us to be able to avoid all the risks to ourselves and our children.

0:33:43 - Rachel Murray I mean, I think you just hit the nail on the head. I mean, ultimately it is about money, you know. I mean it is. It is, ultimately capitalism is a very extractive system and it is ultimately about being as efficient and extractive as possible.

And I do, I think about how I sometimes, you know, I'll use ChatGPT to be like I want this, tell me about this recipe. Or like you know, like even sometimes I'm like I have a weird health thing, like I'm just curious, what do you think of this stuff? And I'm like you are building this psychographic information about me that is even deeper than the already problematic psychographic information that Google has on me and that Amazon has had on me and Netflix and whatever else, right. Like it is ultimately about being incredibly extractive and I do think that's probably not to get fully tin hat eco-socialist person but like it does kind of explain why these tech billionaires are building bunkers and spaceships, like that's their focus, because they're like peace, we're good, you know, but that's also why there's this beautiful pushback that's happening now.

So we're actually recording this on November 6th, so this is a couple of days after elections that have happened that have shown that you know, maybe this isn't the best path. Maybe there are some other ways to consider. Maybe we should be considering the massive amounts of humans that actually live on this planet and thinking about how we are showing up. And I do hope and, Rashmi, I think you probably have more insight than maybe Felicia and myself do on sort of like the younger generations, and my hope is that they will come to a place of either balance or rejection of some of these things, because there is this ultimately human desire for human connection and interactivity. I'd be curious because I know you've got kiddos like how they are experiencing that.

0:35:41 - Rashmi Jolly Well, my kids go to a school that's really putting up a good fight against this technology, but also trying not to keep their head in the sand, so I have to give a big shout out to their school. So they go to a school called Mercersburg Academy, which is also where I went, and and I had a conversation with my son who previously, before going there, was, I think, just like the average teenager. He liked to spend a lot of time on video games outside of school and and he spent a lot of time on his phone. And then he got into this environment. That is really natural. There's a lot of trees, there's nothing around, it's a very small town. There's a basketball court, a baseball field, you know all sorts of things, and he's with a bunch of other kids all day long.

Instead of coming home from school and then being separated from his friends Like he is a boarding school he used to come home and then get online with his friends to games. And I've noticed three things. One, he's never on his phone now, like he never answers me on anything because all his friends are around him and they don't want to be on their phones. They can just walk down and knock on their doors, and so that tells me that the technology did not stick right. It was really just about communicating with their friends. Two, he really enjoys going outside now because he's got people around him to go outside with. Before, when he came home, there was no one there. And the last thing is he himself has said who was my techie kid? I asked him how he feels about this generated value and he says it's killing people's creativity. I've got these friends. They're in an art class and when they want to write, when they need to draw something, they write in chat, gpt. They have the first sketch idea and then that spits out an initial blueprint and he says and they just go from there and the ideas are all lame and recycled and you can tell immediately. And so it.

Just when I look at them now and their change, it just reminds me that the thing is is that we we were so excited with social media and so excited with when technology came out, because it brought us closer together to people. It gave us, you know, remember the days when if you wanted to make a plan with your friends, you had to call them or leave a message on their voice message and then they would have to come home and play it and it made all of that stuff suddenly so amazing. And then we sort of crossed the utility curve and especially during COVID, it became a replacement for relationships and in-person interactions. And what I see is, if you give people those opportunities back, they put down the phone, right. And so I think that if we end up creating a technology and the kids can feel the toxicity of it right. So I think if we start to create a technology environment, that we start to feel truly toxic, in which there are days that I also merge from my computer and I think, oh my gosh, I spent my whole day in a world that was not a high quality experience, right, like I missed everything that was going on outside. If enough of us start to wake up to that and remember that we still have agency and choice, we will eventually choose other experiences.

But it's like we're at this strange saturation point right, where it's like we're not exactly sure what AI is going to do.

We're all a little curious, we've all kind of broken up into new ways of working. So some of us are in opposites, some are as a hybrid. We've had a really divisive political landscape across the world for a couple of years. So we've got all these little things that have kind of wedged themselves quite neatly to keep those things from organically happening. But if all of us are, you know, if all of us start to talk about it and become more aware of it, I do think we, based on the younger generation, we might index back and that would be, I think, the best outcome for humanity to have emerged from this sort of fevered tech dream and realize that the best things in life are always around us. You know trees and the smell of bread and you know our friends and just picking up a ball and going down the street. If that is the outcome, then thank you, tech bros, but we'll spend a little less time on your stuff and more time outside.

0:39:31 - Felicia Jadczak It's like the takeaway, like put the phone down and I mean, as the kids say, touch grass. But that's so hopeful. Rashi, I really appreciate you sharing that. It's interesting because I have two little nephews, five and eight years old, and they are so tech focused. It's scary, in all honesty, sometimes that we know and or interact with, because I think there is starting to be this difference right where you are seeing kids like your kid, where it's really about like how can we engage with the world around us and not divorce ourselves from technology completely, because then we have to function in this world right, but at the same time, there are kids who are like all in and plugged into the max, and you know, and I wonder how that's going to evolve as they get older.

And you know, and as we are all in this together, because, like my nephews came up recently and we do like a little organic neighborhood music festival that was happening and we were just noticing how there was this one house where it was like a very kid focused and kid friendly band and and all these kids were in the yard like really excited, like listening or like you know, singing about bananas or something, and then you know, our nephews are like trying to get my husband's phone to scroll through it, and we're like, can you just listen to the music, you know? And there's definitely like I'm not to say that they're, you know, not engaged in the world around them, but there's glimmers of hope. But it's all to say that there's so much happening right now and I think it's happening so quickly, and so that's part of it is we don't have the time to reflect on. Do we want to go in this direction or not? Yeah, so anyway, it's all a lot. I know our brains are all like firing on all cylinders at this point.

Take that that AI. I want to ask a little bit about some of these like older versus newer models that you're seeing, especially in the workforce. So you know, a lot of us are stuck with the old models of being productive and sitting at our computers for eight hours or nine hours or whatever. It is a day right and feeling like that makes sense because that feels like work. There's also these emerging models of having more purpose, having more flexibility, designing our careers and designing our lives to work for us, not have us just be swept along. So I'm curious what you think might be like a mindset shift that leaders, especially, would need to keep in mind right now to make sure we can prepare for acknowledging the old models but moving into this new space that we're all kind of moving into.

0:42:17 - Rashmi Jolly Well, the advice I often give my clients for both shifting their business models, engaging people and then also just driving growth is that I think every business needs to remember that they ultimately serve another human being at the end. So if you're a sales force, you're ultimately serving somebody who's sitting at that platform trying to grow their particular area of business. They probably have people that rely on them at home in their personal lives. They may have clients they like they don't. That interface is part of their life experience. Right, and I think people, if you start to look at things that way and you can boil it down to any service, even somebody who makes salad right, you can say I make salads. You can say I'm nourishing people with fresh vegetables when I go out into the world right, and in framing it is so important because it immediately assigns the value to what you're doing, so that by itself brings the human element into everything we do and then brings the human energy into everything we do. So if you say I make shoes and there's not really, it's just like I'm just an assembly line worker and I punch the nails and I put the leather on you may not feel the same way as you. As you say, I send people out into the world, you know, with shoes on their feet so that they can go out and experience the world. And there's been a lot of studies shown that the framing of how we approach work and our leaders set the stage. So that's why it's so important the framing really shapes the human experience.

I remember one time I read a story about this study that had and I wish I knew the exact citation, but it was basically in a hospital and somebody interviewed the doctors who, on the surface, would seem like they would have the most direct connection to the purpose of their work, nurses as well, janitors, admin staff and they found that the most content person was this one janitor who said I love my job, it's the most amazing job in the world. And the interview went on and said well, you know, I mean, on the surface it looks like you might have probably one of the least pleasant jobs in the world, right? And the person who answered said yeah, but I make those rooms clean so people can heal. And the person who answered said yeah, but I make those rooms clean so people can heal. And so I just feel we lose already, from the communication level, the value equation of our work every single day by making it about productivity, and all of us go in just starting our day with an already flat emotional experience when it doesn't have to be. It just doesn't have to be.

And if so, if leaders start out from that point of view and really connect people to you know, I hate the word purpose in a way, because I feel like there's a lot of ways to define it and, having come from a deep spiritual background, I sort of lean more towards, like the softer spiritual things when I think of that word. But the reason for being there maybe is a better way of saying it so rarely gets communicated in corporate and so rarely gets lived on. Because of that that I think it creates this vacuum of energy and it creates, you know, this massive, you know growth and burnout that we see, which is exactly the space that AI now is coming into that vulnerability and just like ripping through like a wildfire in a space that we as humans could actually really command and own. We can command and own problem solving. We should be harnessing AI to help us, like it should be, like this thing where we're like okay, well, you know, I'm a shoemaker and I want to make sure that the people who go out in my shoes have the best adventure in the world that's happening. I'm going to use AI to understand their lives and I'm going to understand how they're using my shoes instead of typing in what's the total addressable market of the shoe. You know the shoe market and what Instagram posts are going to be the ones that are going to make them clickbait.

And if I put Brad Pitt in there, is that going to make the one that you know like and that? And they're just two different experiences, and one puts AI in the pilot seat and one puts us in the driver's seat and um, and I think to me, that's where leaders need to step up. It's just in the problem solving equation. Remember, it was the whole experience that we built in the work world was because we, as humans, needed to do something for one another. It started out just to survive, right? We needed to do something for one another so that we didn't have to barter, but that was the whole purpose and we've forgotten that completely for no benefit to anybody. Actually, I cannot see the benefit to forgetting it to anybody.

0:46:48 - Rachel Murray Breach. Oh my goodness, that is so wise. I totally agree with you. Just a quick side note that I would say Pedro Pascal right now over Brad Pitt, but that's just me. Work.

I think that is absolutely what is lost. You're totally right. It removes the dignity of work, which is just so important as we think about it and I don't know why this popped in my head, but like there's this, so on Peloton, you know the little I call it millennial wisdom that they throw out sometimes. But one of my favorites was is it's the framing of this is like, okay, you don't have to do this, you get to do this, as we're, like you know, dying on the bike or the treadmill or whatever. And I just like appreciate that, Not that it's the same, not that, oh, we get to work, but like the idea of framing in this way. That makes it feel like it's not soul sucking, I think.

And you're totally right about the leaders being the ones who are sort of responsible for making sure that they're setting the tone for all of this. So thank you for that wisdom. That's going to be an excellent YouTube short, because AI didn't tell me that I did, but I wanted to talk sort of digging a little bit further into this. So we talk a lot about psychological safety and inclusion as like really important aspects of work today as well as the future of work, and I would just love to know we sort of touched on it already. But if you want to dive into a little bit further how this sort of intersects with innovation, that'd be great.

0:48:31 - Rashmi Jolly Well, I mean psychological safety is critical to innovation. The innovation process has three big parts. Well, it's got five steps in the traditional design thinking, but three foundational parts, I think, that make psychological safety so critical. So when you're trying to innovate for something, your first step is to really understand who you're designing for, and our brain can only operate in fear or love at any one time, so we have like just evolutionary wise. It's very hard for the brain to occupy both spaces. And so if you have to be empathetic to a group and really listen and interview customers and interview your target audience in a way that really hears their pain points and hears what you need, you need to be in that sort of love empathetic side of your brain. And it's really hard to do if you're having a survivalist experience at work. So you know, if you're having an experience at work that constantly makes you feel really stressed and you're constantly in your fear side of your brain, you're going to already not be so good at that first step. So that's part one. Psychological safety is critical. People need to feel like they're valued at work, their opinion and questions matter, that they are safe to listen and absorb and reflect back what they hear and that they won't be told what they heard or they won't be shaped in any way, especially in that early exploratory space. Second is that when you ideate, you need to have as many diverse experiences on the table, because people pull in dots from all over the place and if you don't have a psychologically safe environment, great ideals will get silenced. So in a second stage in innovation is you take all these pain points and you sit in a room and you say, okay, how could we solve them? And you want everybody in the room to feel that they can share that there's no silly idea, there's no out there question, because that's how you get the best output. And then the last thing is is that you have to fail a lot in innovation. You have to test and learn. You have to go out and again re-interview people after you've created something hey, did this hit the mark? Is this right? What do you like, what do you not? And you have to have an environment where failure is okay.

So often innovation cycles because they're hard to scope out in terms of cost and there's often a lot of you know. Especially if it's a big idea with a lot of stakeholders, there's a lot of capital and responsibility put on the shoulders of the people who are innovating. That test and learn cycle. I mean it needs to be protected, but it can often become something people want to rush through to get to market, and so you need to have a real good psychological barrier safety around the notion of what failure looks like in that cycle for people to be able to then actually learn and create a product that will be successful.

And it is really challenging in big innovation projects, and the problem I find with innovation right now is that it is the big innovation projects that are needed. We've developed so many individual, siloed solutions to problems, but the next generation needs for those solutions to start to talk to each other, and that means that more people have to come together more diverse, you know, opinions, more ecosystems, more, more, more, more, more. And actually that creates more variables and it's harder to assess economically and it's harder to support financially. And so you, you, you almost, you almost are creating the conditions where the two needs for psychological safety and the need to evolve in a certain way are going to really challenge each other. I think in the next generation of innovation and the big players will have more influence because they will be able to take the risk, and that actually does scare me a little.

Now, I live in environments. I've spent my career in three different environments that have really concentrated on funding their entrepreneurial ecosystem, so that does give me hope. Like China, singapore, dubai, these are places where the governments have made a concerted effort in all corners to try to create that psychological safety to the extent that they can with capital, with risk free capital, with government grants, with support to government programs. These countries can also afford to do so. So I do think there's a lot of awareness of this problem and there's a lot of, you know, other people outside of industry that are trying to solve it, but the tension will remain. It's just going to be something we're going to have to go through in this next cycle and hopefully, by talking about it, people will build dialogue and parameters and requirements into these processes that start to make it, you know, bake it in, so that we can not forget.

0:53:00 - Felicia Jadczak Yeah, I think you know, as you were sharing, I was thinking back to life before Inclusion Geeks, which there was a phase of my life and career and I worked at a big tech company company and I actually worked specifically on what they created and named as their internal innovation team, which was a whole other exciting part of my chapter.

But one of the things that I remember being sort of surprised by in that phase and working in that kind of environment was the fact that, you know, obviously there was like the traditional, like you know, computer programmers and engineers and all that stuff, but then they also had folks who were from like philosophy backgrounds and things like that, because you need those different viewpoints so that you don't get super siloed into just a one-track mindset. And I think another piece beyond just the psychological safety aspect of it is also the ethics piece, right, and I know that, especially in the last couple of years, we've seen folks who are more focused on ethics losing their roles and their jobs in AI. And so I'm curious, what if there's like one or two ethical red flags that you've seen or that you'd like to flag that you would wish more companies or more leaders would pay attention to in their AI or their data strategy going forward.

0:54:13 - Rashmi Jolly So I've spent a lot of time on this issue in the women's health area because I spent so much of my career in fertility and genetics. My big bugaboo is that the data sets that we're using on are using AI to train on our our data sets that have already emphasized Western culture and usually the patriarchal point of view, and so we don't have enough data on women's experiences in the health care world. This is abundant. It's a very clear and abundant problem. We don't have enough research on women's health and women's health issues. But just in general. So I think you know one we are not actually doing a good job of bringing in everybody's viewpoint into the AI algorithms we're already selecting with the viewpoints that made it to the front of the internet in the last generation. I think that one to me is a big ethical flag, and I oftentimes feel that we're not having a big enough dialogue about the data sets. It just feels like the internet's so vast and the AI is so smart that whatever it's doing must somehow even that out, but that's not entirely true. That, I think, is my biggest ethical flag. I think the other one is that we don't have enough transparency as to what AI is doing with our interaction. So I've done a couple of experiments and I don't like that. I find it very unsettling.

I did a couple of experiments this week with ChatTPT. I was researching for an article and I wanted to know what's the keyword and query frequency of a certain topic. So I asked ChatTPT how often do people ask this question to you? And it said well, I can't really tell you, but I can tell you what the Reddit forums say, that they asked me. And then it spit me out a graph that was kind of creepily accurate and I was like huh, you can't tell me, but you kind of are. But OK, fine.

0:56:00 - Felicia Jadczak And a little wink, wink, nod, nod from the chat A little bit, it's not like it.

0:56:04 - Rashmi Jolly It's like, well, I can't really tell you, but if I look at all these sources and I tell you that's where I'm looking, actually 75% of people are looking at it this way and that way. And it was a very interesting feeling because I did feel the wink wink piece and it was very strange because it was like that is not how I'm not supposed to feel from a research query. And then the second thing was that in the ask it said something very specific to me. It said I know that you're interested in looking at this problem this way. And I said how did you know that? Because I hadn't put it into the prompt at that moment. And it said oh, I have a memory on you and so if you ask Chachi P, it'll tell you what it has in its memory about you. And I actually didn't believe that that was the selection that gave me a very slim group, but I didn't believe, based on what I got back, that that was all that it had on me.

And so the lack of not understanding what's going in and not understanding how it's actually being used, I just think we talk a lot about it but we're not doing anything about it. I find like there was that story that went out that when you share things on Grok, it becomes published on the web I don't know if you recall this right and it actually becomes indexed by Google. So if you share something that you've created on Grok with a friend or you create a team, it gets published and indexed on Google and somebody can Google your Grok search and so then your mind has been exposed in a way that we feel we would never do in an internet article. Right, we wouldn't blog the way that we talk to our chat GPT, but then that went out there and that same sort of more intimate and personal tone of voice. So the fact that we're not being given public guidelines, we're not being told this is how it's used, we're not being told what they don't know, we're not being told how to protect yourself.

Nothing is happening. We're just users in a human experiment that I feel is grossly irresponsible, because things like things that we did for our children, like I always taught my kids nothing naughty, nasty or naked on the internet, because the internet doesn't forget, right, and we used to say this all the time, like I would not know what to say to them about AI and that that lack of clarity how to instruct them is so um, is so irresponsible. I just think why, like we have more safeguards on when you leave a hospital with a child, you have to go through like a car seat training. It is like the single most like um specific thing I remember from giving birth was sitting at the end with car seat training to make sure I knew how to buckle, and I think that was more emphatically you know and consistently shared with the public that came in and out of the hospital than anything I've heard on AI.

0:58:49 - Rachel Murray Yeah, it is. That's a product of its time, right when there were regulations, when you know when there were regulations, when you know not to be super gloom and doom about it, but you know when deaths kind of mattered.

0:59:05 - Felicia Jadczak you know I was about to say, but I think that's an outcome, the fact that you couldn't leave is an outcome of babies dying, and I am worried that it's. I think your point, Rashmi, is about how, like, we can't learn from our past and it's obviously a different space but we can't learn hey, let's not have a bunch of bad things happen before we put in place regulations for this really new thing that we're doing. Right, that's what I'm thinking about it because, like, we're starting to see, oh, there could be some really bad stuff going on, but instead of being like let's get ahead of it, we're like let's just see what happens, and then we'll like go backwards and say, actually, let's put some boundaries in place.

0:59:59 - Rachel Murray And to be even darker about it, because that's, I guess, where I'm at today is I don't think that the guardrails would be put in place if there were bad things that happen. I mean, you can look at the example of the Boeing plane crashes that happened, you know, a few years ago. That killed like hundreds and hundreds of people. That where there was sort of a slap on the wrist but Boeing ultimately barely got any. There was no real fallout from that right. Or the 2008 mortgage crisis right, there was the big bailout, and you know there's. And then all of the gun violence right, there's not. There's no longer this like oh, we should have laws in place that will protect people.

So I think that's one of the reasons why there is also not even this desire to sort of address these issues that sort of a lot of AI ethicists are certainly talking about. You know, it's sort of like, well, I guess it'll just take care of itself, and this is why I believe that you know things are going to be changing soon. But anyway, that's a whole other podcast episode. I would like to switch gears a bit and talk a little bit about your. You've got such vast experience living in different cultures and working in different cultures, and that's such an important aspect to this world because we live in this global society. So I would love to know what differences are you noticing in how different cultures think about the future of work and human-centered leadership?

1:01:19 - Rashmi Jolly future of work and human-centered leadership. Well, I think one thing is that, to your point that you just made about the, you know, not protecting people so much anymore, I think different cultures take that responsibility differently and for different reasons. America is very much a. You know we were born on the story that we pull ourselves up by bootstraps, that you know we're the world's ultimate meritocracy. Everybody makes their own future and so we have a lot of responsibility transferred to the citizens and in a way, that's created the ability for the government to have a diminishing amount of responsibility. The more you transfer to people and say well, actually that's your problem, right, you solve that. You protect yourself against guns, you take care of your own education, you do this, you do that.

The divide that we see between people and the function of our government is a little bit of a reflection of that. In the three economies that I've spent most of my working life in, it's a very different relationship with the people. Now they get a lot of criticism, each individually in different ways, but you know, largely for not being as democratic as the US. But the tradeoff is that for those governments to stay in power, they actually ironically, in some ways and it's not perfect. I'm not advocating any one month, say you know. I'm saying one country is doing it exceptionally. But they have a different relationship because their ability to stay in power is directly related to their ability to protect their people, because they are not asking their people, their opinion, all the time, right? So the relationship is different. It's like the people are like well, I understand that, like, this is what you're doing over here, okay, we're not going. We want you to take care of us. If we're going to entrust you and let you do that this way, your job is to take care of us. Otherwise, then we'll have something to say, but then only if it's like really, really, really, really bad and um, and and it's not perfect by any means, it's it definitely. There's all sorts of issues with free speech and all sorts of issues with um, you know, with know with human rights in those environments.

But the fundamental thing I've seen, like China took millions of people out of poverty and made their lives in one generation go from, you know, a lot of agricultural roles, a lot of low wage roles, into being able to give their children and their citizen these massive technology opportunities, because they were an authoritarian government and they could just decide and they could just plan and they didn't have to ask an opinion every single time. So to your question about the future work when they're thinking about their societies, they are thinking about the longevity of many different trends in a way that I don't feel like we're spending so much time talking in the US Like, for example, china did this thing recently, for better or for worse, but they previously had a very robust tuition culture, which was the ability for parents to pay after school tutors and special programs to augment their kids' education, and China shut that down. They decided that it wasn't okay because it meant that some kids were going to get really ahead and some kids were not, and that they wanted everybody to be a rising tide that were generally tracking with each other. Now, take that as you will, but it was a form of them trying to protect from the evils of social inequality that can tear our society apart. They also made rules on on gaming against their citizens, things that we would find like like a like a freedom violation in the U? S. You know We'd be like you can spend as much time on video games as you want, but China said no.

Actually, if you're a child you cannot, and so these countries are preparing their workforces from the beginning in ways that I feel you know. They're so contrary to what we think, how we think people should be treated in the West, but they're having tremendous outcomes on preparing people for the future. Dubai as well, like Dubai has there's. You cannot tape people here without their consent and you cannot say anything disparaging about anybody on the internet, about being bullied or worried, about showing up in some sort of compromising or fake AI thing. It's just like it's non-negotiable. So you're raising a generation of people who are already, from the beginning, thinking about social media in a different way.

So I see, culturally, that there are other models out there that don't have to be quite so runaway and quite so automatically unfettered and dangerous from the get-go, and I think that will shape how these economies end up using technology. And if we create an environment where technology can be used to further divide people and hurt people and take them down, I think those economies may suffer at the hands who are being more thoughtful about using technology in a way that at least is protected from some of its worst sins, and I wish the world would learn more from each other. I really do. We spend so much time labeling different systems as good and bad that we don't ever really have a healthy dialogue about the things that are individually interesting, about these different cultures. It's just like that's communist, that's, you know, dictatorship, that's this, you know, and then we just write off all the wisdom that may be embedded in those systems, whether assuming as if, like our democratic system, is perfect as well. So you know, yeah.

1:06:38 - Rachel Murray Yeah, no, it is. You raise some really interesting points and you know, yeah, so, yeah, no, it is. You raised some really interesting points and you know it's just. It's of course, worth noting that, like there's huge like human rights issues and violations, absolutely. And how do you? And but you know the same honestly with the U S, right, so, right. So how do you just like? So, how do you disentangle the good stuff and the bad stuff, and is the bad stuff needed to happen in order for some of those good things to happen? I don't know.

1:07:08 - Felicia Jadczak That's like a whole philosophical bent step, Like a whole other podcast episode probably.

1:07:14 - Rachel Murray So I just wanted to throw that little nugget out there, sorry.

1:07:19 - Felicia Jadczak Oh no, it's all good. I think we could probably keep talking all day, but we do need to start wrapping up, unfortunately, because while time is a social construct, we are bound by it. Speaking of time, Rashmi, last question for you is looking ahead 10 years, what gives you hope about where we're headed?

1:07:35 - Rashmi Jolly Oh, the young people, my children, are beautiful human beings. I see them every single day. Look at the world and be really mad at us for the world that we're giving them, but also being really thoughtful and intentional about trying to understand it and change it. I mean, these kids especially have grown up in an era of such human abundance that we can't even imagine Like we can joke about it, but we don't even know what it feels like Like. I remember a shopping mall having three stores. I didn't have to, like scour the internet for the latest styles. I mean just little tiny things, such abundance and what I find amazing about it is that it hasn't made them complacent. In fact, it's made them more uncomfortable. They're like, in some ways, we've burdened them with questions that are almost too complex for them to have to solve. Like who are you in a world where you want to be a fashion designer and there's 2 million online stores? Remember when we were young, you kind of knew what was in your world. You didn't have to see everything at once all the time, and yet they're still grappling through it. They're still trying to figure out who they're going to be. They're still trying to understand how they find their own little Jenga piece in the world and they're trying to be good people. I mean my kids and their friends. They really ask themselves these hard questions, like you know. Am I a good friend? Am I an empathetic listener? Their emotional language is so much better than our last generation.

I think they're going to change the world generation. I think they're going to change the world and I really hope that as a society, that we do take the responsibility of protecting them from the unknowns a little bit more seriously that are coming and not just like the car seat waiting for a bunch of babies to die before we figure it out. But I think that's just going to be on us individually, as communities. It's like the old Hillary Clinton saying that's going to date me like it takes a village. I think we all need to do that collectively. We cannot, unfortunately, it's just two things are moving too fast. We have to process and pass that down to the kids that are in our, in our, purview. I don't think we can wait for governments or schools to do it for us.

1:09:31 - Rachel Murray So yeah, I agree. I mean, let's just quote her again and just say stronger together, you know so why not? So what is next for you?

1:09:44 - Rashmi Jolly I am. I'm exploring a lot of things. So I have an innovation consulting firm that I've opened up and it's very it's human centric innovation, trying to help companies be exactly what I talked about much more customer and human centric and in exactly the way that I just described today. And then I also have a couple of entrepreneurial projects that I'm working on on my own. I always have a lot of creative campuses that I have going and then, when I really feel like I just want to sink into, like, my imagination, I'm writing a mystery novel that I'm hoping to finish this year. So I've got a couple of different things going on and um, and it keeps me, you know, it keeps me fresh, cause I get to change channels a bit in my brain. And I think everybody should have a mix, because it's it's really good for us to have spaces that have just that. We can just be free and we can just be, you know, risk-free, and then somewhere we're disciplined, so we still feel the sense of accomplishment.

1:10:34 - Felicia Jadczak Yeah, love that. Thank you so much for spending some time with us. If people want to follow up or learn more about you, is there any way that they can find you on the internet?

1:10:43 - Rashmi Jolly Yes, yeah, my name. My company name is Asadeo Consulting and I can share the website link with you and you can find me there and all the things that I do.

1:10:54 - Rachel Murray Oh, thank you so much.

1:10:55 - Rashmi Jolly Rashmi, this was amazing. Thank you so much, Rashmi. This was amazing. Thank you so much for having me. It was so fun to talk to you and it's great to have like-minded people discuss these issues. I think we need more of this, so thank you for creating the opportunity. Agreed More, more, more.

1:11:08 - Felicia Jadczak We did it. We have some answers, maybe not all of them, but we hope you enjoyed listening to our interview as much as we enjoyed the conversation.

1:11:16 - Rachel Murray Ditto to that. Some, but most not so much, but that's okay. Work in progress. Stay tuned for future episodes. Thank you so much for listening. Please don't forget to rate, share and subscribe. It makes a huge difference in the reach of this podcast and, by extension, this work. Visit us on the YouTubes, on Instagram, LinkedIn, and sign up for our newsletter at inclusiongeekscom forward slash newsletter to stay up to date on all things Inclusion Geeks. And don't forget to stay geeky. Bye.