What happens when a movement built on innovation starts to eat itself? In this episode, we’re joined by Catherine Bracy—CEO of TechEquity Collaborative and author of World Eaters—for a bold, unflinching conversation on the state of tech, venture capital, AI, and what it really means to build an equitable future.
We get into:
We also dig into the pressures of chasing VC money, the cost of tech’s villain era, and how automation is reshaping labor—often without worker protection in mind. Catherine brings the receipts and the vision. If you care about inclusion, the future of work, or just want a little more honesty in your tech podcasts, this one’s for you.
00:00 – Felicia and Rachel Intro Workplace inclusion meets AI and ethics. Reflecting on 10 years of Inclusion Geeks—and why this convo couldn’t come at a better time.
12:25 – Progress or Plateau? A Decade of Tech Equity Catherine reflects on founding TechEquity Collaborative and whether the industry’s progress has lived up to the promise.
18:38 – Disruption: Buzzword or Barrier? We question who really benefits from “disruption” and how venture-backed growth often sidelines human-centered practices.
29:52 – VC or Bust? Rethinking Startup Success Why the pressure to raise VC has become toxic—and what alternative paths to success can look like.
36:18 – Blitzscaling and the Cost of Speed The culture of growth-at-all-costs, the real meaning behind “blitzscaling,” and how it erodes sustainability and dignity.
41:14 – Silicon Valley’s Villain Era We unpack tech’s political shift, culture wars, and the industry’s growing identity crisis.
50:28 – Automation and the New Labor Divide How AI is reshaping the workforce—and what it means for those with (and without) agency over technology.
58:33 – What We Build Next Imagining a tech future grounded in equity, creativity, and human dignity. Catherine shares what gives her hope.
0:00:07 - Felicia Jadczak Hi and welcome to the she Geeks Out podcast, where we geek out about workplace inclusion and talk with brilliant humans doing great work, making the world a better and brighter place. I'm Felicia.
0:00:17 - Rachel Murray And I'm Rachel, and we are so excited to welcome the brilliant and bold Catherine Bracey to the podcast. Catherine is a longtime tech equity advocate, the co-founder and CEO of Tech Equity Collaborative, and now she is the fancy author of her powerful new book World Eaters Highly recommend. Five stars Absolutely should read it. In our conversation we dig into the deep roots of inequity in the tech and VC world, from a really interesting concept called blitzscaling to the awful labor exploitation, and we explore what it'll take to build systems that actually work for people. Catherine brings clarity and urgency to the table as we imagine new ways of rethinking the workplace. This one is for anyone ready to challenge the status quo and build something better, which is why we are here today in your ear holes, Always the hope.
0:01:08 - Felicia Jadczak But before we get to Catherine's interview, let's talk about the topic that seems to be on everybody's mind these days, which is AI. Haven't heard of it? The robots are taking over. Haven't heard of it? What's that? Well, let me tell you, Rachel. I got a lot to catch up on.
0:01:25 - Rachel Murray JK, JK. Also, I will say that we are actually recording this the day after our anniversary.
0:01:30 - Felicia Jadczak Yes, oh yeah, we should. Actually, we should. We should give ourselves a shout out. It was our 10th business birthday. Yesterday we had a little online celebration. It's been 10 years. I cannot believe it.
0:01:43 - Rachel Murray I know I'm like do we talk about AI or do we talk about?
0:01:46 - Felicia Jadczak Yeah, actually. Well, I mean, we could talk about both, but maybe we should take a moment just to like appreciate 10 years of whatever we've been doing.
0:01:54 - Rachel Murray My gosh, I know Well, it could be a 10 year conversation. It is Well, and it is interesting if you tie it into AI. It's like my gosh 10 years ago Sure it was. It was vaporware, you know. It was something that was like, oh sure, the robots may come. It was science fiction, right, and now we are living it. We are yeah.
0:02:12 - Felicia Jadczak Yeah, but 10 years ago, 2015,. So you and I had already been working together for two-ish years at that point doing events for women in the Boston area. But we made it official, we put a business ring on it, we did Tied the knot, yeah. So I actually I don't know if you know this, but I still have our. I have the letter from the Commonwealth of Massachusetts on this bulletin board that's right in front of me, so it's covered with a lot of different pictures, but I have it up there because you know it was a big deal at the time and it still is. And I think just the fact that I was really kind of going down like a little memory lane yesterday looking at all these pictures and thinking about the 10 years and what is so wild to me is that our five-year anniversary was March 2020. Oh my gosh, seems like it was yesterday. So I don't understand how that math, but that's apparently how that worked out.
0:03:09 - Rachel Murray Wait, I don't like this trajectory.
0:03:10 - Felicia Jadczak So what we're saying is that in five years, In five years we're going to be like android, hybrid, robot, people.
0:03:17 - Rachel Murray The aliens will have 100% come down by then.
0:03:21 - Felicia Jadczak The Terminator will be here. It'll just be full apocalypse and we'll still be working on workplace inclusion. It'll just be people and robots.
0:03:30 - Rachel Murray They can't stop us. Try as they might.
0:03:35 - Felicia Jadczak I mean, you know, I feel like you and I have always said that we'll keep doing this work until we can't anymore, and I know that originally it was because we thought we would. Now that doesn't look. I don't know, We'll see what happens, but you know we're still here, which I think is a big accomplishment in and of itself.
0:03:52 - Rachel Murray Yeah, yeah, it is a very interesting time. There's such a confluence of things that are happening, whether it's the workplace or the world, and you know, it's all we've always said this about. The reason why we focus on the workplace is because that's where so many of us spend our waking hours. So it's, you know, ultimately is about like helping people during the times when they are, you know, most conscious and have an experience that maybe isn't terrible. So that's, you know, that's always kind of been our goal is to how can we make things better and to go into AI been our goal is to how can we make things better and to go into AI. It's been interesting because we both have been using it and you know there are certainly a lot of ethical concerns, which you know we can spend hours talking about, and there's a lot of good stuff. And then just the question of you know, where is it all going to land and how does it play into everything else that we're doing?
0:04:42 - Felicia Jadczak Yeah, it is interesting because I think that a lot of the criticism comes from the speed at which this has been unleashed upon us and that criticism is tied to sort of the ethical considerations around. We haven't really thought about what does it mean to really use this kind of technology and have it take over? You know, and, like you said, we could spend hours on this and maybe we should just it take over. And, like you said, we could spend hours on this and maybe we should just have a whole separate episode just on AI. But briefly, everything from replacing jobs to climate issues around how much energy and water it takes to actually answer our chat, gpt, dumb questions that we're asking it.
I saw someone on social media this morning saying that they saw an influencer went to a popular restaurant and used ChatGPT to ask what they should order off the menu.
I don't think that's the best use of ChatGPT. Just that's my personal two cents. But you know, people are using it in all different ways and we use it, as you say, and we should talk more about how we use it. But you know, also there's the concerns around piracy and copyright violations, especially for artists, and there's recently been stuff around authors I forget which company, but some company put out basically like a list where you can go and you can search and see if you're a published author. You can see if your books were used to train AI for meta specifically, and that's been a huge issue in the book publishing world Studio Ghibli. Just today it came out that ChatGPT basically like illegally stole all their movies and had been using that to train their AI for image creation. But on the flip side, you know, there's a lot of really good uses for it and I think we use it hopefully more for good than for evil. But how are you using it?
0:06:24 - Rachel Murray Rachel A lot of ways. I use it as a thought partner in thinking about. When I have lots of ideas, as Felicia you know, very much all the time, so I like to explore them and especially in this time that we are in now, thinking about how the work is evolving, it's really nice to have kind of a thought partner to sort of explore different aspects of the work. So that's probably primarily how I'm using it, along with like sort of fleshing out some ideas. And I will say I just wanted to touch on something that you said that I think is important that you know the ethical concerns aren't being surfaced, and I do think that they are.
I think that they are being ignored by people by and large, very similarly to like climate change, you know, I think, and what's currently happening in our federal government. I think that it feels easier to just sort of kick the can down the road. And this has just been part of, you know, the human experience, probably since the dawn of humans. It's one of those things where there's like definitely a lot of conversation happening. Whether it's going to matter is, I think, up for debate.
0:07:32 - Felicia Jadczak Yeah, and you know, to that point I also.
Obviously we're, you know, in alignment and agreement, but I think it's being ignored by the people who are in charge of pushing this technology forward, who, surprisingly or unsurprisingly, tend to be rich white men.
And I mean, this conversation has existed in tech for a long time, I know, and I'm sure it's true for you too, but when I was in tech, before we started this company together 10 plus years ago, there was a lot of discussion even at that point around the importance of having people in tech being part of the discussions around building products and services who were not focused just on can we do it, but should we do it.
And so there was a big discussion around bringing in people who were architects and who were artists and who were philosophers. There was a big discussion, at least at the company I was at, around bringing in people who were thinking differently to be part of these teams. And so I think we've really lost a lot of that retrospection because it's just more about oh, this is really cool, let's push it as fast and far as we can go. And actually it ties in really well with what we talk about with Catherine, because you know we talk about how we got to this stage of needing to continually build and build and grow and grow and that whole sort of space in the VC era especially.
0:08:55 - Rachel Murray It's interesting because it's absolutely being pushed by the people who are in charge and it's being used by everyone, including us right. So it's like I think about it the same way I think about you know Amazon, we know that Amazon is really problematic in a lot of ways, and it is a hard thing to break, and so the question is when does this become no longer sustainable? Is this sustainable?
0:09:21 - Felicia Jadczak I mean, I could definitely argue that it's not yeah, and you know, and the other thing, just to kind of throw out there again, I really am like oh gosh, we should just really spend a little other hour at some point on this. But the other consideration too is that you know, not everyone actually is using this tool and I know people in my life who are younger than I am who do not understand chat, GPT or AI, who aren't using it and who are confused by it. I will say I think you're probably a much more robust user than I am and I think I use it fairly regularly. I use it for data synthesis, for, again, that thought partnership.
When I'm creating content, I use it to help me flesh out, like an outline, to make sure that I'm thinking about how to structure things. So I have found it really helpful from that standpoint. But I know a lot of people in my life who, you know, in all ages, are like I don't get it, I don't want to get it, I don't like it or I don't know about it. And so what's going to happen to folks like that who you know, if we continue down this trajectory and it really goes into every aspect of our life, like what's going to happen to people who get left behind from a technological standpoint, and that's another big consideration too.
0:10:32 - Rachel Murray And because we are all feeding it and making it smarter. If enough people who are from marginalized identities or, you know, from minority identities in any way aren't using it, then the dominant voice becomes the majority.
0:10:48 - Felicia Jadczak I know, and it's so complicated because it's like it's not just using it, it's also again like what material is being used to train it? And then how is that getting accessed? Is it stolen? A lot of it's stolen, and, you know, in the artist community, which I'm very involved as well, there's a whole discussion around. The whole point of AI and technology should be to do the stupid stuff that nobody wants to do, so that we can actually create art and music and writing and all the good culture stuff. But what we're seeing is that the robots are creating all the fun stuff and we're the humans are stuck with the grunt work, and that's not a world that I want to live in either. So, yeah, it's an interesting time to be alive.
0:11:29 - Rachel Murray Yeah, it's a real struggle. I do hope that people, because it's happening, like regardless of whatever it is happening, so I do hope that people will use it. Unfortunately, even though it does mean like this is one of those things where boycotting is not a good idea.
0:11:48 - Felicia Jadczak No, and that's a reality. It's like if you boycott, you do yourself a disservice, I think, and that's unfortunate, but that is the reality we live in right now. Wild On that delightful note.
Well, I mean I will say we get into it further with Catherine, and I do hope folks stick around and listen because it's a really interesting conversation. We both learned a lot. The book is amazing. You should definitely get it. But we talked to Catherine about all this stuff and again talking about VCs and entrepreneurship and how that all has come about. So what do you think, rachel Ready to get Catherine into this conversation?
0:12:22 - Rachel Murray I'm to welcome the lovely Catherine. Welcome, Catherine!
0:12:25 - Felicia Jadczak Great, Catherine, so let's just jump right on into it. So nearly a decade ago, you co-founded Tech Equity Collaborative and you had a vision of making the tech industry more equitable, which is obviously a very amazing goal that we support fully, and over the years, we've seen increased awareness, but we've also seen some setbacks. So there's been tech layoffs that have disproportionately impacted marginalized groups. There's the ongoing housing crises and labor precarity. So, just looking back at all of this, do you feel like the progress? Has there been progress towards equity? Has it met any of your initial hopes? What does that look like for you looking back?
0:12:59 - Catherine Bracy Well, I guess, if you're, if the period of time that we're sort of snapshotting here is, let's say, 2016, early 2016, late 2015, which is when I started tech equity't things that we did over the course of the last 10 years that were sort of positive and maybe, in the longer you know scope of history will be, will look like progress. But you know, if I'm reflecting on the last 10 years and assessing it based on where we are in March of 2025, I think there's really no way to look at where we are and say that we've made a lot of, made up a lot of ground. I hate to be so pessimistic about it, but that's just the way it is.
0:13:53 - Rachel Murray Yeah, and I think beyond that, you know, I just wonder if there's like. I know March 2025 is kind of wild. I mean seeing the and with the inauguration, having all the like, the broligarchy up there during the inauguration definitely sends a message. But I'm curious, even though there has been all of that and I'm not begging you for any sort of glimmers of hope but if you do have any sort of nuggets of like yeah, we're on fire right now, but maybe there's is there, do you see anything like?
0:14:22 - Catherine Bracy Even if it's like greater awareness, uh like nope specifically which is an answer I mean if the answer is, if the question is like, specifically in the tech industry, I mean I think the answer is pretty clearly comprehensively bad across the board, at least if we're talking about sort of big tech. I don't know that that's necessarily a reason to be for me. I'm not. That doesn't make me hopeless, it doesn't make me ready to kind of give up. It is more. It's just information.
I feel like I know a lot more about what's driving the industry and that's partially why I wrote this book, that I feel I just feel less naive and more okay. I have a deeper understanding of the problem now. And like naive and more okay, I have a deeper understanding of the problem now and like this has taught me a lot about what is actually functionally quote unquote wrong with the tech industry. And if we are, you know, to build something that is more equitable. These are the things that we will need to address. At least that much is clear. And so, yeah, I mean I guess if you're looking at where we are today, it's we're certainly not where we would want to be, but it isn't again, I don't think it's. We're in a hopeless place.
0:15:36 - Felicia Jadczak Well, that's certainly, I guess, hope in and of itself a little bit, because you very well could have been like.
0:15:45 - Catherine Bracy All hope is in the broader context in the world and the way that people are reacting to, you know, the Trump administration and the rising oligarchy which is led by tech billionaires and in some ways I guess that feels like you know, lansing and Boyle. Maybe it wasn't clear what the underlying issue was and now we've brought it to the surface and it's really bad right now, but like now that we can see it, maybe we can get it out of our system, kind of thing. So that's hope.
0:16:18 - Felicia Jadczak We'll take it wherever we can get it.
And you know it's interesting because so Rachel and I, both our backgrounds are in tech too, and so you know we both read your book, which we'll get into in just a minute.
But I was talking with Rachel earlier and I was like, yes, so much of this, what you wrote was ringing true, and you know, we had similar experiences in a lot of ways.
And you know, one thing that's always struck us about tech as an industry is the ability whether that's, you know, supposed or actual to rapidly disrupt industries but when it comes to equity inclusion, that urgency is not there. And industries, but when it comes to equity inclusion, that urgency is not there. And we felt that firsthand both in terms of working in tech and then also trying to work with folks in tech to push the needle forward in these areas. And so I know you have a lot of thoughts because you wrote them in a book, but for now I'm just curious if you think at a sort of high level, is it because the systems themselves are too entrenched and they're too difficult to shift and change, kind of like we can't move the Titanic from the horse that it's on, or is it more about a lack of will, because disruption is only appealing when it's beneficial to those who already have a lot of power or already in power. Just curious what you think your thoughts are.
0:17:24 - Catherine Bracy Yeah, I mean, I guess the disruption narrative is compelling. It also comes from this place of Silicon Valley and VCs really needing to create this hype machine. A lot of what venture capital is is marketing. Right, You're talking about companies that don't have a track record yet, maybe don't have a product, but those companies are trying to raise a lot of money by telling a story about what the future may look like if they were able to build out their companies, if their companies were successful, and so it's sort of like an intrinsic trait and it attracts people who are really good. Obviously, the natural selection of the industry is going to select for people who are really good at telling stories. Some of them are actually fraudsters, but it also, I mean, so it makes sense that this sort of narrative about Silicon Valley is deeply entrenched in us and is very compelling and attractive.
And that disruption narrative, I think, is very compelling and attractive, and it's not necessarily clear to me that disruption is necessarily a blanket good or that a lot of what you know, the ways in which Silicon Valley was making a lot of money was actually, by quote unquote disrupting entrenched industries. I guess I will sort of challenge the premise of disruption as a concept fundamentally, but that's sort of a longer answer in terms of like, whether disrupting quote unquote entrenched bias in the labor markets is something that they just didn't care about. I mean, I think ultimately, yes, Not that they necessarily were actively hostile to EI at least 10, 15 years ago, but it just wasn't a. The focus of growing a venture-backed company is to just look at the product and the market. It's like blinders on narrowly interested and taking as much market share as you can as quickly as possible, right, and so anything that's going to slow that down is a bad thing.
And ultimately, I think this conversation about, well, who's actually benefiting from this growth? And I mean, not only did it undermine the sort of, you know, fantasy mythology of Silicon Valley, but it also slowed the companies down. You know they had to hire all these teams, they had to do, you know, add on this whole function to the work and they don't even want to do basic HR, let alone attach an equity consciousness to that work. So, you know, I just think it was like, fundamentally, if there was something structural about it, it's that it was fundamentally incompatible with the way that venture-backed companies have to grow in order to satisfy investor demands. There is no time for them to like take on all of these other, what I think they would view as sort of ancillary projects that aren't about, you know, capturing as much market share as they can as quickly as possible.
0:20:19 - Rachel Murray Yeah, catherine, I have to say I mean this book, like Felicia said before, it, really resonated with us really deeply. I mean we sort of came up in this, that era in the Boston area where you know, VC was everything you know, there was like a little talk of like angel investing and here and there we were certainly called a lifestyle brand, you know like the whole pejorative and everything and definitely being treated that way. So just want you to know, we absolutely can relate to everything that you are saying. We saw that it was really about speed. It was really about just making the dollars. It wasn't really thinking about things in any sort of sustainable way. So as we were reading the book, we're like yep, yep, yep. And so I'm curious, as we were reading the book, you know we're getting a lot out of it and a lot of validation. So I'm just kind of curious were there any sort of surprises that you found along the way when you were researching?
0:21:11 - Catherine Bracy Yeah, I mean, when I started writing the book, it really was a critique and I was thinking of it. You know, I think, if I remember remembering now this was like four years ago, cottage industry of books about all these individual tech companies that had become sort of notorious failures or scandals. And there were Netflix series and it was, you know, there was just one right after the other. And I remember thinking like there's a common thread here, and nobody's sort of explicating that common thread, which is, you know, the economic structure, that sort of enabled and then created thread, which is, you know, the economic structure, that sort of enabled and then created the incentives within which these companies were operating. And there's a story to tell there about how that system empowers founders like this or companies like this and creates all of this risk that isn't necessarily there in the first place, or pumps money into companies that are sort of pushing off the risk that they're creating to the rest of us. And that's what the book started out to be. And then I talked to more and more entrepreneurs who had tried to raise venture or had raised venture Folks like you and I heard the same story. Right, it's like we. This really resonated with us. This really resonates with us. We feel like we were trying to raise money and hearing that we couldn't because of this and this.
And after enough of those conversations and hearing the stories of entrepreneurs, many of which are laid out in the book, but not all of them, I don't. I can't like pinpoint an exact conversation or a moment in the process of writing the book, but it did switch and I think if you are reading it very closely, you can pick out this narrative arc where it started. The book sort of starts with a much more critical lens and then ends on a much more sort of hopeful, I think, empathetic note. I would say. And that's because I had all these conversations that informed this opinion that I grew to develop. That was, the challenges that VC is creating for society are not in the companies that they are funding, it's in the companies that they aren't funding, or it's in the companies that they shouldn't be funding, that are not built to actually reach venture scale and that are great businesses and they're not.
You know, I think I get put in the bucket of like social impact. They're not social impact companies, they're just normal businesses with a smaller organic market opportunity and they are either forced to try to be a grand slam when they're actually just a double or a triple or they're not funded at all, and that dynamic is actually what's really harmful to the economy. Not so much the money and I mean this is harmful but like not so much the money that goes into Theranos or WeWork or FTX or whatever. It's the money that doesn't go into really good businesses that could be addressing things like the housing crisis or the climate crisis or just like creating wealth and opportunity for a more diverse set of founders. So yeah, I mean it was sort of a shift in my thinking over the course of my time writing the book, but I couldn't like pinpoint for you one story where it came together in my mind.
0:24:26 - Felicia Jadczak Came together in my mind Well, it sounds like you, had a lot of different stories and conversations and you know a lot of work went into this book, so that makes a ton of sense as far as there wasn't necessarily like that aha moment or that one person. But yeah, as Rachel said, it really rang. So true for us because again we were coming up in that time frame. We dipped our toe very briefly into the world of DC and we're told categorically we don't want you and this is not the kind of business that will ever fund. And we're like, when was this? This was well, it's actually.
0:24:57 - Rachel Murray Probably 2016,. I want to say 2016, yeah, 2015, 2016. And we were yeah, we were squarely women in tech group. We were running a lot of events. We were really popular in Boston. We had three events a month, running a lot of events. We're really popular in Boston. We had three events a month where we had all the corporate sponsors out years of it and I would say the only people that I know that got funding for what we do is probably Power to Fly. Yeah, because they had an app.
0:25:22 - Felicia Jadczak Yeah, do you have a tool or a platform? I don't even know if you remember this, rachel, but I got into some. It was like a pitch competition or a pitch invitation, pitch thing which the New England Venture Capital Association had been hosting, and it was such a waste of time for everybody involved because no one there wanted to talk to me and so it was just. You know, again, we were underlining this book. I've got it right here. It's got so many notes in it where I was reading Rachel's notes because she read it first and she was like, yes, oh, gross, and I'm like plus one, I don't know who will ever read this after us, but whoever has a lot of notes to read there If we have time.
0:25:59 - Catherine Bracy I would love to know which parts you underlined and put stars next to. But I'm curious. I don't have to flip this around on you, but these stories are so interesting to me, like, why did you think at that time that venture capital was the right thing for the business you were building?
0:26:11 - Felicia Jadczak Yes, great question and I can actually you answered it yourself in the book. It was because that was the story and the narrative that we were told. So when we first, you know, came together and we left our respective other jobs and we were really working on putting our company out there, we were told and especially at that time, boston was so VC heavy and it was just, that was the way to do it, and if you wanted to be an entrepreneur, you got VC funding.
0:26:35 - Rachel Murray Or I will say even Angel, or Angel investing, yeah, and the reason why was, you know, we had intentions for it. It wasn't like, oh, we just want to have money, we were doing it for expansion reasons. And it's really interesting. And Chief is actually another example of a company that got a ton of funding because they wanted to expand and they had a particular market, but because they were, you know, their market was really engaged with, you know, c-suite level women and our market was really mid-level women, and so maybe there wasn't as much money or we didn't tell the story well enough or it's just, you know, we didn't know the right people, and it's kind of interesting to think about even sort of the angel investing, which turned out to be great, because you know what's great is owning the business yourself entirely, 100%, and then growing it however you want and needing to contract whenever you want, like in 2025 or 2020.
0:27:27 - Catherine Bracy Yeah, that's right. I mean I wonder, you know, obviously that was the zero interest rate period and that, I think, colors a lot of the critique that I'm making, and I think you know I'm jumping ahead on the list of questions here. But you know there is some question about whether just the simple fact of having interest rates higher for longer creates the conditions that something else happens. I'm not so sure, just because I think we're seeing a lot of the same unhealthy cycles around the AI industry. But, leaving that aside, I wonder if you think that if you were not even just like knowing what you know now, but like if the conditions in 2016 were the same as they are now, do you think that you would have tried to raise VC? Did you even? Was it like in your head that you could get money from anywhere else, or like a bank loan, or just you know anything Was that? Did you guys have that conversation?
0:28:18 - Felicia Jadczak Like you know, it was like a sneeze in the wind, so it really didn't go very far or last very long, but again, that was the predominant narrative, especially in the Boston area. And so I was just reminiscing with Rachel earlier that I used to do a ton of public speaking at that point and a little bit less nowadays, of course. But at that point I just remember like every time I was on a panel, especially if it was around entrepreneurship and building a business, I felt like I had a duty to say hey, you don't have to like, this is just one way to build your business. And we are at that point. I think I called us like we're a unique startup story, because our startup story was very different than 99.9% of the other startups that we encountered at that point in time.
And again, that's why it felt so validating reading your book, because some of the notes we wrote were why didn't this person get a bank loan? I forget the name of the company, but there was one story you shared where the person wanted I think it was $500,000 and tried to go after VC funding. And we both wrote why didn't they try to get a bank loan for that amount of money and I think in the Boston area at that time frame that was just not the advice that we were given and we had a lot of mentors and people and they said, do this the way, go down this pathway, and it didn't make sense. And we were told too that you know, people would say, well, what's your goal with the company? And we would say, well, we want to grow it to a nice place and then run it forever. Surprise, surprise, that's not the answer anyone wanted to hear.
0:29:52 - Catherine Bracy Yeah, that's going to get the door shut right in your face. Well, I mean, I think there's two things going on here. I mean, first of all, like it's very hard for an unproven startup to get a bank loan, you know. So I think that's one reason, and obviously I think that you know that was the gap that venture capital was invented. To fill right is for startups who are doing something risky usually new technology that don't have a track record, that are trying to commercialize something it's not clear whether it's going to work or not. A bank is not going to take that bet, or they're going to take it at a very high interest rate. That puts a lot of baggage on this small little company, and so there is a need for this type of financing, and that is something that I try to make clear in the book that I don't think.
My hope is not that we like ban venture capital or whatever it's that venture capital as the traditional definition of venture capital, in seeking these power law returns that it goes back to doing what it was meant to do when it was invented, which is to find those breakthrough technologies and not to just chase every startup that they could force into a winning lottery ticket.
But then the other thing I think is going on is like and maybe you guys were conscious of this at the time, maybe you weren't, but looking back I think it's pretty clear now that like it wasn't just the signals from your advisors who were telling you this is the way to do it, it was also a sense of like the culture you're in and like what does it mean to be a successful entrepreneur? And you know you feeling like you have to justify yourself if you're not raising, if you can't raise VC, that's a signal that you're not a successful entrepreneur. It has nothing to do with the actual business that you're building. And that became, I think, super ingrained and super unhealthy in the larger sort of culture around entrepreneurship and that is also, I think, an outgrowth of this sort of hype machine, the marketing around Silicon Valley.
0:31:44 - Rachel Murray We sent you a list of questions, but one of them wasn't specifically to describe what a parallel model is. But there might be people who don't know what it is because they haven't read your lovely book yet. So I was wondering if you could talk a little bit about that. And, to your point around how the industry is, how could you see it potentially shifting? What do you think it could take to actually shift the industry so it could be more sustainable? Yeah, yeah.
0:32:09 - Catherine Bracy So I guess, going back to that history, you know I said that the VC model came out of this, you know, solving this problem of needing to bring capital into riskier startups in like the middle of the 20th century it was post-war era where there were a lot of new technologies that had been invented but there wasn't, it wasn't clear how to commercialize them. And this group of kind of civic and business leaders recognizing this problem, which was also tied to a need to like, show that capitalism was a better system to create, you know, strong societies than communism was. They looked at some models that had been used in other industries and kind of came around to one that was used in the whaling sector at the end of the 1800s, which you know also a very risky industry, where they had agents that came together and sort of collected investment from the money holders, the wealth holders, and then these agents became sort of experts about which captains they should give the money to and invest in certain ventures that these ship captains were going to go off and do, and then that way that the risk was sort of spread across multiple different ventures instead of just putting your money on one ticket, and that model, obviously is what the venture model came to be is putting all of these companies into a portfolio, spreading the risk around, and the idea is that these are high-risk, high-reward opportunities so that a small number of them will pay off very big. A larger number of them will sort of fail or not return anything, return anything. That small number of huge returners will more than make up for the larger number of failures and that, when you plot it on a graph, looks like a hockey stick up to the right. That is called a power law, right. So that concept that you're going to have these sort of huge grand slam, couple of huge grand slams that are going to hit it really big that will more than make up for the ones that don't, is the sort of fundamental principle that VC is built on, and that was a hypothesis.
When it started in the middle of the 20th century they didn't know that was actually going to work. It turned out it did work, and when it did work and then a few of the early firms sort of replicated it, it blew up right and it has become I would say, not just a observation of what happens what your portfolio will look like after you invest in companies that are producing breakthrough technologies. It has become a thing that is the point in and of itself, right. So investors are chasing the financial return and not the breakthrough technology, and so everything becomes about how do you sort of shove every company in to fit that power law curve, rather than like looking at companies that could be really breakthrough technologies and letting the chips fall where they may. They may, and I think that has created most of the harms that we see in the industry today, and the book sort of lays out how that all plays out for everyday people, yeah, oof.
0:35:16 - Felicia Jadczak Well, to jump into another oof kind of topic, you also talk in the book about this concept of blitzscaling, which is where companies are rapidly growing at all costs and then there's a lot of harm that this approach can entail as well. And you know, we're really, I think, now seeing this reverse, split scaling sort of approach in government, where all these structures and institutions are just being gutted and ripped apart without any real plan. It seems we might be able to come together which would be sort of like cross-industry, so across business policy, different communities. How can we address this and build something more sustainable, whether that's specifically in the VC space or more broadly speaking?
0:35:58 - Catherine Bracy Yeah well, there's a lot to say here. So I mean, first I'll start with blitzscaling. I mean, going back to what I just said about, if VCs are trying to fit all their companies into this power law model, what exactly are they asking them to do in order to create that return? And Blitz, I mean, I think Blitz scaling is just one of them. It's a very evocative methodology, but it's pretty much it's the basics that all of the VC playbooks have, which is you need to try to grab as much market share as quickly as you can and do whatever you can to get there. Anything goes and as long as the ends justify the means, kind of thing. As long as you get to a place where you have huge market share, then anything you did to get there is fine. And obviously it is named after a Nazi military strategy, which I think Reid Hoffman probably regrets at this point. But it tells you pretty much all you need to know about how these companies are forced to behave. It has been very successful at making some people a lot of money.
One thing, and I don't I mean I'm going to kind of ramble on because I feel like they're about Ramble away, please Several different topics. To get back to the actual question you asked, but interrupt me and bring me back, reel me back in if I don't get there. One of the things that I came to appreciate after the election, my sort of coping mechanism was to kind of go back to history and do some reading about like well, how did we get here, what, where are we? How do we orient ourselves in this place? And I did a lot of reading about, you know, the 90s and the deregulatory era and NAFTA, and it became clear to me was that the rise of Silicon Valley and especially the practice of venture capital was sort of the perfect model or, you know, practical application of late-stage American capitalism, sort of the neoliberal economic era, where the idea was just government's role was to facilitate the private sector to make as much money as possible and if they did that, like all the benefits would flow to society from there and the policy project was to support that. And I think we saw that in sort of the Clinton and Obama era. I mean I worked for President Obama in the early 2010s, so like I don't feel like I'm blameless here either, but I just think that was the mindset. It was the economic system we lived in and really it was about, like, leaving these companies to operate unleashed and whatever they did was going to benefit was going to roll down to the rest of us.
I think a lot of like how we got to where Silicon Valley has tilted so far to the right and has really like started this sort of very explicit villain arc. They are realizing, I think, along with the rest of us and this started in 2016 when Trump was elected that era is ending that like it didn't work, that trickle-down supply-side economics actually caused people a lot of harm. Supply-side economics actually caused people a lot of harm, caused the economy a lot of harm. That we had this recession and the growth that followed the recession no-transcript only served to concentrate wealth at the top did not create all this goods for the rest of us. People reacted. It took eight years, I guess, until Trump came around. They were given an opportunity, but there was a lot of dry tinder there, right, and so they reacted.
I at the time, in 2016, remember holding I worked with Sam Altman on pulling together a group of Democratic donors who, from the tech sector, we were going to do like tech resistance that's what we were calling it. It was all these like really high, fancy tech people to talk about? Like what happened? What do we do next? Where does Silicon Valley put its money to fight Trump? And I had so many conversations in 2016 with people from Silicon Valley who had been totally checked out of politics up until then were like whoa, actually this is like something I need to be plugged into. So there was a sense that, like populism was not going to work for this industry.
And then Biden came along and actually was like and you're part of the problem and like you have to play by some rules now.
And the freak out was like so disproportionate Right, and now we are where we are. So I think there's this way to explain this Sorry, this is all a long way of saying. I think there's a way to explain how we got here and why you know, tech and VC is so aligned with Trump by understanding the sort of like, scope of economic like, the way that economic systems evolve, and I think there's no greater indicator that whatever system, this supply side system that we lived in before is dying than that they are having this desperate reaction and they feel so threatened by what's happening that they feel like the only way to protect what they have is to throw in with Trump and you know, just go full grift and just get as much out as they can before the House comes crashing down. So that's sort of set up like where we are. I'm going to start, I'm going to stop talking and see if you guys have any questions before I go back to answer the original question that you asked.
0:41:21 - Rachel Murray No, we could probably talk forever, but listen, this is your opera, this is what we want. We do. All the talking, ok, but that's kind of boring for people.
0:41:29 - Felicia Jadczak No I mean, I will say there was that we have a little back channel and a little bit of a back channel question that we were asking ourselves as we were talking, which I think, if you don't mind, rachel, I think it's a good point to bring it up because, you know, you mentioned just now you're talking about, you know, biden, and this, freak out. I'm like 100% with you. I was like I was chatting with Rachel. I was like, you know, silicon Valley is in its villain era right now and it totally feels like that and that's the reality, right.
And so you know, I'm thinking the war on, like critical race theory and the Me Too movement and you know, again, in the time period that we're talking about, that encompasses a lot of this, and especially for VCs, I'm thinking of the Me Too movement, which was, at least in Boston. There was such an uproar about it and everyone was like, oh my God, we have to figure this out. And there are all these panels that are being convened and then really not much change, and then really not much change. And so I'm just wondering out loud if you think that some of those, like, I guess, movements for lack of a better way of putting them, categorizing them? Were they like sort of chipping away at this house of cards and then really kind of? The last piece was with the government and Biden or like. I'm just curious how you think it all kind of fits together there.
0:42:55 - Catherine Bracy Yeah, I mean, I'm just curious how you think it all kind of fits together there, like now we see the problem as exposed and we can't delude ourselves that there's something else going on.
Here is sort of that point right Of like these, you know, pushing them to do the bare minimum. Anything that threatens their ability to sort of be the biggest company or make the most money is existential and is worth them sort of going nuclear on, I think. So, you know, I'd put that in the same bucket. If you think about the things that the Biden administration wanted to do or was asking them to do, it wasn't any worse than like what Trump has proposed, right, and I don't think even I mean, you know, I think if you look at I guess I don't know how the policy is playing out right now, but like he hasn't brought people into his administration to run places like the FTC or, you know, the antitrust section of the Justice Department that are against you know, all of the stuff that Biden was doing in the first place. Like a lot of that stuff seems like it is going to continue and go forward.
I will say though, catherine we are literally it's so crazy to say this only two months into his administration. Fair enough, I know you need to like have a huge asterisk about everything, Even though it feels like two years. Yeah, One thing we haven't talked about here, like we're talking about all the economics of it, I do think there's sort of the cultural piece of it too, and like one of the things that Marc Andreessen said after the election when he was talking about why he's decided he wants to be full MAGA now, is it wasn't just that like he felt, like he called it the deal, that like we had reneged on the deal and that Biden, you know, was the Clinton and Obama had told him he can do whatever he wants and make all this money and that'd be great. And now Biden is saying that's not true. An important element of that deal was also that we would call them heroes and think of them as like great men of history and benevolent titans.
I guess Now they don't have that right. We're not calling them heroes and I think a lot can be explained by the fact that any sort of critique of the business practice or the money or whatever was seen as an attack on them personally. They took it very personally and so I think a lot of the diversity stuff they took very personally. They're so emotional, Catherine. Oh, I know they shouldn't be trusted to run anything Seriously, yeah. So I mean, I think a lot of this is just identitarian and anything that it doesn't not even just about threatening the business. But like threatening the business, their identity is so tied to how well the business does. If you threaten that, you're sort of going at the core of who they are and they would rather take down American democracy than go to therapy.
0:45:48 - Felicia Jadczak So that's, I'm just like this is where we are today. It's so wild Like I'd literally rather be an evil demon, villain for all of eternity than do the right thing Right exactly.
0:46:04 - Catherine Bracy I'd rather have my name written in the history books as second only to Hitler, you know, than like take one minute to reflect on my own behavior.
0:46:15 - Felicia Jadczak Yeah, I feel like I don't even know where we are at this point.
0:46:19 - Rachel Murray That is so helpful. Thank you, yes.
0:46:35 - Catherine Bracy I want to know. So how can we address this? See, a lot of people, it's like not cool to be earnest right now, you know, like it feels a little bit like cynicism is en vogue, and I would really ask people, implore people, to reject that thinking. I was in Russia in 2010, which was like a solid 10 years into Putin's sort of reign, I guess, and I remember feeling really depressed about just the sense that everybody had kind of given up. You could feel the cynicism was palpable. Everybody had kind of given up, it was all. What can I get for myself? There was no real sense of there being a civic project at all, and it wasn't. You know, I think cynicism is like this. It's so corrosive, and what I fear is that's what's taking root right now with us. I think we have like one.
There's a few benefits, I think, or things working in our favor. One of them is actually that the administration Trump administration is moving so fast and being so obvious about what they're doing. They have no impulse control, and so they'll do the worst. They can't help themselves, and so I think it makes it easier to get people like their attention focus, and I think we're starting to see that shift. People are waking up.
But if your tendency, if you feel the urge to say, well, it's not going to make any difference, or like they're going to get away with it anyway, so we might as well just not. Or you know, I'm sure that nobody who listens to your podcast would obey in advance. But if you were thinking about like self-censoring, don't do it. You know there's all these things that I think we can do in our own lives that just don't let the rot take root in your own consciousness. That's the thing I would say. Don't censor yourself. Don't tell yourself no before. That's part of what they're trying to get us to do is overwhelm us and turn us inside.
So I'm going to my Tesla protests, you know. Be visible. Go to the organize a you know town hall, or go to the town hall in your community. Say the earnest thing on the internet. Don't be part of the like, the cynical problem. I think that's probably the easiest thing for individual citizens to do. That's the only way we're going to get out of this. I don't trust leaders. We shouldn't trust any leaders to do this. It's going to take a mass mobilization effort at this point, I believe. And so, yeah, I mean, all I can say is wherever you are in your community don't let cynicism grind you down.
0:49:31 - Rachel Murray I'm going to have a cry. This is exactly what Felicia and I talk about all the time. Community is so important and leaning on others for support when you're feeling low sort of be buoyed by other people who are ready. So thank you for sharing that. I think that's absolutely the right attitude.
0:49:49 - Catherine Bracy And I think, if you're looking for it, there's so many places where you can see this hope taking root, and I keep like a mental list of all of the reasons that we should be hopeful and I feel like if there ever was a time for the audacity of hope, it is right now.
0:50:00 - Rachel Murray Well said, well said. Well, I'm going to just. Time is flying by and, honestly, catherine, I feel like we could probably talk with you for like four hours on all of these topics, but I really did want to get to one of. You know something that people aren't talking a lot about these days, which is AI. You may have heard of it.
Obviously, there's been this massive shift. We're seeing it in real time. I mean, wow, what a time to be alive. It's kind of wild right now, and so I'm just curious how quickly it's advancing. Automation, you know it is already starting to replace jobs. You know, gig work has sort of become more of the default. You certainly talked about it in the book. The people who aren't, the 1%, the workers, the small business owners, the managers I'm curious what you think they should be thinking about, what we should be thinking about right now to navigate these changes in a way that makes it feel like because it does feel like it's so out of our control. Are there ways that we can sort of not just survive but maybe build something a little bit better in this wild world that we are living in?
0:51:01 - Catherine Bracy Yeah, I mean and this is what I do for my day job, I think so that I think a lot about this, a lot. You know, I think there's like three buckets of people I come across and so wait, who is this for? Is this for people who like know the technology or working with the technology, or is this for just like every day?
0:51:19 - Rachel Murray It could be both. I think it's for people. I'm thinking about it specifically from the labor standpoint. I know that your focus is like labor and housing, so I'm thinking about it from like the labor market standpoint. So, but that could be either anybody who's sort of like in the workforce or even small business owners who maybe are just thinking about how this will impact their work couple ways to slice this.
0:51:44 - Catherine Bracy So the first way I'm going to say it are like people, workers who have agency in their, in the way that they go about doing their work, and workers who do not.
And then the other sort of the type of worker we think about here is like or the way AI is affecting work. One is AI coming into the workplace, so tools that we use in our job or that are used on us into the workplace, so tools that we use in our job or that are used on us. And then there are the workers who actually power the technology. There are different challenges on each of those. So, and those are the two buckets of work that we do I mean the AI workers in the AI supply chain, and then AI in the workplace. I think for you know your listeners, you know this question of like do I have agency over the technology or is the technology have agency over me? Is the big divide right? So for somebody like me I'm assuming you guys I see all of the possibilities that AI could bring to my work. If I never have to write another grant report, if I never have to build another PowerPoint deck, if I never, I mean if I had had a really good AI research agent while I was writing this, this book would be 50% better than it is right Because, like the amount of time of just like searching the internet for stuff that I needed or crunching the data I only had so much time I was writing this on the side Could have been a way better book. Endless possibilities for people. I guess For those who don't have agency over how the technology is used on them they're usually lower wage workers, they're in a more precarious financial situation, they don't have as many protections in the workplace this could be I don't want to say catastrophic, but a very impactful development. And what I have learned is that from doing all the work I've done over the last 10 years, but also just from, like you know, being across the negotiating table and AI policy with corporations, everybody thinks oh well, the technology isn't going to come in and replace workers. Or do you know, do X, y or Z to workers? It's just not good enough yet. It's never going to get there.
The technology does not have to be as good as a human worker for companies to replace human workers with the technology. I think a lot about the new app that CVS makes you download in order to open the cases in the. You know, everything is behind plexiglass, and it used to be like press the button and somebody would have to come over and open the case, and that was already annoying enough. But now they've decided that, instead of just removing the plexiglass or hiring more workers, what they're going to do is have an app that you have to give all of your data to in order to open the case, and so they don't care about us and they don't care if the experience is worse. They know that, like you're willing to put up with a little bit less and if they can close, you know, increase their margins they will replace.
So the technology does not have to be that good for it to start really disrupting the labor force, and I think that is already happening, and I am extremely worried at how little attention is being paid to that, even from people on the left who work on tech policy or in labor, who want to like shit all over the AI companies about like this isn't that good and they're not. It's not even. They're just hype and all of that Like maybe I don't think any of us know. None of us can actually say, including the companies, including the advocates, where this is going.
But again, we don't need to know where it's going, to know that they will use this technology to exploit workers and we are not on top of it. We are not on top of it. So that is something that is very concerning to me now and that divide of like workers who have control and agency over the technology and workers who don't, is going to get much more stark. Some of those workers are workers going back to where we started in the tech sector, who thought they were workers who had agency over their own jobs, and it turns out, I think, they're learning a lesson now that they don't. They are actually just workers like the rest of us, and that may be the most positive development of the last 10 years, like I said, I mean, I think that we have now. We are now seeing what is actually going on and workers in tech are now seeing what is actually going on and for them to sort of internalize, that could be a very powerful shift in the industry overall.
0:55:59 - Rachel Murray You know, I would love to add to that, if that's okay. Thank you so much for sharing that, because I am like nodding aggressively. So a couple of things. One is so my husband works for a very large tech company in the Bay Area and he recently did a video because he's someone who gets on the video and he does a thing and I swear it looked. I was like, oh, he is 100% going to be an avatar for the rest of humanity, like he's now videoed himself into, like being a robot, a virtual robot. So that's wild.
And someone asked like well, surely they can't just take your thing Like absolutely they can. I'm sure that whatever he signed, he's a work for hire. So this is yeah, so this is his new reality. And I really no-transcript, we're just like the kings and queens is like no, we're not actually, we're way closer to this other side and this is our reality. And there is this sort of I think there is this awakening and I'm excited for that as well and I think there could be a lot of. This is where I'm like oh, maybe there will be real hope and real change for workers now that we realize that we're closer to gig workers than to the broligarchs. That's always been the case, but now the charade is gone, so thank you.
0:57:59 - Catherine Bracy That is very true. Closer to gig workers than the broligarchs, for sure, yeah, yeah.
0:58:05 - Felicia Jadczak Yeah, and that reminds me as well of there's a section in your book where you talk about full-time employment versus contract employment and the rise of contract employment and that trend in tech specifically.
And again, I mean, my husband also experienced that where he was supposedly getting hired full-time for a large tech company in the Boston area and then at the very last minute they're like, oh, never mind your contract, maybe we'll convert you. And then they never did, and that's a whole other story for another time. But we have limited time together, so what I would love to just maybe start to close this out with is more of a visioning question. So, given everything that we've just shared and all the research and your work and all the stuff that we didn't even get into in terms of your background and your experiences, what do you think the American workplace, or the workforce or work in general, is going to look like in 10 years, especially if we could think of it through the lens of hope as opposed to being super cynical about it, which I know is probably really hard to do right now. What do you think it could look like?
0:59:01 - Catherine Bracy I am so bad at these, I guess I would go back to maybe not what it would look like, but maybe what it would feel like or some conditions that might be in place. I will say I am very excited about AI. I'd love to see the conditions in place for the labor force overall to have the psychological safety to use their imaginations in that way to think about all of the ways that AI could enhance their lives or their work or make them more productive or more whole people, or give them more dignity and space to do what only humans can do. We don't live in that world yet. I don't think we're going to get there in 10 years Maybe we will have started to pick ourselves back up from where we are now.
But like, there's a big project here to create the conditions for everybody so that this new technology can be the sort of game-changing value add, actually be the game-changing value add, actually be the game-changing value add that the companies claim it will be.
So that's the work that I'm doing at TechEquity. That's. Our big project now is starting in California with how can we create common sense standards that don't hinder the ability to see where this technology is going, but in fact, like create the conditions on the ground so that people trust it and want to use it and can imagine new uses for it. And that's where, like, vibrancy in the economy is going to come from. If everybody's scared of it, if everybody thinks it's going to take their job or become a sentient robot that kills them or whatever, or is worried about it, like creating a new bioweapon, then yeah, it's not gonna go very far. That's what happened with nuclear right. So if we don't have these sort of guardrails around the systems, then we're never gonna be able to use our imaginations and build this future, and so I think any sort of imaginative exercise just requires that we create that space where people can think forward and not be sort of in this scarcity mindset that they're in now.
1:01:07 - Rachel Murray Love that. Totally agree. Okay, so any final thoughts before we go.
1:01:12 - Catherine Bracy I think we covered most of it. I can't remember half of the stuff that I've said now. Apologies, jesus, if this doesn't all hang together and make sense, but if you want a book, that is both, I think, a fair critique of the tech sector as it has been created and also a hopeful vision for how we might build something that comes next. I think actually, one of the questions we didn't get to in your list was how should we be building? Like what can we build in 2025? And like 2025 might be a rough one to think about, but if we think about like what can we build in 2025? And like 2025 might be a rough one to think about.
But if we think about like what should we build for after Trump? Because fascism always fails, right? I don't know how long it's going to take, I don't know how much damage is going to be done in the process, but we will be in a position to build something new, and it is going to require new technological innovation and the capital to fund it. How do we think about creating that structure that does not repeat the mistakes that we've made over the last couple of decades, and maybe there are moments that in the next couple of years, bright spots where we can start to take advantage of the fact that the interest rates are so high and show that a different way is possible, using this moment that has been created for us. So thanks for having me.
1:02:26 - Felicia Jadczak Thank you, Catherine. If people want to learn more or connect with you, where can they find you?
1:02:30 - Catherine Bracy Techequity.us is the best place to find out about TechEquity's work. And then mynameorg is. Or is it org or com? One of the two? You'll get there CatherineBracy.com to learn more about book events and what's next on the book tour, and we'll be sure to link the book as well in the show notes.
1:02:54 - Rachel Murray Thank you so much, Catherine thank you both we did it. We absolutely had an amazing time with Catherine. Hope that you enjoyed it as much as we enjoyed having the conversation.
1:03:06 - Felicia Jadczak Thank you so much for listening and please don't forget to rate, share and subscribe. It makes a huge difference in the reach of our podcast and, by extension, our work. You can visit us on YouTube, instagram and LinkedIn and sign up for our newsletter at inclusiongeekscom forward slash newsletter to stay up to date on all things inclusion geeks Stay geeky. Bye.