Our Guest Dr. Rumman Chowdhury Discusses
AI Is Killing Free Will: Ex-Twitter Ethical AI Lead Explains How to Protect Yourself
Are we giving up our freedom for convenience without realizing it?
In this episode, we sit down with Dr. Rumman Chowdhury, a globally recognized AI ethics leader featured in Time and Forbes, to unpack the real risks of artificial intelligence, Big Tech power, and data privacy.
As the former Head of AI Ethics at Twitter and Accenture, Dr. Chowdhury shares an insider perspective on how Big Tech is consolidating power, why narratives around AGI and “AI intelligence” are often misleading, and how everyday tools from apps to social media, are quietly shaping a surveillance-driven ecosystem.
This conversation dives into AI ethics, surveillance, and the future of work, and explores why trust in AI is declining even as adoption accelerates. We also break down the real-world implications of AI, and most importantly, how you can protect your data, reclaim your agency, and use AI more intentionally.
00;00;01;28 - 00;00;18;29
Dr Romman Chowdhury
I play this innocent game in like 2016. That data lives forever and ever. And over a decade later, it's being used with a technology that didn't exist at the time for an incomprehensible evil.
00;00;19;01 - 00;00;45;27
Geoff Nielson
Hey everyone! I'm super excited to be talking to Doctor Raman Chowdhary. She's a former leader of AI Ethics at Accenture and Twitter. Remember Twitter and recognized by publications like Time and Forbes as an absolute leading voice in how we use AI. Look, we all have our concerns with Big Tech, but she has actually been in charge of trying to make AI companies more accountable and been fired for it, which I think is a badge of honor.
00;00;46;00 - 00;01;03;12
Geoff Nielson
I really want to know what her biggest concerns around AI and big Tech are, right now. What stories about AI we need to reject, and if there's a responsible way to use this technology at all, it should be an amazing conversation. Let's jump in.
00;01;03;14 - 00;01;16;14
Geoff Nielson
Thanks so much for joining today. Really, really excited to have you. And maybe just to kick things off, I wanted to ask a broad question just around, you know, what concerns you most around the state of AI right now?
00;01;16;16 - 00;01;32;00
Dr Romman Chowdhury
Consolidation of power, lack of agency, which technically are two things that are really one thing. Right. So fewer and fewer people hold more and more power, and we have less must say about what's getting built and how it's being built and what it's being used for.
00;01;32;03 - 00;01;39;24
Geoff Nielson
So when you say lack of agency, you mean as, as kind of consumers are users of AI are, you know, stake in this or ability to direct that?
00;01;39;26 - 00;01;58;13
Dr Romman Chowdhury
Exactly. And, you know, to be very explicit with it, it is overwhelmingly clear that people do not want AI in many of their consumer goods. And products. They do not trust it. They understand what the technology is being used for and of the use cases. They understand how their data is being used in ways that they've not approved of.
00;01;58;16 - 00;02;17;29
Dr Romman Chowdhury
So it's not really a disagreement with the tech fundamental technology. It's a disagree with the power structures. Right. So I think recently there's a poll everyone's talking about where I think it was like 70, for example. Like really high percentage of people, you know, ranked the use of AI very, very low aligned with people sentiments on ice.
00;02;18;07 - 00;02;38;17
Dr Romman Chowdhury
So that's been the running joke in tech that like, wow, we actually hate AI more than we dislike AIS. Just like as a population in America or like right around on par and you know, there is no love lost between the average American and ice. So, and this is just one in a series of many surveys that have been going on for years and years.
00;02;38;17 - 00;03;01;15
Dr Romman Chowdhury
And just to point to another one. There has been a Pew study that's been ongoing, and every year for the past few years, Americans trust in AI systems has declined. And more and more people say that it will bring more harm than do good, which is like the very explicit thing. They're responding to that like more and more people believe every year that this technology will do more harm than good.
00;03;01;15 - 00;03;09;03
Dr Romman Chowdhury
So yeah, when I say agency, it is very clear that people don't want it. And yet all we are seeing are new AI launches.
00;03;09;05 - 00;03;23;25
Geoff Nielson
Well, and that feels like, you know, especially that Pew survey. It feels like to me more of an indictment of the power structure and of big tech. That's right, than of the technology itself. Right. It's saying we don't trust the institutions that are behind this.
00;03;23;28 - 00;03;46;13
Dr Romman Chowdhury
Yeah, that is exactly correct. And again, people are very clear as to why. And that is exactly why I sometimes CEOs interpreted and Silicon Valley certain certainly interprets interpreted as oh the average people doesn't understand what AI is capable of. I think they do understand what AI is capable of, and they're willing to say, yeah, this is like a cool toy, and maybe it could do some impressive things.
00;03;46;14 - 00;03;57;11
Dr Romman Chowdhury
I am not willing to give up my personal liberty and the rights of the people around me so that I can have a calendar agent, you know?
00;03;57;14 - 00;04;22;20
Geoff Nielson
So so with that in mind and the big tech perspective, I mean, one of the thing that's interesting here is there's just there's so much noise, there's so many voices, there's so many conflicting narratives about, you know, what I can do, what I can't do, what the future looks like, how it's going to impact, you know, people's lives and their livelihoods and very different incentives, from the actors behind some of these voices.
00;04;22;20 - 00;04;43;26
Geoff Nielson
If they're trying to get you to adopt the tool or trying to sell their own services and you and I both exist, you know, within this ecosystem to some degree. But but I'm curious if there's any particular narratives that you're hearing pushed by the creators of AI that that you think are dangerous and that you specifically want to call out that we need to reject?
00;04;43;28 - 00;05;06;01
Dr Romman Chowdhury
Yeah, the big one really is well, there's two. One is just the general anthropomorphism of the technology. And frankly, I see that coming. More from the quote unquote good guys anthropic than I hear it coming from OpenAI. And I, you know, I coined a phrase years ago in the days of narrow. I called it moral outsourcing. And we're seeing moral outsourcing at play.
00;05;06;01 - 00;05;33;24
Dr Romman Chowdhury
Right. The intent of using language that is humanizing of AI like AI models are built into design decision for it to speak to you and say things like, I feel I think, I'm sorry, I understand when does not do any of those things right. That is a specific design decision to anthropomorphize the technology. Number one, it it then it alienates us right, from connecting this tool as something that somebody has built.
00;05;33;26 - 00;06;09;21
Dr Romman Chowdhury
And it makes us think fearful of it because we think that it is this big, scary superintelligent thing. But then also importantly for these companies, when you know, an AI system goes wrong, they can conveniently say and all the headlines say AI model erases company database versus saying this product failed, which is how it would be stated in just about any other use case if you had, you know, a server and it caught fire, you wouldn't say server deletes data by spontaneously combusting like it sounds so dumb.
00;06;09;25 - 00;06;36;27
Dr Romman Chowdhury
But that's how we talk about AI agents that take action that have done things. Again, we blame this technology as simply executing a command, and maybe the command is poorly specified or we haven't been able to like ringfenced bad decisions. But and again, like entropic is particularly guilty of saying things like intent manipulation. You know, they have set up an entire team to look at the welfare of the AI itself, which is mind boggling to me.
00;06;36;27 - 00;06;40;04
Dr Romman Chowdhury
And all of this is theater. It's it's theater.
00;06;40;06 - 00;06;58;28
Geoff Nielson
Yeah. Well, and it's interesting that anthropic is doing it, you know, of all people and as you said, I mean, I think it's at least from where we're sitting right now. Yeah, I think the good guy moniker probably does apply more to them to than everybody else, but but why is that? There's a story, not why is it a good guy moniker?
00;06;59;00 - 00;07;21;21
Geoff Nielson
Why are they doing all these performative actions around AI? And I guess, you know, I had a slightly different perspective because when I see all this anthropomorphize ation of AI, to me the motive is extremely clear. It's just to make it to influence people's behavior and make it more engaging and try to just get people using the technology for longer.
00;07;21;21 - 00;07;25;26
Geoff Nielson
So but why are good guys falling into this trap? Good guys.
00;07;25;26 - 00;07;40;25
Dr Romman Chowdhury
Yeah. I mean quote unquote good guys. Right. Which I to use I think it serves them very well, which goes to kind of the second thing that I think is the most dangerous thing being pushed, and it's actually a book project that I'm working on and something I've just gotten really interested in, which is this idea of intelligence.
00;07;40;25 - 00;08;01;02
Dr Romman Chowdhury
Right. So they want us to believe that this thing that exhibits signs of sentience and will, according to their words, also is smarter and better than us and all of the things. And we'll take over everything that we are doing. And really what is what is hidden under that narrative is, you know, the slippery slope definition of AGI.
00;08;01;06 - 00;08;18;29
Dr Romman Chowdhury
If you ask the average person on the street what they think artificial general intelligence is, they'll probably point to a movie like Terminator or her and be like, oh my God, what artificial general intelligence is, is this AI system that's able to interact just like a person. But then for those of us and I think you've seen this too, there's been a slippery slope of what that's been defined as.
00;08;19;06 - 00;08;53;09
Dr Romman Chowdhury
And now it is only defined in economic terms, like OpenAI was calling it the automation of all tasks of economic value. Right. And why, why, why identified? Because they want to make money. So it hides this profit, you know, the profiteering perspective. And it makes them seem like they're pursuing this noble mission for humanity and humanity's growth. Rather than saying, oh, no, we're just trying to automate the work people are doing so we can further consolidate wealth and power amongst ourselves.
00;08;53;12 - 00;09;20;17
Geoff Nielson
If you work in it, Infotech Research Group is a name you need to know no matter what your needs are. Infotech has you covered. AI strategy covered. Disaster recovery covered. Vendor negotiation covered. Infotech supports you with the best practice research and a team of analysts standing by ready to help you tackle your toughest challenges. Check it out at the link below and don't forget to like and subscribe!
00;09;20;19 - 00;09;38;28
Geoff Nielson
Maybe I'm just closer to it, but to me it's it's extremely obvious that the the motive is exclusively economic, right? It's just how can you own this platform under what everybody is doing? That's correct. You know, why would you hire employees when you could hire this employee replacement that we own? But see, it's the.
00;09;38;28 - 00;10;11;24
Dr Romman Chowdhury
Same, right? And there's no end to them. It's like there's no messiness, right? There's no human messiness involved. Yeah. Like the Silicon Valley is such a weird place where, like, tech people hate humanity. They they hate dealing with people. And like, some of it's, you know, kind of funny. I used to joke that, like, especially in the 2010, like the 2000 and the 20 tens, a lot of the startups that were built were kind of built to, like, avoid dealing with messy human things, like, I don't want to cook food, I'm going to get delivery.
00;10;11;24 - 00;10;30;14
Dr Romman Chowdhury
I don't want to drive myself somewhere. I don't do my laundry. So like that kind of startup of like automating the mundane human things. It's even in the language, right? The currently bio biohacking is sort of the big thing. I'm it's a big thing kind of everywhere. But transhumanism, it all originates out of Silicon Valley. And like, what is it?
00;10;30;15 - 00;10;52;06
Dr Romman Chowdhury
What is it specifically saying? Right. It's specifically saying that like you're human functions of like sleep as an example, aging or just being tired. These are inefficiencies. Like it's bad to be a human and like, can you get rid of that? So a lot of this is not 100% correct, is trying to create this AI workforce is like God, I don't want to deal with like pregnant women.
00;10;52;06 - 00;11;17;28
Dr Romman Chowdhury
And someone's got to pick up their kids and like, somebody has a cold. Like it's annoying, right? People's feelings are annoying. You know, rather than seeing a lot of human messiness as a way of bringing value, which, by the way, it is right that institutional knowledge is a thing. There's a reason why you can't just like, fire someone who's been at a company for 20 years, replace them with a kid fresh out of a PhD program and assume they're the same, and it's going to be the same.
00;11;17;28 - 00;11;21;13
Dr Romman Chowdhury
And we're seeing the same for agents as well.
00;11;21;15 - 00;11;57;24
Geoff Nielson
So I want to push on all of that a little bit. And by the way, I agree with everything you just said. That the piece I want to push on is there's a lot of emphasis there on the, the supplier side or the provider side of all these tools versus the consumer side. And one of the things that's, you know, troubling, troubling is the word I'll choose is that there does seem to be a lot of widespread adoption, you know, both of I but, you know, when you talk about all these, like, friction reducing apps that take away, you know, the humanity and things, they've seen widespread adoption, right?
00;11;57;24 - 00;12;12;24
Geoff Nielson
Like that. That demand is there. And so I'm curious what if any onus you put on, you know, the consumer of these and and you know, I don't I don't know like what does that ecosystem look like for you in a better world.
00;12;12;24 - 00;12;27;27
Dr Romman Chowdhury
Yeah. Yeah I, I, I love that you're asking that question because I think a failure state is saying, like, the company has to do all of the things. And because they push back and say, well, the market doesn't say it's right to like Zuckerberg's, to call it the privacy paradox. I think it was coined by him or by Matt at least.
00;12;28;00 - 00;12;47;04
Dr Romman Chowdhury
We're like, you know, they're like, okay, well, all of you advocates say people want privacy, but I overwhelmingly see that when I try to get people to use privacy, they don't want them. They just want to be like yet permissions moving on. Right. And so there's two things. One is I actually think younger generations like are building in friction because they kind of want it.
00;12;47;04 - 00;13;14;07
Dr Romman Chowdhury
I think we've we are starting to kind of come full circle where and I can just anecdotally tell you about a lot of like what like TikTok influencers, etc. I've seen their methods shifting over time from volume and easy to access to be more curated. So think of this as the difference of difference between like how a lot of people are moving off of major social media platforms and moving into lots of like signal groups or telegram groups or WhatsApp groups.
00;13;14;10 - 00;13;29;12
Dr Romman Chowdhury
It's kind of like and that is more friction, right? I can sit there and passively just scroll and look at stuff that's, you know, feeding me junk food, or I can be in like 30 different signal and WhatsApp groups, which is way messier but more rich. And I think we are kind of seeing people are understanding the value of that.
00;13;29;16 - 00;13;42;21
Dr Romman Chowdhury
But to your point, like traditionally, people just want things to make their lives easier in many case there. And I do think there's an onus on the consumer. I think many of us have been saying for years and years and years, you know, like protect your privacy. You never know what your data is going to be used for.
00;13;42;25 - 00;14;01;00
Dr Romman Chowdhury
And the pushback was always, I have nothing to hide. I think, you know, we're in the finding out phase of all of that where it's like, oh, you thought you had nothing to hide when you're playing Pokemon Go. But guess what? When you played Pokemon Go that is now being used to generate the surveillance state and like it is a clear line.
00;14;01;00 - 00;14;20;18
Dr Romman Chowdhury
It's an abstraction. So I think, you know, people are people have been hearing this story for years, for whom again, saying like you should protect your data was an abstraction, are now seeing how their data is being used because the lines are being drawn. I'm curious to see, you know, what sort of consumer protections pop up, which, by the way, I'll also add one more thing.
00;14;20;24 - 00;14;45;12
Dr Romman Chowdhury
That's also why they're all trying to consolidate their power, right? They are they are trying to own the horizontal. You know what. Like a Sam Altman funding, you know, World Coin was not born out of some like, you know, good for humanity and like, you know, because he realized that the technology that he has helped ushered in will completely erode trust.
00;14;45;14 - 00;15;03;28
Dr Romman Chowdhury
Because the ability to create realistic deepfakes will get to the point where you will need biometric identification. So you just the idea of gold coin and making people scan their biometrics is him collecting a database so that when he needs to own the privacy part of things, he gets to own that too. It's a horizontal that he's trying to build.
00;15;04;00 - 00;15;08;05
Geoff Nielson
Right? He he gets to profit off solving the problem that he himself.
00;15;08;07 - 00;15;31;07
Dr Romman Chowdhury
That he created. Right? Like coming and going, but also the investment in data centers, investment in minerals. One of the theories I don't know how valid this theory is about like, you know, trying to get Greenland was that there were specific rare minerals to be mined there that a bunch of the tech CEOs had invested in the startup that specifically was going to be based.
00;15;31;07 - 00;15;40;25
Dr Romman Chowdhury
So again, like, I don't want to like go into like red strings. But it's really hard not to in the space because many of our red strings are proven to be true.
00;15;40;27 - 00;15;49;16
Geoff Nielson
Yeah. Well, and and you know, I, I've said in a lot of these conversations, we're not a political podcast, we're a technology podcast. But the line seems to get blurry and blurry or these days you.
00;15;49;16 - 00;16;05;10
Dr Romman Chowdhury
Can't, right? Like you truly cannot. You know, like one of our, one of our most infamous CEOs. You know, Elon Musk held a pretty prominent position in the current administration. We know like they don't divorce technology and politics. Why should we?
00;16;05;13 - 00;16;42;24
Geoff Nielson
Yeah, I think that's well said. So I, I want to come back to this, the consumer side and people maybe not appreciating that. As you said, it's not abstractions. A lot of their decisions that we as consumers are making are being they're creating the society that we have and that we're going to have. And so if I can frame it up this way for you, if you were going to have a message to consumers saying, basically, you know, wake up that, you know, this is what you need to know what what would be kind of your, you know, pithy message around what you want people to be focusing on right now.
00;16;42;26 - 00;16;50;08
Dr Romman Chowdhury
And, gosh, that's a good question.
00;16;50;10 - 00;17;15;02
Dr Romman Chowdhury
You know, maybe it's just something very basic about just app hygiene and data privacy, like, just go into your phone, go into all of your settings and just maximize all of your protections. You know, if you can afford it, you know, buy a, you know, get a VPN service, you know, like in our house, we have something called a pie hole, which I plays a Raspberry Pi tool that blocks all the ads coming in.
00;17;15;03 - 00;17;35;16
Dr Romman Chowdhury
Like if you have the capability, just do it. You know, it maybe. I guess, like my pithy way of saying it is explore the world of products, tools and services that are now exploding around protecting your privacy, security, and data. Because there actually are a surprising number of things that have come up, especially in the last few years.
00;17;35;23 - 00;17;59;23
Dr Romman Chowdhury
Some of it requires a bit of tech skill. A lot of it does not. You just have to search. It's not going to be number one in the App Store because these are not massive, big funded companies. It's literally like two friends who felt passionate about data protection, literally, you know, or it's comes out of the Raspberry Pi community, which is largely open source and, you know, built on subreddits, you know.
00;17;59;23 - 00;18;09;23
Dr Romman Chowdhury
So I guess I'd say that like, explore the world of things that have popped up to help you protect your pie. It's not it is on you, but you don't have to do it alone.
00;18;09;25 - 00;18;30;07
Geoff Nielson
So so just along those lines, I want to go back to kind of painting the picture of the importance of this. And, you know, you made an offhand comment about, you know, building the surveillance state. But I want to, you know, just kind of, as I said, paint the picture of why is this so important? Why is it so important to protect your privacy and your data?
00;18;30;11 - 00;18;39;16
Geoff Nielson
What are some of the, you know, malicious things that these organizations can be doing with it? And how do you see it kind of playing out for people in a negative way?
00;18;39;19 - 00;19;05;09
Dr Romman Chowdhury
Yeah. And and again, I think people are have already seen so many examples play out. I was reminded of one of the very first talks I ever gave back in like 2017 about like sort of AI and ethics. And it was at SurveyMonkey and this is in Atlanta. And I don't know how we got here in the Q&A, but somebody mentioned something like 23 and me and I'm like, oh God, never use those like, you know, and this is again, 2017.
00;19;05;09 - 00;19;26;07
Dr Romman Chowdhury
Fast forward some years later, they're bankrupt and they bought by private equity. And who knows how private equity is now going to use people's genetic data. Right. And even even during their existence they did you know, questionable things like make a Spotify playlist based on your DNA. Truly they did. I'm like, what? It's so they were trying to market it.
00;19;26;07 - 00;19;49;27
Dr Romman Chowdhury
Right. So we have we have seen this at the and the latest is, you know, people pointing out, you know, there's this like TikTok trend. There are two TikTok trends like sorry to be like so chronically online, but it's interesting. I, I will take a step back and say, one of the most interesting things to me is that, you know, a lot of this narrative is being pushed by regular people to regular people.
00;19;49;27 - 00;20;12;10
Dr Romman Chowdhury
What I love is that it is not coming from me, the quote unquote expert lecturing people from, you know, my space of like, I do this all day, every day. What I love is seeing just regular people saying, you know, this seems suspicious. I, I so I'm so glad to see that because they're stating it in a way where it's like, hey, I'm a normal guy, and I'm telling you, a normal person to do this, right?
00;20;12;13 - 00;20;39;08
Dr Romman Chowdhury
There are so many memes that are like, you know, you ten years ago or something like that, and people have pointed out that that can be used to train databases, like whether it is or it isn't. But, you know, the one I alluded to earlier, very specifically, many of us, myself included in literally a time so long ago in our heads that it seems like a different millennia played Pokemon Go, and it was actually a very beautiful thing.
00;20;39;08 - 00;21;05;07
Dr Romman Chowdhury
I loved Pokemon Go. What I loved was seeing how many families were out and kids were, you know, engaging with people in this very friendly way. And, you know, even I would have said, oh, it's just a game. I used it well, hey, cool. Now I've now we find out that Niantic has sold all of that data. That data is being used to literally, you know, map out surveillance in the United States.
00;21;05;07 - 00;21;27;22
Dr Romman Chowdhury
So you played an innocent game. I play this innocent game in like 2016. That data lives forever and ever. And over a decade later, it's being used with a technology that didn't exist at the time for an incomprehensible evil. And those are not like exaggerated, exaggerated words.
00;21;27;24 - 00;21;43;22
Geoff Nielson
It's, it's it's such a sad development because I have I have the same kind of, you know, fond memories as you do. Like I was out there. Yeah, it was a go catching Pokemon. And it was like, in some ways it was like a golden age. Like it just felt so carefree, me saying. Yeah.
00;21;43;22 - 00;21;45;14
Dr Romman Chowdhury
And to them, yes.
00;21;45;16 - 00;22;03;01
Geoff Nielson
Yeah, yeah. Weapons like is I was just going to say like the fact that it's been like weaponized is really, really depressing. But I want, I want to come back to, these TikTok trends you're talking about and specifically I want to I want to ask you about TikTok, because TikTok is not a neutral player in this right?
00;22;03;01 - 00;22;27;00
Geoff Nielson
They they've got a fair they have an algorithm. They push content. There is, you know, not even to get into the geopolitics of TikTok. But I'm curious your posture around a tool like TikTok, whether it's TikTok, whether it's, you know, some of its competitors. Yeah. Do you recommend people use it? Should they not use it? Can they use it, ethically or in an informed way?
00;22;27;00 - 00;22;29;12
Geoff Nielson
What should our relationship be with some of these tools?
00;22;29;12 - 00;22;50;04
Dr Romman Chowdhury
Yeah, yeah. And I think what you're pointing at is like the exact manifestation of, like, just lacking agency. So I struggle in general with my relationship with social media, like in general. So I, you know, back in the beautiful days of Twitter was just hyper online. The entire field of responsible AI was built by people snarky on Twitter.
00;22;50;04 - 00;23;07;07
Dr Romman Chowdhury
That's how we all met each other. And not just snarky. That's how we read each other's papers. It's how we interacted, right? Like we were all there were not a lot of us that did this work. You know, when I first got started in 2017, there's still not even a lot of us. Everyone's everywhere. It was it was beautiful.
00;23;07;10 - 00;23;37;29
Dr Romman Chowdhury
And I moved from being hyper online or hyper on Twitter to just not having a social media presence at all. I don't even really post on LinkedIn, but the reality is so much of the public discourse does happen on social media platforms. You know, at the same time, I do see how again, like now that I've been, I would say I've been like pretty much offline other than like very short stints on, like LinkedIn or like playing around with TikTok since, I mean, on my team.
00;23;37;29 - 00;23;55;12
Dr Romman Chowdhury
And I all got fired. That good for me. It was a principled stance and I'm not going to be on X. Why would I why would I be there and, you know, have my data and my attention support this platform. You know, so I, I am of two minds like it's hard for me to answer your question because like you're hearing my struggle in real time, right?
00;23;55;14 - 00;24;14;10
Dr Romman Chowdhury
Where I'm like, okay, there is this need as a professional, especially a public professional, to be on these environments, be on this platform on the other end, like we also know just to bring in another narrative, that a lot of the news media is now captured by billionaires, and we actually consider a lot of media to be untrustworthy.
00;24;14;17 - 00;24;43;04
Dr Romman Chowdhury
And social media has traditionally proven a place where you can try to find more objective sources. Right? So it's like all of these moving parts together. One is like lack of trust in centralized media institutions. Not that social media isn't that, but it's another version of that. Right? So there's that. There is like what we know to be this necessity of being online, to just be aware of what's happening in the world, or maybe even engage with your community of practice.
00;24;43;11 - 00;25;07;07
Dr Romman Chowdhury
And then there is this, like evil that we know exists. Right. And how do you how do you reconcile that? I wish I had an answer because I don't like I wish I could just say it is unethical to use these platforms. Well, I can say that, but then I cannot in good faith say, therefore, don't use any of them ever, because I do think it's fair to say that, you know, if you're a professional, you know, for example, by community now exists on LinkedIn.
00;25;07;07 - 00;25;24;28
Dr Romman Chowdhury
And I don't really go on LinkedIn very much. Is that detrimental to me? Maybe. I don't know because I can't measure the opportunity cost. Right. I have no idea what the opportunity cost is. If I was chronically posting, you know what, I be interacting with more people? Would I have more knowledge? What more consulting or speaking opportunities come my way?
00;25;24;28 - 00;25;40;01
Dr Romman Chowdhury
Maybe. Probably right. I don't I don't know. So yeah, it's kind of a convoluted answer to your question because like I said, you're here in real time. What goes on in my head constantly whenever someone sends me a LinkedIn post or whatever.
00;25;40;04 - 00;26;00;16
Geoff Nielson
Well, and what I did here and it made me reflect on is just how much of a bummer it is that a lot of these platforms used to be a place for independent discourse, not owned by, you know, Megacorp, and they've been absorbed and it feels like, you know, I, I, I'd like to think a few years from now we'll have some next wave somewhere else where we can have these discussions.
00;26;00;22 - 00;26;05;11
Geoff Nielson
That's independent again, because it just feels like we're in between and it's been usurped from us. Right?
00;26;05;13 - 00;26;40;08
Dr Romman Chowdhury
I hope so, and actually, I think there is a way that, you know, one can use AI tools in the way. So like I've been playing with perplexity computer lately. And I would say the joy that it brings to me is very similar to like when I was in high school and the internet was like kind of this thing we were all learning and like, if you like, remember who you were at the time that the internet kind of became a thing, and if you were interested in it, you're like, wow, I can learn anything.
00;26;40;15 - 00;27;00;11
Dr Romman Chowdhury
I can meet anyone. Like, as like, suspicious as all of that is. Right. But there is a genuine, like, purity to it, like you said, and also like a joy to knowing that, like information access at your fingertips. And, you know, I have felt like that playing with some of the agen tech I apps and I've done I like spent a couple of days.
00;27;00;11 - 00;27;32;27
Dr Romman Chowdhury
I just like setting up dumb things, right. Like send me a daily email with all of the ebooks on sale for Kindle and the genres that I like and like, that has pretty poppy. So much to like. I probably spend more on ebooks now because that thing is actually particularly good, but it is that there is this like way of using these tools where it's not about them telling you what to think or feel or what you need, but how we are then building, which is again, like kind of the mindset of the early internet.
00;27;32;27 - 00;27;55;07
Dr Romman Chowdhury
I think the difference here is the internet was free for us. Now we have to pay for tokens, which that that's the part where it's like, oh, but you have actually robbed us of like a public good and like that's how it's different. But like the the mindset and the feeling I have is similar. But again, like another very difficult thing for me to reconcile not knowing.
00;27;55;07 - 00;28;20;26
Geoff Nielson
And you know, I appreciate that it's not clean and it's not as simple as just don't use these tools. You know, given your perspective and your experience and, you know, again, I'm just I'm just kind of reflecting on, you know, what you were saying about that. And to me, one of the other differences in my mind, and I'm curious on your thoughts, is that the internet was, how do I want to frame this?
00;28;20;26 - 00;28;44;08
Geoff Nielson
Like the internet was more neutral in the sense that you could go on it and you really had full agency or close to it of what you're looking for. And these platforms and these algorithms in some way, take that agency away from you because they are pushing you towards specific experiences they want you to consume in a certain pattern, they want you to consume certain stuff.
00;28;44;15 - 00;28;52;00
Geoff Nielson
And so it feels like you have to be a lot more intentional about how you use these tools. If you're going to maintain that agency.
00;28;52;03 - 00;29;12;18
Dr Romman Chowdhury
Yeah. And and to some extent, it's exhausting because you, to your point, constantly have to pay attention to like, am I being manipulated in some way? Right. Is this information real? And it's, I think, an analog way of thinking about or a semi analog way is like how we don't just look at the New York Times and Washington Post and say, oh, that's the news.
00;29;12;18 - 00;29;42;27
Dr Romman Chowdhury
We're like, oh, that's the thing that Bezos so and so of course they feel like this about this. And let me go online and find if three other sources are talking about and we now have a lot more work we have to do, we cannot approach it as innocently and to your point, like, you know, one of the things that came up in discussion in Twitter amongst Twitter leadership and I can't really name names, was kind of like the sentiment that it was unfair that social media companies were under all of this scrutiny to do content moderation because we never content moderated the internet.
00;29;42;27 - 00;30;04;03
Dr Romman Chowdhury
I just give you an example, like Nazis are allowed to have websites, but Nazis are supposed to be banned on social media. So like, not to say that anybody wants Nazis, but like but to say that like the, the rules seem to have applied differently because but again I think it's because the internet was born as this tool of free access to information that nobody owned and nobody paid for, or at least people do pay for it.
00;30;04;03 - 00;30;12;08
Dr Romman Chowdhury
But like, not in the way that like we are directly putting dollars into specifically accessing information.
00;30;12;11 - 00;30;35;08
Geoff Nielson
I wanted to go back to this, you know, this experience you had at Twitter. And for listeners who don't know, you were a leader on the machine learning ethics, transparency and accountability team, at Twitter. And I'll, you know, ask you this in a deliberately broad way, but, you know, can you tell me a little bit about kind of your reflections on that time and you know, what you were trying to achieve and what it what it taught you?
00;30;35;10 - 00;31;04;09
Dr Romman Chowdhury
Yeah. So, it's worth also thinking through like structurally where I was in the company, there are there have been many teams that do this kind of work, you know, sort of infamously or famously at places like Google. There's still some at Microsoft, etc.. My team at Twitter, at least for its time, was very unique. I was an engineering director and I sat on this team called cortex, which is basically, if you know, the structure of Twitter, it's where all of the machine learning and AI services were offered.
00;31;04;09 - 00;31;24;17
Dr Romman Chowdhury
So there are teams that own products like a product could be like who to follow. Right. And the the core tools that they use to build who to follow came from the team that I was on. Why is that important? It is important. And that's why I actually in some ways, I took the job is because I was in the room with people building it, so I didn't have to ask permission.
00;31;24;23 - 00;31;43;11
Dr Romman Chowdhury
My colleagues and my peers were not, you know, policy and there's nothing wrong with like policy, etc. but you're not in product. And to me, if you're in product, if you're in that room, then you have access and privileges that people in other kinds of teams have to fight for. Right? So if you're a responsible AI team, as a pure research team, you don't own product and you don't influence product.
00;31;43;13 - 00;32;09;06
Dr Romman Chowdhury
Or if you want to, you have to fight for it to happen. Whereas I am just in the room and when there was an engineering meeting, I'm just in that room because these are my peers. That's really important to like shaping how the tool is being built from design. So I loved it. And also the other thing I'll say about Twitter is somehow it kept that like weird, very millennial 2000 startup vibe, you know, like it's it's so corny, but like in the best way possible.
00;32;09;10 - 00;32;34;18
Dr Romman Chowdhury
It was a very corny company, but I loved it. And I one thing I will say about a lot of Twitter employees, most Twitter employees, like we knew we had a really difficult task and we knew we weren't going to get it right. And one of the things I loved about Twitter is they owned their mistakes, right? I think that's kind of what Twitter was famous for, is when Twitter would go down there like, sorry, I think we done messed up.
00;32;34;20 - 00;33;04;01
Dr Romman Chowdhury
And that was and I loved that. And around when I was interviewing was when, there was, you know, like the citizen data science around potential algorithmic bias and the image cropping algorithm. Many of you remember this, but basically people realized that it seemed like the Twitter image cropping algorithm was cropping out darker skinned people. And, you know, it had already by then been demonstrably shown that these models underperform for people who are higher on the Fitzpatrick scale.
00;33;04;01 - 00;33;22;23
Dr Romman Chowdhury
So darker skin tone. So instead of doing what a lot of other companies do, which is sort of hide behind PR narratives, you had Dentally and proud. So Prague was, the CTO Dentally was then had a product, I believe, hopping in and being like, hey guys, what are you seeing? Like, can you tell us about it? Like, I want to learn you know, and they were very open.
00;33;22;23 - 00;33;48;09
Dr Romman Chowdhury
They were very responsive to criticism and they promised to do something about it. And then they did, which is again, so rare in general and so rare these days, and more and more as like we have become there's like now this elite ruling, godlike class of AI CEOs. I can't imagine these people having the kinds of interactions that Parag and Natalie had like four years ago.
00;33;48;09 - 00;34;06;27
Dr Romman Chowdhury
There's I just I don't I don't see them doing that in an honest and open way. I could see them snark King. I could see them belittling people on social media. I don't think I, I can imagine them coming in and earnestly asking about a product from trying to learn how to fix it.
00;34;06;29 - 00;34;30;18
Geoff Nielson
Yeah. Yeah. Sorry. The reason, the reason I'm just like pausing, reflecting on that is I feel like it's like you can draw an arrow from that statement right back to the start of this conversation about what's going on in big Tech and also about, you know, the market, sorry, the market in terms of consumers, not the market in terms of investors, that's a whole other story that we can or cannot talk about.
00;34;30;20 - 00;34;52;03
Geoff Nielson
But but this distrust in the products and in the tech is the the leadership style. Like it like it feels like leaders are deliberately putting up those walls in a way that maybe they didn't ten years ago. And I don't know, is that going to hurt them in the longer term? Is it going to pay off it? It's it's kind of fascinating.
00;34;52;06 - 00;35;17;08
Dr Romman Chowdhury
Yeah. I think that sort of dodging and obfuscation does not, in the long term benefit any CEO. I think maybe in the short term it helps them because they get to avoid problems. You know, one of the best books, I would argue, probably the best book in the space to understand, like this whole, like you said, it comes full circle is Shoshana Zuboff The Age of Surveillance Capitalism, which, like you really, you could literally only it's a very big book.
00;35;17;08 - 00;35;45;06
Dr Romman Chowdhury
It's like literally three inches thick, but you could just read the first chapter and what she does beautifully is outline the strategy and the economic model of Silicon Valley. Right. And part of that strategy is like playing this waiting game, right? Waiting until we as a public get exhausted with a topic. And you know, the example she gives something that I personally had forgotten, which was that when, Google Maps first came out, people were up in arms about it.
00;35;45;06 - 00;36;05;17
Dr Romman Chowdhury
They were really upset and people were protesting. They were stopping the cars, they were building higher fences. And Google did not say or do anything about it. They just kept their mouths shut. And what they waited for was for the momentum to die down. And I don't think any of us question the, you know, the Google recording cars that we see driving around for maps anymore.
00;36;05;17 - 00;36;22;20
Dr Romman Chowdhury
And like that really made me pause and reflect on actually how good that strategy can be. But I think, again, AI is just it is just so different. It's again, so much less abstract. It is especially with generative AI. It's in our hands, it's in our faces. We're seeing it play out real time. We're seeing it play out in the genocide.
00;36;22;20 - 00;36;42;17
Dr Romman Chowdhury
We're seeing it play out in the field of war. We're seeing it play out, you know, with protests. Right. And again, like the technological and the political are the same thing now that it's it, I don't think that strategy works. I don't I think people are too smart now. I think there was a lot of naivete that we are now past, and I'm glad.
00;36;42;17 - 00;37;05;09
Dr Romman Chowdhury
I'm glad people are not naive about it anymore. But what they have now removed is our ability to make decisions. Right. So one of the one of the other studies of the many studies that point out how people don't want AI products, I think one consumer study showed that, product labels that said they had AI were purchase 74% less and then if they dropped the label, so now they just drop the label, they don't drop the AI, right?
00;37;05;09 - 00;37;09;18
Dr Romman Chowdhury
So like they take the wrong thing away.
00;37;09;20 - 00;37;31;17
Geoff Nielson
So I there's a few different avenues I want to take you, but maybe let's start with with us as consumers. Like if we want to be more ethical about it, if we want to, you know, be building a better future here, like, what do we do? Like what's our what's our kind of imperative? And how should we be thinking just about how we interact with these products?
00;37;31;17 - 00;37;40;14
Geoff Nielson
And, I mean, you said it yourself like it's it's it's not clean. It's not it's not necessarily just, you know. Yeah. You know, throw out your phone.
00;37;40;17 - 00;38;00;10
Dr Romman Chowdhury
It's not. And I think those are like very trite things to say. And they're often borne of an immense amount of privilege. Like I realize that I have an immense amount of privilege being able to just not be online. Like I literally think about this morning. I actually was talking to my friend about this morning. My friend is a doctor, and she said to me, you know, Roman, I wish I had the energy to go build a brand.
00;38;00;12 - 00;38;23;21
Dr Romman Chowdhury
Why does a medical professional need to build a brick? They do now, right? Medical professionals feel that they because she is one of the dying breed of, you know, small business owners. She has her own practice and she now has to think not just about taking care of patients, but about building an online brand, because that is what drives patients to your door, which is, by the way, I think, ridiculous, I think, as a ridiculous state of affairs.
00;38;23;25 - 00;38;50;25
Dr Romman Chowdhury
So I was reflecting on like how I am privileged that I don't have to be chronically online and thinking of building a brand in order to feel like I can get ahead. Right. So like, so what can what can people do? Which is which is always a great question. I think number one is one of the things about not being chronically online, even though I'm somewhat somewhat like I use social media as a way of understanding, like social movements, and how people are thinking about things sees it as an observational tool.
00;38;50;27 - 00;39;08;23
Dr Romman Chowdhury
I think one of the things I realize from not being like on Twitter all the time is that is how much we get caught up in, like local maxima and minima and what I mean specifically is that, like, there are attention cycles that are very short and seem incredibly consequential at the time that you then realize really didn't matter.
00;39;08;28 - 00;39;37;04
Dr Romman Chowdhury
I know there's a lot of stories that I miss and a lot of like main characters online that I miss, but actually it doesn't impact my life. So I think one is just like not being fooled by the moment, like the local, the local minima and seeing the big picture. I will also add that, by the way, a lot of this very aggressive do it now or else narratives about AI are actually meant to make you not plan long term.
00;39;37;04 - 00;40;14;13
Dr Romman Chowdhury
Not like think deeply. It's meant to make you run around scared and not be strategic. So, you know, my advice in general, but also the way it's technology is like think strategically about what will serve you not and don't make fear based decisions. What do I mean by that? Like my, my, my intent of like using computer, for example, which is an agenda tool built by perplexity, was very intentional and I have been experimenting with it to think about what I want to use it for and not use it for it, versus the hype around Open Claw a few weeks ago, which was to me insane that people were like, I'm going to give it
00;40;14;13 - 00;40;51;13
Dr Romman Chowdhury
access to my bank accounts and it's going to bet on Poly Market for me. I'm like, why don't you build something dumb? Like, as I said, a daily email with Kindle recommendations before you go giving your bank account information to it. Right. So like you need to figure out what your relationship with this technology will be. So yeah, I think that's my big advice is like think strategically, think about how it serves you versus it being based on FOMO or fear, or I'm going to lose my job or whatever else story that that is being pushed to make us too scared to ask questions.
00;40;51;15 - 00;41;06;09
Geoff Nielson
I really like that. And it ties into something that you said earlier, which, you know, I was a little bit surprised by, but, you know, made me happy, which is you said that there's when you use some of these tools, there's, there's joy in it for you. Right? Like you're actually able to use these in a deliberate way and, and find joy.
00;41;06;09 - 00;41;19;19
Geoff Nielson
And so is that is that basically your advice around this is the secret to finding joy in this stuff? Is being strategic, being intentional, and starting with how do I want to use this? Versus, you know, how am I expected to use this? Maybe.
00;41;19;21 - 00;41;36;05
Dr Romman Chowdhury
Yes, I think you framed it absolutely perfectly. It's it's literally like, how do and this just goes back to agency. Right. And so much of what I've done over the past few years is and I would even argue, maybe like this has been the arc of my career. It was a topic of my TEDx talk, for sure, is how do we give people agency?
00;41;36;05 - 00;42;04;19
Dr Romman Chowdhury
Because with agency we make choice. When we make choices, we're actually happier with the outcomes. I think part of this, like the satisfaction people feel maybe even isn't how the tool is performing, but the fact that nobody bothered to ask us, nobody bothered to say, do you want this? You know, like I, I don't have kids, but I imagine parents trying to mediate technology and their children are irritated by the technology, not necessarily because they don't like technology, but because it was they were given no option.
00;42;04;19 - 00;42;29;17
Dr Romman Chowdhury
And now they have all this responsibility for something that they didn't choose to do. You know, an a good you know, that's a good way to think about it. Right? It's not that people don't want responsibility. It's a we want to make a choice to have to have a responsibility. Right. If you think of like, let's say, a hobby that you have, like people who are, let's say like marathon runner, if they will wake up at absolutely wild times and go run for 13 miles.
00;42;29;19 - 00;42;49;14
Dr Romman Chowdhury
Why? But if I were to say, hey, you have to wake up for work at 4:30 a.m. tomorrow, they would be irritated to do this. Why? Because they'd say why I did not make that choice. So part of the to your point, part of me finding joy in using sort of more, hands on a tech tools is that I get to decide what it's being used for.
00;42;49;14 - 00;43;12;11
Dr Romman Chowdhury
I get to decide what problem solving. And by the way, I'm starting to see that arc with more of the tech tools that are being. I think before they were trying to tell us what we wanted to use it for. A great example, by the way, are calendaring apps or any sort of like AI assistant type apps that really don't do very well because they are so prescriptive.
00;43;12;11 - 00;43;33;26
Dr Romman Chowdhury
They make these broad assumptions of what you want. So I personally have never found one that works for me because I travel a lot. So if I had a calendly then like what happens is I end up with calls at two in the morning because unless I am constantly going in and updating wait times, which is a lot of work for me, then it's just not going to work.
00;43;33;26 - 00;43;54;20
Dr Romman Chowdhury
So they have not thought through things like that, or at least not giving someone like myself the tools to do that easily. So again, it's sort of this like generic prescribed use case being shoved on us. And that's what people are rejecting. And what does it look like to just like let go of some of the power and maybe let us do stuff like let us drive the car for a bit.
00;43;54;23 - 00;44;15;09
Geoff Nielson
It's it's really interesting. And I'm just, I'm, I'm absorbing so much here and I'm thinking about it and, Yeah, I'm trying to zoom out and think about what this cycle looks like in the fact that we've got the backlash, we've got this big push by big Tech and the the phrase that came to mind. And I haven't thought about this before, but it's, you know, good branding, is it?
00;44;15;13 - 00;44;29;12
Geoff Nielson
It feels like there's a war on agency, like a war on human agency of like, no, you don't think about it. You let us think about it for you, which is in direct opposition with us feeling a sense of, you know, joy or pride a compliment.
00;44;29;14 - 00;44;30;28
Dr Romman Chowdhury
Absolutely.
00;44;31;00 - 00;44;47;26
Geoff Nielson
So if we if we agree that that's true, well, you know, how optimistic are you that this is getting better versus worse? Like what? Where is this going? Is do you think we're going to get an inflection points and it's going to be similar to what you said about groups on Twitter. And people are going to say, no, this is bullshit.
00;44;47;26 - 00;45;06;20
Geoff Nielson
I'm taking back my own agency and kind of forcing organizations to come with them or, you know, is it going to be the opposite? And we just as people go more toward that, like Wally future of just, yeah, turn off my brain or secret option.
00;45;06;22 - 00;45;30;28
Dr Romman Chowdhury
Yeah. I mean, I, I think my answer that would like change at an hourly basis depending on like, what new what fresh nonsense I may have seen online or what's going on. Here's what I want to believe. And actually, I do believe. Like what? Like. And it's I think I again, back to these like, constant battles I have in my own head.
00;45;30;28 - 00;45;56;19
Dr Romman Chowdhury
I constantly wonder if, like me, existing in this space is actually a bad thing for people like myself, right? Because, you know, like people like me are not tech abolitionists, even if they use the language of tech abolitionism. Right? Because to truly be a tech abolitionist like that means you're just literally disengaged from all of it. Often I do wonder, like, what would I do if I just literally just a completely actually did throw my phone into the ocean?
00;45;56;19 - 00;46;31;12
Dr Romman Chowdhury
Like that is a lifestyle of change I've seriously considered right? I, I, I wonder if the existence of people like myself is futile, right? Because we're just we're making it seem we're giving window dressing when there's a fundamental problem. Right. So the reason I still do this every day is I actually I do believe in the human condition and that the, the that human beings want and need things like agency and ownership and at some point will fight for and will make it happen, whether they're fighting for it with their dollars, whether they're fighting for it, you know, in Congress, by trying to pass bills, you know, you see more and more young people running
00;46;31;12 - 00;46;56;07
Dr Romman Chowdhury
for office and running for office on specifically tech platforms. Like it is very fascinating to me. I live in Texas that we can have candidates in Texas that can run on issues of like data privacy and the population. Their constituency understands what it means. I think it's a great I think it's a great thing. So I, I do believe and, you know, the the think is the wheels of democracy move more slowly than the wheels of autocracy.
00;46;56;07 - 00;47;12;07
Dr Romman Chowdhury
Right. So one example I use constantly actually, is Elon Musk. And it's unfortunate that you know, that man has actually failed at many, many things, many of the things he has tried to do, much of the harm he has tried to inflict, he has actually failed at. And the problem is when he does it, he does it quickly.
00;47;12;07 - 00;47;31;04
Dr Romman Chowdhury
He fires half of Twitter illegally. He, you know, cops into the US government hires a bunch of babies and they wreak havoc and happens like overnight very fast. Right. But the wheels of democracy do turn because guess what? He lost the lawsuit against Twitter employees. He also lost a lawsuit against Tesla employees. And they did something, something very, very similar.
00;47;31;06 - 00;47;56;10
Dr Romman Chowdhury
You know, Doge a judge just ruled that many of the actions that Doge did were incorrect. And, you know, not actually permissible because they were never congressionally approved. The problem is that took two years to happen, right? The Twitter lawsuit took three years to happen. And the Wizard of actually moves slowly. So maybe it is not in this like sort of social media driven, short attention span world back to like, let's think of the meta picture and the big picture strategy.
00;47;56;10 - 00;48;17;16
Dr Romman Chowdhury
If you're paying attention, the big picture, you'll see that like a lot of these things are not particularly successful over time in the immediate, they seem very successful. So yeah, I, I think that people and I don't even want I need to believe. Right. I actually literally need to believe because that is what keeps me going at this job.
00;48;17;18 - 00;48;21;07
Dr Romman Chowdhury
I need to believe that people are going to do the right thing.
00;48;21;10 - 00;48;48;26
Geoff Nielson
Well. And you know, your your case for optimism is pretty compelling. And I like your, your point that it requires zooming out sometimes and it requires getting beyond, you know, the media cycle. And I mean the media and they're complicit in, in this as a whole, is a whole other conversation. And I mean, like, you know, I should acknowledge again that, like, even us, by having this conversation, we are part in some way of that landscape.
00;48;48;29 - 00;49;02;01
Geoff Nielson
But but it's a compelling case for optimism that that there's just the stories you don't hear. And you know what that means in terms of, you know, safeguards to, you know, what we want in our society.
00;49;02;03 - 00;49;34;13
Dr Romman Chowdhury
Yeah. And again, it's just like I have not seen in the almost ten years I've been at this job, people become smarter and demand more and demand better. You know, just again, anecdotally, one of the this is before I even worked in responsibly, as you feel didn't even exist. I remember a while ago, like Google had done this thing in Mountain View where they took, these mosquitoes and they had synthesized something to help prevent, I want to say West Nile virus.
00;49;34;15 - 00;49;55;25
Dr Romman Chowdhury
And they just sort of released these mosquitoes after just injecting them with thing. And at the time, this is like in, like peak tech optimism and everyone's like, wow, Google. Amazing. And I'm like, did this get FDA approval. You know, and like just like thinking through like but again the predominant narrative was just so optimistic like, oh my God Google.
00;49;55;25 - 00;50;16;29
Dr Romman Chowdhury
They're going to cure West Nile virus by stopping the mosquitoes. I don't think that narrative would fly today. Right. You know, I think today people would be like, excuse me, why is Google, like doing biological experiments on people, you know, at you you could not ask those questions ten years ago that you can, you know, like, well, more than sorry, I'm dating myself more than ten years ago.
00;50;17;01 - 00;50;38;04
Dr Romman Chowdhury
That that you can ask. I love that I'm. I am always happy to see, like, citizen movements that there is this, article by Rebecca Solnit that I absolutely love, and it's actually one of I supplied one of her least known articles. And I love her reading Rebecca Solnit because she straddles that kind of critique and optimism that I think we need.
00;50;38;07 - 00;51;03;00
Dr Romman Chowdhury
And this, this article is called When the Hero is the problem. And the purpose of the article is to talk about how we really want to show this, like individual hero. And all of Silicon Valley is built on this. The child genius who dropped out of Harvard and single handedly builds whatever. Right. And what she points out is actually the reality is that most, most progressive movements, most positive for humanity movements were built by collectives.
00;51;03;02 - 00;51;22;23
Dr Romman Chowdhury
And what she reflects on is how hard it is for her to pitch a story or write a book, because we as a society are so enamored of the hero. But she's like, you know, there there is no one person that solves a problem, collective solve problems. And I like to think about that when I like to think about what does it look like to push back against centralization of power?
00;51;22;23 - 00;51;41;17
Dr Romman Chowdhury
It is a collective movement. So we're not going to have a hero. We're not going to have a single person. We'll have is a lot of people just getting really fed up and just stopping, and maybe they'll stop in their own little way. But like that will mean something when it's all summed up and all put together.
00;51;41;20 - 00;52;02;05
Geoff Nielson
So I'm going to ask you a question that might be unfair, and so feel free to answer it as you see fit. But in, in this landscape of, of AI and consolidation of power and big tech and, you know, collective movements and responses to this stuff, what what is the role of the AI ethicist or of responsible AI?
00;52;02;07 - 00;52;19;21
Geoff Nielson
Where do you see yourself and your mission into fitting in this broader picture? And how much of it is, you know, with consumers or organizations? And, you know, how do you how do you, you know, roll that boulder uphill or, you know, something slightly more positive to to make sure that, you know, we're contributing for good here.
00;52;19;23 - 00;52;31;15
Dr Romman Chowdhury
Yeah. I mean, I don't think that's an unfair question. I, I think it's a great question. I think it's sort of two questions in your question. One is like, what is the AI ethicist role? And then the second one, it was like like, what am I like, what is my what do I see as my role? I think those are two.
00;52;31;20 - 00;52;49;11
Dr Romman Chowdhury
I think about the second one constantly. What is the AI ethicist? Well, I think what I like is that you know, people have fallen into different categories, right? I think there are the people and a lot of people fall in the space where it's about like informing people. Right? I do think there is still a role of constantly informing.
00;52;49;14 - 00;53;19;03
Dr Romman Chowdhury
I think informing can be a double edged sword. I think, you know, based like critique without any path forward is actually alienating and disempowering. And I think there are people in the responsible AI community that I wish would learn that, you know, and trying to be like very careful with my words here. But, you know, I, I think there is a like, yes, we should be raising awareness, but like, people cannot walk away feeling hopeless.
00;53;19;03 - 00;53;40;21
Dr Romman Chowdhury
Right. I think there are people who do that. Amazing. Well, Karen, how it just comes to mind immediately. Right. Like she she see and she, as a journalist comes into this as somebody who is good at telling a story, explaining things. That's why Empire of AI is so powerful. But what's great is you don't leave that book feeling disempowered, like she focuses on positive movement.
00;53;40;21 - 00;54;04;18
Dr Romman Chowdhury
So there's the that's a good example of awareness raising. Second is there are kind of the builders I put myself in, kind of the builders category. It's, you know, and builders, by the way, are not just tech people. There's a lot of lawyers who are builders. One thing I love is seeing, like this legal community popping up and the, you know, if I could, you know, go back in time one degree, maybe I would think of getting would be a law degree, because I think tech law is one of the most fascinating places to be.
00;54;04;21 - 00;54;26;27
Dr Romman Chowdhury
And there's so much ground to cover with like rights and protections and to be informed and capable in this space. One of the most powerful tools you can have. So I love seeing privacy professionals like legal tech people popping up to say, like, these are your rights, or they're going to advocate for rights. You know, the third group of people within the builders are, you know, auditors and people who are making tools.
00;54;26;27 - 00;54;48;20
Dr Romman Chowdhury
Right? And that is also a very, very powerful space to be in. So what I so now we're on the like what do you Rahman see yourself doing. So one of the purposes of human intelligence, the nonprofit was to build a community, a practice of independent evaluators. And one of the things that the problem that I want to tackle, that I've been tackling for the past few years is, you know, how do we get again, back to agency?
00;54;48;20 - 00;55;10;13
Dr Romman Chowdhury
How do we get, you know, people who are lived experience experts or experts that are not in, you know, influenced by tech companies, i.e. literally paid for on, you know, on the tech companies dollar to do this work and not just interested in this work like legally protected, you know, certified that it's a viable and it's to be a profession.
00;55;10;13 - 00;55;28;00
Dr Romman Chowdhury
Professionals that fall out of the sky. They they happen because there are certain things that enable it. So what I'm working on now with the Public Benefit Corporation of Human Intelligence is the infrastructure to do that work. How does somebody, let's say, you know, you're interested in that to being an evaluator, maybe you even have some consulting work.
00;55;28;05 - 00;55;57;26
Dr Romman Chowdhury
How do you do this work efficiently? How do you do it? Well, so I see it as like problems to tackle. The other part I'll add is, you know, there's so much like symbiosis between all of these people, right? So the tech legal people know that they're not technologists, and then they'll go to people like me to say, hey, if we're trying to write a law that says your model should be audited, what can what can we and can we not ask for companies to give and do it sounds like we're all collaborating.
00;55;57;29 - 00;56;25;23
Dr Romman Chowdhury
So I think there's there's it's a such a deep and I have not even tackled like cybersecurity. Right. Which also is intersecting with the space immensely, which I would argue is probably one of the most lucrative and futureproof fields to go in. Do the other panel add, by the way, with all the all of the hysteria and concern at the moment about the future of work, I see all of these jobs ranging from cybersecurity all the way to like informing people about it, as these are like AI proof jobs.
00;56;25;23 - 00;56;41;00
Dr Romman Chowdhury
Like, I cannot come to do this work because it fundamentally needs human judgment, it needs collaboration, it needs synthesis, it needs historical understanding. It needs so much stuff that I literally cannot, cannot do.
00;56;41;03 - 00;56;57;07
Geoff Nielson
We've we've sort of backed it. There's there's so many threads I want to pull on there. So thank you, Raymond for that, for that comprehensive answer. But we kind of backed into future of work there. You know, cyber security is kind of interesting in some ways. You know, I work with a lot of IT professionals and some aspects of cyber security.
00;56;57;07 - 00;57;36;12
Geoff Nielson
You're actually the first to be, automated because you're doing, you know, kind of automated threat detection, which is very different from the judgment piece about designing cybersecurity work. But but maybe again, just more broadly, you know, can you give me any sort of, you know, from your perspective, just view on the future of work and what it's going to look like over the next handful of years as some of these, you know, AI powered tools kind of I was going to say infiltrate more workplaces, maybe that's that's to, judgmental a word, but but as we start to, you know, rewire, organizations with AI.
00;57;36;14 - 00;57;55;10
Dr Romman Chowdhury
Yeah. So I actually just did this, the show called open to Debate on this topic specifically. This is something I actually, I've been thinking a lot about, and it's sparked me to think more about even just the idea of human intelligence. And what does it mean to be an intelligent person? I find it fascinating from like, a philosophical perspective, a practical perspective.
00;57;55;10 - 00;58;15;18
Dr Romman Chowdhury
So there's the like just to go into the practical one, like so much of this is the hype cycle at play, and the hype cycle is meant to be disempowering, right? Again, you know, popping in and saying all jobs will be automated in the next 18 months is such a ridiculous and unfounded thing to say and like, but then why?
00;58;15;18 - 00;58;36;09
Dr Romman Chowdhury
Right. Because it you cannot you cannot adjust your life to something that is going to completely decimate it, as if like, you know, somebody pointed out that there is a meteor about to hit the planet Earth in the next day, and we're all going to die like there's nothing for you to do, right? There's nothing you cannot plan and execute on a time frame that is that short, right?
00;58;36;11 - 00;58;54;08
Dr Romman Chowdhury
But instead, like but that's not the reality. The reality that we are seeing that and again, like this is why this like getting caught up in the local maxima and minima is dangerous is if you're caught up in that story, then you're missing the real story, which is actually that you know, we are experiencing a lost generation of young people who cannot get jobs.
00;58;54;12 - 00;59;15;09
Dr Romman Chowdhury
Entry level jobs are increasingly harder to get, especially in fields like programing. Right. So what? So but that annoyingly is a problem that you can actually tackle, right? If you have like narrowed your scope to people graduating with certain kinds of degrees who are trying to enter the job market, that is something maybe we can build something around.
00;59;15;09 - 00;59;33;09
Dr Romman Chowdhury
But if we are, then all going bananas because we think none of us will have a job, people are going to self-preservation. Like, why would somebody, as a senior engineer, try to build, you know, opportunities for young people entering the job market? If you as a senior engineer are being told that you're going to be out of a job, right?
00;59;33;14 - 00;59;53;07
Dr Romman Chowdhury
So it is this is like a perfect example of how it's disempowering. So like one, I do not think jobs will all disappear, but two, I think that there are certain fields that are being automated away right now. I think it is happening slowly but surely. And I think that three because this time frame is not like happening tomorrow, we can actually plan for it again if we like.
00;59;53;10 - 01;00;11;25
Dr Romman Chowdhury
Take a look at the long term picture. We actually can plan for the next 3 to 5 years. And you know, how are the kids who are sophomores in college today going to be able to be successful in the job market? Right. That is a different question from saying, how can a kid who's graduating this year be successful in the job market?
01;00;11;25 - 01;00;31;26
Dr Romman Chowdhury
And again, these are actually problems we can tackle, right? So one thing you know, I love is that the CEO Reddit has said, actually we're leaning into hiring young people because, you know, we think we can figure out roles for them. Amazing. But the fact that he even pointed out and is thinking of the problem to be solved as young people needing jobs, right?
01;00;31;26 - 01;00;55;21
Dr Romman Chowdhury
That's a framing that not everybody understands right now. The other thing I'll add is there's also increasing empirical evidence, which I'm glad to see on how work is being integrated and how it's successful, and also because there's so many papers coming out. Eric, Ben Nelson has had quite a few. There's like others that have come out of like Harvard and MIT and like generally the trend that we're kind of seeing is that you still need expertise.
01;00;55;24 - 01;01;32;10
Dr Romman Chowdhury
And maybe if there's another theme other than agency is whole conversation that I'd like to bring up. It's discernment. I think that, you know, having the ability to discern good and bad output, appropriate and appropriate uses of technology actually requires expertise. So it's almost like counterfactual right, whether or not counterfactual. Yeah, it is kind of factual. Right. Where like the existence of this technology actually means you need more experts, but it's automating away the junior level roles that would then allow someone to become an expert, which, by the way, a lot of senior engineering people are saying they're like, well, you need to hire junior people, even if an AI can do their job because
01;01;32;10 - 01;01;52;03
Dr Romman Chowdhury
I'm not going to be here forever. And this AI tools, not as great as you think it is. And you need my level of discernment to understand what is good and bad and what is right and wrong, so that that I would apply that to social media. I would apply that to our consumption of these tools. I think other than agency, my other word, big word of the day is discernment.
01;01;52;05 - 01;02;16;13
Geoff Nielson
I really like that word too. And I like the kind of the framing of why it's useful and why, you know, being an expert is, you know, in some ways has more, more value than what's, you know, obvious or being communicated in some of these headlines. I want to come back to something, you know, one of the themes here about by framing these, tackle these, by framing these problems better, we can we can actually tackle them.
01;02;16;13 - 01;02;37;28
Geoff Nielson
We can come up with a plan. We can do something about it. I want to push on the word we, Who who do you see as being kind of a key actors in the. Why is it, you know, is it business leaders? Is it political? Is it just us as everyday consumers who who holds the power here?
01;02;37;28 - 01;02;46;01
Geoff Nielson
And, you know, do you have any specific advice to the people who you think have the most outsized roles in, you know, correcting some of this?
01;02;46;04 - 01;03;13;16
Dr Romman Chowdhury
Yeah, that's a great question. And we were sort of used very broadly, but you're absolutely correct to say, well, who is we in different situations? I think who like it depends on what we are talking about. Right. So there is a we that is the average consumer when it comes to like picking and choosing whether or not you want to download an app or, you know, buy the new Alexa or, you know, enable Siri on your iPhone, like there is a wi that is the average consumer.
01;03;13;16 - 01;03;28;12
Dr Romman Chowdhury
And I'm talking about actually all the things that I don't do. I've never bought an Alexa. I don't plan on buying it. Sirius turned off on my iPhone like this. Actually, as a consumer, I did it on my parents. You know, like I can do that. These are these are right. These are capabilities you have as a person.
01;03;28;15 - 01;03;59;02
Dr Romman Chowdhury
Then there is this wi that is like the technologist or the person in the room building the product or the tool. You know that is a very different way. And like as that individual, you have a lot more agency and ownership of what's being built, how it's being used. And again, like how the problems being defined. And then the third is like there is a role for political is actually I think a lot of this like question framing frankly, does lie in, you know, in the hands of policymakers, not just Congress, but also like state and local policymakers.
01;03;59;04 - 01;04;21;11
Dr Romman Chowdhury
You know, whoever like whoever is like, for example, the, the state, the, the state lead for the Department of Education, deciding on how I should be could be a school board. Right? There are these political decision makers that are, you know, a framing question. So I see framing questions. This could even be what I said earlier. Like we want to write a bill on auditing algorithms.
01;04;21;13 - 01;04;43;04
Dr Romman Chowdhury
What should that bill be? And just as an example, the current New York state law on hiring and, requirements for hiring algorithms is so weak that a kid with an Excel tool could, like, pass it. Right. So back when that law first came out, everybody was excited about the idea of it, but it got so watered down that actually it just became legal theater.
01;04;43;04 - 01;05;01;28
Dr Romman Chowdhury
And actually, worse, it created more responsibility for an individual to have to push back and say, actually, this algorithm did discriminate against me, even though it technically, quote unquote, past the, you know, past the rules of this law. When I had been asked afterwards to help companies audit these algorithms, I would literally turn it down and say, you don't need me.
01;05;02;01 - 01;05;28;12
Dr Romman Chowdhury
You literally need an Excel sheet and a bunch of data from your database. And you can you can pass this, right? So framing the question, I think, is one of the most important tools for policymakers to do. They need to frame the right question the right way and then be able to frame the answer, because bad policy in many cases is actually like the example I gave is worse than having no policy at all, because a bad policy then dumps the responsibility on us as the consumer.
01;05;28;12 - 01;05;53;20
Dr Romman Chowdhury
So whereas before, let's say if you're in New York State and you thought you're being discriminated against by an algorithm, you know, there's a particular kind of like, you know, onus you have of like having a court case to hire a lawyer. Well, now your lawyer has another job, which is to say that the law itself did not adequately protect you, which is much, much harder to fight against than just saying a discriminatory algorithm being used by a company.
01;05;53;23 - 01;06;15;23
Geoff Nielson
Right. And as I think about that, too, it thematically it seems to be tying back to that piece about discernment, right? And being able to frame these things correctly and understand it. And so I'm curious, you know, you talked about the importance of framing, but are there any, you know, specific guiding principles you can share about how to best do that?
01;06;15;26 - 01;06;35;11
Geoff Nielson
When we're talking about, you know, confronting this technology and its usage? Yeah. And the lens we haven't gone to yet, by the way, that that's on my mind is also with, with business leaders, how they should be adopting and how they should be thinking about it. And so, you know, what kind of guiding principles do you have for them or for any of the other kind of key actors in this, in this space?
01;06;35;14 - 01;06;56;26
Dr Romman Chowdhury
Yeah. I mean, for business leaders, it's actually always been kind of the same boring advice. It's just like, does this technology actually solve a problem for you? Like a business need versus just being kind of a court there? Demonstrably. So. So it's not like, yes, I would love to get rid of all of my employees because benefits are expensive.
01;06;56;26 - 01;07;19;19
Dr Romman Chowdhury
They're file and use AI agents. Like, is an agent better than a person? By the way, every CEO that was out there like Sea of Khan Salesforce. Oh crowing about how many people are they all have rolled their ass every single. Not a single one has met the expectations that they were yelling about when agents were first coming out.
01;07;19;22 - 01;07;38;05
Dr Romman Chowdhury
My engineering lead just sent me an article today that actually MXGp servers are being quiet. There's like a little bit in the weeds, but like MXGp server, MXGp servers are supposed to be like the next big thing in a tech infrastructure. And everybody was told, got to learn MSPs like you got to understand it, otherwise you're going to be left behind.
01;07;38;07 - 01;07;57;06
Dr Romman Chowdhury
Funny how they and it actually comes from Perplexity computer. It's actually just built on Rest APIs like it's it's very simply built. You did not need to. So like just to point out a couple of sayings, you know, at a let's say a business leader level agenda, care is going to take care of my workforce. Well, it doesn't it's not to manually proven it can do that.
01;07;57;06 - 01;08;16;11
Dr Romman Chowdhury
So before you jumping into it, is it. Solving an actual problem is a capable of solving the problem. Does it solve that problem better than something analog? Right. So the MCP example is actually it didn't solve a problem better than something analog and actually probably introduced a whole raft of new problems because as I was talking to people about it, they're like, how do you handle versioning with an MCP server?
01;08;16;11 - 01;08;37;18
Dr Romman Chowdhury
Right. But MCP server is meant to like orchestrate a whole bunch of tools. What happens if like one, one like, you know, there's one tool in this chain that now has like some specific thing that's changed about and then everything is a cascading failure, right? Like it's less observable than your very traditional like Rest API is talking to your product and building a thing anyway.
01;08;37;18 - 01;08;46;16
Dr Romman Chowdhury
So like those are those are kind of my it is a very boring answer. But it is like actually the thing people don't do.
01;08;46;18 - 01;08;55;05
Geoff Nielson
No, I appreciate that. And the fact that people aren't doing it makes it that much more important. Right. Like if it's yeah, yeah.
01;08;55;07 - 01;09;15;07
Dr Romman Chowdhury
Yeah. No, I think that again, like just sort of for the general, I think people, people now increasingly understand right, how much of this stuff is smoke and mirrors. But I think there was a belief that the there are so many words that are thrown around that are actually Silicon Valley business jargon that, to the average person, sounds like something real intangible that just like made up math.
01;09;15;07 - 01;09;35;09
Dr Romman Chowdhury
So like valuation is a great example like value. Oh my god, that's company is worth $1 billion. It's not like the average person thinks in of this like money in, money out. And so like, wow, if my local bakery were worth $1 billion, that means it's selling $1 billion of cookies. That is not what that means in Silicon Valley.
01;09;35;09 - 01;09;55;09
Dr Romman Chowdhury
It means that like, it's like the price of art. You know, this painting is only worth $1 million. Someone's willing to pay $1 million. Or is that intrinsically worth that much? I think people are starting to realize this when they're saying that some company has this valuation of a or this technology is capable of something, doesn't actually mean it can do the thing or it's worth that much.
01;09;55;09 - 01;10;21;04
Dr Romman Chowdhury
It's a speculative discussion. There's a lot of a lot of like speculation being sold, which, you know, morbidly like one of my latest morbid interests is just like following the rise of Poly Market and sort of this legalized social gambling that's happening in the it. It's kind of like a sick symptom of this, like very speculative, hype driven world that like it's like it's like the it's like an aura.
01;10;21;04 - 01;10;34;17
Dr Romman Chowdhury
Boris. It's like this hype is eating itself, you know, and I see Poly Market as like that or call she's like, I'm not going to like discriminate that they're both awful. They're both this manifestation of like the snake eating itself.
01;10;34;19 - 01;10;39;14
Geoff Nielson
Yeah. It's like it's like meta hype in some way. It's like the hype of the hype. Can we bet on the hype like it's.
01;10;39;18 - 01;11;07;13
Dr Romman Chowdhury
Correct that the. Yeah, it's like when we all looked at NFTs and laughed, you know, this is just like NFT as an entire completely gazillion dollar industry that is being integrated into every aspect of our lives. With NFTs, like we're stuck in a corner and the true believers got to go do what they wanted to do. Which, by the way, you know, the metaverse recently shut down $1 billion, a billion with a B.
01;11;07;19 - 01;11;15;16
Dr Romman Chowdhury
I don't think all of AI governance, it has spent $1 billion. Yeah, yeah. And like its existence.
01;11;15;21 - 01;11;21;00
Geoff Nielson
Yeah. Well and like, who saw that coming. Like everyone I think who only everybody.
01;11;21;01 - 01;11;22;29
Dr Romman Chowdhury
Yeah. Yeah. Correct.
01;11;23;01 - 01;11;33;00
Geoff Nielson
Yeah. So I so I want to be conscious of the time here as well. And I know I know Ramon, we've covered an awful lot of ground here. Any kind of final thoughts you want to leave listeners with? Before we wrap up?
01;11;33;02 - 01;11;52;01
Dr Romman Chowdhury
Yeah, I just want to go back to kind of these two themes that have clearly come up in this conversation. One is agency and two is discernment. And I think that applies to like, actually, every aspect of our lives, because every aspect of our lives currently is digitally mediated. You know, it is it is now our responsibility as a consumer to exercise those things.
01;11;52;01 - 01;12;13;10
Dr Romman Chowdhury
Right? Figure out, you know, what I do and don't have agency over, and execute that agency and learn about how you can execute your agency. And to is like, learn discernment. Like what is good, what is bad, what is positive, what is negative. And again, like what I have seen very positively over the last ten years in this field is that people have become more discerning about certain things.
01;12;13;13 - 01;12;17;19
Dr Romman Chowdhury
And then the agency part is the ability to execute on that discernment.
01;12;17;21 - 01;12;29;08
Geoff Nielson
Yeah, and that's exciting that it feels like there's some green shoots of more discernment and people, you know, kind of fighting back against, you know, the the powers that be here.
01;12;29;10 - 01;12;50;23
Dr Romman Chowdhury
Yeah, I think so. Like I said, just like as a cultural observer of social media, some of my favorite accounts are just like regular people. There's one that's actually now become copied by multiple different accounts. And it's, you know, this guy and he looks a particular way and he's like quoting, quoting one crazy thing a tech CEO said every day in 2026, actually, 3 or 4 similar accounts.
01;12;50;26 - 01;13;09;15
Dr Romman Chowdhury
And every day it's like some fresh nonsense. But I love that. It's just like some influencer meme talking to regular people. It's again, it's not me, it's not an expert. It's just like it's a literally an influencer meme. And I love that. I love that that exists because that that's what changes the collective psyche.
01;13;09;17 - 01;13;28;16
Geoff Nielson
Well, and can I yeah, I just had I had a brainwave while you were talking, and I feel like it ties back into something you said a lot earlier. It feels like snark is actually a very powerful weapon against, you know, the manipulators of power and the rotors of our agency.
01;13;28;19 - 01;13;46;00
Dr Romman Chowdhury
Because, I mean, like, it's hard to tell. I was born in 1980s. I'm like a millennial Gen X person. We like Live and Die by snark, like 100%. There's no better way to disarm somebody with a very fragile ego than to be snarky.
01;13;46;02 - 01;13;58;19
Geoff Nielson
I love that. I love that in defense of snark. That's awesome. Ramon, I wanted to say such a big thank you for coming on. This has been such an interesting conversation. I know you've given me a ton to think about, so I really appreciate it. All your insights.
01;13;58;22 - 01;14;03;19
Dr Romman Chowdhury
Thank you so much. It's been an absolute pleasure to chat with you.
01;14;03;21 - 01;14;29;02
Geoff Nielson
If you work in it. Infotech research Group is a name you need to know no matter what your needs are. Infotech has you covered. I strategy covered. Disaster recovery covered. Vendor negotiation covered. Infotech supports you with the best practice research and a team of analysts standing by ready to help you tackle your toughest challenges. Check it out at the link below and don't forget to like and subscribe!
The Next Industrial Revolution Is Already Here
Digital Disruption is where leaders and experts share their insights on using technology to build the organizations of the future. As intelligent technologies reshape our lives and our livelihoods, we speak with the thinkers and the doers who will help us predict and harness this disruption.
Our Guest Dr. Rumman Chowdhury Discusses
AI Is Killing Free Will: Ex-Twitter Ethical AI Lead Explains How to Protect Yourself
As the former Head of AI Ethics at Twitter and Accenture, Dr. Chowdhury shares an insider perspective on how Big Tech is consolidating power, why narratives around AGI and “AI intelligence” are often misleading, and how everyday tools from apps to social media, are quietly shaping a surveillance-driven ecosystem.
Our Guest Salim Ismail Discusses
The AI Comet Has Struck: Moonshots' Salim Ismail Warns Most Organizations Won't Survive AI
In this episode we break down the rapid acceleration of AI, robotics, space tech, and solar energy and what it all means for jobs, business, governments, and your future.
Our Guest Brian Solis Discusses
AI Adoption Is Failing
In this episode, we sit down with Brian Solis, a globally recognized futurist and thought leader, to explore how disruptive technology is reshaping business, society, and the future of work.
Our Guest Chip Conley Discusses
AI Will Never Have a Soul: Here's Why
What happens to leadership, meaning, and human value in the age of AI? On this episode of Digital Disruption, we’re joined by Chip Conley, former head of strategy at Airbnb, New York Times bestselling author, and founder of the Modern Elder Academy.