You are here: American University School of International Service Big World podcast Episode 91: Gen Z vs. Big Tech

Gen Z vs. Big Tech


In this episode of Big World, SIS alumnus and tech accountability advocate Zamaan Qureshi joins us to discuss the impact of AI and social media on teen mental health, the push for stronger regulation, and how U.S. tech policy compares to other countries.

Qureshi shares what drew him into tech advocacy (2:47) and how he came to co-found Design It For Us—a youth-led grassroots coalition pushing for safer online spaces (4:35). He breaks down the specific harms young people are experiencing from social media and explains how platform design features like infinite scroll, like counts, and beauty filters feed into this and are deliberately built to maximize time spent on these products (8:11).

What does taking on Big Tech look like in practice (11:12)? What can the United States learn from international approaches like the UK's Age Appropriate Design Code or Australia's social media ban for users under 16 (15:08)? What risks does the rapid mainstreaming of AI chatbots like ChatGPT pose to young people (22:53)? And what does the recent landmark jury verdict against Meta and YouTube mean for the future of the tech accountability movement (27:20)? Qureshi answers these questions, and more.

0:07 Madi Minges: From the School of International Service at American University in Washington, this is Big World, where we talk about something in the world that truly matters.

0:17 Zamaan Qureshi: They have tried to pass sweeping AI state amnesty, which would prevent the states from regulating. And we really need them to reject that because if we don't get this right, there is a lot of risk to not only young people's safety and protection, but everyone's safety and protection.

0:37 Madi Minges: That was SIS alumnus, Zamaan Qureshi. He joins us today to discuss the current regulation landscape surrounding both artificial intelligence and social media in the US and abroad, and how he believes we can do it better. Data released earlier this year indicates that Meta, whose platforms include Instagram, Facebook, Threads, and WhatsApp, now averages three and a half billion users per day. That's more than 40% of the global population. Since social media first came on the scene in the early 2000s, we've seen massive shifts, not only in the ways people use social media, but in how the platforms themselves are designed. And while social media was initially marketed as a way to keep us better connected, recent data and ongoing lawsuits paint a different picture. I'm Madi Minges, and today I'm joined by Zamaan Qureshi. Zamaan is an alumnus of the School of International Service and an advocate, policy expert, and strategist focused on tech accountability.

1:38 Madi Minges: He is the co-founder and director of campaigns and policy for Design It For Us, a youth-led grassroots coalition that is advocating for better online safety. In this episode, we're talking about how the US stacks up against other countries when it comes to tech policy and the push for stronger regulation and guardrails on social media. A quick warning note for listeners before we get into this episode. In this conversation, we do discuss topics surrounding mental health as well as suicide. Here is our conversation. Zamaan Qureshi, thank you so much for joining Big World, and welcome to the show.

2:20 Zamaan Qureshi: Great. Thanks so much. Thanks for having me.

2:23 Madi Minges: Zamaan, I'd like to start by talking a little bit about your background. I know that even as a student at AU, you were already getting involved in advocating for better regulation and guardrails on social media. So I would love if you could tell us about what motivated you to get into this and pursue this kind of advocacy.

2:47 Zamaan Qureshi: Yeah. So I got involved with an organization pretty young when I was a senior in high school, called the Real Facebook Oversight Board. And part of that came from my own interest in the technology ecosystem and how it was contributing to our democracy and to our elections. In particular, I had worked on the 2018 midterms for a congressional candidate in Illinois, where I'm from. And I worked on the digital campaign as part of that campaign. And so I saw the ways in which social media and technology could really influence voters. We won that election in a very tight race, and so all the votes mattered, and getting people to turn out to vote really mattered. And I took that same interest and carried it forward. I was particularly interested in the Cambridge Analytica Facebook data scandal, and that really kind of motivated me to get more involved in the space around responsible technology and social media.

3:52 Madi Minges: Yeah. It's so interesting. I think kind of being young people in this generation that's really grown up and can at a really formative time remember when social media came on the scene and became popular and was a thing that it seemed like everyone had. And so I think it's interesting just hearing your background and getting involved even as a teenager. I know that you are involved with an organization called "Design It For Us," as part of this movement. Can you tell us how you got involved, how it got started, and what it's trying to accomplish?

4:34 Zamaan Qureshi: Yeah. So "Design It For Us" is a coalition of young people advocating for safer online spaces for our generation. We're a group of Gen Z activists from across the United States and around the world, about 500 young people from about 40 different states. And we really care deeply about this issue. I co-founded the organization in 2023 and did so in part because we were seeing the disclosures from whistleblowers like Francis Haugen from Facebook, who was coming forward and telling us that big tech platforms, Meta in particular, knew the harms of its products, but did not disclose and did not take sufficient action to actually protect young people online. And then there were very few young people involved in the conversation around responsible technology and safe social media. And so it was a real call to action for me and my co-founder, Emma Lembke, when we founded the organization in college to get more young people involved, get more young people at the table and with decision makers, that was really our priority from the beginning.

5:43 Zamaan Qureshi: So that looked like influencing policy and legislation at the state and federal level. I think since then, our kind of remit has expanded to include a lot of the narrative change work that's really important behind this issue, doing research on these platforms and trying to identify where the gaps are in the safety protections offered by social media companies doing campaigning and direct action work, and as well as building this movement from the ground up. It's a grassroots organization of young people from across the US. And so we have to activate them, get them involved, give them opportunities to create touch points with the issues that they care about. And oftentimes those issues intersect with their own lived experiences and the ways that they grew up online, they intersect with the issues impacting their communities, and they're seeing how tech is influencing in particular. And in this moment, elections, for example, how tech is influencing legislation and policy, and how they're trying to change the narrative about the ways that their products are used and the ways that the technology is created.

6:56 Madi Minges: Yeah. You mentioned the idea of harms on social media and these companies being aware of how their own product is harmful. And I mean, there's additional data and research to back that up as well. I know there's a recent Pew Research study found that nearly half of teens now say social media has a mostly negative impact on people their age. And Design It For Us has this really striking line that we read on your website that tech is "marketed for connection but weaponized for profit." And it does really feel like that connection aspect is disappearing, especially when we think about what the initial vision of what social media was meant to be or sold to us to be, this kind of place where we post photos, we see what our friends and our family are doing, we've seen significant changes in that space. So I'd love to know, from your experience of advocating for a better world online: What are some of those specific harms that young people are experiencing from social media, and how does platform design play into that?

8:12 Zamaan Qureshi: Yeah, it's a great question. And we were really sold on the connection piece, like you said. That was the draw. That was what people like Mark Zuckerberg spent years talking about, how we want to connect the world together, and that sounds really cool and really special. The fact that you can connect with your relatives abroad and you can make friends with people that you don't share the same geography with, there's some real power and excitement around that, and I think there was in the early 2010s era. And now, what we have received looks very different than that. The products that we've been provided with look very different than that.

8:53 Zamaan Qureshi: The harms that we talk about in this space manifest in different ways and impact young people in different ways. But some of the common denominators are that a lot of young people are continuing to experience mental health challenges, exploitation, and body image issues related to social media, and that manifests in different ways. For example, the design features that inhibit or exacerbate a lot of these mental health challenges come from the fact that you're essentially quantifying your self-worth on a platform like Instagram, where you have to create or curate a profile of who you are for your peers, and your peers judge you on the basis of that, and that is the product that is offered. There's no in between there. What began as connection, I think, quickly turned into kind of a dark place of the way that young people use these products. And that's not to say that they don't do good things and that people don't discover interesting communities and spaces, but we should not have to put up with this. We shouldn't have to accept this as the status quo.

10:15 Zamaan Qureshi: To come back to your question about where this starts at the core, it is the design. It is the way that these products are built, and they are products, at the end of the day. There are engineers and employees at these companies who are making deliberate decisions to increase the amount of time spent on these products. That's their metric, they make more money the more time that people spend on their products, the more ad revenue they can generate from that time spent. And so, it's in their business model, and they've said as much in a lot of the lawsuits that we've learned about, what executives have been saying over the years, that they deliberately build things like infinite scroll, like counts, beauty filters, location sharing features to increase the amount of time spent. And when that is the metric, when that is the chief thing that you are measuring for, a lot of harm is downstream from addictive design.

11:14 Madi Minges: I guess my question with that is, how do we begin to confront that? And I know you, obviously, are part of this grassroots movement that is forming to say, "We don't have to accept the status quo. We can bring change." But can you talk about what does that tangibly look like? And in the advocacy space that you're in, does that mean are you asking platforms to change their design? And is there a way to go back to a design that's less harmful?

11:45 Zamaan Qureshi: Yeah. So I think in particular, the companies that we were talking about, Snapchat, YouTube, Instagram, TikTok, these are massive companies. And throughout our advocacy work, we've come into contact with them in the short amount of time that we've been around. And unfortunately, none of the conversations have felt particularly productive to getting at the core of design. We met with a large company a number of years ago to try and encourage them to change specific design features in the way that they built their algorithm and their product, and it felt like, we would have these conversations months later, that nothing had changed from the previous conversation, that they had taken our perspective into account and we had not actually seen change to the product in that way. And so, there's a term that we use to describe this approach, and we describe it as self-regulation. And the track record over time of these companies is such that I think it is impossible for us to be able to trust them to self-regulate.

12:54 Zamaan Qureshi: And so, to come back to your question, the theories of change in this space that we believe in are first that people need to understand the problem. And for a lot of young people, we know what we've lived, what we've experienced growing up. Again, that potential for connection, but also the risk and the fear of exploitation and having to quantify your self-worth. And sometimes it's just about giving young people the vocabulary to access their already existing experiences. I think the second piece is, and one of our main theories of change, is regulation and policy, and policy that regulates the core design of social media products. It's certainly possible, as much as the companies like to say that it isn't possible or that there are constitutional risks or that this is difficult regulation to implement. They've been complying with a law in the United Kingdom called the Age Appropriate Design Code, and their products still work in the UK.

13:58 Zamaan Qureshi: And so, these are the kinds of things that we want to see happen across the US, a country where we have ultimately the utmost responsibility to regulate these companies. I think especially in the domestic policy context, it's easy to think about just the impacts of these companies and their products in the United States, but the impacts are far-reaching. In places like Southeast Asia, for example, Facebook comes pre-downloaded on devices, for example.

14:30 Madi Minges: Wow.

14:30 Zamaan Qureshi: So it is the internet to a lot of people, and with that in mind, we have to, as a country, take on our responsibility to create a safe product for young people. So those are some of the theories of change that I think we believe in. Ultimately though, young people have to be the center of what we're doing, and that is our entire ethos, that you can't regulate around us. You need to put us front and center and include our voice in the conversation to get to the kind of regulation that we want to see.

15:07 Madi Minges: Yeah. On this topic of regulation and policy and the current landscape surrounding big tech, I think it's really interesting to see how other countries around the world are responding to this issue, and now trying to implement stronger regulation, stronger protections. You had mentioned the UK Age Appropriate Design Code. I would love to hear kind of more about the specifics of that and whether that's been working in the UK and what that practically looks like. But I want to mention another case as well. Australia is probably one of the most extreme examples, where they recently banned social media for young people under age 16. That appears to be a bit of an anomaly in this space. But I'm curious if you could talk about the trends that we're seeing in regulation, especially internationally and what seems to be working and what's not working.

16:04 Zamaan Qureshi: So in terms of the UK Age Appropriate Design Code, this was a law that was passed around 2019 and requires social media companies operating in the UK and actually a lot of companies beyond social media that provide products to young people to ensure that the design of their products are age appropriate. So what that looks like is the highest privacy settings by default, creating a bit of friction between some of these addictive features so that young people can thrive in online spaces without being exploited.

16:41 Zamaan Qureshi: And there's a really great organization called Children's and Screens, which did an analysis that found about 90 changes that social media companies had to make as a result of complying with that law. And the information commissioner in the UK, essentially the equivalent of the Federal Trade Commission in the US brought a couple of enforcement actions under the code to social media companies for failing to create that age appropriate experience for young people.

17:17 Zamaan Qureshi: I think as well, one of the challenges that we hear in the US is how can you bring this law that was passed in the UK over to the US? The UK doesn't have a constitution, the United States has a First Amendment. And that's one thing that we've heard time and time again. We helped pass the California Age Appropriate Design Code and have subsequently helped pass laws in Vermont and Maryland and back to legislation in New Jersey and Illinois that takes this model approach.

17:52 Zamaan Qureshi: And we had a really good ruling in the California Ninth Circuit, which said that most of the law that has survived from the initial court challenge can go into effect. And so hopefully we're going to start to see changes in the US to these products, at least if it's on a state by state basis, that will encourage the federal government to take some action who have really sat on their hands here and failed to take action.

18:21 Zamaan Qureshi: On your question on Australia, I think Australia is interesting because the approach to restrict social media access across the board has been gaining in popularity amongst world leaders, but I think the jury's still out on the data in Australia and whether it will actually be successful or not. I think there's some contention there right now, whether it's actually limiting the access that they'd intended it to.

18:53 Zamaan Qureshi: For us, the approach that we believe in is a design-based regulation approach that there are ways to design products to be safer. And so to be frank, design-based regulation is what's going to change the core business model of these products by regulating the algorithms and regulating the design features. We are essentially saying to companies that the onus is on you to build a product that's safe for young people, and you have to do that with young people's access in mind.

19:30 Madi Minges: Thinking now on the US, you had mentioned this push to implement policy at the state level. I'm curious if you can fill us in what's happening on the federal level in this space. What has regulation for big tech looked like under the Trump administration? Is there movement happening? Is it a priority? Can you tell us about where that currently stands?

20:00 Zamaan Qureshi: So for years, Congress has been debating legislation to hold social media companies accountable and protect young people online through myriad different proposals, transparency, privacy, safety, section 230 reform. All of these proposals have gained traction at various points in time. Antitrust reform is another, and still we are without large scale regulation in this space. The last law that has really made an impact was the Take It Down Act, which was passed last year to regulate non-consensual AI generated intimate images and to criminalize that.

20:45 Zamaan Qureshi: But before that, the last law we passed regulating in this space was in 1998 under the Children's Online Privacy Protection Act. So it's been a long time in the making for regulation that is desperately, desperately needed to keep pace with the growth of technology, not only social media, but the growth of AI and the use of chatbots now by young people.

21:07 Zamaan Qureshi: And on the AI space, they have tried to, three times, pass sweeping AI state amnesty, which would prevent the states from regulating. So essentially, like what you and I were just talking about, which is that states are moving to act in a lot of these spaces and the federal government is trying to prevent states from acting. So that would nullify a lot of the regulation and protection that we've advocated for at the state level.

21:37 Zamaan Qureshi: In addition, the administration put out its national AI regulatory framework, which is far weaker than I think we would want to see. We don't need to make the same mistakes that we made with social media companies, with AI companies. I think Washington in particular has a tendency to get starry-eyed when they look at Silicon Valley and the products that they're producing. The CEOs have spent a lot of time on Capitol Hill, not only in hearings, but also meeting with lawmakers behind the scenes.

22:14 Zamaan Qureshi: And we really need them to reject both the money involved and the continuous arguments to prevent regulation because if we don't get this right, there is a lot of risk to not only young people's safety and protection, but everyone's safety and protection. Whether that is creators and artists, whether that is blue collar workers, whether that is people who are at risk of falling prey to some of these chatbots. These are all of the things that we need to be regulating for and right now it feels like we're behind.

22:53 Madi Minges: I'd love to stay on this topic of AI for a moment. In the last few years, we've really seen AI really blow up, really come on the scene really quickly, be very quickly integrated into millions of people's daily lives. According to recent data, almost two thirds of teens say that they are using chatbots. I would love to have you talk a little bit more about how this rapid mainstreaming of AI is changing the landscape and how it's also changing not just the nature of harm, but the nature of what it looks like to regulate.

23:38 Zamaan Qureshi: So I think it's first good to talk about what the opportunity is for young people and for the public as a whole. AI could be an equalizer, perhaps it's going to make medical breakthroughs, the ability to be a creative and think and work through a problem

24:00 Zamaan Qureshi: ... with AI. AI can be a really good companion in that regard. There's so much promise involved, I think, with the technology, which is really exciting. And yet there are some really justifiable concerns that we have with the growth and rapid expansion of this technology without guardrails. In particular, I'm thinking about, unfortunately, a lot of the suicide cases that have come as a result of young people's interactions with AI chatbots. In particular, you had cases of young people, Adam Raine or Sewell Setzer III, who ended up taking their own lives after engaging with a chatbot that led them down a rabbit hole. In Adam Raine's case, it was ChatGPT, which had convinced him to take his life. In Sewell's case, it was Character.AI, this company that had developed a product marketed deliberately for young people to engage in romantic and sexual conversations with a chatbot without any guardrails in place. And that friction or removal that Sewell had from the product was ultimately what ended up contributing to him taking his own life.

25:14 Zamaan Qureshi: And so these are the risks that are present here right now. And in ChatGPT's case, in particular, for example, employees had been flagging that there were potentially concerns with the rollout of that current model at that time. And they rushed the model out because of the growth of competition and the need to kind of keep competitive with other AI products that are on the market right now. And so there's not the sufficient testing required to actually ensure a safe product. And ultimately, we don't have any ability as the public to independently verify whether these companies are actually upholding their promises. Again, coming back to self-regulation, it's not enough for Sam Altman or Dario Amodei to kind of trot out and be like, "We've told you we built a safe product. We need to see it firsthand."

26:11 Zamaan Qureshi: And so we've worked on bills that deal both with catastrophic risk and kid safety in the states as well. In New York, in California, we had two significant victories passing SB 53 in California and the RAISE Act in New York, which creates a baseline level of transparency requirements for AI companies that when they do detect catastrophic or critical risk, that they flag it to the regulator. There are bills that we're working on this year in particular that also have a kid safety element component to that too, so that AI companies are required to publish plans on how they are mitigating risk to young people and that is backed up by third party audits. And so this is the kind of regulation I think we need to get out ahead of because there is so much promise with AI. Let's maximize that promise, but we need to be able to put in place sufficient guardrails so that we do not have the fear of continuous mental health challenges that result in anxiety, depression, and in some cases, suicide.

27:19 Madi Minges: Yeah. I must say, I feel so inspired hearing you talk about this because I think from the outside, it does feel like such a David-Goliath kind of fight, right? I mean, these are some of the richest companies in the world, the richest people in the world behind this, that I think it's really inspiring to me to hear that there are victories happening. And actually, I'd like to end our episode by talking a little bit about the future of regulation. I think I would be remiss to not mention this recent landmark decision. A jury found Meta and YouTube liable for the mental health struggles of a young woman who compulsively used social media as a young child. She was awarded $6 million in damages with lawyers specifically pointing to features like the infinite scroll, algorithm recommendations, constant notifications, beauty filters, all of these things that we've kind of touched on in this episode.

28:27 Madi Minges: But I would love if you could unpack for us what this decision means for the future of the movement and the prospect of future regulation.

28:37 Zamaan Qureshi: Yeah. It was a landmark victory in Los Angeles when a jury on all counts found Meta and YouTube liable for failing to create a safe product for KGM Kaley, the 20-year-old plaintiff in that case and awarding her those penalties as well. And so I think it is a sea change that we're seeing in regulation that the failure of the federal government is really put on show here as plaintiffs are taking up the mantle.

29:12 Zamaan Qureshi: And there was another landmark decision in New Mexico as well, which a jury in New Mexico found Meta liable on all counts for, again, failing to protect young people for creating a product that exacerbated exploitation and created a product that they knowingly did so and lied about it. And so I think those two cases in conjunction, and I think in particular the New Mexico case, which I should state as well, I was the last witness for the plaintiffs and testified in the New Mexico case are significant because they show that states are stepping up, that young people are stepping up, plaintiffs are stepping up to take action where we've really struggled to see action from the federal government kind of take place.

30:04 Zamaan Qureshi: And my hope is that this is a push in the right direction for lawmakers to take up the mantle on this issue. Not only I think is it good politics, it's the right thing to do right now. And I think for too long, we've let these companies that are so large and so powerful have outsized excess control over our markets and have a huge amount of money in their coffers just continue to expand and grow and do so at the cost of young people. We end up being the collateral damage because ultimately we're the ones that are the product. We're the ones that pay the price through our engagement and through our attention. And ultimately, if there is not enough of a sea change or a tone change, I think the American public is really waking up to this issue in particular, and they will be demanding accountability from their lawmakers. You best believe that we will be right there with them in demanding that accountability.

31:15 Madi Minges: Zamaan Qureshi, thank you so much for joining Big World to talk about social media regulation and this current landscape. It's been a pleasure to speak with you.

31:25 Zamaan Qureshi: Thanks so much for having me. It was a pleasure.

31:29 Madi Minges: Big World is a production of the School of International Service at American University. This episode was produced by Morgan Desfosses. Our podcast is available on our website, on Apple Podcast, Spotify, or wherever else you listen. If you like this episode, please leave us a rating or review. Our theme music is It was just cold by Andrew Codeman. Until next time.

Episode Guest

Zamaan Qureshi,
SIS/BA '24 and co-founder of Design It For Us

Stay up-to-date

Be the first to hear our new episodes by subscribing on your favorite podcast platform.

Like what you hear? Be sure to leave us a review!

Subscribe Now