Emma Lembke
00:00:00
Make no mistake, unregulated social media is a weapon of mass destruction that continues to jeopardize the safety, privacy and well-being of all American youth. It’s time to act.
Dr. Sanjay Gupta
00:00:21
That’s Emma Lemke. It’s a name you probably need to remember because I have a feeling you’re going to be hearing it a lot. Emma is a college student. In fact, she’s not much older than my own teenage daughters. She and many of her peers have decided to take on a fight that is big, bigger than themselves.
Emma Lembke
00:00:42
Hello, everyone. My name is Emma Lemke. I am originally.
Dr. Sanjay Gupta
00:00:45
In February of this year. Emma traveled to Washington, D.C., and testified in front of Congress for the first time.
Emma Lembke
00:00:52
I am humbled and honored to be here today.
Dr. Sanjay Gupta
00:00:56
I created the hearing was on how to better protect kids online. And the 20 year old had a very clear message for U.S. senators and frankly, for the whole world.
Emma Lembke
00:01:07
Ten years from now, social media will not be what it is today. It will be what members of my generation build it to be. We want to build differently. We want to build it right.
Dr. Sanjay Gupta
00:01:23
As you likely know by now, I’ve spoken to each of my own daughters for this season. I did it mostly because I’m worried about the impact that social media and technology could be having on them and all young people. You see, the thing is, I don’t want to spend the season just talking about young people. I wanted to talk with young people. Why? Because I wanted their perspective. I know that they have many ideas. They know how to make these spaces better. Emma is a prime example of that. You see, in 2020, she founded the Log Off Movement. That’s an organization that aims to raise awareness about social media’s impact on youth mental health. Something I care deeply about. She also heads up, technically politics. That’s a youth led advocacy group calling for stricter regulations that protect young users. Now, all of this activism has landed Emma in the spotlight. It’s also how she found herself facing a panel of lawmakers.
Emma Lembke
00:02:23
I have heard, as members of my generation have expressed concerned, not just for our own well-being, but for younger siblings, for cousins, and for all those to come after us.
Dr. Sanjay Gupta
00:02:38
Emma is right in the middle of this sort of push and pull between social media companies, the users that want to see change and the lawmakers in charge of reeling in big tech. But what is it going to take to make social media safer? Is it even really possible? Also, should these companies be held accountable, carry some sort of responsibility for the harms young people like Emma and others are alleging? None of this is easy, and yet it is really one of the most important issues of our time. So today we’re going to try and get some answers from a tech journalist who’s going to educate us about a decades old law currently in front of the Supreme Court that could decide the very future of the Internet.
Brian Fung?
00:03:26
You know, partly because of Section 230, a lot of the lawsuits that have been brought in relation to tech and mental health have actually not gone anywhere.
Dr. Sanjay Gupta
00:03:37
Look, we’ve all heard about TikTok privacy data concerns, but how worried should we actually be? Are they really spying on us? How do they even do that? And what does it mean for you and your family? I’m Dr. Sanjay Gupta, CNN’s chief medical correspondent. And this is Chasing Life.
Dr. Sanjay Gupta
00:04:04
In the past couple of months, if you’ve been paying attention, you’ve seen social media companies come under fire.
Laura Coates
00:04:10
Turmoil spreading tonight at Twitter, where there appears to be some kind of a mass exodus of workers who are-
Don Lemon
00:04:16
Facebook whistleblower Francis Haugan.
Jake Tapper
00:04:19
So the US Senate just voted unanimously to ban TikTok from government phones. TikTok is a popular app-
Dr. Sanjay Gupta
00:04:28
But for Emma Lemke, this fight goes back even further. It all started, in fact about eight years ago when she was first introduced to social media.
Emma Lembke
00:04:38
I remember seeing all of my friends as attention get pulled away from me and having their eyes looking up at me, having conversations and getting pulled straight down. And it felt like a drop. So each one would go and each one would spend more and more time sucked in to their phones and screens rather than talking with me in person.
Dr. Sanjay Gupta
00:05:00
But Emma wanted to keep an open mind, so she figured there must be a reason why her friends are so enamored by this technology.
Emma Lembke
00:05:08
But I was an optimistic 11-12 year old, so in my mind I was thinking there has to be something that is so mystical and magical and golden within these platforms that’s taking my friend’s attention away from me in the moment and throwing their attention down at the screen.
Dr. Sanjay Gupta
00:05:27
It did seem magical. You know, when I was a kid, if you had told me that one day we would carry these devices in our hands, that with a swipe of your hand, could conjure up images and sounds from all over the world. I would have thought witchcraft and sorcery. But here we are. Not only does the technology exist, but there seems to be this strange obligation to constantly interact with it. There is this constant pressure that young people have to be online. And as I’ve learned this season, it can be particularly challenging for girls. In fact, for young people like my daughters and Emma, it can be a double whammy feeling the pressures of being a young woman and being a young woman online.
Emma Lembke
00:06:14
I consistently could see how many people liked my video, how much social feedback I was getting, how many people commented, what did they say, how many likes and followers do I have, is my follower to following ratio off.
Dr. Sanjay Gupta
00:06:28
Then one day when she was in the thick of it, all of a sudden something changed in her. It was like the flip of a switch.
Emma Lembke
00:06:36
I remember I heard the buzzer, my phone, probably Snapchat notification, something trying to pull me in, and I instantly had that pillow in response to grab for it. And it was in that response in the millisecond between that buzz and my grab that I finally hit my breaking point. And I asked myself, how am I allowing these apps to have so much control over me? So I started looking up things from, Is social media bad? To, what design features are employed to keep me scrolling? And what I found was a plethora of data and studies and articles seeking to investigate social media’s role in fueling the mental health crisis of young people and in harming young people. And what I found most jarring was the lack of youth representation and youth voices in those discussions.
Dr. Sanjay Gupta
00:07:37
That was Emma’s call to action. Emma claims that there’s something about the way social media draws you in and doesn’t let you go, something that feels manipulative by design. In fact, the idea reminded me of my conversation with Catherine Price earlier this season. If you remember, she had a similar take. Her take is that a lot of these apps are designed to mimic slot machines in order to keep you hooked, in order to keep you pulling the lever again and again and again. Emma simply doesn’t think this is right. In fact, she believes that it’s time for a redesign and she believes that the responsibility to fix it should come from, you guessed it, those who designed it in the first place.
Emma Lembke
00:08:24
That burden should not be on the individual and it should not be on the parent to hold. I realize that, it was not my fault that I was so anxious and depressed as a young person online. It was the fault of designers and of companies to develop addictive technologies and platforms that they knew were harming me and not doing anything. For instance, looking at Francis Hogan’s whistleblower report. Instagram knew, Meta had a very real understanding that for one in three girls, their platforms made their body images worse. And that was kind of the moment when when I began to understand that the burden was completely flipped. That I knew I had to engage in top down advocacy efforts.
Dr. Sanjay Gupta
00:09:20
So she started out small looking at any legislation that could make a difference. And at the age of 17, she was advocating for changes to policy at the state and then the federal level behind all of Emma’s work is this idea that we cannot legislate on behalf of young people without giving young people a seat at the table.
Emma Lembke
00:09:42
Often times I think young people, they do not think that legislative action will effectively solve this issue. One, because of how slow and methodical and chaotic and complicated it can be. But two, because you, in the media, get clips of senators asking, “How do you make money, Mr. Zuckerberg?” You know, like these these basic questions that you would think, oh, no, as a 12 year old, I could answer that. What are you doing? You’re the person who’s supposed to protect me. But in going there and going to D.C., I have become so optimistic because I see these senators working tirelessly to understand these issues by bringing in young people like me.
Dr. Sanjay Gupta
00:10:32
No doubt. Emma is a remarkable young woman. She’s inspiring, really. She’s right at the age where most people are figuring out who they are, what they want out of life, and just kind of having fun while doing so. And the fact that she’s dedicating these prime college years to fighting for a future that is better for her and other young people. Well, it gives me a lot of hope for the younger generations. I think about this a lot. So many times we have handed our kids a world they did not necessarily desire. Whether it be climate change, civil strife, deep political polarization, and, yes, an inescapable digital world. And each of those times I have seen young people rise up and tackle some of these huge issues. It kind of makes me feel that despite it, all, the kids will be all right. It isn’t just talk either. Emma has some specific ideas about ways to improve social media.
Emma Lembke
00:11:32
We have to address two fundamental things. One, we need to increase algorithmic transparency.
Dr. Sanjay Gupta
00:11:38
What does that mean? Well, for Emma, she saw how autoplay would lead her astray. For example, how searching for something as simple as healthy meals led to suggested content pushing her toward disordered eating. It is just so strange, but disordered eating content would fire up the emotional centers of your brain and get you to stick around longer. Candidly, it’s pretty sick and Emma does not want anyone to go through that. She says it’s important for the public to know just how these algorithms work and how they end up recommending content that isn’t good for them.
Emma Lembke
00:12:17
Then I think the second thing we have to address is how do we build in safeguards to protect young people? So it’s looking at regulation via design features such as stopping autoplay by default, stopping or banning targeted advertising to young people that would feed them those harmful pieces of content. And it is also looking at features allowing minors to be direct message by adults that they don’t know. And I think in investigating how to put into place those two things, safeguards and algorithmic transparency, that we will see the most effective solutions emerge.
Dr. Sanjay Gupta
00:13:02
Remember, again, Emma is young, She’s 20 years old, but she has lived her life fighting these battles. It’s why these ideas are such a good start. But here’s the thing. Too often, great ideas for change run into a legal brick wall coming up. Why one of those walls could soon come crumbling down.
Brian Fung?
00:13:26
What a lot of tech platforms are worried about is if Section 230 has been a shield against lawsuits for 27 years and the Supreme Court decides to make that shield smaller, that kind of by definition, means they’re exposed to more lawsuits.
Dr. Sanjay Gupta
00:13:44
We’ll be right back.
Dr. Sanjay Gupta
00:13:54
And now back to Chasing Life.
Dr. Sanjay Gupta
00:14:00
If you’ve been paying attention to the news in the past couple of weeks, you might have come across a story or many stories about the ongoing legal battles social media companies are currently facing. Like many of you, this topic is one that really hits home because at the center of many of these legal brawls is the very safety of our children. But the truth is, some of the legal lingo can feel confusing even for me. So I decided to sit down with one of my colleagues who lives and breathes social media and tech.
Brian Fung?
00:14:32
This stuff can be really challenging to study at any sort of systemic or rigorous, you know, scientific way.
Dr. Sanjay Gupta
00:14:40
That’s Brian Fung. He’s a CNN tech reporter based in Washington, D.C., who’s been on the technology beat for years. I wanted to ask Brian a bit about some of the questions that Emma brought up earlier. Like, why do those algorithms feel manipulative and the reality slash possibility of holding social media companies accountable? Interestingly, Brian says the answer partly lies in a piece of old legislation called the Communications Decency Act. You might have heard of this already. It’s called Section 230. You know, I read one of the articles you wrote recently, and it was about Section 230 and the headline that you put on there, it really kind of got my attention. It said how the Supreme Court could reshape the Internet as you know it. It’s a big deal. I mean, it sounds like a big deal. I wonder if you can just spend a minute, Brian, telling us about Section 230.
Brian Fung?
00:15:35
In the early days of the Internet, there were a number of court cases about whether or not Internet portals could be sued for the content that some of their users put on their platforms. And the courts came down with kind of conflicting answers. And so Congress saw what was happening and said, hey, we don’t want to have this legal mishmash threaten to choke off this potentially really important economic tool before it has a chance to take off. So what we’re going to do is create a special category of legal immunity for websites and people who use the Internet so that they don’t have to be legally responsible for the speech that others put on the Internet. Tech companies, ranging from AOL to Facebook to Snapchat to Twitter, have all used this law to say, hey, we can’t be sued for content that our users post online. And by the way, we can’t be sued also for deciding to moderate content in the ways that we see fit, which is another critical piece of this legal immunity shield that has kind of come under attack in these court cases that the Supreme Court is now considering.
Dr. Sanjay Gupta
00:17:00
Look, I know that a lot of this can feel dense, but it’s hard to understate just how big these court cases are. Legislative action can be very slow, but when the Supreme Court hands down a ruling, change can then come very quickly. And there’s one case in particular Brian’s been paying close attention to.
Brian Fung?
00:17:20
It’s called Gonzalez v Google. And the case is all about whether YouTube can be sued for algorithmically recommending videos created by the terrorist group ISIS to YouTube users. Historically, 230 has really, you know, been focused on what individual users post on social media sites. But now we’re getting into content recommendation algorithms and are algorithms protected by section 230? And this question of whether algorithms are protected by 230 is really important at a time when more and more companies are relying on AI to curate and moderate content. So the outcome of this case could determine how many lawsuits companies like Facebook and Google and YouTube might have to face if their users post something that another user doesn’t like.
Dr. Sanjay Gupta
00:18:24
Okay, Again, I know that sounds like a lot. And you may be wondering why do I have to care about how companies moderate content? Why? Because it strikes to the heart of how most of us use social media. Think of it like this. The algorithm recommends things. We interact with those things. It learns what we like, what we don’t like. But what happens then when the algorithm spits out something dangerous or objectionable and someone then retweets it or likes it, should they be held responsible for the damage that causes?
Justice Amy Coney Barrett
00:19:03
Okay. Let me ask you this, I’m switching gears now.
Brian Fung?
00:19:07
So then you heard this and a really important exchange between Justice Amy Coney Barrett and one of the lawyers for the plaintiffs who are suing Google.
Justice Amy Coney Barrett
00:19:18
Let’s say I retweet an ISIS video, on your theory of my aiding and abetting? And does the statute protect me or-
Brian Fung?
00:19:26
Justice Barrett asked, Hey, if I go to Twitter and I retweet something under your legal theory, you’re saying I’m not protected by Section 230. And the lawyer responded, Yeah, that’s content you’ve created.
Justice Amy Coney Barrett
00:19:43
That’s content I’ve created. Okay.
Brian Fung?
00:19:45
And and the implication being that if you’ve created this content, then you’re potentially liable for it. And you’re not simply just passing on someone else’s content, even if it’s just a retweet. That’s a really big deal.
Dr. Sanjay Gupta
00:19:57
Just by retweeting something, you are now taking some ownership of that content and you could be held liable for that content.
Brian Fung?
00:20:06
That’s right. And so what a lot of tech platforms are worried about is if Section 230 has been a shield against lawsuits for 27 years and the Supreme Court decides to make that shield smaller, that kind of by definition means they’re exposed to more lawsuits. Facebook could have to stop ranking content in its feed or Google or YouTube might have to stop fighting spam. And so that’s that’s kind of where these arguments are coming from. It’s coming from how do we avoid litigation? And potentially the answer is stop moderating content so that you can’t be accused of knowing that it exists or moderate everything so that you can’t be accused of not doing enough.
Dr. Sanjay Gupta
00:20:53
This hits on, I think, so many sectors of society. And, you know, I think there’s a relevance to to everyone. But with regard to two lawsuits over this association between these platforms and mental health, are there lawsuits out there that you think have traction?
Brian Fung?
00:21:10
You know, partly because of Section 230, A lot of the lawsuits that have been brought in relation to tech and mental health have actually not gone anywhere.
Dr. Sanjay Gupta
00:21:21
And this is key. Remember in our last episode, and then again earlier with Emma, we heard about the ways that this law has allowed for an online space where misinformation or malicious content can exist, be easily distributed and then accessed by people in vulnerable moments. It is section 230, which is a big reason why social media companies have been shielded from legal action even by families who’ve been harmed by this content.
Kristin Bride
00:21:49
This is my son, Carson Bride with the beautiful blue eyes and amazing smile and great sense of humor, who will be forever 16 years old. As-
Brian Fung?
00:22:02
Good example, there was a congressional hearing a few weeks ago where a mother whose son committed suicide after being bombarded with bullying messages on Snapchat, tried to sue the company.
Kristin Bride
00:22:16
After his death. We discovered that Carson had received nearly 100 negative, harassing, sexually explicit and humiliating messages, including 40 in just one day. He asked his-
Brian Fung?
00:22:29
And the company was able to have that lawsuit dismissed because they cited 230. So you see a lot of these types of interactions pretty frequently, which is why you’re seeing policy makers, you know, particularly on the left, calling for changes to Section 230 so that these platforms might have to face more of these types of lawsuits.
Dr. Sanjay Gupta
00:22:56
We are going to eventually get some clarity on the fate of Section 230 from the Supreme Court. And Brian says we’ll likely get a ruling in Gonzales versus Google by the end of June or early July. But for now, lawmakers seem to have focused their sights on another social media platform, one that I’ve been talking to my own daughters about all season, TikTok.
Dr. Sanjay Gupta
00:23:20
By the way, there’s a lot of discussion lately about banning TikTok in the United States. You’ve probably seen some of-
Soleil Gupta
00:23:26
I think that’s pretty stupid.
Dr. Sanjay Gupta
00:23:28
Well they’re worried about the whole junk food thing and they’re worried that-
Soleil Gupta
00:23:30
You know what all the teenagers are going to do? They’re going to set their phones so that they live in Canada and they’re still going to use Snapchat or TikTok.
Dr. Sanjay Gupta
00:23:37
They’ve already figured it, haven’t they?
Soleil Gupta
00:23:38
Oh, yeah.
Dr. Sanjay Gupta
00:23:40
Now, I got to tell you, Soleil isn’t the only one with strong opinions on this.
Jim Scuitto
00:23:44
Well, there is a new push in Congress for something, that is to ban TikTok from operating here in the U.S.. Several-
Poppy Harlow
00:23:51
The Senator is citing fears that the app could be used to spy on americans by foreign adversaries like China.
Victor Blackwell
00:23:56
TikTok just announced that it will set a daily screen time limit for users under 18. This is one of the most aggressive-
Dr. Sanjay Gupta
00:24:03
TikTok is hugely popular. It was the most downloaded app in the entire world in 2022. But it’s also very polarizing. So I wanted to get to the bottom of it with Brian. Just how worried should we be about tick tock?
Dr. Sanjay Gupta
00:24:22
I don’t really use TikTok. I don’t have it on my phone anymore. But I know from my daughters and I know by following your reporting that at the same time, TikTok is hugely influential. It’s it’s it’s controversial, but it’s hugely influential. And there’s a lot of news that’s coming out. We know that Canada is banning it on government devices. Certain states have done that in the United States as well. What do you think is is driving that with with TikTok specifically? Is there something different fundamentally about TikTok?
Brian Fung?
00:24:51
Yes. So the concerns driving the tick tock conversation are. Oh, I guess there are twofold. One is kind of the the mental health conversation you and I were having earlier. That’s not unique to tick tock. And then the other which is unique to tick tock is the fact that the company is owned by a company with a significant presence in China and what US policymakers have said, and increasingly also European and Canadian policymakers, too, is that China has national security laws that force companies with, you know, a presence in China to cooperate with military and intelligence activities. And the fear is that TikTok U.S. user data could find its way into the hands of the Chinese government because the Chinese government has leverage over Tiktok’s parent company. Mm hmm. There’s no evidence that the Chinese government has actually accessed this information or used it for intelligence purposes. But if you talk to cybersecurity experts, the potential is there.
Dr. Sanjay Gupta
00:26:05
And I’ve heard that myself over and over again. I have friends in the cybersecurity world that told me to just completely remove the app from my phone because of that potential. But I did want to ask Brian about Tiktok’s response. How do they respond to these accusations that their app is a threat to cybersecurity?
Brian Fung?
00:26:24
So TikTok has acknowledged some of the concerns that U.S. government officials have raised about the platform, and they’re currently working with U.S. government on a potential national security deal to mitigate some of those concerns. One step they’ve taken voluntarily is to move the data that it holds on U.S. citizens from servers that it controls in Virginia and in Singapore to third party servers controlled by Oracle, a U.S. company. And it’s also working with Oracle on policies and procedures that it says will be able to keep TikTok accountable for ensuring that, you know, no U.S. user data is exposed to the Chinese government. So TikTok does seem to be working to meet the concerns of the U.S. government. But I think there are still many lawmakers who, you know, are skeptical that this deal, this, if it is approved, may be enough to realistically address the risk.
Dr. Sanjay Gupta
00:27:34
When we talk about this for government employees, I guess I can I can kind of understand that this idea that they may be dealing with sensitive material or whatever, and and they want to be careful. But what about for my daughters? Should my daughters also not be using TikTok for some of these same reasons?
Brian Fung?
00:27:54
The concern about Tik Tok for use by non-government officials has largely to do with concerns about misinformation and disinformation. So if China has access to the inner insights of how Tiktok’s algorithm works, could it then, you know, influence TikTok, you know, through its parent company to show you things that benefit Chinese foreign policy or Chinese strategic policy at the expense of U.S. strategic policy? And could that mean election interference? Could that mean, you know, sowing discord among Americans? And then, of course, many young people who use TikTok are themselves someday going to become government officials. And, you know, could that again, the history, the sum total of information that people are allowing themselves to put out there, could that be misused in ways that don’t serve their interests?
Dr. Sanjay Gupta
00:28:56
The concerns about China specifically here and having influence over the company that owns TikTok, you said there’s no evidence that they’ve been accessing this data. Right. But do you do you think that there’s there’s real validity to these concerns or is this more anti-Chinese sort of rhetoric, which we’ve heard a lot, you know, during the pandemic. And, you know, the posturing has seemed to reach critical levels sometimes. Is this part of that? Do you think it’s getting wrapped up or is there is there valid concerns here?
Brian Fung?
00:29:25
You know, this is a question that I’ve been grappling with at a very personal level. I’m Asian-American myself. My family is is Chinese. And so it’s often hard to tell where one ends and the other begins. I think it’s undeniable that there is a legitimate national security risk that much has been established. The question is what you do with that. And for I think particularly now, as you know, history. Typically the U.S. is in a kind of competition mode with China, and so it’s low hanging fruit at the moment to take a tough stance against China. And I think that there’s probably some blurring of the lines between what is very much a legitimate national security concern and being critical of China in ways that yield political dividends.
Dr. Sanjay Gupta
00:30:24
I think that’s a pretty helpful explanation because we do hear so much shouting about this. I think it’s important to lay out what we really do know for sure. Yes, there are concerns that China could one day gain access to U.S. user data or influence TikTok users for political purposes. But to be fair, so far, there is no evidence that they’ve actually done this. As with so many of these issues, I just want my daughters to simply be aware of the potential dangers so that they can use these tools more safely. You know, again, when I talk to my daughters, I’m often reminded, Brian, that this is not to say the world that they wanted. This is the world that we handed them, you know, And and now it’s a question of are there things that can be done to make it safer for young people? If you if you accept this fact that there are some dangers here. Are there pieces of legislation that you’ve seen out there that have merit that you think could actually make an impact on making the Internet safer for young people?
Brian Fung?
00:31:26
There’s no national federal data privacy law. What you do have are various state level laws, California being the toughest, that govern how companies can handle user data. And so the fact that we don’t have a national federal privacy standard is kind of a glaring missing piece that U.S. lawmakers have been trying to fix for many, many, many years.
Dr. Sanjay Gupta
00:31:54
When you look at the United States, is this country different than other countries? Leave aside China, maybe talk about European countries or the U.K. There was this U.K. Coroners Court that ruled that Instagram and social media companies were to blame for a teen’s suicide. I think that was last year. First of a kind ruling. TikTok also under fire in the U.K. for allegedly failing to protect children’s data. We’re hearing about all these sorts of things in the U.K. That’s accountability. It sounds like. It is, it’s happening overseas, but not here in the United States. Is it just not happening yet? Do you think that that’s that that’s a harbinger of what’s to come here? Or is the United States different?
Brian Fung?
00:32:36
It’s absolutely clear that the UK and the EU have been out front on a lot of these issues. And a big part of the reason is just the way that regulation happens in those places. In the European context, there’s much more of an expectation that the government will impose rules ahead of time rather than try to enforce rules, you know, after the fact. Whereas in the United States, the legal regime is kind of flipped where corporate accountability happens generally after things go wrong. And there are some signs that that may be beginning to change. And you have, you know, policymakers in Congress pointing to, you know, laws that the EU has passed or using, you know, those laws as a reference for how we craft our own laws. So there’s definitely a lot of cross-pollination happening. The question is whether or not that can be enough to overcome the kind of sclerotic political divisions that affect our government right now.
Dr. Sanjay Gupta
00:33:50
I usually try and steer clear of politics. That’s not what I do. I’m a scientist. I’m a doctor. I’m a journalist. But this goes beyond politics. These kinds of lawsuits or legislative fixes are going to have a direct impact on all of us. They’re going to have a direct impact on my children. All of our children, their futures. And they know what we all need to care about that. I know it can feel especially frustrating because there’s a lot we don’t know when it comes to the future of social media. For example, we don’t know at this point if lawmakers can come together to rewrite Section 230. We don’t know if social media companies will make meaningful changes, like Emma Lemke is asking for. And like we’ve been saying all season, we don’t know the full scope of how screens and social media impact our health. A lot of that work is still currently underway. And that is why next week we’re going to head to Seattle and take you inside a lab that is at the forefront of this amazing cutting edge research into how screens impact early childhood development. Thanks for listening.
Dr. Sanjay Gupta
00:35:06
Chasing Life is a production of CNN Audio. Our podcast is produced by Grace Walker, Xavier Lopez, Eryn Mathewson and David Rind. Our senior producer is Haley Thomas. Andrea Kane is our medical writer and Tommy Bazarian is our engineer. Dan Dzula is our technical director. The executive producer of CNN Audio is Steve Lickteig. And a special thanks to Ben Tinker, Amanda Sealey and Nadia Kounang of CNN Health and Katie Hinman.
Source: www.cnn.com