Public Health vs. The Internet: LIVE from the American Public Health Association Annual Conference in Atlanta | Crooked Media
25% OFF NEW ANNUAL SUBSCRIPTIONS—Join Friends of the Pod Today! 25% OFF NEW ANNUAL SUBSCRIPTIONS—Join Friends of the Pod Today!
November 14, 2023
America Dissected
Public Health vs. The Internet: LIVE from the American Public Health Association Annual Conference in Atlanta

In This Episode

America Dissected comes to you LIVE from Atlanta at the American Public Health Association Annual Meeting. Abdul reflects on the ways that the internet is fundamentally reshaping the way we think about place–and its impact on public health. Then he sits down with Ian Bogost, professor, video game designer, and contributing writer at the Atlantic.

 

TRANSCRIPT

 

[music break]

 

Dr. Abdul El-Sayed: All right. Good evening APHA’s! [applause] I am so excited uh to be here with you. I’m also super excited to be welcoming our guest today uh, Ian Bogost. Now, we’ll talk a bit more with Ian but I want to give you some context of why it is that we’re talking about public health and the Internet. Now, like many of you, I’ve given my fair share of public health lectures. And usually when I give those lectures, I try to start from baseline first principle. So what is public health? And I know everybody in this audience is real quick. We’re always like, well, it’s not medicine, okay? It’s different. [laughter] Public health is what we as a society do collectively to assure the conditions for people to be healthy. Now, all of us have heard that definition. We probably heard it many, many, many times, can repeat it ad nauseum. But I want to break it down quickly because I want to explain why it is that I think we’re getting the Internet wrong. Public health, of course, is not about individual action. It is about collective action. It’s about all of us acting in concert to do a thing. You can’t do public health for individuals instead, what you do is you do public health for communities, for society. And then the last part is that doing public health is about what you do collectively to change the conditions in which communities live. Now, when we think about conditions, we start from the most tangible. The very air we breathe, the water we drink, the sidewalks or not that we walk on. The nature of our public transit. But then we get more esoteric. The nature of laws, whether or not you have a chance to help elect your public officials, whether or not your speech and your bodily autonomy are protected. We all understand this when it comes to what it is that we do. But I want us to think a little bit about what kinds of contexts we take seriously in public health. And that’s where I think we may be failing. See when we think about context, most of the time we limit the contexts that we admit into where it is that we do public health action to tangible contexts. Places, you should be able to see them and feel them and walk into them. The problem though is that I think our definition of context may be too limited. How many of you all here remember a time before you had Internet in your home? Alright, that’s most of us, right? I remember that time. We didn’t get Internet in my house until we had America Online in 1996. I was 13 years old. I had never bothered learning how to type until I was trying to AIM message my friends. But I remember that you would walk away from the Internet. I want you to raise your hand if you could access the Internet within 3 seconds right now. Here’s the thing about that. When was the last time somebody told you that you should practice presence? That you should really think about how to make sure that your presence in the space in the moment that you’re in, that this would be really good for your mental health. Hear that advice before? Why is it that we have to give that advice? How many of us are not actually present in the place that we’re in? Well, none of us and all of us. And that’s exactly it, is that when we think about being present in your current context, we usually think about trying to keep your mind focused on the place in which you’re inhabiting. And the reason we say that is because it’s so easy to inhabit what? Another place, and that’s the place in your phone on the Internet. And I think the fact that we assume that the Internet of today is just like the Internet from 1996, that it’s a place that you can choose to go to, that maybe you can interact with some of your friends that can give you certain kinds of information and that that’s all that’s limited to. I think that we’re missing out on the opportunity to think through the broader impact of the Internet of today on people’s health. Because here’s the thing. I’m coming to you as somebody who’s about to turn 40. Mmm? And I barely remember a time before the Internet. I think about my siblings. One born in 1992, the other born in 1999. For them, there was not a time before the Internet. And for that reason, the ways in which the Internet space has started to take in, has started to conquer, has started to color our ability to be engaged in the real space means that we fundamentally have to be asking about the public health implications of the Internet. And I say this in 2023, in a time when the Internet is about to undergo some really profound changes. So today, yes, we’re going to talk about mis- and disinformation. We’re even going to talk about the ways that the Internet can enable us with real time data. Those are all great. But I want you for the course of this discussion to take seriously the premise that the Internet is a place like other contexts, and that if we’re serious about engaging some of the biggest challenges of our time, whether it’s teen mental illness or the fabric of our civil society, that we need to start thinking a bit about how it is that public health ought to act on this context we call the Internet. We’ve got an incredible guest for us today. First, I’ve got to read a couple of messages from our sponsors. [music break]

 

[AD BREAK]

 

Dr. Abdul El-Sayed: With that, I would love to introduce our guest today. Ian Bogost is somebody who’s been thinking a lot about the Internet as context. He’s even designed part of it. Ian Bogost is a video game designer, professor and author. He’s written ten books, including How to Talk about Video Games. His work can also be found at The Atlantic, where he’s a contributing writer and he’s a professor in arts and sciences, film and media studies and computer science at Washington University in Saint Louis. Please give a warm APHA welcome to Ian Bogost. [applause] [music break]

 

Dr. Abdul El-Sayed: So Ian, how many uh public health conferences have you ever been to? 

 

Ian Bogost: I have been to one. 

 

Dr. Abdul El-Sayed: How are you enjoying yourself? 

 

Ian Bogost: So far, so good. 

 

Dr. Abdul El-Sayed: So are we, are we the funnest people you’ve ever been around? 

 

Ian Bogost: You are the funnest people I’ve ever met. [laughter]

 

Dr. Abdul El-Sayed: Maybe you’ve got some boring friends first of all. [laughter] But also, like, we’re a whole community founded on telling people what not to do. 

 

Ian Bogost: Yeah, it’s kind of a drag. It’s like IT. 

 

Dr. Abdul El-Sayed: Yeah. A little bit. A little bit. [laughing] But um [laugh] I want to. I want you to just think a little bit about the definition that I shared about public health. And you’ve designed digital spaces? 

 

Ian Bogost: Mm hmm. 

 

Dr. Abdul El-Sayed: I want you to think a little bit about how do you think about designing them and do you think about them as real spaces and do you think about the ways that people are going to interact with this space vis a vis their health? 

 

Ian Bogost: So the spaces are real. Digital spaces are are real, but they are distinct from other kinds of spaces. We went through a couple kind of [?] with ourselves on this over the years. First, we thought there was the real world and there was the online world. And when you’re in the online world, there’s something kind of wrong with you. You know, um like you’re in your you’re in your basement too much or you’re dialed into Prodigy or America Online. And, you know, mom wants to use the phone or you’re gaming with your friends or whatever you’re doing. But now all of us are in those spaces all the time, as you pointed out. And so it feels as though they’ve taken over the world. But they are still they’re still distinct, like those of us in this room today are in this room. And those of us who can also be on the Internet with one another, you can text your friends or your, you know, your parents or whatever. Uh. You’re in a different you’re in a different space. So that can be very confusing, can be very confusing, but it’s really no different when you think about it. It’s no different than opening a book, turning on the television. We’re used to moving between a meet what we call mediated experiences and the Internet is that. But it’s also weirder than that because you use it to do things. You use it to do your banking, you use it to go to the doctor, you use it to find information, to buy stuff. And increasingly, we do a lot more things through it. So it’s made it even more confusing to think of them as as separate. 

 

Dr. Abdul El-Sayed: I think one of the places for me where the rubber hits the road when it comes to the role of the Internet space is in how we think about community. And I think what’s been really challenging in thinking about and watching particularly younger people interact for whom the Internet has displaced traditional community spaces is that even if you wanted to opt out as a young person, I don’t actually think that’s possible because so much of that interaction is mediated online. Like you walk into a lecture hall before lecture and I remember being in college and I’d be sitting there either I have a group of friends with me, we’d be talking, hanging out, or I might try and strike up a conversation while we’re waiting for lecture. 

 

Ian Bogost: Sure. 

 

Dr. Abdul El-Sayed: Today, everybody’s literally on their phone and like, that’s the normal thing. And if you tried to strike up a conversation, people look at you kind of weird. So how is it that the social feature of the Internet has started to, I guess, admix almost steal from a lot of the social features in in in actual lived space? 

 

Ian Bogost: There’s a lot to say here. There’s a lot a lot to unpack. One one feature you’re observing is that not just young people, but people of all kinds don’t have other options. Where are you going to go to do things? It’s true if you’re a young person because, you know, we we coop you up. You don’t go out and like ride your bike or hang out with your friends. And there’s sort of these these stereotypes of what that looks like. But it’s also true. But it’s true in part because you can do it online. You can you can dial up and you’re having true social experiences when you play Fortnite with your friends or what have you. It’s not it’s not fake. It is it is real, but it has displaced other forms of socializing. Um. Now, that said, that said, the um the idea that you could just put your phone away, like for this session, maybe you just don’t need to look at your at your phone if you’re in the audience or you know that that sometimes you’ll go out to brunch or something with a group and they all like, let’s just all put our phones in the center of the table and, you know, let’s let’s be present with one another. It’s it’s simply not the world that we live in anymore. And we can’t pretend as though we’re not being drawn constantly back to those devices. There are companies worth trillions of dollars whose sole purpose is to get you to pick up the phone again that’s in your pocket or purse right now, or that’s in the center of the table. And it’s simply not realistic uh to kind of wag our fingers at it uh as an undesirable activity, as something that’s just like a bad habit. Like like biting your fingernails. 

 

Dr. Abdul El-Sayed: I really appreciate that point because there’s there’s a couple of features I want to draw out here. The first is it gets to a couple of different modes of of public health action. And anybody who listens to the show knows that one of my big frustrations with what public health has become is that we have ceded this regulatory–

 

Ian Bogost: Right. 

 

Dr. Abdul El-Sayed: –engagement space and we have limited ourselves to being, at best informers of risk with some value laden proposition of that. And we use that value judgment oftentimes to try and coerce people about what they should and should not do without thinking about the broader structure within which they do that thing. So we tell people, you should eat well and exercise. 

 

Ian Bogost: Sure. 

 

Dr. Abdul El-Sayed: Well good that’s great advice. 

 

Ian Bogost: It is. 

 

Dr. Abdul El-Sayed: Except for if you live in a community that was designed specifically so that you needed a car to drive in it, right. It becomes a lot harder to go out of your way to get exercise. 

 

Ian Bogost: That’s right. 

 

Dr. Abdul El-Sayed: If we don’t sell healthy foods and we’ve corrupted our food environment, it’s a lot harder to eat well. And so we’re in this situation where when we talk about about the health implications of phone use of social media, the challenge is we’re swimming against the tide and we’ve sort of not thought about or failed to appreciate the fact that in the same way, we are working against a set of corporations who have made, like you said, trillions of dollars monetizing eyeballs and eardrums that we always pick up the phone. 

 

Ian Bogost: Although there’s another side to it too, which is that maybe it’s okay, maybe it’s even good, at least some of the time. And maybe most of the time. I was thinking about this this year. I was trying to remember what people did, what I did, because I remember before phones and the Internet, what did I do during all those times when I would be looking at my phone now? When you walk into the classroom, you’re waiting for your coffee. The bus hasn’t come, the session hasn’t started, and you need something. What would you have done? And what I realized is I we were really bored. [laughter] It was super boring. You go to the dentist’s office or something and you’d read through the magazine from like three months ago that was on the counter. And then when it was done or when you’d seen everything you wanted, you’d stare at the clock or you’d read the pamphlet that they handed out to you. People would read shampoo bottles just for something to do, just for something to do. Uh. And or know how you’re working a boring job and waiting for the for the shift to end. Nothing, nothing to do. And now you can get information, sure you can get misinformation, but you can connect with people. You can talk to your spouse or your friends or your or your parents. Um. You can read, people read more than they ever have before. So the activities that we’re partaking of when we use those devices aren’t just kind of compulsive. It’s not just oh I’m I’m taking part in this compulsion. That’s what I’m doing. It is that don’t get me wrong. But it’s it’s more than that, too. 

 

Dr. Abdul El-Sayed: So I think, you know, I think the analogy that I want to go back to then is food. All of us need to eat food. The challenge becomes what happens when our need to eat food is weaponized around creating foods that are artificially minimally expensive? That are not as nutrient rich as they ought to be and that are designed specifically to compel us to eat them. Right. And that’s the the aspect of, I think, nutrition that public health would would would look upon and say that’s a real problem. And I guess the question I want to ask you is we have an Internet, right? An Internet 2.0 we’ll say right, Internet 1.0 was AOL. You know, me, 13 years old in ’96. 2.0 was me in 2007 on Facebook. Right. And we’re about to get to 3.0, which is this AI media and we’ll talk a little bit about that. But part of the problem here seems to be the Internet unto itself is not a problem. You’re right. Like it allows us to do all kinds of great things. I don’t think any of us, you know, would want to go back to a world where if you wanted a piece of information, you literally had to go to a library. Libraries are amazing places, by the way. But where you’d have to go to a library, talk to a librarian, find an actual physical journal, sit there and read the thing, maybe photocopy it right. We don’t want to go back to that world. The problem, though, is that it’s a series of incentives. I’d love to hear a bit about, you know, thinking about past dependency, given that we’re about to be at another inflection point. What are the set of choices that led to the Internet that we have that tends to be compulsive, that tends to um, you know, pull us? It’s not just– 

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: I’m hanging out like waiting for a doctor. It’s I’m with the person I love the most in my life and we’re having a lovely dinner and I want to check my phone because because my fingers are getting right? 

 

Ian Bogost: Right. 

 

Dr. Abdul El-Sayed: That’s the issue. 

 

Ian Bogost: Yeah. And this is a long history. So let me see if I can go through it rapidly. Um. It’s a history of of trying to solve a problem of access and trying to solve a problem of scale and of facing the consequences of those of those solutions. So back way before the Internet, in the Cold War, there was a dream of making info, you’re kind of an officialist dream, a guy named Vannevar Bush, who was partly responsible for the Manhattan Project, wanted to make scientific information more accessible, or at least loved the idea of making it more accessible. 

 

Dr. Abdul El-Sayed: He hated the library too. That’s what you’re saying. [laughing]

 

Ian Bogost: He came up with this this concept of a physical desk that would that would have a bunch of um microfilms in it, and you could access all that information at your fingertips and make connections between them. Uh. And you could see what other people had made connections. And this in 1945, when this vision was articulated in a piece for The Atlantic, actually called As We May Think. Uh and this had enormous influence on people that came later um on on the invention of the idea of hypertext, which then influenced the development of the World Wide Web, but also on the personal computer. So one of the things that we have with the pre-history of the Internet is a desire to make information more accessible and then a desire to make machines for accessing and managing information, a.k.a. computers, more accessible. Um. In the early days of the personal computer in the 1970s, when this idea was new, computers were the things of government agencies, of large corporations and the folks who invented the PCs, you know, the Apple one, Steve Wozniak and Steve Jobs, and uh uh even the folks who made early video game systems at Atari, they saw themselves as part of the counterculture movement. They were they were taking control of this machinery of the future and putting it in the hands of the people. So that’s about access, right? And then it’s about scaling up that access by turning it into commercial products. And those were the PCs that we got in the seventies and eighties and then connecting them together um via a system of publishing information that came out of the sciences. Again, that’s where the World Wide Web originated as a tool for scientists to share information and researchers um that then became commercialized in the in the Web one days or the Internet 1.0, if you want to call it that. And uh what if everything, the question then was, what if everything that you do now you could do through a computer and you were connected to your bank and your electric company and you could look up the menu of the restaurant down the street to see if it was appealing to you. So again, you know, it’s this it’s this theme of of access and then scaling up that access to more and more people. Then suddenly you could publish. You or me or anyone could have a web page or later a blog, and you could say whatever you wanted. You could publish video or audio on the Internet, and you, or at least in principle, were uh taking away the middleman. You now had direct access to the whole world, or at least the whole world that was online. And all of that was was liberating in many ways. And it gave us access to one another and to information. When things really started to shift it was when that practice became a kind of practice like unto itself, you know what I mean? Um. A computer was typically used to get work done. You know, you were using it to do word processing or something, and then you were publishing that in a phrase. Even even the early Internet was about getting things done. I’m now doing I’m paying my cable bill online. You were getting something done. But that social media era and then the smartphone era that followed on its heels was about using computation just for whatever whatever it was, what just for the activity of using it. Right. And that’s when you started going on social media. That’s when you started just, you know, kind of using the phone to scroll through things or even just to play games to kind of like touch the apparatus, to feel what it was like to manipulate it. So that shift from using these machines and the networks on which they were built to get things done, including to connect with one another. Sort of what the early days of social networking were all about, you and I could keep in touch even if we didn’t live together in the same town, even if we didn’t call each other on the phone every week to well now we just everyone’s a publisher and we want them to speak as much as possible to as many people as possible. That’s the shift from social networks to social media. That’s when everything changed and that’s when it became a game of um maximizing content, as we now call it, engagement, monetizing that engagement uh and turning it into not just a kind of platform for speech or for activities, uh but for attention um and as a kind of way of as a kind of way, way of life. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: So that’s the area to really focus on. Um. If you wanted to find remedies akin to those that you might analogize in food in the in the food chain, it’s there. It’s the quantity of information. It’s the speed at which it can be delivered. It’s the uh the number of people that can see it all at once. Um. And you put all those things together and it’s just no wonder that it’s explosive. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: In a bad way. 

 

Dr. Abdul El-Sayed: So, you know, it’s funny because the analogy for explosive content on the Internet is to a virus. 

 

Ian Bogost: Mm hmm. 

 

Dr. Abdul El-Sayed: And everybody wants to, quote, “go viral.” I kind of always want to remind folks like viral, going viral is not a good thing, right? In Public health, essentially a bad thing to go viral. 

 

Ian Bogost: Oh, it’s it’s I mean, Douglas Rushkoff is the guy who came up with with that idea back in the in the nineties. I believe he called it an idea virus. And it was absolutely, in his words, articulated as a bad thing because viruses are bad. It got completely contorted. 

 

Dr. Abdul El-Sayed: Mmm. 

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: I want to think a little bit, though, about the ways that specifically a couple of very, very big corporations cemented in place this kind of marketplace. And I think one of the challenges that I’ve had, particularly as we think about public health and our engagement with the Internet, is that so much when you ask folks about the impact of the Internet on public health, they’ll usually say one of two things. One good thing which is wow we have so much information at our fingertips. We can collect such great data. And I think those things are true. And I want to ask a little bit more about that, particularly as it comes to A.I.. But the second is it has become an engine for the proliferation of mis- and disinformation. And I guess the point that you’re making here, and I want us all to think a bit about this is it did not have to be that way. But it became that way as a function of the kind of marketplace that was built and cemented into place by a couple of very, very large actors. And I think we missed the opportunity to be thinking about all of the externalities of that kind of action of like tech as a potential public health bad that I think we missed. 

 

Ian Bogost: Oh yeah. 

 

And I want to think of a little bit about that with you. 

 

Ian Bogost: Yeah. No, we completely missed it. Uh it the intervention had to happen sometime between 2004 and 2008 or so. Um. And in the intervening years, you got social media in all terms YouTube, Facebook, um and um you know Twitter, eventually Instagram, which came. Actually Instagram I think is really the moment when it cements itself, I’ll get back to that. And then the smartphone was further [?] so so that was the moment for some kind of regulatory intervention if that’s where you referring to. But yeah, we, we uh that that time came and went um and you know, the Obama administration uh was kind of a big booster of of tech actually during that time. So it’s it’s not as though uh the uh you know, the opportunity for regulatory intervention which you might associate with that sort of administration being in power. That wasn’t enough either we just completely missed it. 

 

Dr. Abdul El-Sayed: Mm. So thinking now. Right. I want to um ask you and this is this is, you know, to all of our public health talkers in the audience. One of the things that happened during the actual viral moment where like an actual virus went viral uh is that we had come to the Internet assuming that the old rules of conversation would hold. Specifically that being an authority or an arbiter of the science would command a platform large enough to be able to drown out right or otherwise communicate past a lot of the mis and disinformation. And what we found was that because of the algorithms that had been built in that 2004 2008 period and continue to hone later on, is that actually it was the conflict itself that drove engagement and that created an even platform between truth and like full on falsehood, which of course elevated falsehood. And I guess I want to ask you, as we think about, you know, how we engage in this this Internet 2.0 moment and going into 3.0, how should we be thinking about the way in which this moment of the Internet actually privileges certain kinds of speech versus others? And how do you communicate in it? 

 

Ian Bogost: [laugh] Did y’all really think that, though? That it was [?]– 

 

Dr. Abdul El-Sayed: We did, we did. 

 

Ian Bogost: Wow. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: Okay. 

 

Dr. Abdul El-Sayed: Am I wrong? No. We really thought that, didn’t we? 

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: Yeah, we did. I mean, we’re like we’re like an earnest bunch, right? Like, we we assume that, like, good always wins. And that like unfortunately, is not the world we live in. 

 

Ian Bogost: I mean, yeah it’s it’s there’s so much to say about it. The um we know we know that antagonism oppositional as um uh bad vibes uh that they reach further and faster than good ones and we kind of knew that already from television and from other forms of media, but it only got amplified uh in the in the Internet uh age. Um. So, you know, in terms of like how we since we can’t go back and recuperate that moment, we have to face the reality in which we actually live and work from that point forward. I don’t know what I would have done differently as a public health communicator had I been one during the early days of the pandemic. But what was clearly crucial was to understand, okay, the channel is going to have an effect on the message. And surely didn’t we learn that from like Kennedy and Nixon, you know, like, why why did we have to learn that lesson now? Uh. It’s a very simple one, and it’s that technologies change the messages that are delivered on them. And so when you say one thing one week and you say another thing the other week, but now they’ve been compressed into little soundbites that are even smaller than the soundbites you can get on network television. It is no wonder that that creates a space for that kind of argument that you’re talking about, that kind of dispute to undermine the trust of a supposedly trustworthy actor and then to spread that mistrust further, whether for earnest or or deceitful ends. We we knew that was the case already. By 2020, it was known. 

 

Dr. Abdul El-Sayed: Hmm. We didn’t get the memo. 

 

Ian Bogost: Well, and, you know, it’s easy for me to say now sitting here today and maybe nothing else could have happened. Maybe, maybe this is a comforting thought to think. Maybe at that moment in time there was nothing we could have done better. The battle had already been lost. We got through it as best we could in that moment, which doesn’t mean that we did a good job, that you did a good job, but now we have to move forward in some way. Now if I had to pick one thing to change and not an easy thing to change, one thing to change, it would be to turn down the volume to to down scale where everything had been upscaled. People just aren’t meant to be able to speak this frequently to this many people. It is bad, and no matter what they’re saying, even if they’re delivering expert public health information, in fact, I don’t think you should be able to do that as often as you want. It has to become more precious. 

 

Dr. Abdul El-Sayed: Hmm. You know, it’s interesting because there’s an implicit competition between two pieces of what we shared. On the one hand, the whole advent of the Internet. You talked about the Vannevar Bush um idealized moment. The whole point here was that if we could connect faster to more people, that that should be a benefit for people. And we’ve gotten to a point now where the cost of communication is virtually nothing. Any of us can, in theory, communicate to everyone all the time. 

 

Ian Bogost: Yep. 

 

Dr. Abdul El-Sayed: And we have not yielded the world that we want. In fact, I I worry that because of that and because you’re right that our minds aren’t great at processing information coming to us from everywhere all the time, and understanding how to leverage that in a productive way, that we’ve created a world that inhabiting is actually really quite hard, particularly so for people who engage it the most. And I would argue that a lot of the teen mental health challenges that we face are a function of this, you know weird information environment that we now occupy as a function of the Internet. I guess my question to you is, as we think about what ought to be regulated or should have been regulated, what are the pieces that we should have pushed for? And I’m asking this because we’re about to hit another inflection, and I swore we would move on to AI, I keep saying it, but we’re about to hit another inflection in which our information is not the only information–

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: –that’s being created. 

 

Ian Bogost: Yeah. I mean, the point I would make about regulatory opportunities, that is to say what is possible and what is desirable, what’s at the intersection of what is currently possible and what is and what is desirable. There’s a lot of talk about content related, whether it’s regulation or management. So, you know, moderation. How do we get all this misinformation offline or how do we how do we suppress it from being spread if it’s false? And that’s also coming up with with AI, AI is sort of already self-censoring. A lot of the stuff that I mess around with when I when I try out these AI systems, I’m not allowed to do because someone has intervened. Right. That’s totally wrongheaded thinking. I mean, some amount of it is going to be needed. But to think that, okay, we can continue just barraging the universe with information as much as we want all the time, and we will somehow figure out a way, a viable way of taking the false stuff and hearing you know false is unfortunately relative, or maybe fortunately so, because we do have many perspectives to to to take on different positions that we will somehow be able to control the quality of information. It’s just a pipe dream. It’s impossible. So what can you control instead? Well, you can control the flow. You can control uh how much I can say how often to how many people. Um. These are simple ideas, but they’ve really never been tried. They’re also content agnostic. It doesn’t matter what I’m talking about. If I can only say it every so often. Right. So those are the those are the levers I would pull on uh if I had my druthers today. But it also um it’s it’s kind of too late in the sense, not in the sense that like, you know, power and wealth of tech companies have overrun us or that our tolerance for any kind of regulatory or legislative intervention is low. It’s also that all of us have become acclimated to this way of living. Like, do you want to be told, well, you know, maybe you maybe you’ve got your tweet for the day. Actually, you’re kind of cut off. 

 

Dr. Abdul El-Sayed: Ah, that’d be great. 

 

Ian Bogost: Right. Um. [laughter] I really I really need it. But uh but most people, they they would feel censored, right in a way. That’s probably what how they would how they would describe it. Why are you censoring me or why is it that I can’t do this thing that I was able to do a week ago? It’s a Pandora’s Box problem that, you know, where it’s trying to stuff something back in. Um. And but I still think it’s the most promising way forward. Just turn down the volume on everyone. And if there were less stuff to see, uh then we’d have less information by less misinformation by definition, because we’d have less less information, too. 

 

Dr. Abdul El-Sayed: One more question on this. You know, I’m a I’m a I’m a father. I got a six year old and an almost one year old. And I think about the ways my kids think about the Internet. They have a incomplete picture of what this thing is. 

 

Ian Bogost: Yeah, definitely. 

 

Dr. Abdul El-Sayed: And I remember my daughter one time asking me, she’s like, hey, can you post my picture on Instagram? I was like, absolutely not. How do you know what Instagram is? And uh and she’s like, oh, my friend, right? My friend has an Instagram. I was like, your friends are six. Like, how do you have a? [laugh]. 

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: How does how does your friend have an Instagram? So I think a lot about how do we think about um age, and particularly using age as a proxy for cognitive development and access to this thing that even our adult minds aren’t built to fully engage with? Do you think that there ought to be a clearer, more direct effort uh to limit internet access some you know, in Utah, for example, um there was uh frankly efforts to ban access to social media for for kids. 

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: And that’s one extreme right. And there are there are other approaches to thinking about this. And of course, every approach to this regulation is going to have some unintended consequence. How do you think about the best way to engage? Because I’ll be honest with you, I look at young people, young people these days anyway. I look at I think about younger folks in my life and it’s hard to feel like, things are not okay. And so much of it comes from this difficulty of trying to balance a life in real life, a life in digital life, and have your social interactions scored and cued and monetized by some company whose goal it is to keep you there. 

 

Ian Bogost: Yeah, I mean, one of the things that Internet life has done, especially post mobile devices, is it kind of flattens all of our experiences like, like everyone has a phone and so therefore everyone can kind of do the same things that everyone else can do. And if you’re a kid, then maybe you don’t have a phone, you probably have an iPad though. Does your six year old have an iPad? 

 

Dr. Abdul El-Sayed: No. 

 

Ian Bogost: Okay. 

 

Dr. Abdul El-Sayed: I’m a if you haven’t gotten the sense like I’m a um this Internet thing, man. 

 

Ian Bogost: Mm hmm. A lot of a lot of a lot of parents wouldn’t give their kids a phone, but we’d give them an iPad because then they can sort of like go watch TV. You know, it’s–

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: It’s kind of the age old problem of what are you going to do with your with your kids all the time. Um. But anyway, that that flattening of experience now means that we all sort of see the same stuff. We all have access to the same ideals, whereas that that you previously wouldn’t have. You would have you would have had to have been in certain kinds of adolescent or adult spaces in which different kinds of topics and opportunities would have come up, whether that’s being at school or at work or at, you know, at the mall or whatever. These things were embedded in social and and physical structures more than they are now. Now they’re they’re mostly symbolic. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: And once they became symbolic, it becomes harder to say, well, you know, you shouldn’t have access to this system of of meaning that everyone else does and that they draw a value and delight from. So it’s more of an uphill battle than it used to be. But also, we seem to be against limiting control of anything to anyone uh nowadays. I mean, I’m familiar with the the the the social media uh kind of legis– proposed legislation in Utah you mentioned. But there’s also been recent discourse about control of pornography online. And it’s been interesting to me that there’s been a lot of pushback that that too, any kind of limits or controls on accessing pornographic material online, like what would have previously in a different era seemed like fairly reasonable laws about access, um you know, modulo certain details and the legislative and kind of meta legislative purposes as as politicking rather than the politics. But there’s been a real kind of counter reaction to that. So it’s I think we may have left orbit on the opportunity to control children’s experiences or it’s much harder to do uh than it than it than it used to be. COPA, the the children’s privacy regulation from the nineties that is the, you know, when you have to like check the box that you’re 13 obviously did nothing is what do you do? You check the box. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: So one problem is that we we failed at making good laws, but the gates were already um they were already broken. But when we when we put them, when we put them up. You know, if you, if you’re a kid, if your if your six year old went and tried to buy a pack of cigarettes, like what are you doing? Right. Like, no, um but yeah, they can get on Instagram much more easily. 

 

Dr. Abdul El-Sayed: Yeah. I I wonder if enough of us who are internet native as we age and our children age that we may have a different approach to this. And I want to move into the conversation about AI, we hosted a great conversation with Dr. Eric Topol. And for folks who haven’t listened to it, I highly recommend it. The interesting thing was um Dr. Topol is considerably older than I am, and he is a huge believer in what AI is going to do for health and health care. 

 

Ian Bogost: Mm hmm. 

 

Dr. Abdul El-Sayed: And if you haven’t noticed already, I’m a bit of a skeptic, right? Because I kind of came up in this world, and I cannot argue that on net. Right? We are better off than we were when a lot of these tools weren’t there. So I want to jump into the conversation about AI. First, how do you see AI changing the Internet as we know it? And then I want to talk a little bit about the positives, because I know there are some, but also what some of the consequences of artificially generated information in this space are going to be? 

 

Ian Bogost: Okay. So let’s let’s try to define AI a little bit for for for the sake of our own sanity, because you sometimes you hear AI it can mean, it can mean sort of anything. In general what I would say is what we mean by artificial intelligence today is using large scale information to make predictions. It’s basically about probability. So given enough data on a certain topic, even if you don’t know the answer, you can put it into a machine learning system and it can say, well, based on what we’ve seen in the past, these are sort of the likely next outcomes given a certain set of inputs. That’s oversimplifying things, but that that’s sort of what it looks like. So for that to work at all, you must start with a large volume of data. Sometimes that data has been collected in, I don’t know by means that are normal means to collect information. So like health data is an example of a kind of information that we can and do collect um in um through through instrumentation, through experimentation, through uh you know, if you have like a medical practice, then there’s information. You have to keep it private, all that sort of thing. But there are ways of collecting and sharing that information when it’s drawn directly from the sources from which an [?] or [?] set of experiments, what have you. When it comes to other kinds of information, like what’s the likely next word in the sentence that I’m writing going to be, you need a much weirder dataset and a much larger one to do it. And so everything about the scale of the Internet that we’ve been talking about up until now, all of that information, everything online, everything on Wikipedia and Reddit, everything that you’ve posted to any publicly or even in some cases not publicly available resources. All of that has by now been slurped up into these language models and other sorts of of models for generating novel information as a base, as a case base from which new predictions can be made. This is important because this whole AI era that’s about to open would not have been possible without the the age of social media. 

 

Dr. Abdul El-Sayed: The OG, OG 2.0 Internet.

 

Ian Bogost: That’s that’s right. We just wouldn’t have enough data to do it. 

 

Dr. Abdul El-Sayed: And so on that first piece, when you think about tailored models built around uh prediction from data, you can imagine a world where that’s going to make public health research a lot better. Right now I’m an epidemiologist and as a by training and I when I think about how we engage with our modeling, we’re so limited by data we can identify. 

 

Ian Bogost: Sure. 

 

Dr. Abdul El-Sayed: And I wrote a whole dissertation about using uh predictive modeling to be able to run trials in a fake space. An Internet you created, right? Or a confined space you created. Now, with the computing capacity that we have today, you can imagine doing that on a much more targeted scale. And that seems to me to be somewhat positive. But then there are risks. I want you to think a little bit about when we think about the the degree to which we are able to use data or infer and make big public policy decisions from data. How do you think–

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: We ought to be thinking? Because we missed the boat, right? 

 

Ian Bogost: Right. 

 

Dr. Abdul El-Sayed: When you talked about regulation before. The hope is we don’t miss the boat again. Where do you feel like we ought to go? 

 

Ian Bogost: So this is going to be very confusing, I think, for folks who work in and around medicine in particular, because with what I’ll call medical data, forgive me if that’s not the right way to put it. Um.

 

Dr. Abdul El-Sayed: Public health data. 

 

Ian Bogost: Public health data, clinical data, research data, whatever it is y’all do. 

 

Dr. Abdul El-Sayed: Right. [laughing]

 

Ian Bogost: When you have that–

 

Dr. Abdul El-Sayed: That’s the right term. 

 

Ian Bogost: When you have that kind of information, uh it may be um it may be more reliable to be able to make predictions based on it, whether it’s whether it’s used to generate new kinds of information, new kinds of treatments, new kinds of pharmaceuticals, new kinds of therapies, whatever it might be. I really do believe that in medicine we will see enormous and we are already seeing enormous advances that are driven by by AI as a as a as a partner in in the research process. So if you hold that in one hand and then in the other, you say, oh, but also all of the communication, all the information, all the other stuff that we do, that’s not that’s not purely clinical uh or it’s not related to interventions from that pile of data they AI didn’t have a name for that’s all up for grabs in a much worse way. So it’s like it’s like kind of the worst of both worlds where you can trick yourself into thinking that AI stuff is good. It’s going to really improve outcomes. But then, oh, hold up. It’s we’re going to shovel that you know back through this door, out into the Internet uh where it can be, you know, I could probably generate smart sounding, accurate sounding, not accurate sounding, but professional sounding public health commentary like right now on my phone if I had it with me. I bet I could do it. And someone, maybe even someone in this room wouldn’t be able to tell the difference on first blush, if it came from someone who knew what they were talking about or someone who was just asking for something plausible. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: So that’s where things really start to go awry. How are we going to hold those two ideas in our heads at the same time? 

 

Dr. Abdul El-Sayed: So it’s like at baseline the overhead need for expertise goes away, except for when it really comes down to policymaking. It actually makes expertise so much more important because the cost of generating seemingly plausible sounding information is is, is is almost zero. 

 

Ian Bogost: It’s basically nil. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: Um. 

 

Ian Bogost: I mean, and you can if you have not tried these systems, you know, ChatGPT and so on, you need to try them there are there’s free versions of most of them and they’re not that expensive to subscribe to, even if it’s just for a little while. I really encourage everyone alive online today to whatever it is that you do. Try it out in these systems and see it’s it’s it’s pretty good. It’s pretty good at generating something, where you think huh? Yeah, that that sounds like someone who knows what they’re talking about. 

 

Dr. Abdul El-Sayed: Until it tells a bafflingly obvious lie. [laughing]

 

Ian Bogost: Sometimes, though, it’s hard to even see the lies because–

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: You’re like, Wait, hold I really need to think. I need to, like, hold my head to see the lie in this. I really have to, like, find it and look for it. And I think we also knew that that was going to happen, didn’t we? With with, with with health information. With other information. Like think about people Googling oh, like something’s weird about my body. I going to put it in Google and then like, WebMD comes up or something and then you’re like, I don’t know, like I have cancer. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: Um. Right. And everyone and everyone on the planet has had this experience now. So we knew it was already happening, but now it’s going to happen in a much worse way. It’s almost like, you know, you’re we’re going to have Thanksgiving soon. You go to your there’s always that like overly smart uncle, like someone who knows much less than they purport to but can pull it off. You’ll be like oh, wow. Like, I didn’t know that. But it’s like, no, they’re just making it up. That’s kind of what AI is like. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: Like that that uncle. 

 

Dr. Abdul El-Sayed: And so the notion that more and more of the Internet is going to be populated by your obnoxiously confident uncle. 

 

Ian Bogost: Yeah. It’s not a bad shorthand for it. That’s, that’s kind of what it, that’s kind of what we’re seeing. 

 

Dr. Abdul El-Sayed: So thinking through that, um how should we like I, you know, getting to where, where we see this going the worry that that I have right is a bad actor, a mis or a disinformer, specifically a disinformer who really wants to disinform at scale. These tools are profoundly good. 

 

Ian Bogost: Do you see why I want to upend the scale? Right. I just don’t see any other answer. The tools are out there. I mean, it’s over, uh and they’re only getting better. And they’re getting better much faster than you think. And it costs enormous amounts of money to make your own models. Or there are some open source models that kind of anyone can train and use that are starting to pop up. That’s why I think that that solving the scale problem is the only way. 

 

Dr. Abdul El-Sayed: So really, the answer from your mind is we have to just limit the exposure. 

 

Ian Bogost: That’s it. 

 

Dr. Abdul El-Sayed: The idea is that somehow we are going to stop the content from being created. 

 

Ian Bogost: That’s not going to happen. 

 

Dr. Abdul El-Sayed: Or that we’re going to limit people from getting to the content. Those two are the limit are are are they’re–

 

Ian Bogost: It is very difficult. 

 

Dr. Abdul El-Sayed: They’re too difficult to pull off. 

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: That the answer here has to really just be about recognizing that there’s going to be disinformation. But any one unit of disinformation, in effect, has to be throttled. 

 

Ian Bogost: Yeah. The the the lesser the exposure of any single unit of disinformation, the better. Right. And but this is going to take a long time. Look we it took us kind of a a half a generation to get where we are with Internet information. And I don’t see any reason to believe it would take any less time to to downscale again. 

 

Dr. Abdul El-Sayed: Here’s my worry, [clears throat] is that we do that. Let’s say let’s say that that regime falls into place. I worry that, you know, in a world where you know that any given time you can be exposed to mis and disinformation that’s artificially generated, that it really shakes the underpinnings of trust. Because here’s the thing about it, right? We are in a competitive information environment, and I think that’s what we missed during COVID. We did not realize that it wasn’t just us unilaterally talking to the world and saying, here’s the information you need to know, is that we were in a competitive media environment and whether it was our tactics or our strategy or just the approach to competing, we had to make information that was compelling, true, and trustworthy. Is that doing that in this world where, A, by definition you’re limiting the reach of even the truth, or you’re in this position where some regulator has to decide what truth gets the reach. 

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: Is that it makes communicating truth in this post AI world even harder. I wonder how you think about that. 

 

Ian Bogost: We we now live in a world of multiple fictional worlds. This is the best I can offer you. It’s almost like being in an alternate timeline or like like a fictional universe uh where other things are not just true, but really deeply back storied. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: You can it’s like all these, like kind of weird fandoms. 

 

Dr. Abdul El-Sayed: It’s like competing narratives. 

 

Ian Bogost: Of of right? 

 

Dr. Abdul El-Sayed: On everything. 

 

Ian Bogost: Competing narratives of everything that that trace back to lore, which is what the Internet loves, you know, a kind of back story um that can be litigated through argument. 

 

Dr. Abdul El-Sayed: Hmm. 

 

Ian Bogost: So that is the state of affairs. I don’t have an answer for what to do with it, but if we start by diagnosing that that is the state of affairs and like lamenting this truth falsehood, I mean, we just have to get over it. It’s like everyone lives in among these these kind of intersecting fictions. And what is it going to feel like to communicate, to try to find common ground about important matters. When all of us have, you know, maybe our own little bespoke bubble. 

 

Dr. Abdul El-Sayed: Hmm. 

 

Ian Bogost: Of of different narratives that have been created, especially for us, or maybe not especially for us, that now we we believe. 

 

Dr. Abdul El-Sayed: Yeah. I mean, that is a, a really sad and uh disheartening [laugh] space within which to to sort of transition the conversation. But I do want to take that note seriously, which is to say, if we live in a world that is increasingly going to be about a set of competing narratives. Then we have to make the narrative that is backed by some sort of objective evidence that much more compelling. And that’s the world that I that’s the thing I worry a bit about is that we want to live in a world as public health communicators and professionals where the evidence speaks for itself. And the evidence is never going, it never did speak for itself. We always had to speak for it. And increasingly now the evidence is going to need a lot of champions. And the hard part about this is that the information environment, the context, if you will, in which we are going to be communicating about really important information that that can leverage can be leveraged to literally save people’s lives. That that’s becoming harder and harder to communicate into. And easier and easier to manipulate by potentially bad actors. And this is the thing about it, because I think for a lot of us and this is the thing that makes it even more scary to me is that for a lot of us, we will be the generation that remembers the before AI. Just like we are the generation that remembers the before social media and the infor– the generation that remembers the before Internet. And I worry that if we do not take seriously this competing narrative environment, that we will assume that holding up the data is going to be enough. And I worry that we are going to miss our opportunity to really intervene. The other part of this is we’ve got to be invested in the regulatory framework about how information gets moved. And I think you’re right, right? That that that it’s really tough to police content. Right. Nor is it is it, frankly, even possible in a world where you can generate as much content as you want for free. But we’ve got to be involved in this conversation. And I think we may be missing the boat because we still operate in a world where the Internet is something you can toggle out of you can opt out of. Instead of a world that you are fundamentally, by definition, opted into. And the choice that we make is how we opt into it. I would love to take a couple of questions if folks have them. Um. And so we’ve got two mics on either end. If folks do have questions uh to Ian or I, we would love to take them. I can’t take hands, but I can take questions from the mic. 

 

[Audience Question number 1] All right. My question is around responsible AI and particularly how we achieve some level of  equity and balance when we know that data is often subject to the people who have the power and the sort of levers and generations and thinking about these new systems. How are you thinking about responsibility for equity and bias? 

 

Ian Bogost: Mm hmm. So I hate this word responsible AI or ethical AI, because I think what it is, is it’s a way to say I want to have the AI part and I just kind of want to magic away, the the problems like, well, our AI is responsible. Right. So it’s okay. It’s okay now. And you know, the the the point that uh the data that’s collected, where it’s collected from, who benefits from its collection, but also who is represented and how, all of these are extremely real problems. Um. But also all of that data has already been. It’s already been slurped up. It’s already in use. We don’t even know what much of it is and how it works. Most of these systems are uh are opaque and diff- that’s another another concept that comes up sometimes explainable AI, where you can ask questions about how it’s drawing conclusions and and what it’s doing. So we’re in for a world of hurt on this front. We should expect it to get worse before it gets better. But I just kind of encourage you, when you hear those phrases start to come, it’s not that they’re entirely bad, but what do they mean to the people who are articulating them? Right. I always kind of feel like my wallet’s getting stolen–

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: –when I hear this [mumbling indistinct] huh? 

 

Dr. Abdul El-Sayed: And this is to your point, this is the thing about it is that I worry that a lot of the mistakes that we made around failing governance for Internet 2.0 was being trusting of the people who were building that Internet. 

 

Ian Bogost: Yup. 

 

Dr. Abdul El-Sayed: And I worry that we are too willing because because it’s hard to understand what’s happening, too willing to trust the people building 3.0 and the incentives that they have are entirely around market dominance of a sector that’s going to be huge. And so the corners that they’re willing to cut to win that race I think are really important. And I also worry that, you know, it’s like AI is is if anything, it’s really honest. Like humans are not very honest, but AI is honest. It’s an honest arbiter of what it has slurped up. And it what it has slipped up is all the biased garbage that we’ve generated on the Internet. And so when AI, right? Generates biased stuff, it’s just reflecting back our own biases. And, you know, I worry about ceding decision making because, you know, one of the nice things about being human is that you can try and correct bias. I don’t know that the machines are going to be as good at it. The other last part I think we ought to be really thinking about is it’s going to be really easy, particularly at the individual level uh or the local level, to cede decision making to AI because it’s going to be so much better at pattern recognition. Let me use a medical example here. A chest x ray will give a physician a certain amount of information, but there is no physician in the world that has seen every chest x ray ever done. AI can train on that. So it’s going to be better at seeing patterns that we can’t see. And we are going to oftentimes see decision making whether we say we do or we don’t. However, even the system is designed to that A.I. because we’re not going to second guess the AI. And I think we need to build a lot better governance around how AI based decisions are made that empower us to make decisions that are not just um predictive, right? But that are normative about the world that we actually want to build. And this is the thing I’m really worried about I think like I’ve seen, you know, so, so often when humans interact with machines, it’s too easy to just like, well, the machine set, right? How often do you see that? Um. Versus saying, well, actually what do we want here? Um. And how much of this problem can the AI actually solve? Um. 

 

Ian Bogost: Next question.

 

[Audience Question number 2] One of my biggest concerns is the next generation’s lack of information on certain topics due to some of the regulations going on in the States. So we have children growing up and in schools now that may never be taught in a classroom about African-American history or sexual health and things of that nature. Are you at all hopeful that this proliferation of information could help with that? And lastly, go blue. 

 

Dr. Abdul El-Sayed: Go blue. 

 

Ian Bogost: I mean, it’s definitely helped. It’s definitely helped. Like that’s one of the benefits of having a global information system such that even if you don’t see it uh in uh in school or through, you know, from your parents or whatever, whatever it is, through official channels, that they’re you’re more likely to have access to get information um from others, but you’re also likely to have access to get all of the other information as well. So uh, I mean, this this is a concern that is um in some ways like separate from the matter of Internet life, that is to say, like the regulatory boot that’s coming down on what gets taught in schools or what, you know, what’s appropriate even in college classrooms. But it’s related in so far as one of the reasons it’s that that that one of the reasons there’s an incentive to drive that kind of uh legislation forward politically is because we’ve polarized so much more thanks to all the misinformation that Internet life has generated. 

 

Dr. Abdul El-Sayed: You know, what’s interesting is you use a needle in a haystack, it’s easier to find the needle. It’s just a much, much bigger haystack. There are more needles and way more haystacks or way, way more hay in the haystack. 

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: And this is the challenge that I I have. Like when I watch TikTok or Instagram reels, which is like TikTok for millennials, um like two weeks old. Uh. Is that the degree to which a random young person with a camera and the Internet can find some factual information is like really profound. Like, I couldn’t imagine having that much information at my fingertips. When I was young and I grew up in the era of the Internet, right? It just wasn’t as well-sorted or as much. And on the other side, the degree to which the information has become like a cacophony of just narrative competition, which is even worse. It’s like, is you know, we think on the Internet when we’re like, when you when you when you drop a tweet or whatever it is you think you’re communicating to the world, you’re not. You’re communicating to a group of people, right. That is tailored to be interested in what it is you’ve communicated. Now sure, can other people find it? Fine. But like, what you tend to have is people who agree with you and people who profoundly don’t. Right. And then everyone else is in a whole bunch of different echo chambers that, like, are talking about like Michigan football. Right?

 

Ian Bogost: Yeah. But also the affirmation that what whatever it was you had to say was was worthwhile. It was worth saying. And it was right that you said it. 

 

Dr. Abdul El-Sayed: Right. Right. And that that I think creates a weird set of incentives about around communication. Uh. Next question. 

 

[Audience Question number 3] Hi. This has been really interesting for me to listen in on. I’m um with an organization where a nonprofit social enterprise that builds technology tools for public health. Um. We have a very large repository of information that we’ve created over the last like five years, as well as hundreds of public health data sets. So one thing we are thinking about is what’s next for us and do we want to get into the AI space? So as you guys were discussing um and kind of lifting up the turn down the volume, um how do we be arbiters? I’m wondering, you know, what what you would advise an organization like ours that is, you know, actively building that kind of repository and looking for what’s next? 

 

Ian Bogost: Yeah, it’s like the the the local farming version of information. If we go back to the food chain metaphor, so when it’s cheap and easy to just, you know, pump corn sweeteners into things or soy or whatever it is um and it’s efficient and low cost, then you get it kind of you you reap what you sow, literally. Um. And I think there’s there’s a great, huge, enormous, important market for this kind of like, you know, small scale, bespoke uh info systems where where we we know where the data is coming from. It’s curated, it’s prob it’s provenance can be demonstrated and traced. Um. The question that no one has really answered yet is how well does it work independently from other chunks of information? Because one of the one of the powerful features about um these new generative artificial intelligence systems is that they’re able to draw conclusions in such a way that we don’t even really know how or why they’re they’re drawing those conclusions. Um. And so the relationship between the the big tech wholesale systems and these local ones are now these are now right now being litigated. And you’ve seen perhaps in the last week, like OPENAI announced some new products that lets you build your own. You can dump your data into it if you so choose. And they have certain caveats about what they are and aren’t doing with it. And then others um are building local um models what are called models, you know, to take data and then and then allow you to ask questions of it or make predictions of them and that’s happening at a lot of organizations, too. So this this whole space is now just sorting out. I absolutely think it’s worth experimenting and very carefully experimenting and understanding the risks, understanding what’s happening with the information in particular. 

 

Dr. Abdul El-Sayed: I agree with Ian entirely. I agree with you that turning down the volume will have an important effect. I also think identifying the ability to identify mis and disinformation in public health is worth doing. I think it’s going to be increasingly hard to do with the proliferation of new data, but I do think it’s an interesting question about trying to identify. The other question is also like what public health training data are we training data or do training–

 

Ian Bogost: Yeah. 

 

Dr. Abdul El-Sayed: –the future of AI on? And I think being able to build and generate that that training set, for example, could be really powerful. But like it’s a whole brand new world. And I, I, you know, at some point like 20 years on, we’re gonna be sitting down and it’s like, so, public health 3.0, how did we miss the boat there? But like, I’m glad I’m glad smart people are uh are thinking about it. So thank you for what you do. I think we have time for two more questions. One here and one here. Go ahead. 

 

[Audience Question number 4] First. Hi, Dr. Abdul. It’s good to see you again. So I feel like I heard a little bit of conflicting information that I just wanted to clarify. Um. So leading up to that, first someone said, go blue. And I’m not a sports person, um but I do political advocacy. So it meant something completely different for me. But [laughter] what I do is in the space of public affairs and policy, and I also have an emphasis on the social determinants of health, particularly um I have worked recently on strategic plans for public health departments that look at Black maternal and child health, um which I’m sure people here know the statistic. It’s three and a half to four times that of white maternal and child health in terms of mortality. So people are dying because they don’t know about resources. Um. They don’t know how to combat racism that is interpersonal, that is systemic. And it’s one thing to state the statistic. It’s one thing to try to amplify that message. It’s a completely different thing to get the right information in front of the people before they’re part of that statistic. So just kind of going back to the first question that was asked in terms of looking at limiting information, but then also trying to drown out misinformation. What would you say, I guess, is advice for people looking to shape policy that looks at getting that information out, lowering that statistic, closing the gap of maternal and child mortality, particularly as influenced by racism, while also trying to be safe with the volume of the information that we put out and also receive? 

 

Ian Bogost: Mm hmm. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: Mm hmm. Well let me take a first swing here because and this is not my field, so this is going to be a more general answer. So the opposite of scaling up is, is scaling down. And the whole tech economy only cares about the highest scale, highest leverage solutions and isn’t concerned and in fact, almost cannot intervene in in low what I’ll call low leverage opportunities, which is a terrible way to put it, which leaves a bunch of white space in the universe where people can do the much harder, much more laborious and much more thankless work of filling in all the gaps. And that’s unfortunately, from my perspective, as an outsider to this work, that’s what I think it’s going to come down to. It’s going to be a lot of um boots on the ground. 

 

Dr. Abdul El-Sayed: Yeah. 

 

Ian Bogost: Kind of work because no one else is going to on the good thing is that it won’t be all mucked up with scaled up bad information. The bad thing is that there’s no scale to it. It’s not high leverage. And so it’s hard to do, expensive. You got to put people in the right places. And I’m sure there’s all sorts of methods that you all know for doing it. 

 

Dr. Abdul El-Sayed: Well, I’m actually gonna pick up exactly where you um where you put down there. I really first thank you for your work. So much of the challenge that we face and this is an area in my day job in Wayne County I work a lot on. So much of the challenge is that as the information ecosystem gets louder, more cacophonous, and is more untrusted, people are going to look to older institutions, brick and mortar institutions, actual people I could see face to face. And one of the challenges that we have is that in the sort of rush to move everything digitally, we’ve actually lost a lot of that. And I think building more of that infrastructure is absolutely critical. So one of the things that we’re doing in in Wayne County is trying to build brick and mortar birthing centers. Right? Birthing is a truly physical thing, right? You can’t do it digitally. And so we want a place, an institution, where you can go. You could talk to the same person. And talk and have a conversation. And I think that’s going to be increasingly important as the Internet becomes a less navigable place with fewer and fewer trusted voices that you, you can engage with. The second thing I’ll say is that I think public policy especially at the federal level that addresses the structural inequities that are implicit in our system is fundamental. Now a lot of you know in my you know, if you ever listen to the podcast, I believe deeply in Medicare for All. I literally wrote a book about it. And part of the reason I do is because if you are a Black parent in America, chances are your experience just running the numbers is as having been on the wrong side of a health care system that assents to a notion where you can be a second class health care citizen. Medicaid is a really, really great safety net program. But the problem is, is if you’re you’ve had Medicaid in your life, you know that no provider is excited to see you. And it codes for all kinds of differences in the way that you’re treated. And so if that’s your experience and you walk into a clinic and the expectation because of the color of your skin, because of the nature of your insurance is about a certain circumstance in which you became pregnant, about whether or not you have a stable partnership to bring up an infant. All of those things change the way in which you’re interacted with, and those things end up pushing you into, right, the information space because you can’t trust the brick and mortar institution in front of you. So I think there are really two big parts here. We need to demand federal policy where we stop paying lip service to equity while assenting to a system where people can be rendered second class because they’re uninsurable. Right. Because we think that you have to be employed to get real insurance. And second, we’ve got to build trustworthy public local institutions where people can come and build relationships of care and where they know whatever I’m hearing over there, I can trust the people here. And what I worry about is that over the last several decades, we’ve dismantled the progress toward either of those two things. We’re not doing local services well enough. Institutions have tremendous amount of flux because they’ve been fundamentally disinvested in. And then at the federal level, right? The effort to make sure that no matter who you are in this country, that you are an equal health care citizen. Right. That your insurance should buy you access to the best care available. Right. And that we are training our providers not to discriminate against people because of the color of their skin. Doing those two things together I think is going to be critical. And I worry that one of the things we’ve often done is we’ve been quick to offload right a lot of our responsibility on to these sort of digital services when in the long term they’re going to be more and more untrustworthy, and particularly particularly for folks who have been marginalized and discriminated against in our institutions for too long. And so we’ve got to hurry up and catch up, because I worry that the Internet of the future is going to be a more polluted, less trustworthy, less engaging place, and people are going to need somewhere to turn if we don’t build those in the real world. I’m really worried about where we’re headed and I appreciate you being a part of the solution. [applause]

 

[Audience Question number 5] Hello. So sorry. I am definitely one of the few people that did not raise my hand when you asked if people remembered life before the internet. Um.

 

Dr. Abdul El-Sayed: Don’t be sorry about that. [laughing]

 

[Audience Question number 5] But I guess as someone who is pretty green in public health as an MPH student, um you spoke a lot about how there’s finger wagging and public health personnel, whatever field you are in, within public health kind of face that pushback. Um. And it just kind of relates to your episode about raising up public health. Since we’ve talked about all of the negatives of social media and AI, how can public health personnel step up and create content that’s not just engaging and truthful, but that really grabs attention? 

 

Dr. Abdul El-Sayed: So thank you for your question. That is a fantastic question, and I think you’re onto something really important. I think one of the things that your generation of folks you know, I like to claim that I’m a young person. I can’t credibly do that anymore. But like, one of the things you understand is that you’ve come up in an information ecosystem where you understand that you don’t just have to present facts, that you have to make content, right? And anybody who’s been on social media knows that good content is good content. It has to grab your attention, it has to engage you. It has to hit those little buttons that you look for right when you’re scrolling on TikTok or in my case, on reels? Right? And and that means that we have to be a lot more compelling about explaining ourselves quickly in the medium that is going to increasingly be the medium of the future. Right. And so I do think that we as a community need to take seriously the responsibility to learn how to communicate into the 21st century. And the problem is we haven’t. Right? A lot of public health schools have, you know, health communications programs that are really a lot more about interpreting right esoteric figures and graphs than they are about how to make compelling content. And I think if you’re going to graduate with a MPH or a Ph.D. in public health, you really need to be able to talk about this stuff in ways that you can push forward in one minute that captures everything. And one last note, and I just want to say this about our own internal curriculum. Part of the challenge with public health as a public health community is we’ve become we’ve ascended to becoming academicized. What do I mean by that? We we publish for ourselves more than we care about communicating to people, right who are not part of our group. So we use big language, right? And we’re real like there’s like a fetish about big words for small ideas rather than small words for big ideas. And the problem with becoming overly academic is we’ve become navel gazey, right? We’re really interested in what other public health people think. And half the time we’re stunting for other public health people rather than actually trying to communicate to the public, which, by the way, is the first word of public health. Right. So one of the things that we we’re going to need to be able to do is understand that that in an environment where you’re more academic, comprehensiveness is more important than concision. In any other form of communication being concise is more important than being comprehensive. Now, how many times have I engaged in a public discussion on public health where I had somebody else in public health be like, well, you missed this really important detail and you’re misinforming the public. I’m like, Nah, man, you don’t understand. I can sit here and give a whole lecture for two and a half hours. You know how many people are going to watch. [makes fart sound] [laughter] So, no, we have to be able to communicate in the system of communication that exists. Also, we need to be thinking about how we want to tailor the system of communication so that it allows us to communicate what it is that we want to do. And we have to be about doing both right? So I really, really appreciate your question. And, you know, any of the the the school public health administrators out there. Right. Like tick tock for public health would love to see that course somewhere. Uh huh, all right. I’ll leave. I’ll leave that there. Ian do you have anything to [?]–

 

Ian Bogost: Oh I just want to add a couple of things as an outsider to this to this community. Um. The first of which is that um all of us live in the present. We live in the present, in the now. We don’t live in the past. We have to we have to play the the hand that was dealt us. And sometimes it’s a good hand and sometimes it isn’t. I don’t necessarily think that, uh and I hope the the the picture we’ve tried to portray of the Internet isn’t that it’s just a bad hand. It’s a very it’s a very troubled one. Um. But it is it is where we live now, there’s just no denying that. And so we have to work within those constraints and yes, to change them, but not always to focus only on changing them. It’s kind of to face the real reality that’s in front of us, which I think is germane to to what all of you do, as I as I understand it. And the second thing I want to say is that, I mean, maybe this is obvious, but I don’t get this audience all the time. Um. We we are counting on you, you know, to get this right. This is important. It’s not just, oh, well, it would be nice if we had a TikTok, right? It’s that unless there’s collective, this is the whole concept, right? Unless there’s collective action, then individual action doesn’t doesn’t really matter. So it’s it’s important. Um. It’s important to get it right. Where getting it right just means, like, earnestly and deeply engaging with the the methods of of of interchange that people actually use and how they actually use them rather than treating them as externalities. It’s something that you could kind of steer around or um or avoid or ignore. 

 

Dr. Abdul El-Sayed: Yeah, thank you Ian. And so um I really, really appreciate you joining us today. Um. If we could have one more round of applause for Ian Bogost. Hey, [applause] for your first public health conferance man you killed it. 

 

Ian Bogost: All right. 

 

Dr. Abdul El-Sayed: Thank you. And um and that’s all we got today. I really, really appreciate you all coming. The show is America Dissected. I hope that you will subscribe. Share with your friends. Leave us a nice review. And if you really love us, we do have merch at our merch store. You can check it out. And with that, thank you all so much. [applause] [music break]

 

Dr. Abdul El-Sayed, narrating: America Dissected is a product of Crooked Media. Our producer is Austin Fisher. Our associate producers are Tara Terpstra and Emma Illick-Frank. Vasilis Fotopoulos mixes and masters the show. Production Support from Ari Schwartz. Our theme song is by Taka Yasuzawa and Alex Sugiura. Our executive producers are Leo Duran, Sarah Geismer, Michael Martinez, and me, Dr. Abdul El-Sayed. Your host. [music break] This show is for general information and entertainment purposes only. It’s not intended to provide specific health care or medical advice and should not be construed as providing health care or medical advice. Please consult your physician with any questions related to your own health. The views expressed in this podcast reflect those of the host and his guests and do not necessarily represent the views and opinions of Wayne County, Michigan, or its Department of Health, Human and Veterans Services.