March 20, 2020

Instagram vs. Reality: How Social Media Handles Pandemic

Instagram vs. Reality: How Social Media Handles Pandemic

We hold these WFH truths to be self-evident: that we’ll always wear pajama pants no matter how many times we promise to get dressed, and that we’re spending way more time than usual scrolling through social media.

We hold these WFH truths to be self-evident: that we’ll always wear pajama pants no matter how many times we promise to get dressed, and that we’re spending way more time than usual scrolling through social media. 

About that second one...social platforms like Facebook, Twitter, and Instagram have become lifelines for human contact and important headlines in the era of #socialdistancing, putting pressure on Big Tech to clamp down on falsehoods and potentially harmful viral cure-alls. 

But can Zuck & Co. handle it? Are they stepping up to the plate to root out dangerous misinformation? Can we trust anything we see on social media anymore?

In this special episode of Morning Brew’s Business Casual podcast, Bloomberg social media reporter Sarah Frier explains how tech c-suites are navigating what, for many, has become the most uncertain periods of their existence...and how those business decisions impact society.

Listen now and let us know what you think.

 

 


Transcript

Note: Business Casual transcripts are generated using speech recognition software and human transcription. They may contain errors, although we do our best to avoid them. Please check the corresponding audio before quoting a transcript in print. Questions? Errors found in a transcript? Email businesscasual@morningbrew.com 

 

[00:00:01] [sound of coffee being poured]

 

[00:00:04] [intro music plays]

 

Kinsey Grant, Morning Brew business editor and podcast host [00:00:08] Hey, everybody, and welcome to Business Casual, Morning Brew’s podcast, answering the biggest questions in business. I'm your host and Brew business editor, Kinsey Grant. And now, let's get into it. 

 

Kinsey [00:00:19] You might have noticed that this episode isn't coming out on our usual Tuesday morning release schedule, but coronavirus’ impact on business and the global economy doesn't care that today isn't Tuesday. So we're again doubling up for the week to bring you the most relevant answers to the most relevant questions in business. And until we better understand what's going on in this uncertain time of pandemic and economic whipsawing, expect more of the same. If you've got more questions about coronavirus’ impact on business or the economy and people in general, let me know. Now, as for today, we're talking about one of the thorniest aspects of any story that goes viral in both meanings of that word. Social media—it can help us, it can hurt us. It can be our really homely tether to the rest of the world when we're all practicing social distancing and prudence, self-imposed isolation. But no matter what, the business implications are wide ranging. 

 

Kinsey [00:01:12] I probably don't need to explain to you how deeply social media has become entrenched in our lives. Let's instead talk about how it's changing the first version of today's history in a world ravaged both physically and economically by COVID-19. And what is big tech doing? Where is it falling short? How will it change the way we're interacting with social media in the future? To talk about all of that, I am excited to remotely welcome Sarah Frier to Business Casual. Sarah, thank you so much for joining me.

 

Sarah Frier [00:01:41].Thank you for having me. 

 

Kinsey [00:01:42] I'm really excited to talk to you. You've got a really interesting career. You cover social media at Bloomberg, and you just wrote a book that's coming out in April called “No Filter: The Inside Story of Instagram,” which we are excited to read as well. And you kind of made yourself an expert voice when it comes to companies like Facebook, Instagram, Snapchat, Twitter, and the big business that those companies are doing, which is, I'm sure, been a very interesting beat to cover for the past several years. 

 

Sarah [00:02:08] My job has changed every year. It's become less like covering a company and more like covering a government. [Kinsey laughs] It really has. I mean, these companies now are in charge of our information flow, in charge of how we undergo elections, in charge of so many aspects of our lives that we never could have anticipated a decade ago. 

 

Kinsey [00:02:32] Right. And I think that that is the hallmark of this entire conversation—is that everything is changing at a pretty quick clip. But we're going to do our best to figure out what's going on today. Like I mentioned before, we are remote. You are on the West Coast. I'm on the East Coast. You guys are kind of in a bit of a lockdown situation, I believe, in the Bay Area, where you are. I don't know if that is what's in store for us here on the East Coast, but we'll do our best. So let's dive right in, shall we? 

 

Sarah [00:02:59] Absolutely. 

 

Kinsey [00:03:00] OK. Cool. So I want to talk shortly about what the specific companies are doing—specific social media and big tech companies are doing in light of everything going on with coronavirus. But first, let's kind of start broadly. So if you had to say in say one sentence: What should the role of social media be in a pandemic situation like what we're experiencing today—in a perfect world? 

 

Sarah [00:03:25] In a perfect world, this social media, first and foremost, is a place to keep in touch with all of the people who are going through the same thing you are going through, which is this extreme isolation, perhaps maybe even medical issues that they're having. Your friends and family are probably all feeling just as lonely and scared as you are. 

 

Sarah [00:03:47] And Facebook has really seen a dramatic surge. Zuckerberg said this Thursday that the site is now seeing spikes of activity, surges in activity that are higher than they experience at the stroke of midnight on the New Year. So that is how much people are relying on social media right now. But that also means that social media has become a definitive source of information for people. And, in an ideal world, it would be the best information verified to help you become safe and take the right steps you need to take, for whether it's preparing for hunkering down in your apartment or getting tested for the virus should you be unlucky enough to contract it. And I think that that is where we're starting to see some cracks in the issues of—not just because of Facebook, but because of this global panic—everyone sharing information from unverified sources, taking advantage of scared people, and also just piling on and in spreading that which scares us. 

 

Kinsey [00:05:05] Right. So if we're saying, you know, Instagram versus reality. Here the Instagram version is—this is our outlet to talk to other people, to share experiences together, to build internet communities. But the reality version is we're also experiencing an overflow of information, not all of which is true and accurate. 

 

Sarah [00:05:24] Even when people want to be helpful, they're sharing rumors of military lockdown in New York City. Or you can solve the coronavirus by drinking bleach. 

 

Sarah [00:05:36] Do not do that, by the way, [laughter] the things that are spreading on social networks, especially in these more hidden communities, on Instagram; on Facebook it works a little differently because everything goes viral because of the sharing capacity. So you might see things bubble to the surface and then fact-checkers, hopefully, eventually we'll debunk them and they'll get removed. But on Instagram, it's a lot different because it works in hashtag groups where there is no sharing and people may be spreading lies to each other without a lot of outside scrutiny. 

 

Kinsey [00:06:14] So let's talk about some of the scrutiny. You know, each of these outlets, it seems like, has come up with different ways that they're going to go about battling misinformation. And that's not only with what we're experiencing with coronavirus. They all have different kind of policies in place in normal times, not just in pandemic times, but how do these social media networks and companies go about actually weeding through that information? It seems like an almost Sisyphean task. Like it's never going to be completed, and that we're just going to kind of constantly live in a world in which there is falsehood and that is propagated on social media. 

 

Sarah [00:06:53] Well, a couple of things to note there. Facebook and their employees, they're working from home just like you are. And what that does is it dramatically complicates the effort to stem the spread of misinformation because the contractors who normally come into the office to fix that are not able to work in the kind of specified environments that they're allowed to, in a way that adheres to their rules on privacy. So Facebook is redirecting a lot of their content, moderation, work to employees. 

 

Sarah [00:07:32] And meanwhile, they're just overloaded with the amount of this task. They are finding that they need to augment quite heavily with artificial intelligence. So when you talk about who's reading this stuff, how do they take it down? It's probably an algorithm or a computer that is reading these posts to take them down.

 

Kinsey [00:07:54] Do you think we can we rely on that? 

 

Sarah [00:07:56] [laughs] No, I mean, just yesterday there was a huge bug where the spam filter took down a lot of verified information. I should say Wednesday, because this podcast, you might be listening to it another day. On Wednesday, there was a huge bug that took down a lot of verified information from new sources like PBS, and Facebook eventually caught it and restored a lot of those posts. 

 

Sarah [00:08:26] But there was there was a lot of uproar from people who were trying to spread good information about the coronavirus and how to deal with the surge in cases and how to do the best kind of social distancing. And, yes, it was just a disaster. So sometimes these algorithms can be oversensitive and sometimes they can miss things. And what Facebook is hoping is that when they get guidelines from the World Health Organization or the CDC, they're trying to use those as the arbiters of what should and shouldn't stay up. 

 

Sarah [00:09:01] Then they can run some sort of filter to check through those items that, like drinking bleach, secure yourself, they can find everything that sort of adheres to that keyword and remove it en masse. Of course, that means it's going to take a while before new misinformation gets to the level where the World Health Organization or the CDC sees it. And even then, Facebook is only focused on taking down misinformation that causes imminent harm to people, like bad cures or bad advice for what to do. Saying if you drink celery root juice, you'll be cured. Or saying if you—it's fine, you should just go out and hang out with people. The only way that coronavirus can spread is through some vote, some bogus thing. The misinformation that they're not taking down is maybe the more outlandish theories about where the virus comes from. 

 

Sarah [00:10:02] There were a couple celebrities last week on Instagram that were sharing rumors that Bill Gates was behind the invention of the virus. [Kinsey laughs] And that's the kind of thing that Instagram is just going to turn a cheek to say, you know what? That's not the top priority right now. The top priority is stuff that causes harm. 

 

Sarah [00:10:23] And the other top priority is, Zuckerberg was saying, is people posting about self-harm and depression. And he said he was very worried that as we isolate ourselves from our loved ones, and as we work alone without our coworkers, especially if you are in a position where it's difficult for you to get groceries, or it's difficult for you to get help, or you don't know a lot of people that that can help you—you could see a surge in instances of depression and self-harm. And so Facebook is surging its response to that kind of content and trying to take down posts, because when there is a post that discusses self-harm, it tends to inspire others to think the same way. 

 

Kinsey [00:11:09] Right. And you bring up an interesting point—that it kind of has to be what floats to the top. What is the most important for people's well-being tomorrow? Not necessarily, you know, celebrities saying some sort of outlandish comment. It's interesting, a word that I kind of hadn't heard of until I started preparing to talk to you today, that the World Health Organization is calling this a massive infodemic, which I think really paints an accurate picture here. That infodemic is this overabundance of information. Some of it's true, some of it's not, but it makes it harder to figure out how to parse through all that information, to find what's true, find what's reliable, and find the news that's accurate. And, you know, there is just so much that it's almost impossible to get through it. You just have to prioritize what's important today and what can wait until maybe there is more time to go through even more information that's floating everywhere around right now. 

 

Sarah [00:12:05] It's hard to imagine that just a few weeks ago, a top concern of many reporters like me in the US was what Facebook was going to do about political misinformation. Facebook today said, you know, with all of the candidates telling lies about each other, that's nothing compared to this kind of misinformation that can cause this real-world harm. And so, we're just going to have to watch them prioritize. But I would say that it's also indicative of how Facebook built its business. They now have more than 2.5 billion people using Facebook products around the world. They only had—even before this—only had about 15,000 of those people who moderating content and trying to keep users safe. 

 

Sarah [00:12:57] That was underinvestment in the health of our information that we receive. An infodemic waiting to happen. And so, there is some responsibility that Facebook should take for not having invested in this and sort of thinking that they could transition to AI-based moderation a little bit faster. 

 

Sarah [00:13:20] And now they're going to be forced. Now we've got what we got. Like this is the world we live in. And they're going to hopefully invest a lot more resources in this. But they're catching up. And Facebook invests a lot more in, or at least understands a lot more about, how information spreads on its main network. But remember, this company also owns Instagram, which has more than a billion users. And WhatsApp, which has more than a billion users, and Messenger. 

 

Sarah [00:13:50] So there are all these other ways that people are going to be lying to each other, scamming each other. And we're only going to see them be able to deal with stuff that reaches the surface. 

 

Kinsey [00:14:03] OK. I want to talk more about the direct business implications of that underinvestment in content moderation that you bring up in just a second. But really quickly, let's take a short break to hear from our partner. — And now back to the conversation on social media in the age of coronavirus with Bloomberg’s Sarah Frier. Sarah, we were just talking about how Facebook is not just Facebook. Facebook is also Instagram and WhatsApp and Messenger. And this is a very wide-ranging group of users. It's not just the parents who are still posting on Facebook. It's everybody. And trying to moderate the content that is put out into that ecosystem is a difficult task. But I want to hear more about what you think the implications are beyond just making sure people avoid misinformation or aren’t served misinformation. And when we talk about the future for Facebook and its properties, do you think that the actions it's taking today are going to impact that future in any meaningful way? 

 

Sarah [00:15:01] Well, absolutely. I think that that's the responsibility here. They're going to learn from it, just like they responded to the misinformation that spread in the 2016 presidential election. That was a real learning experience, where they realized that they had a major blind spot in terms of how foreign governments were using our risks in our society and trying to dig divides in us further. And so, yeah, this is going to be an instance of finding a lot more of those blind spots. We're all participating in a constant experiment by Facebook about how best to serve us, and that's how it's always been. This is a company that is constantly testing whether things work, whether we are responding well to whatever they do. And they're going to change their strategies. They're going to adopt new strategies, new products. And we'll have to be along for the ride. 

 

Kinsey [00:16:05] The idea of being along for the ride as they kind of figure this out is an interesting point. I mean, we willingly participate in this constantly in social media in general, but especially in Facebook’s platforms. And we frankly have no idea what's coming for us or what's next. [Kinsey and Sarah chuckle] But I'll just kind of go along for the ride anyway. 

 

Sarah [00:16:24] Absolutely. This is how I think about it, especially when we talk about privacy and Facebook. When we signed up for Facebook, they didn't have it. At least, when I signed up for Facebook, they didn't have these abilities to figure out what was in a photo we posted or what was in a video we posted or have the ability to parse what we were typing at the extent that they do today. And so Facebook has really upped the amount of information that they can gather on us while we still sort of made the agreement to use their products years ago. 

 

Kinsey [00:16:59] Right. I want to hear your perspective on some of these other companies, two other big social media companies. I mean, we saw earlier in the lifetime of social media's role in this pandemic that Google, Microsoft, Facebook, Twitter, Reddit, YouTube, and LinkedIn all released a joint statement basically asking for help with fraud and misinformation during this pandemic, when we see statements like that, you know, we're doing our best, we're taking the actions that we deem necessary to stymie the spread of misinformation. How much can we take that at face value? What does it actually mean beyond getting a write up in TechCrunch? 

 

Sarah [00:17:37] Well, the companies need to collaborate on this because basically, when you see something spread on Twitter, chances are it's also spreading on Facebook, it's also spreading on YouTube. And so, what they can do is they can take what's called a hash. This is maybe getting too technical, but if you have an image that, you know is part of a dangerous meme, you can share a hash of that image with the other companies so that their AI systems can automatically remove it. It's already been vetted by somebody. And that's a lot easier than going through the whole discovery process. We talked about it passing through the reports and in having individual content reviewed one by one. So the collaboration between the companies is likely to make things a lot more efficient and it's something that they're used to doing because they that's the same way they deal with child pornography, for example; instead of actually reviewing each of the images, because that would be incredibly scarring and troublesome, they are sharing just a line of code that corresponds with them so that they can all take them down. 

 

Kinsey [00:18:46] So once one company finds something troubling, they share with the rest that they've joined forces with and then everybody can get rid of it at the same time. 

 

Sarah [00:18:54] Uh-huh.

 

Kinsey [00:18:55] OK, but I'm curious how these measures are enforced. I mean, you say I go on Facebook right now and write the most ridiculous post you could imagine about coronavirus. What's the process like to get that taken off? How are all of these measures enforced? 

 

Sarah [00:19:12] Well, it probably won't be, to be honest. Basically, what would have to happen is your followers would have to rise up against you; your friends and family that love you so much would have to go to the right-hand corner of your post and click on the drop-down menu to report it and go through the options and say this is this is inappropriate or this is spam or this is harmful. And they would tell that to Facebook. And if enough people did that on your post, Facebook would automatically start deranking it in the algorithm, so that it would be seen by fewer people. But Facebook doesn't actually remove misinformation unless it's going to cause harm. Pretty imminently, like talking about drinking bleach example, we’d imagine. So if the content moderators determine that it reached that level, then they will remove it. But for the most part, it's likely to just be a shadow-banned a little bit. 

 

Kinsey [00:20:12] OK. Now what about these big companies working with government organizations or working with the World Health Organization? Do you think that there can be a healthy relationship between social media platforms and government entities? Or do they just have such misaligned interests that they can't actually meaningfully work together on something like stopping the spread of misinformation? 

 

Sarah [00:20:37] Well, it was kind of interesting the other day to see Trump mention in a press conference that he had signed on Google to build this national website for virus testing. Google was like, what, we were? OK.

 

Kinsey [00:20:50] New to us! [laughter]

 

Sarah [00:20:53] We'll do that, then. Sure. So I think that a few things are happening. The government thinks that it can get a lot more cooperation from the companies than the companies may be able to provide. And there are a lot of rumors going around, a lot about what that help might look like. Facebook, for example, today had to tamp down some rumors that it's going to share everyone's location with the government to make sure that we are all social distancing. Zuckerberg said that they have gotten no such requests and they would not do that with the government unless they had an opportunity to let users opt into it first. 

 

Sarah [00:21:30] So, there's a lot of expectation that these companies will step up to help the government in whatever way they need. And the companies are saying, listen, our role should be putting high-quality information in front of people. So help us do that and giving them the right information on where to go, what to do, and how to wash your hands, all those things. And so that's what Facebook announced this week. They want to start putting a coronavirus misinformation—oh, sorry—a positive coronavirus information hub at the top of your newsfeed. And what it's going to have is advice from the World Health Organization, the CDC, and well-credentialed entities like that, as well as some participation from celebrities and academics who can amplify that and create contacts around it. So if you're not going to listen to the World Health Organization, maybe you'll listen to Beyonce. I don't know if Beyonce is working with Facebook, but it could happen. 

 

Kinsey [00:22:36] And who wouldn't listen to Beyonce? [laughs] 

 

Sarah [00:21:37] Who wouldn’t. So I think that actually is a good strategy because one way to deal with a surge in predatory misinformation is to provide people with the right information. And that's certainly something that Facebook has been hesitant to do around election misinformation, because they don't want to be the arbiter of truth. They don't want to be the ones that say, OK, this candidate is telling lies and this candidate is not. But with this, it's easier because they have these other organizations that are saying what is factual, and so they can lean on them and build around it. 

 

Kinsey [00:23:22] OK. I want to talk more big picture about the strategies in just a second. But quickly, let's take a short break to hear from our sponsor. — And now back to the conversation on social media's role in the coronavirus pandemic with Sarah Frier. Sarah. We were just talking about some of this, more specific strategies around the disease companies are taking to stop the spread of misinformation. I'm curious, as someone who has been covering this space for some time now, do you think that these companies, Facebook and Twitter and Google, etc., are doing enough right now? Is there more they could be doing or more they should be doing? 

 

Sarah [00:23:58] Well, I think there is absolutely more they could be doing. I think, like I said earlier in our conversation, they have really underinvested in content moderation. But I think it's also not just a matter of content moderation. It's a matter of focusing so much on growing—like the internal ethos that Facebook is—the more users, the better. The more users, the more time people spend on our service, the better off we are. And if people are using our service, that must indicate that they're getting value out of it. If somebody is spending 50 minutes a day on Facebook this year as opposed to 45 last year, that means we've made our service better. Well, that's not actually quite true. If people are logging on to stoke fear and get obsessed with things and maybe get inspired to commit something that harms themselves. So I think that the biggest thing that Facebook could do to address misinformation is address the fact that its algorithm amplifies that which causes us to react emotionally. 

 

Kinsey [00:25:15] That's an interesting point, that if you spend more minutes, it’s providing you more value—just simply not true. [laughs]

 

Sarah [00:25:23] Yeah. It's sort of a falsity. And if Facebook, fundamentally, as part of their algorithm, shows us things that we're likely to engage with because it's trying to up those numbers, get us to spend more minutes, they're going to show us things that scare us and things that make us react. And that's just the way it has tended to work for years. If somebody posts something surprising, you're probably going to add an angry face and comment on it and share it with your relatives. And sometimes what you're reacting to or sharing is simply not true. But because so many people are interacting with it, Facebook thinks that it's really important information that you should know, just algorithmically. 

 

Sarah [00:26:06] So I think that that is the fundamental problem with misinformation on social media and in the different challenge on Instagram. Because Instagram doesn't have viral sharing, but they have all of these hashtag-based communities where people may be rallying around misinformation in a way that is less easily discovered. And then on YouTube, the algorithmic recommendation of, if you like this video, you might also be interested in that one. It also tends to prioritize videos that will rile you up or get you to watch for more minutes, which tends to prioritize conspiracy theorists and people who want you to be afraid. So I think that that we all have to be conscious about how the social networks are built. And once we understand the mechanisms behind them, then we can make appropriate decisions for ourselves that how we want to interact and what we want to share and how we want to participate in a more informed society. 

 

Kinsey [00:27:16] Right. It's an interesting point that the onus is almost on us because these companies have made billions and billions of dollars using these algorithms to their advantage and using these setups to their advantage when it definitely does not serve a real purpose for the user as much as it does for the company itself and the bottom line. So we have to be aware of these algorithms in the way that they work in order to not necessarily play the system, but to understand how the system operates, which is a big ask, think for a lot of people to understand how these tech companies are reading us on the other side of the phone. It’s troubling. But also, I think an interesting point to bring up, for sure. 

 

Sarah [00:28:00] I think another thing that we need to keep in mind is when we're using these services, especially social media, we're using them passively. We're logging on to get served whatever people are sharing. And we're not necessarily going to say, I want to find the infection rate for coronavirus. You will just see somebody sharing that, and then suddenly that's the information in your mind. And so we can also be a little bit more active about our pursuit of information and getting it from verifiable sources. And not allow ourselves to just go through that constant scrawl and absorb whatever may come out. 

 

Kinsey [00:28:40] That's right. I'm curious—your thoughts on how much we've relied on the internet and on social media in recent weeks as so many people and more and more, frankly, every day, are being asked to self-quarantine, are being asked to work from home. How you think that changes, if it changes the attention economy that these social media sites and platforms have relied so heavily on. Do you think it's a boon? Do you think it will hurt them in any way? What's your take? 

 

Sarah [00:29:08] Well, I think that there is a positive side that we've seen—the people who are doing happy hours over live video chat or doing live concerts for their fans or doing online yoga lessons when they can't have their studios open. We're seeing a lot of uses of the internet in this time of isolation that are very positive. Like if Facebook went down or Instagram went down, we'd lose a source of connection to each other. And so I think in that sense, you know, I mentioned earlier that traffic on Facebook properties is surging right now beyond what they'd see on a New Year's Eve situation. So it really is quite clear that people rely on these platforms heavily in times of isolation. I think that it's up to us how we decide to use them. Like if it's for fear mongering and sharing unverified claims from somebody’s acquaintance who lives in a hospital, I've seen a lot of those text chains. 

 

Sarah [00:30:22] Sharing a text chain from a doctor at a hospital when you don't actually know who that person is or if their doctor or someone made it up in order to spread something that they thought would go viral. Being wary of that kind of thing. But then also appreciating the live art classes for children and the ways that people are gathering to sell gift cards to support local businesses that might otherwise go under. And so there are a lot of ways that people are coming together right now. But I think, you know, this attention economy is a really great way to put it. The things that optimize for the most attention tend to be the things that cater to our worst instincts as humans. 

 

Kinsey [00:31:11] Yeah, it's like using your screen time for good, not for spreading [chuckles] information that's false or for fear mongering, which is again, a big ask. And then lastly, I'm curious to hear if you think that there is anybody who's kind of benefiting from our increase in screen time that so many of us are probably going through right now, that we do have an election this year. And social media has played an enormous role in the last several elections, but especially in this one, it's become sort of a focal point given what happened in 2016. Is this a good thing or a bad thing for the campaigns that are actively pursuing the White House right now that we are spending more time on social media? 

 

Sarah [00:31:54] Well, I think that it depends on how they use it. Who's benefiting from the attention we are paying online right now? I mean, probably every video chat service [Kinsey and Sarah laugh], every new virtual—I'm going to have to do a virtual book tour. And that will be really interesting how that works out. [laughter] I think that campaigns are going to move virtual as well. We've already seen a lot of text campaigning, a lot of call-based campaigning. 

 

Sarah [00:32:27] I think that they're going to try to do the same strategies, but you're not going to see the big in-person rallies like you've been seeing, which is a big way that Trump riles up his user base. User base— [Sarah and Kinsey laugh]. Whatever we call ourselves when we're not users. 

 

Sarah [00:32:50] Yeah. I think that that all of these events are going to move virtually. And it's this virtual future that everyone in the tech industry has been planning and telling us that we're about to encounter—it's going to just happen to us. 

 

Kinsey [00:33:05] OK. Well, it's definitely an interesting time to be a user or a member of a coalition or whatever we want to call ourselves [Kinsey chuckles], whether or not we're online or not. But, Sarah, thank you so much for coming by and talking and explaining this to us. You think that there's a lot that remains to be seen with how these social media platforms are going to operate in the next several months as everything is kind of still up in the air. But thank you so much for talking to me and explaining some of this. I really appreciate it. 

 

Sarah [00:33:31] Thanks for having me. 

 

[00:33:33] [sound of coffee being poured]

 

[00:33:34] [outro music starts]

 

Kinsey [00:33:37] Thank you so much for listening to this week's episode of Business Casual. Now, I know a lot of us are going to spend the foreseeable future on the internet. But remember what we learned in this episode? Not everything that you see in the trending sidebar is as it seems. If you want to follow some reputable sources on Twitter, I recommend giving us a follow @bizcasualpod and giving Sarah a follow @sarahfrier. That's s-a-r-a-h-f-r-i-e-r. So join the conversation. Let us know your thoughts and I will see you next time.