Podcast: The problem with ‘Do your own research’

Francesca Tripodi

“Do your own research” sounds like a harmless — unassailable? — suggestion. It’s more complicated than that, says today’s guest, UNC professor Francesa Tripodi, author of the new book The Propagandists’ Playbook. In this episode of Debugger in 10, Tripodi discusses the IKEA Effect in research, why Googling is more like a scavenger hunt, and how algorithmic polarization means the question you ask leads you to the answers you get.

Tripodi is an assistant professor at the University of North Carolina’s School of Information and Library Science and she’s also a senior researcher at the Center for Information Technology and Public Life. She loves research. So why is she so concerned with the phrase “Do your own research?” Click play below to listen to our conversation, or click here to listen on Spotify. A transcript of our conversation appears below.

Debugger is available on all major podcast platforms.

——————————————FULL TRANSCRIPT————————————-

Francesca Tripodi: So the intent behind doing your own research is not the problem. The problem is that many of us don’t understand how search engine returns are organized, and we also don’t realize that people who are spreading conspiracies and those who are involved in political manipulation have a really, really good understanding of how search engines return information.

And so they seed content and they tag it, and then they suggest you to look for specific things. So it’s not the research itself that’s the problem. It’s that many of us don’t understand the tools and how those tools are being manipulated.

Bob: You use this phrase, the IKEA effect of, of misinformation.

Francesca Tripodi: So the IKEA Effect comes from business scholars who find that when people put together furniture themselves, they value it more. And I use this same concept to think about information seeking. And it is also attached to the United States specifically as well. A lot of the foundational work within the United States is connected to Protestantism, which is bound to this idea of individuals having the same ability to read the Bible or engage with higher texts and not necessarily having a reliance on an expert.

And so if you come to a problem or an idea with this very individualist framework, and you search for information yourself, people are valuing what they find more, not realizing that it’s not so much putting together information, but rather a scavenger hunt and that the path that they are hoping you take has been highly cultivated ahead of time.

Bob: I this is an incredible unveiling of a way of looking at things. I do, You go to Google, you type something, you think you’re just getting sort of raw information, but instead you’re, you’re a participant in a scavenger hunt that you don’t even know you’re playing.

Francesca Tripodi: Absolutely. This is a problem not just for Google though. Google is the dominant way that people search for information. But any search engine that you’re using, whether that be Duck Duck Go, or even something newer like TikTok, those starting points have a really big influence on where you end up and people are manipulating that, right? They have a high understanding of the importance of keywords and they are suggesting that you go out there and do your own research, not just about anything, but about this in particular, and so that it is why it’s so problematic.

Bob: So “Do your own research,” but it’s not really your own research.

Do your own research to find what I’ve laid there for you to find

Francesca Tripodi: Yeah, do our own research, .

Bob: People are very familiar at this point, I think with the concept of search engine optimization when it comes to advertising. You’re talking somewhat about search engine optimization and politics, right?

Francesca Tripodi: Absolutely. So search engine optimization in itself is not necessarily nefarious. A lot of businesses use this in order to elevate their content and ensure that you click on what they want you to click on. Politics has now used the same game, and if it’s something about information on a candidate or making sure the candidate’s website is one of the top returns when you search for the candidate’s name. I would say that’s search engine optimization in a way that is intended, right? You are trying to make sure that when people search for your name, they find information related to your platform. That is different than creating a conspiracy, tagging it with the name of a person who otherwise does not necessarily exist in politics, and then using all mechanisms possible to drive people to search for that phrase. And that effectively tricks the signals so that search engines think that the information that’s being clicked on is very reliable. They often hyperlink it to one another, which also sends a signal to search engines that this is quality content and that is a different kind of gaming that is taking SEO, I guess, to the next level.

Bob: And is this how you get to phrases like algorithmic polarization? Algorithmic exploitation?

Francesca Tripodi: So something like algorithmic polarization, I think is much more inclined to be part of the starting point. So what I also argue in my book is that many of us don’t realize that our starting points really drive our returns in ways that we might not necessarily recognize. So an example I give of algorithmic polarization is how this is tied to what sociologists refer to as a deep story. Those stories that we hear around the dinner table or the campfire, they’ve been told so many times that they seem true, even if they aren’t necessarily true.

And so a big one that I think a lot about is related to immigration. In the United States, do you have a deep story that connects the immigrant experience to one of opportunity and progress? Or do you have a deep story that connects immigration to things that are problematic for society? And so if you start your keywords, if you start your search with something like “non-citizen voting rights,” you’re going to get information about why people who were here, regardless of their immigration status, should participate in the democratic process. If you start your start your query with “illegal alien voter fraud,” you are going to get information about the problematic ways in which immigrants might hurt democracy. So those starting points are what I refer to as algorithmic polarization because you’re effectively building this filter bubble around you in ways that you might not even recognize.

Bob: The way that you ask the question impacts the answers that you get.

Francesca Tripodi: Absolutely. We often think about Google, it’s this window into the wider world or like a helpful librarian, but it, it’s in fact often a mirror, right. Reflecting these biases back to us. And, at the same time, you can think of it as a magic mirror in the sense that there are people who have a very sophisticated understanding of how search engines work and are relying on that idea of inquiry and manipulating those returns in ways that people definitely don’t see.

Bob: It’s impossible to have this conversation today in early November, moments after Elon Musk has taken over Twitter, and not ask you what you think about the state of Twitter right now and how that impacts the conversation we’re having.

Francesca Tripodi: Sure. Twitter is very different than the search engine, although Twitter content is very much connected to search engines, so how I think of it is …We think a lot about how disinformation on social media sites influences the democratic process. I would argue that people aren’t typically seeing things on social media and then just believing them. They are usually seeing things on social media and then searching for more information on it or doing their own research on the topic.

My concern is that this is a great way to seed the internet with problematic content. So if there’s less content moderation on a social media site and it’s allowing for phrases to trend or allowing for problematic personalities to drive the conversation or dominate the conversation, then that leaks over into doing your own research because people are more inclined to check that information and because of the way search engines work, oftentimes people will just get the same content that people talked about on Twitter or Facebook or whatever the social platform is, but because they’ve gone to another site and done this next step, they’re more likely to kind of be invested in this information now. Right? They’re like, “Oh, this is great. Now I have another source telling me this is true.” When it is in fact the exact same source, they just got it on a different platform.

Bob: So … and I’m not saying this has happened or will happen … but you might, at a social media platform that reduces its amount of moderation, you might suddenly see a lot of comments about Holocaust denial or 9-11 conspiracy theories, and that might make you Google those things and you would find more.

Francesca Tripodi: I would say with things that have been thoroughly debunked, search engines have effectively downgraded conspiratorial content, right? So there was this big instance where … it was found that when people search, “Did the Holocaust happen?” they were fed or their, their information returned to them, substantiated that claim. However that void, if you will, has been filled. There are plenty of people putting out quality content of high integrity that can combat that.

The issue that I see… it relates to what’s referred to as data voids. So when little to nothing exists online, it becomes very easy to fill that gap with low quality, problematic content. So as there are more bombastic figures potentially let back onto platforms like Twitter, it provides them with that really great reach to do that seating right, to say, can you believe X, Y, Z?

And then you know they are talking about X, Y, Z. This is what happened regarding the whistleblower in Trump’s first impeachment hearing. So there was this concerted campaign to effectively dox what was supposedly the whistleblower. They effectively got that name to trend on Twitter. So if people then search for that name, they got more information created by those same content creators furthering the conspiracy.

And it got so bad that Google was actually auto-completing this person’s name at one point. So if you take Twitter out of that equation, it lessens the likelihood that search is going to fill that void. It just creates another outlet for seeding the internet with problematic content.

Bob: So, I know some of this, you know, from scholarly research, but a lot of this you learned while working on your book, The Propagandist Playbook. Can you talk about the research and the immersive research that you did for that book?

Francesca Tripodi: So the research that I did was threefold. I did physical ethnographic observations inside of two Republican groups — a women’s group and a college group. I conducted interviews with people who were part of those groups or affiliated with those groups who I met at the group events. And then as part of the informed consent process, I asked people if I could connect to them on Facebook through a professional Facebook. It’s easier for me because I don’t have a private account. So I connected with them through a professional Facebook account, and then I was able to follow the news and information that they liked or shared or commented on. And then I used the data points from my interviews and my ethnographic observations and my Facebook to create a curated media set of news and information that people in my study described as trustworthy sources of news.

So I took that and then I immersed myself in that news media environment. So for four months, I received all of my news and information from sources that those in this study had identified as trustworthy. And I think that media immersion process, which is effectively media ethnography, is just extending ethnography outside of physical into the immersive and media environment was really essential because it allowed me to have a deeper understanding of the frames and narratives that were being told.

And then after I did that media immersion. I conducted a more formal content analysis where I used transcripts from the shows that I had listened to or I did some field notes while I was listening and watching TV, for example. And so I think doing that allowed me to have like just a much more nuanced understanding of the core values that this production is trying to reach as well as the central themes that they are pushing.

Bob: I know you have a whole set of themes that they’re pushing and lessons that you’ve learned from it. And I’m going to suggest that people read your book to get all of them. Is there one or two that you’d really like to highlight?

Francesca Tripodi: I guess the two things that I think are really interesting to highlight is 1) this central narrative around a dangerous left. So there is a very concerted effort within right-wing media to push this idea that those on the left have become increasingly hysterical or emotionally driven — that they are run by what they refer to as feelings over facts. And I think this is important for two reasons. One, it creates this framework whereby it insinuates that information creators on the right are somehow fact based, whereas we know through content analysis, they are also driving feelings like fear, and many of the facts that they use are heavily manipulated. So they might start with a seed of truth, but then they’ve been used out of context.

The other big concern I have from this overwhelming narrative of the hysterical left is it creates an environment whereby violence becomes justified. And then you see this narrative tie back to disinformation campaigns.

So in my ethnographic observations of the Unite the Right rally that happened on August 12th in Charlottesville, Virginia, as well as the aftermath of the storming of the capital on January 6th, there was a concerted disinformation campaign to somehow blame the violence on Antifa, which is effectively a key word that signals this supposedly hysterical left. And not only was this found to be not true, right, those who were perpetuating violence that day were not in fact leftists, but were those who had been motivated by president Trump. But we also know that in trying to scapegoat these leftists, it created this window for kind of stand-your-ground language that justifies violence.

Bob: We recently saw Marjorie Taylor Green, say Democrats want Republicans dead, and the killing has started.

[Audio clip of Marjorie Taylor Green at rally Oct. 1, 2022] “We’re all targets now for daring to push back against the regime, and it doesn’t stop at a weaponized legal system.  I’m not going to mince words with you all: Democrats want Republicans dead.  And they’ve already started the killings.”

Francesca Tripodi: Absolutely. That’s another great example. Right? Again, this is a this is a statement meant to seem like a fact with absolutely no source or evidence. Clearly, you do not see this overwhelming evidence or any sort of study that has been conducted which would argue that anybody isis arguing for the mass death or extinction of another side.

And this pitting of people against each other is tied to extremist language. I drew on the research of Cynthia Miller-Idriss at the University of Maryland who does a lot of really amazing work on radicalization and extremism, and I used her definition of extremism to really drive the focus of my study and to better understand how the rhetoric being used is that which is tied to extremist values that elevate white supremacy. And that is concerning.

Bob: I know this is a really complicated question and we only have a minute or so left, but I’m going to ask it anyway because it has to be asked. It’s obvious that governments have a role to play in fighting propaganda, but that notion has been pretty poisoned as well. Are there solutions that the US government can attempt to implement in dealing with this issue?

Francesca Tripodi: Hmm. I would say one thing that we’ve seen happen is the public hearings around January 6th. Right? So holding public officials accountable for lies that they tell whether or not there’s any true outcome in that. But making that visible to the public in the form of a public hearing is I think a really great example of how the government can use its role as an extension of the people to inform the people

Bob: In public, and transparency is key, right?

Francesca Tripodi: Yes, yes. Doing this through public hearings. And these public hearings are now archived on places like CSPAN, but the fact that they were televised on network television, is, I think, a really important way that we can ensure that democracy can continue.


Don’t miss a post. Sign up for my newsletter

About Bob Sullivan 1638 Articles
BOB SULLIVAN is a veteran journalist and the author of four books, including the 2008 New York Times Best-Seller, Gotcha Capitalism, and the 2010 New York Times Best Seller, Stop Getting Ripped Off! His latest, The Plateau Effect, was published in 2013, and as a paperback, called Getting Unstuck in 2014. He has won the Society of Professional Journalists prestigious Public Service award, a Peabody award, and The Consumer Federation of America Betty Furness award, and been given Consumer Action’s Consumer Excellence Award.

Be the first to comment

Leave a Reply

Your email address will not be published.


This site uses Akismet to reduce spam. Learn how your comment data is processed.