Since the dawn of language, there have been those who wanted to confuse the rest of society by spreading chaos and discord. The ascension of the internet has allowed more access to information than ever before, so surely that means it should be easier to disseminate the truth from falsehood, right? After all, knowledge is power.
As it turns out, the digital age is a double-edged sword. The internet has created a perplexing paradox: the more access we have to all types of information, the less we are certain of what is true.
The real danger comes when citizens do not take the time to reflect on the media they consume. When website owners push out content, they typically do so in order to gain increased viewership. They achieve this by giving people information they want to see — information that aligns with their values. This is known as confirmation bias, the tendency of people to favor information that confirms or strengthens their beliefs.
However, many people are already aware of these biases when consuming information on social media. Paula Silva, a Miami local, expresses this sentiment.
“Sometimes, people read what they want to read, so they go to the sources they like rather than others because it’s like a comfort,” Silva said. “Nowadays, anyone can make information online — anyone can fake a news article. If I want to know if something is true, I ‘Google’ it to see if other sources are actually reporting it.”
Kurt Sampsel, the senior program manager for disinformation with PEN America has investigated online trends and how the spread of misinformation has rapidly increased over the years. Throughout his career, he has done extensive research on information ecosystems. His efforts aim to defend individuals from disinformation by being reflective of where their media comes from. However, he believes the issues that we see today are not easy to fix.
“I think it’s really important that people reflect on who is in your information ecosystem and what perspectives maybe aren’t reflected there,” Sampsel said. “There is a lot to learn from people we might disagree with. I think it’s really important that people reflect on [their own] information ecosystem and what perspectives maybe aren’t reflected there.”
Researchers have harkened back to the phenomenon of echo chambers for years, in which a person’s own biases keep them stuck in an information bubble with views that always align with their own. According to FIU Professor Laura Kurtzberg, a platform’s algorithm can contribute to creating more echo chambers.
“Once they are there on whatever particular site, that site’s algorithms are designed to boost posts that spark outrage and create engagement, drowning out content from professional news outlets in favor of more sensationalist stuff,” Kurtzberg said. “The abundance of people posting misinformation drowns out knowledgeable or reliable information.”
The rise in misinformation can be attributed to the rise of the digital age. As more voices can be heard, more ridiculous claims can be spread and reach places that were once unreachable.
“I always reflect on one of the biggest paradoxes of our time here in the 21st century,” Sampsel said. “We have greater access to information than we’ve ever had before, but the result has not necessarily been a better-informed public than we’ve ever had before.”
This leads to dangerous territory when the idea of free speech is involved. The marketplace of ideas theory is a concept rooted within this right as a way to fuel democracy. It posits that in an open and unrestricted marketplace of ideas, the best and most truthful ideas will naturally rise to the top through debate, discussion, and scrutiny.
The theory was established in John Stuart Mill’s 1859 publication On Liberty. Mill argues against censorship and in favor of the free flow of ideas. Asserting that no one idea alone embodies the truth. He claims that the free competition of ideas is the best way to separate fact from fiction. Yet, the digital age contradicts this theory.
The rise in misinformation online has inspired citizens to advocate for more legal protections without harming the First Amendment. Miguel Torres, another Miami local, expresses this belief as a potential solution.
“My family reads a lot of misinformation all the time on Facebook, conspiracy theories about Biden and Covid. It’s not monitored at all,” Torres said. “There should be a law to hold these platforms accountable and force them to have to monitor misinformation more strongly.”
However, according to Dr. Karla Kennedy, a mass communications professor at FIU, laws that go into the territory of the First Amendment are more controversial. Due to the specific language in the Constitution, social media is technically not protected by the First Amendment — at least, not in the way we think.
“Social media is a private industry that is not regulated by the Constitution,” Kennedy said. “It is a private entity owned and operated by a private industry…social media is not regulated by the First Amendment because it’s not a public forum, it’s not a private form that operates as a newspaper or as any type of actual journalistic piece.”
“We really have to understand that social media is not media,” Kennedy continued. “You might get some of your news from it. You might get some information from it, but it’s not verified. It does not follow the tenets of journalism that there haven’t been interviews and there hasn’t been extensive research.”
Before the rise of technology, people gathered in a public square to debate and exchange information, ideas, and points of views with each other. Today, this has been replaced by a digital public square and the debate often takes place on social media platforms like Twitter (Now known as X) and Facebook. Although it would be ideal to hold these companies and people accountable, the way our legal system works makes it challenging to do so.
When the public has no clue what is true and false, it doesn’t only limit our right to free speech and democracy — it limits how much we truly know about the world around us. So, how do we combat itt? Sampsel believes it’s not as easy as fighting each source of misinformation.
“We can go around the United States and pick up every beer can alongside the road or every bit of plastic trash, but that’s really not a realistic way to respond to the challenge of environmental degradation,” said Sampsel. “And in the same way, I think the effort to get rid of every single piece of false or misleading information is probably not a realistic way to respond to the challenge.”
Similarly, Dr. Kennedy argues that to combat misinformation is a complex issue due to the fact that it relies on the behavior of each and every individual.
“It is definitely going to be up to society in generations to come as to how we deal with this social media thing,” Kennedy said. “It’s not going to be black and white. It has nothing to do with money. It has nothing to do with skin color. It has nothing to do with education. It has nothing to do with economic status. It just has to do with being a human being and knowing when to put it down. And the government is not going to tell you when to put it out because they really don’t have any right to tell you to put it right.”
We are in uncharted territory. The reality is that there is no definitive answer to solve this issue — not yet, anyways. There are too many complexities at play, especially since many possible solutions require every person on the internet to change their behavior. However, one way we can try to combat misinformation is to emphasize the importance of critical thinking — training our brains to think deeper, to a time before we relied on this technology for everything.
“I think a good place to put attention is on those individual factors,” continued Sampsel. “And even kind of being hard on yourself to think twice about, you know, reaching for [your] phone for a moment of pleasure. Why am I doing that? What comment? What content am I expecting to see…I think that those are really worthwhile questions for us all to ponder.”