In March, Tik Tok CEO Zi Chew was hammered during five hours of testimony in Congress. Representatives assailed the app’s safety because of its Chinese ownership and expressed admiration for the RESTRICT Act, which aimed to ban the social media platform.
The hearing resulted in online hysteria. CNN’s headline read, “TikTok users are making fun of Congress members for their questions to app CEO Shou Chew.” The Harvard Crimson asked “What the Hell Happened: TikTok Legal Drama.” And Business Insider wrote that “TikTok has no chill: Users are defending CEO Shou Zi Chew over his congressional appearance so vehemently that some have started lusting for him.”
But just two weeks before that frenzy, a far more measured and constructive conversation was held in a hearing of the House Cybersecurity, Information Technology, and Government Innovation subcommittee. Academic and industry artificial intelligence experts discussed the impending AI revolution and the need for guidelines and protections.
It got two headlines.
“You see that a lot in today’s political climate,” said Jennifer Hernandez, the head of analytics at the House Appropriations committee. “You will hear a lot of the noise, whatever is the loudest or most controversial, that’s what will get covered, not necessarily the more constructive conversations are happening.”
In her role, Hernandez is responsible for assessing, designing and developing data systems that enable data-driven decision-making for the committee. She was previously the senior data scientist for the City of Miami in 2019 and is an FIU alum.
At both the city and federal levels, data collection informs and facilitates policymaking. In Hernandez’s experience, legislators are often left wanting more information after she presents her findings. They want the data insights because it helps them make decisions and also shows constituents that they’re doing the work needed — it holds them accountable.
“…When I put the information in front of decision makers, they just start asking more questions,” said Hernandez. If someone asks for a number she doesn’t have or can’t be collected, they ask what needs to be done to attain it or change the process, “and that actually creates a really healthy feedback loop,” she said.
Federal regulation and protections are still far on the horizon, but some states have been proactive about considering and enacting measures. Across 25 states and Puerto Rico, almost 140 consumer privacy bills were introduced or considered in 2023 (Florida is not one of them).
In 2021, Congressional committees approved various measures. “The 4th Amendment Is Not For Sale Act,” introduced in 2021, is a bipartisan effort to stop data merchants from selling personal information to government authorities without a legal warrant. The “Data Protection Act of 2021” aims to establish a national agency to oversee high-risk data practices. Approved in 2022, the “American Data Privacy and Protection Act” would increase oversight on how companies use AI in their business.
But they sit on the floor collecting dust until they receive a full House vote.
The nation’s hyperfocus on data collection recently excites Hernandez because data collection and cyber safety are important to learn about.
“I think all these different hearings, legislation, and conversations that have been going around, are actually extremely useful,” Hernandez said. “When it comes to technology, I think this is one of these fields that everybody should be able to voice their opinion and voice concerns because ultimately, you have to build trust.”
The Restrict Act may have gotten people talking about data protection, but it didn’t bring the solution closer.
Willmary Escoto is a U.S. policy analyst at Access Now, a civic and social organization that takes a multifaceted approach to defend the digital rights of people who belong to vulnerable communities. Her work specializes in content governance, privacy, artificial intelligence and data protection.
Escoto called the Restrict Act a distraction from actual work that needs to get done. “It’s not going to get to the crux of the problem. It’s a band-aid,” she said. “It’s really a distraction from passing a comprehensive federal data protection law… and a distraction from the fact that we need global surveillance reform.”
With so much that is constantly shared in the media, it’s easy for misinformation to catch fire. Proper education about AI is pivotal not just for those in public office when they determine legislation, but also for the everyday American.
In the subcommittee hearing, former Google CEO Dr. Eric Schimdt proposed that an effective way of teaching young minds is by coming up with a federal program that gives states or universities money to administer computer and technology education, which they then pay back in service.
Miami Dade College’s School of Engineering and Technology has implemented the AI For All program. Its purpose is to expose all students to AI and “aims to serve the national interest by increasing community colleges’ capacity to attract and train students in AI.”
In addition to the government and schools, companies play a role in advancing knowledge too.
Escoto said that companies are responsible for keeping their users as safe as possible online, and to do that means adequately informing how the platform works.
“It’s incumbent on [companies] to do everything in their power to protect their service and protect their product and the integrity of their product,” said Escoto. “And if that product is a social media platform that pretty much impacts everyone’s daily lives, then by default you have to protect those users’ human rights and civil rights” and to do that means to be transparent.