We probably see them and think they’re a real Twitter profile. Or spam. They’re those comments we see online that have little or nothing to do with the topic at hand and clearly want you to click on a link.
Those are bots – computer programs created by humans to pretend to be humans and interact with other users.
The panel was entitled “Robo President: Politics in an Algorithmic World.” On the panel were Tim Hwang, the founder of Pacific Social and a researcher for Data and Society; Meredith Broussard, a data journalist and assistant professor at New York University; Andres Monroy-Hernandez, a researcher at Microsoft Fuse Labs; and Samuel Woolley, a researcher with the Computational Propaganda Research Project.
There are different kind of bots. There are activist bots, like the Stay Woke Bot designed for the purpose of the Black Lives Matter movement. Some are journalism bots that gather journalistic data for reporters. There are political bots that are meant to send tweets on behalf of a politician.
Monroy-Hernandez talked about the concept of bots tweeting for politicians, as well as those created to show fake support for a politician.
He said it’s a concept that’s happened for a long time in Mexico, before this technology became prevalent. A political candidate would essentially bribe a person to come to their campaigns by giving them food. The poor were mostly targeted and enticed with a free lunch.
“People are transferring this style of support on computers,” he said.
Though this sort of voter deception is a negative use of bots, Monroy-Hernandez says there are possibly many uses for different causes and organization to have a positive use.
Broussard teaches data journalism at NYU and created a data computing bot to sort through information and essentially write an article for some of the more repetitive stories reporters have to write.
“When you write some of these stories it feels like some of them are essentially the same,” she said. And some of this technology is already being used. “When earnings reports comes out, the AP uses it to come out with a story.”
She also mentioned a Quakebot, used by the LA Times and connected to the USGS earthquake feed. She said whenever an earthquake happens, the quakebot tweets about it and even writes articles about the event in mere seconds.
Broussard created the Story Discovery Engine in response to trying to find more information on the education system and standardized testing.
What came from it was an algorithm that, instead of responding with a question-answering system that provided smart answers (that had previously been created) she decided to create an answering system that, instead, provided visualization of the data.
The program has the ability to help journalists come up with creative investigative story ideas.
“The hardest (story) to come up with is high-impact investigative stories,” Broussard said. “Most come from a whistleblower…instead of the reporter’s own inquiry. This leaves the reporter with a data visualization to trigger their own creativity.”
Hwang took over, and advocated the bot, saying bots have a bad reputation, but they can also be used for good.
“The debate shouldn’t be about automation or not automation, but who controls the automation,” he said.
One of the biggest problems with bots is that they are often used without moderation. The more controversy, the more engagement will be on the site, he said. So many bots are not made to hold back.
Bots have sometimes also been made to do good things, but unwittingly cause problems, like one bot created to make certain purchases, and eventually ended up buying illegal drugs.
“We need to have more of a conversation about who is making the algorithm,” Broussard said. “And if they’re making them in a responsible manner.”
Hwang gave a few examples, however, of how bots can do good. They can introduce and connect people online from different social spectrums. They can provide data and information. They can do many customer-service tasks.
For Broussard, her creation is helping journalists do their job.
“When I say I work in artificial intelligence in journalism, it means tech is going to supercharge the reporter,” she said. “Like a jet pack. To do what they do, but better.”
The group discussed bot literacy as well – whether or not the public knows what bots are, and if they need to be able to figure that out.
Hwang said the new norm, like other things, change, and bots may be to us in the future what something like the billboard is to us now. Billboards may have been new and disruptive back in the day, but now is something ordinary we all understand.
“I wonder whether or not we’re waiting for the norm to settle around bots as well,” he said. “It doesn’t make sense to talk about literacy if the norms have not been settled in some sense.”
Below are brief interviews with Woolley, Hwang and Broussard: