Nobody looks at the world with absolute objectivity. So why should it surprise us that digital helpers like those embedded in our Google results to our Facebook feeds are just a little bit slanted?
Digital algorithms abound in our digital lives. They help parse through virtually endless caches of information in a process that is both instantaneous and mysterious. But technology author and professor Ramesh Srinivasan says it's time to pull back the curtain, so the world's 3 billion internet users know why they're seeing what they see online.
"There are two mechanisms by which algorithms tend to be configured," Srinivasan explains. "One is whoever tends to be the most popular in the digital world tends to be reinforced in their search results."
Popular, in this case, meaning a person who has been searched out or linked to the most.
"The other is [when] algorithms are personalized, meaning that the algorithms are measuring the digital activity that we're already engaging in and they're feeding us information based on our existing biases or existing digital appetites," Srinivasan says.
While many might have suspected that their internet experiences tend to revolve around their preferences, Srinivasan says few stop to ask whether that's a good thing.
"Part of the problem with personalization is we're not exactly sure what choices are being made in terms of what's being personalized for us," Srinivasan says. "We shouldn't just think of technology as impacting us as individual users, we should think about the communities and cultures that algorithms are being designed for."
Srinivasan says without this sensitivity, most internet users will only be exposed to a narrow selection of what's really available online. This reinforces certain ideas and perspectives while burying others under mounds of other results.
"If the codes of technology that dominate people's experiences all over the world are just written in a few small offices, in cubicles, in software design rooms by engineers here in California, our visions and our perspectives are basically being transmitted through these digital networks that are impacting the larger world," Srinivasan says.
On social media, he says algorithms can keep people in their own thought bubbles. But he has a suggestion.
"These technology companies need to go give people a sense of why they see what they see and what other options there might have been," he says. "We need to explain what types of algorithms we're providing to people and why."
And Srinivasan highlights another potential benefit: letting people look behind the curtain might even change the timbre of the nation's political dialogue.
"[Are programmers] choosing to expose Ramesh Srinivasan to information coming from Trump supporters? Because I tend not to be a Trump supporter. So these algorithms can start to bridge our divides rather than reinforce our division," Srinivasan says.