Episode 15 / September 8, 2017
In Episode 15 of The Redirect Podcast: How search queries related to natural disasters have changed between Hurricanes Katrina and Irma. Plus, the power of “The Algorithm.”
How Natural Disasters Influence Search Behavior
Since we focus on search day in and day out, sometimes we can’t help but apply our curiosity and methods to current events. Jason compiled a study on how search behavior has changed since Hurricane Katrina in 2005, and discussed it briefly in this week’s podcast. Read the study here: Natural Disasters’ Influence On Search Behavior.
Pat shared insights from an episode of the 99% Invisible podcast, called “The Age of the Algorithm.” The episode begins with discussing the incident in April when a man was dragged off an overbooked United Airlines flight. He was chosen as a “least valuable passenger” on the flight by an algorithm.
As SEOs, we always think about The Algorithm in terms of search and how to “beat” or “learn” it. How do we reach an audience better, and how do we get people to “buy what we’re selling?”
The podcast goes on to explain how algorithms are shaping our world in profound ways that we may not realize:
- Determining our ability to pay back a loan
- Determining what we see on social media
- Sorting through resumes
- Evaluating job performance
- Determining prison sentences
- Monitoring health
The goal of using an algorithm for these processes is to “replace subjective judgements with objective measurements.” But it doesn’t always work. This podcast focused on how algorithms can create imbalance in the world outside of search, and how it’s still creating a gap and dividing society – most notably in the courts and prisons. Some judges use “recidivism risk algorithms” (which are supposed to assess how likely it is that a person will break the law again) to determine the amount of bail, length of sentence, and likelihood of parole.
The podcast also featured mathematician Cathy O’Neil, author of the book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy.
She explains how an algorithm works and what it’s set out to do. “But in reality, every algorithm reflects the choices of its human designer.”
“People tend to trust results that look scientific, like algorithmic risk scores. ‘I call that the weaponization of an algorithm … an abuse of mathematics,’” she says. We saw this with social media during the 2016 election. It recently came to light that Russian Facebook accounts spent more than $100k on what amounts to “fake news” ads as part of an influence campaign.
We’ve previously discussed the concept of filter bubbles: We curate what we want to see on social media through our interactions with posts, and the algorithms fill in with content that’s relevant to your interests. A study on the “diffusion of moralized content in social networks” (found on reddit) points out that the left and right are generally isolated from each other on their social media channels – specifically Twitter – creating echo chambers for ideological groups.
The Russian influence campaign was able to tap into specific bubbles to spread their content. The following statement isn’t meant to assert that they did or didn’t do anything wrong, and it isn’t a validation or endorsement of what they did, but maybe the Russians just took a system that was already in place, used what they knew about targeting, and pushed to a massive 300-million person retargeting audience? While the material they peddled was shady, the system was used the same way any marketer might be able to use it with the same amount of time and resources.
And on that note… we’ll see you next time!
Thanks for tuning in! To catch future episodes of The Redirect Podcast, subscribe on Soundcloud, iTunes, or Stitcher.