I, algorithm: Can data-driven decision-making lead to dumb results?

November 21, 20165 Minute Read

Select article text below to share directly to Twitter!


Data-driven decision-making is more than just business intelligence dashboards or personalized recommendations in your Netflix queue. It’s now impacting your chance at scoring a new job, finding a date, or a signing a lease on an apartment. While big data may be helpful when you’re trying to find the best deal on a plane ticket, is it ever unfair in a way that hurts people? As IT leaders, when do we need to stop looking at the data and focus on the person?

For one anonymous Canadian applicant, a rescinded job offer based on his credit score was viewed as an absurd penalty for being “out of work for a while.” In an interview with Huffington Post, the Ontario resident expressed frustration that his potential employer thought his credit score was an indicator he’d steal money. Furthermore, TransUnion government relations director Eric Rosenberg has admitted there’s no existing statistical correlation between poor credit and the likelihood to commit fraud.

Big data is an amazingly valuable tool for finding the right book on Amazon or getting the right dosage of antibiotics from your doctor, but are we beginning to see its limitations as a form of intelligence? Data-driven decision-making in the real world has both positive and negative aspects, especially where human romantic and socioeconomic relationships are concerned.

Can data crack the code on romantic compatibility?

For the 11 percent of American adults who have dipped their toes into the online dating pool, big data is playing a bigger role than ever before. Chances are, your matches are ranked by perceived compatibility, based on how you’ve answered a series of questions about your hopes, dreams, and feelings about horror films.

While it may seem strange to apply algorithms to the highly chemical and inexact science of attraction, some proponents insist it really works. To an extent, anyway. When CEO Amy Webb successfully “hacked online dating” with a 72-point set of criteria to discover compatible matches, she discovered that there’s more to romance and attraction than she ever thought.

Professor Kang Zhao is a pioneer in dating algorithms who created a self-learning algorithm that looks beyond pure compatibility to match partners by perceived attractiveness. While Zhang has not specified exactly how well this method works, he told the MIT Technology Review that users had a better chance of getting responses from those the algorithm matched them with.

Algorithms aren’t biased—but they don’t look for “hustle”

For IT pros, hiring algorithms are an application of big data that may be encroaching awfully close to your personal world. The explosion of “people analytics” means that data-driven hiring decisions are no longer reserved for elite organizations like Facebook. Supporters of this approach argue that it can lead to objectively better and more diverse decisions than if a human hand-picked an applicant.

The National Bureau of Economic Research has found that for 300,000 applicants in “low-skill, service-sector jobs, such as data entry and call center work,” a hiring algorithm was an accurate predictor of employee tenure. This could be because it’s objective—it’s natural for hiring managers to pick candidates they can get along with. While algorithms may be capable of removing the dangerous human biases against the best candidate, not everyone is sold on the idea. Founder and chief executive of Millennium Search Amish Shah told The New York Times, “I look for passion and hustle, and there’s no data algorithm that could ever get to the bottom of that. It’s an intuition, gut feel, chemistry.”

When big data at work gets a little sketchy

While data is revolutionizing the professional realm in tons of positive ways, like energy-efficient smart offices, there’s still plenty of controversy. Biometric data collection on employees is one area with hotly contested ethics. From HR’s perspective, knowing the percentage of employees at-risk could be incredibly helpful. But where does the line between data-driven insight and personal privacy fall?

The City of Houston recently offered their employees a bit of a hard bargain—either participate in a wellness vendor program or pay $300 more for coverage. While the employees revolted, the organization may not be the first—or last—to recognize the employer benefits of employee data.

Some experts believe that biometric technology may be moving faster than companies know what to do with. World Privacy Forum executive director Pam Dixon is among those who tell employers to proceed with caution before diving deep into the world of fitness trackers or genetic samples on their talent. “The bottom line is,” Dixon told the Society for Human Resource Management, “there are pieces of [biometrics] that are not regulated.”

When algorithms rule (and when they just don’t)

Data has limits. This doesn’t mean you should unplug your Hadoop cluster, or ask your Doctor to hand-crunch the numbers for your laser eye surgery. But IT pros need to understand the limitations of big data decisions in the real world, and use this filter for smarter and more ethical choices at work

Google Flu famously failed to predict influenza outbreaks. This doesn’t mean that algorithms can’t ever work for epidemiology—it just means it’s still a work in progress. Any new or old artificial intelligence has flaws because it’s driven by humans, who are prone to mistakes. But algorithms also fail to account for the law of entropy or any factors outside their programmed logic.

Data-driven decision-making algorithms perform better in a wide range of concepts. In many cases, they’re an effective tool for removing bias and error. However, IT managers need to recognize that technology is only as smart as it’s human creator, and misapplications of artificial intelligence can lead to some really dumb decisions.

  • Recommended for you
  • Recommended for You