How NOT to Handle New Technology

Rite-Aid's disastrous rollout of facial recognition offers lessons to insurers as they try to figure out how to use generative AI.

New Technology

My mantra for innovation has for decades been, "Think Big, Start Small, Learn Fast." Among the many possible ways to get that wrong, large companies seem to go with, "Think Big, Start Big." 

If they're lucky, they just wind up wasting money when an innovation, inevitably, doesn't roll out as envisioned. Less lucky companies also alienate a significant chunk of their customers by presenting them with some clunky new product or process. Really unlucky companies also wind up in legal trouble. 

Well, Rite-Aid hit the trifecta when it broadly rolled out facial recognition technology that was designed to prevent shoplifting. It lost a bunch of money, ticked off a host of customers and wound up in trouble with federal authorities, including for racial bias.

The nuances of facial recognition bear some resemblance to what insurance companies are finding as they try to find breakthroughs with AI, so it's worth understanding Rite-Aid as a cautionary tale.

Let's look at Rite-Aid to see what NOT to do.

Facial recognition technology, returns a lot of false positives. It also has inherent racial bias, because the technology does a much better job of recognizing the faces of white people than of Black people or Latinos. And that's today's generation of technology -- the latest and greatest version. 

Yet Rite-Aid began rolling out the technology way back in 2012, and it did so to hundreds of stores, not in some sort of tightly controlled pilot. 

Like many "Think Big, Start Big" mistakes, Rite-Aid's was driven by a combination of a compelling business need and the promise of a breakthrough technology. 

Rite-Aid, like all pharmacies, has a real problem with shoplifting, a high percentage of which is committed by a small number of people. So you can imagine why Rite-Aid would be susceptible to a sales pitch about magical technology that would let it know when people convicted or suspected of theft entered a store. 

In any case, Rite-Aid jumped right in. It built a data base and had the technology alert employees when someone thought to be a match walked into the store. An employee would then follow the person around or call the police, even if they hadn't seen a crime committed. 

The problems should have been obvious to Rite-Aid. According to the Washington Post:

"Huge errors were commonplace. Between December 2019 and July 2020, the system generated more than 2,000 'match alerts' for the same person in faraway stores around the same time, even though the scenarios were 'impossible or implausible,' the [Federal Trade Commission] said. In one case, Rite Aid’s system generated more than 900 'match alerts' for a single person over a five-day period across 130 stores, including in Seattle, Detroit and Norfolk, regulators said."

Once, "employees called the police on a Black customer after the technology mistook her for the actual target, a White woman with blond hair," the article said. "The system generated thousands of false matches, and many of them involved the faces of women, Black people  Latinos."

Rite-Aid, which reached a settlement with the FTC last week that includes a promise not to use facial recognition technology for five years, said it was merely conducting "a pilot" test and noted that it ceased using the technology in 2020. The company filed for bankruptcy protection in October because of losses and opioid-related lawsuits and is shrinking and reorganizing.

Insurance companies would do well to keep Rite-Aid in mind as they consider the opportunities presented by generative AI. The temptations might be more manageable in our industry, because we don't have as defined a problem as shoplifting that could be made to simply disappear if generative AI works perfectly. But temptations are still there.

Whatever you're considering, you should certainly think big, because big opportunities are out there to be had. But start small. Think Big, Start Big is a recipe for failure, even if you don't end up with the FTC knocking on your door with an armful of subpoenas.



P.S. If you're interested in reading more about Think Big, Start Small, Learn Fast, my friend and frequent co-author Chunka Mui lays out our rubric here.