For the fainthearted . . .

When algorithms go wrong

A group of three people sit at a glass topped table on a lawn: a boy leans on his left elbow, with his right-hand he holds open a book he is reading. A girl sits on the lap of a pony-tailed woman, they are deep in conversation. Beneath the table a shaggy brown dog looks around expectantly, a tennis ball lies at the woman’s left foot. On the table are croissants and coffee cups. The caption beneath the picture reads,

“Have you explored the potential of sustainable investing? Here’s a glimpse of what it could mean for you. The value of investments can go down as well as up. Your capital is at risk.”

The Instagram post was an advertisement from something called UBS, a Swiss bank based in Zurich. There was no clue as to why it would appear among the posts I see on Instagram, no explanation couched in such terms as, “you are seeing this advertisement because . . . ” It was an odd advertisement, would people like me going through Instagram pictures be likely to think, “Those were nice images, I think I’ll invest some money with a Swiss bank while I am here?” Presumably, among Instagram users, there are very affluent people who would contemplate transferring money to Zurich; presumably, there are Instagram users who are very different from those of us who just snap everyday images with our smartphones and share them online with a few dozen others; or perhaps it is just that the advertisement ended up in the wrong place.

One assumes that a company as big as Instagram has carefully devised software to determine to whom advertisements are shown, it seems unlikely that a UBS advertisement would appear on the screens on the millions of young teenagers who use the Instagram app, but who takes the decisions as to how and when such advertising appears? Algorithms may be employed to implement an advertising campaign, but someone, somewhere must have decided who should be the target audience and attempted to devise an appropriate algorithm. The algorithm seems to have malfunctioned when an advertisement directed at a thirty-something, heterosexual male with a young family, who is of very high net worth, appears on the wrong screen.

The cost of the odd banking advertisement going astray is infinitesimally small, its appearance on my screen being triggered by something in my profile or activity that fell within the parameters of the algorithm – it is of no consequence. Other algorithmic activity can be more problematic, in the past, a bank algorithm caused it to decline payment to a ferry company, another bank was prompted to hold a payment and send an email regarding online scams after I had made an innocuous online purchase from a well-known online trader.

At worst, the malfunctioning algorithms have been a personal inconvenience, but if they can get little things wrong, what about big things? Should we trust computer programmes with decisions regarding security or immigration or health care when they cannot determine that a fifty-seven year old male with no money is an unlikely client for UBS? What if a really big mistake was made?

 

Exit mobile version