25 years: Behavioural science & compliance
This video refers to an effective algorithm that does what it was created to do, but the consequences of this are beyond what was envisaged fully by the creators. In this video, Christian highlights how technology, particularly what he called ‘'awful algorithms'’, can make bad decisions with severe consequences.
This video refers to an effective algorithm that does what it was created to do, but the consequences of this are beyond what was envisaged fully by the creators. In this video, Christian highlights how technology, particularly what he called ‘'awful algorithms'’, can make bad decisions with severe consequences.
Subscribe to watch
Access this and all of the content on our platform by signing up for a 14-day free trial.
15 mins 20 secs
In this video Christian looks at the times when machines get things badly wrong, how they bring out the worst in people and how they are already negatively influencing and will influence human decision-making in the future.
Key learning objectives:
Be able to define technology risk
Give some real world examples of awful algorithms
Explain why algorithms need risk management
Access this and all of the content on our platform by signing up for a 14-day free trial.
By this we don’t mean the risks of technology, but the risks posed by it! For example, unlike humans, machines aren’t sentient — at least not yet — so they’ll do exactly what they’re told, regardless of context. Unlike people, they cannot question or think about the ‘why’ of what they’re doing. Computers don’t do context - this can lead to bad outcomes.
After a failed terror plot in London in 2017, a news program set out to discover how easy it would be for people to buy bomb-making equipment. What they found was that Amazon’s “frequently bought together” feature, which is designed to recommend additional purchases customers might like to make alongside the item they’re looking at, was inadvertently providing customers with a guide on how to make bombs.
Black powder and thermite, two common ingredients of homemade explosive devices, were grouped together under a “Frequently bought together” section under listings for other chemicals used to make explosives.
Furthermore, steel ball bearings often used as shrapnel in explosive devices, ignition systems and remote detonators were not only readily available, but some were promoted by the website on the same page as the chemicals that “Customers who bought this item also bought”.
Uber’s ‘surge pricing’ algorithm does exactly what it’s told to do — it tracks activity on the platform and increases the price of rides whenever rider demand looks like it will outstrip driver supply. When there’s less demand or increased supply, prices can then readjust.
What the algorithm wasn’t programmed to do, is to understand why there might be an increase in rider demand or a decrease in driver supply. That’s not what it’s there to do.
This may be a problem because for example, what about if there’s a sudden increase in demand because there’s been a terrorist incident and people are trying to get away, or there’s a natural disaster?
Technology also needs human oversight. While it can deliver perfect solutions from an economic perspective, it can’t make judgments as to whether those perfect solutions are appropriate from a human perspective. Computers can't be contextualised. Unless we program them to understand the times when there are exceptions to the rules, we’ll get bad outcomes.
It is inevitable that for some things, we will need to use algorithms. However it is important that we think carefully about the risks we’re running when we deploy them, and try to mitigate those risks ahead of time.
Access this and all of the content on our platform by signing up for a 14-day free trial.
There are no available videos from "Christian Hunt"