Money blog: ‘My neighbour’s CCTV faces directly into my home – what are my rights?’ | UK News
By Brad Young, from the Money team
The price of the flights, furniture or insurance you buy online may be controlled by semi-autonomous “black boxes”.
These AI-powered pricing algorithms learn how to improve a seller’s revenues by surveying a huge amount of data very quickly – such as the prices of competing products – and changing the costs of goods accordingly.
If the algorithm decides to undercut a competitor, this leads to cheaper prices, but what if it learns the most effective way to increase revenue is to engage in tacit collusion?
This is one of the questions facing regulators like UK energy watchdog Ofgem, which in April launched a consultation after voicing fears artificial intelligence could be used to manipulate the market.
Last year, the Competition and Markets Authority – tasked with battling anti-competitive practices in the UK – warned pricing algorithms may “become a device to facilitate collusion”.
And across the Atlantic, Lina Khan, the head of the US Federal Trade Commission (FTC), warned in April that outsourcing pricing decisions to the algorithms could result in inflated prices.
‘Dark side’ of algorithms
Online shops have always used computer programmes to assist them in setting their prices, but AI has introduced a new level of processing and sophistication, using trial and error to better price goods.
“The AIs are basically black boxes. If they find ways to increase prices that can be considered in violation of antitrust law, then we have a problem. Because they have autonomously started engaging in this behaviour, so there are issues of liability,” Professor Emilio Calvano, an economist specialising in algorithms and competition, says.
By black box, Prof Calvano means we can see what information the programme has reacted to and how it has reacted, but we cannot determine the intentions of the AI.
It’s impossible to know for sure whether an online seller uses the tech – they don’t have to report it – but experts believe if they are not already widespread, they will be; and if they’re here, they’re here to stay, says Prof Calvano.
“Automating pricing is potentially very good, because it is good when prices react to things such as market conditions,” Prof Calvano, who works at Luiss Guido Carli University in Rome, says.
For example, pricing that responds to demand increases the number of transactions.
But authorities are concerned there may be a “dark side” to the power these algorithms wield.
If multiple sellers delegate their pricing to the same algorithm, there is scope for coordination, with the profit-driven programme learning that jointly increasing the prices offered by competing vendors increases the payoff for all of them, said the professor.
Research undertaken by Prof Calvano found that even if pricing is delegated to multiple independent pricing algorithms, they may learn by interacting with one another that they could mutually benefit from price increases.
Prices changing more often
Artificial intelligence could lead to online prices fluctuating more frequently, according to Tony Boobier, an author on analytics and AI, because it will be able to react faster and more accurately to market conditions.
But he says it is unlikely to fix prices because large corporations will not make available their data to their competitors, meaning different programmes will be reacting to different information.
“AI will bring greater granularity and accuracy. Perhaps AI will even lead to reductions in costs for some people,” says Mr Boobier.
Difficult to prove
Paolo Palmigiano, head of UK competition, trade and foreign investment at global law firm Taylor Wessing, says there has not yet been a case involving algorithmic tacit collusion “because it would be difficult to prove”.
To show it has occurred between human beings, lawyers must demonstrate the firms involved knew that their agreement would affect other sellers; they could monitor the agreement; they could punish deviation from it, and the agreement could not be jeopardised by competitors or customers.
“Even if pricing algorithms are used, the necessary conditions for tacit collusion may not be met,” Mr Palmigiano says.
“But never say never. I’ve seen everything under the sun in competition law.”
If this was the case, there would be a “big difference” between algorithmic collusion and human collusion.
Human coordination is limited, but orchestrated by computers it could be “everybody at the same time, very quickly”.
Regulators have found evidence of algorithms facilitating collusion, says Mr Palmigiano, but it hasn’t happened as widely as academics expected.
In 2016, the CMA fined Amazon seller Trod more than £160,000 for breaking competition law by agreeing with another seller, GB eye, not to undercut each other.
Both sellers, which sold posters of Justin Bieber and One Direction, used automated repricing software to monitor the agreement.
Abuse of dominance
The risk posed by AI-powered algorithms to good deals is not limited to collusion.
Large companies with significant market power can abuse it by using algorithms to give their products preferential treatment, stifling competition, Mr Palmigiano says.
In 2017, the European Commission fined Google €2.42bn (£2.05bn) for giving its price comparison service, Google Shopping, an advantage through prominent placement in search results.
Personalised pricing
Mr Palmigiano says there were rumours of an anti-competitive practice called personalised pricing: “Depending on your shopping habits, computers will determine the price you find.”
In effect, the price offered to you might be higher because the computer knows you can afford it, based on data it has collected about you.
“There is no evidence at the moment that this has happened, but it is one issue that clearly could come up in the future.”
What should be done?
Regulators should audit algorithms to make sure they cannot cause harm, Prof Calvano suggests.
“It is good that these machines are increasingly used, but we should put in place some guidelines to avoid that something goes wrong.”
Mr Palmigiano says the European Commission has been clear: companies should be responsible for inputting the necessary safeguards for compliance with competition law in their design.
“But let’s be careful about what we regulate because actually, normally, the purpose of this pricing algorithm is to be as competitive as possible,” he says.
Awareness of the risks should be raised among the general public to encourage debate, Mr Boobier says.
“After all, aren’t we all stakeholders in this new data-infused, algorithmic world, so aren’t we all entitled to an opinion?”
The CMA declined to comment, while the FTC did not respond to a request.
Ofgem says its consultation would “help us develop the right regulatory framework and tools to ensure AI use in the sector benefits consumers”.
link