British watchdogs ask how to better regulate algorithms • The Register

0

UK watchdogs have solicited opinions on the benefits and risks of websites and apps using algorithms under the banner of the Digital Regulation Cooperation Forum (DRCF).

While “algorithm” can be defined as a strict set of rules that must be followed by a computer when making calculations, the term has become a bugbear as lawmakers grapple with the revelation that it is involved in every digital service we use use today.

Whether it’s what video to watch next on YouTube, what movie you might enjoy on Netflix, who shows up on your Twitter feed, searches for autosuggestions, and what you might want to buy on Amazon, the algorithm rules them all and much more more.

While this all sounds pretty harmless, regulators are aware that algorithms don’t always work to the benefit of consumers.

“Algorithmic systems, especially modern machine learning (ML) or artificial intelligence (AI) approaches, pose significant risks if not used carefully,” said the group. “They can introduce or reinforce harmful prejudices that lead to discriminatory decisions or unfair outcomes that reinforce inequalities. They can be used to mislead consumers and distort competition.”

The DRCF – consisting of the Competition and Markets Authority (CMA), the Information Commissioner’s Office (ICO) and the Office of Communications (Ofcom) – has drawn up a work plan for the coming year which aims to:

  • Better protection of children online
  • Promoting competition and privacy in online advertising
  • Support algorithmic transparency improvements
  • Enable innovation in the industries they regulate

However, you don’t just throw the hood on an algorithm and say, “Yeah, that fits.” We’re talking about enormously complex mathematics that can only really be evaluated after they’ve done the job they were built to do, when you want to eradicate (un)intentional prejudice and other meanness.

We’re also talking about information that the companies that own it consider copyrighted — as evidenced by the lawsuits launched over that fact.

While digital secretary Nadine Dorries – who reportedly asked Microsoft “when they wanted to get rid of algorithms” – maybe they don’t understand, at least the DRCF has opened up the word for people who… could?

Gill Whitehead, Chief Executive of the DRCF, said in a statement: “The task ahead of us is significant – but by working together as regulators and working closely with others, we intend the DRCF to make an important contribution to the UK’s digital landscape to the benefit of.” people and businesses online.

“Algorithms are just one of those areas. Whether you’re scrolling social media, flipping through movies, or deciding on dinner time, algorithms are busy but hidden in the background of our digital lives.

“This is often good news for many of us, but algorithms also have a problematic side. They can be manipulated to cause harm or abused because the companies that integrate them into websites and apps just don’t understand them well enough. As regulators, we need to make sure the benefits outweigh the risks.”

Stefan Hunt, CMA’s Chief Data and Technology Insight Officer, added: “The CMA, FCA, ICO and Ofcom have already done a lot of work on algorithms, but there is more to do. We are now wondering what else is needed. also from us as regulators and also from the industry?”

As for what this work might be, two discussion papers by the DRCF on the benefits and harms of algorithms and the landscape of algorithmic testing and the role of regulators can be viewed in detail here.

Both emphasize: “This discussion paper is intended to encourage debate and discussion among our stakeholders. It should not be taken as an indication of the current or future policy of any of the DRCF’s member regulators.”

It may be that we’re just too far away to do anything useful against malicious algorithms. It’s not as if the tech giants that have proprietary software printing money for them are suddenly going to trash the programming that underlies their products and services.

And while a compsci expert may be an “algorithm expert,” there’s no guarantee they’ll be able to look at twitters and have any idea what any of it means or what it’s related to.

The opportunity to comment is open until June 8th. Opinions should be sent to [email protected]

We wish the regulators the best of luck. And don’t listen to journalists’ opinions. ®

Share.

Comments are closed.