The latest policy recommendations for regulating powerful Internet platforms comes from a U.K. House of Lord committee that’s calling for an overarching digital regulator to be set up to plug gaps in domestic legislation and work through any overlaps of rules.
“The digital world does not merely require more regulation but a different approach to regulation,” the committee writes in a report published on Saturday, saying the government has responded to “growing public concern” in a piecemeal fashion, whereas “a new framework for regulatory action is needed”.
It suggests a new body — which it’s dubbed the Digital Authority — be established to “instruct and coordinate regulators”.
“The Digital Authority would have the remit to continually assess regulation in the digital world and make recommendations on where additional powers are necessary to fill gaps,” the committee writes, saying that it would also “bring together non-statutory organisations with duties in this area” — so presumably bodies such as the recently created Centre for Data Ethics and Innovation (which is intended to advise the UK government on how it can harness technologies like AI for the public good).
The committee report sets out ten principles that it says the Digital Authority should use to “shape and frame” all Internet regulation — and develop a “comprehensive and holistic strategy” for regulating digital services.
These principles (listed below) read, rather unfortunately, like a list of big tech failures. Perhaps especially given Facebook founder Mark Zuckerberg’s repeat refusal to testify before another UK parliamentary committee last year. (Leading to another highly critical report.)
- Parity: the same level of protection must be provided online as offline
- Accountability: processes must be in place to ensure individuals and organisations are held to account for their actions and policies
- Transparency: powerful businesses and organisations operating in the digital world must be open to scrutiny
- Openness: the internet must remain open to innovation and competition
- Privacy: to protect the privacy of individuals
- Ethical design: services must act in the interests of users and society
- Recognition of childhood: to protect the most vulnerable users of the internet
- Respect for human rights and equality: to safeguard the freedoms of expression and information online
- Education and awareness-raising: to enable people to navigate the digital world safely
- Democratic accountability, proportionality and evidence-based approach
“Principles should guide the development of online services at every stage,” the committee urges, calling for greater transparency at the point data is collected; greater user choice over which data are taken; and greater transparency around data use — “including the use of algorithms”.
So, in other words, a reversal of the ‘opt-out if you want any privacy’ approach to settings that’s generally favored by tech giants — even as it’s being challenged by complaints filed under Europe’s GDPR.
The UK government is due to put out a policy White Paper on regulating online harms this winter. But the Lords Communications Committee suggests the government’s focus is too narrow, calling also for regulation that can intervene to address how “the digital world has become dominated by a small number of very large companies”.
“These companies enjoy a substantial advantage, operating with an unprecedented knowledge of users and other businesses,” it warns. “Without intervention the largest tech companies are likely to gain more control of technologies which disseminate media content, extract data from the home and individuals or make decisions affecting people’s lives.”
The committee recommends public interest tests should therefore be applied to potential acquisitions when tech giants move in to snap up startups, warning that current competition law is struggling to keep pace with the ‘winner takes all’ dynamic of digital markets and their network effects.
“The largest tech companies can buy start-up companies before they can become competitive,” it writes. “Responses based on competition law struggle to keep pace with digital markets and often take place only once irreversible damage is done. We recommend that the consumer welfare test needs to be broadened and a public interest test should be applied to data-driven mergers.”
Market concentration also means a small number of companies have “great power in society and act as gatekeepers to the internet”, it also warns, suggesting that while greater use of data portability can help, “more interoperability” is required for the measure to make an effective remedy.
The committee also examined online platforms’ current legal liabilities around content, and recommends beefing these up too — saying self-regulation is failing and calling out social media sites’ moderation processes specifically as “unacceptably opaque and slow”.
High level political pressure in the UK recently led to a major Instagram policy change around censoring content that promotes suicide — though the shift was triggered after a public outcry related to the suicide of a young schoolgirl who had been exposed to pro-suicide content on Instagram years before.
Like other UK committees and government advisors, the Lords committee wants online services which host user-generated content to be subject to a statutory duty of care — with a special focus on children and “the vulnerable in society”.
“The duty of care should ensure that providers take account of safety in designing their services to prevent harm. This should include providing appropriate moderation processes to handle complaints about content,” it writes, recommending telecoms regulator Ofcom is given responsibility for enforcement.
“Public opinion is growing increasingly intolerant of the abuses which big tech companies have failed to eliminate,” it adds. “We hope that the industry will welcome our 10 principles and their potential to help restore trust in the services they provide. It is in the industry’s own long-term interest to work constructively with policy-makers. If they fail to do so, they run the risk of further action being taken.”
source https://techcrunch.com/2019/03/11/online-platforms-need-a-super-regulator-and-public-interest-tests-for-mergers-says-uk-parliament-report/
No comments:
Post a Comment