To Regulate or Not

Response to Call for Evidence for House of Lords Select Committee on Communications inquiry into “The Internet: To Regulate or Not To Regulate?”

Written evidence submitted by Prof. Derek McAuley, Dr. Ansgar Koene, Dr. Lachlan Urquhart of the Horizon Digital Economy Research Institute, University of Nottingham. May 11th 2018.

Summary: it is already regulated - could someone please enforce existing regulation.

1. Horizon[1] is a Research Institute at The University of Nottingham and a Research Hub within the RCUK Digital Economy programme[2]. Horizon brings together researchers from a broad range of disciplines to investigate the opportunities and challenges arising from the increased use of digital technology in our everyday lives. Prof. McAuley is Director of Horizon and was principal investigator on the ESRC funded CaSMa[3] project (Citizen-centric approaches to Social Media analysis) to promote ways for individuals to control their data and online privacy and the EPSRC funded UnBias[4] (Emancipating Users Against Algorithmic Biases for a Trusted Digital Economy) project for raising user awareness and agency when using algorithmic services. Dr Koene led the research of the CaSMa and UnBias projects. Dr Urquhart is a research fellow in IT law, researching challenges and solutions to regulating emerging technologies.

Questions

1. Is there a need to introduce specific regulation for the internet? Is it desirable or possible?

2. When considering regulation for the internet it is important to make a distinction between the question of ‘regulating the internet infrastructure’, i.e. the underlying communications infrastructure, vs. 'regulating services that are built on the internet' (e.g. media and commerce platforms and services).

3. Regarding the internet infrastructure, the focus should be on facilitation of access, which includes regulator support for an appropriate concept of Net Neutrality – that is, internet communications service providers should not be permitted to discriminate against specific classes of traffic or users in normal operations.

4. For services built on the internet, often referred to as platforms, the primary focus needs to be on appropriate application of existing offline regulation to online service providers, and where it is deemed inadequate, updating that regulation to deal with the gap.

5. Regulation (and application of regulation) should focus on the function that is provided, not the medium through which it is delivered. Thus, a business that facilitates chauffeured private car hire services should be regulated the same regardless of whether the service is provided via an app (e.g. Uber), a phone call, or telex. Indeed, much existing legislation has been applied in this way despite the repeated complaints from some service providers that the use of the Internet should in some way exempt them from all existing legislation.

6. A key challenge in regulating these services built on the internet is the international nature of such service delivery, which can cause confusion regarding jurisdiction, and subsequently the problem of how those affected can seek redress. This is a fundamental issue that has been recognized and addressed in the GDPR by focusing on where the impact of processing occurs, i.e. the location of the data subject. So generally, it is the case that services targeted at specific jurisdictions through localization, whether through language or tailored local content, and generating revenue from such localization should be required to obey the regulation within that jurisdiction.

7. As an example, the fact that online platforms are increasingly becoming the information gateway for people, especially younger generations who get much of their news from online platforms via mobile devices, raises social and political concerns similar to traditional news media. Concerns about media empires with too much dominance in newspapers or TV coverage, should equally apply to online platforms where it is now common for a single provider to dominate a service sector (Facebook for social networks, Google for search). As shown by Facebook's own study (2012 US elections impact on likelihood to cast a vote[5]), they have the power to influence voting behaviour.

8. In summary, given the broad uses of internet technologies, and the wide range of legislation that already applies, a specific internet regulation can only address those elements that are specific to the internet technologies and not apply to all the myriad uses to which such technology can be put. Much of this is already covered by more specific regulation, which should be more rigorously enforced and updated as necessary.

2. What should the legal liability of online platforms be for the content that they host?

9. It is necessary to differentiate between the many possible different roles of online platforms, as examples consider platforms:

  • Engaging in, or facilitating, open or broadcast communication (e.g. YouTube);
  • Offering private person to person, or closed group communication (e.g. WhatsApp);
  • Performing personalization of content (e.g. Facebook).

The test for legal liability must be based on an independent assessment of the role that the platform takes, noting that a platform may simultaneously take on multiple roles – for example, many platforms offer both person to person private communications while also engaging in algorithmic personalized editorial control of third party contributed broadcast content.

10. A service provider operating as a broadcaster of content, however sourced, should be held to regulations concerning broadcasters.

11. Platforms that provide private communication (whether encrypted or not) between closed groups should be regulated as such in this role, directly in parallel with traditional telephone communication. So, they should not be held accountable for content in such private communications, but neither should they be permitted to process it other than in a manner essential to convey it. Hence, a company that processes the content of email to target adverts should not simultaneously be permitted to claim merely to be a "communications provider".

12. Service personalization is frequently used with the claim that it improves the customer experience, and this frequently involves filtering/recommending the products/services/information the customer is presented with. However, the algorithms are of course actually driven to optimize revenue, and as these algorithms become increasingly complex and adaptive, platform providers themselves may not be able to guarantee that they are compliant with regulations - for example, the personalization may in fact be based on illegal profiling using gender, ethnicity or a myriad of other types of sensitive personal information. However, such algorithmic content moderation is still an editorial engagement with content, even though it does not involve direct human intervention. The platform provider controls how the algorithm is set up, what its prioritization metrics are, and should be held accountable.

13. In-site linked advertising can cause specific problems, especially for sites that are meant to be child-friendly (by using child targeted content filtering) because the advertising content hosted on websites is usually under the control of a third-party ad delivery service (e.g. AdSense), which run real time auctions to determine which advert to show. Various ad delivery services do include customization options that allow the site owners to tune the type of ads they allow on their site, but often these settings are not used or fail to match the age appropriateness of the site content. Service providers should be held accountable for such contracted third party content.

14. In general, a broad sweeping internet regulation cannot possibly capture all the roles that service providers take on with regards to content.

3. How effective, fair and transparent are online platforms in moderating content that they host? What processes should be implemented for individuals who wish to reverse decisions to moderate content? Who should be responsible for overseeing this?

15. Noting para. 11, internet infrastructure providers, and "over the top" platforms while performing the role of providing private closed group communications services, should not be required, or indeed permitted, to moderate content.

16. Many platforms currently claim the protections afforded to such communications service providers, even when content is made publicly available, and prefer not to moderate content in advance, but rely on user take down requests for illegal or inappropriate content. However, such take down requests include many frivolous and malicious requests, sometimes aiming simply to censor content which the person reporting disagrees with. Hence it is only right that the moderation process should be one of transparent arbitration, which would be greatly helped by the wide adoption of a common code of practice and common processes.

4. What role should users play in establishing and maintaining online community standards for content and behaviour?

17. Some online communities are defined by their community standards and decide what is appropriate – indeed many platforms exist to support such community interaction. However, the handling of illegal content is a matter for law not community opinion, and the appropriate role of the community is simply to flag suspected content into a transparent arbitration process.

5. What measures should online platforms adopt to ensure online safety and protect the rights of freedom of expression and freedom of information?

18. Some platforms provide the means to label content as "adult", which is a somewhat blunt distinction – in film, TV and computer gaming[6], age labelling and controls are more nuanced and online service providers could use similar, rich content labelling schemes – even better if adopted as international standards and capable of being automatically applied through appropriate browser settings.

19. In our research, participants of our "Youth Juries" suggested the creation of peer-group advice services to support both parents and children with practical advice concerning online security based on personal experiences - online platforms could be encouraged to support such initiatives or at least sign post them for users.

20. Related to freedoms of expression and information, the previous comments on moderation apply. In addition, the previously discussed data-driven personalization can result in what has been referred to as a "filter bubble" where the personalization algorithms limit information visibility, hence imposing an unintended block to freedom of information – again we need to call for appropriate transparency as to this profiling, and the right and ability to remove it.

6. What information should online platforms provide to users about the use of their personal data?

21. This is covered extensively in the EU GDPR and the associated UK Data Protection Bill; what is now required is rigorous enforcement by the Information Commissioner. However, public engagement with, and understanding of, such legislation is poor.

22. A key element of modern data protection regulation is the role of the technologists, as non-state actors, in regulation through concepts like privacy by design and default (e.g. in Article 25 GDPR). How they design the technology has regulatory implications and mediates how users behave. However, it is also important to go beyond Privacy by Design as a compliance tool, to a mechanism for dialogue with citizens about what values they want embedded in technology, and how. It can be a medium for bringing wider human values into design from the beginning. Such participatory design would greatly aid wider public understanding of how their data is used.

7. In what ways should online platforms be more transparent about their business practices – for example in their use of algorithms?

23. Technologists like to think of their algorithms as neutral, but the modern class of goal driven big data algorithms will reflect any biases in the selection of data types selected for processing as well as biases present in the training data itself. So, yes, online platforms should be more transparent about how they work. They should provide clearer insight into the kind of data they collect and process about users, including behaviour and activity tracking, as outlined in the House of Commons Science and Technology Committee report on Responsible Use of Data (Fourth Report of Session 2014-15).

24. Service personalization is frequently used with the claim that it improves the customer experience, and this frequently involves filtering/recommending the products/services/information the customer is presented with. However, the algorithms are of course actually driven to optimize revenue, and as these algorithms become increasingly complex and adaptive, platform providers themselves may not be able to guarantee that they are compliant with regulations - for example, the personalization may in fact be based on illegal profiling using gender, ethnicity or a myriad of other types of sensitive personal information. However, such algorithmic content moderation is still an editorial engagement with content, even though it does not involve direct human intervention. The platform provider controls how the algorithm is set up, what its prioritization metrics are, and should be held accountable.

25. For many online platforms the default business model has become the 'freemium' / free to use model that is supported by advertising revenue. While the obvious side of the advertising revenue are the ads that are shown on an online platform, a second source of income is often the sale of platform user behaviour statistics. Data are commonly gathered through multiple sources, including: storing of the information that is posted to the platform (e.g. product reviews), tracking of user behaviour on the site (tracking-cookies track behaviours like, where the users has clicked on a site and the amount of time between clicks), purchasing of data about behaviour/interest of demographic classes of users. The data is used to sell targeted ad space to advertisers and to feed into the filtering/recommender algorithms that 'personalize' the user experience. Users typically have very little control over any of this data collection. Privacy settings on sites like Facebook primarily stipulate how information is shared between users, not how the platform provider gathers and uses the data. Terms & Conditions of online platforms are usually formulated to give maximum freedom to the platform provider to use the data as they wish. For example T&Cs often include vague, broad-stroke, clauses such as 'data may be used for research purposes', where the research question is not specified to the user. Users usually have no options to control how their data is used, if they want to use the services, or even just part of the services, of the platform provider, they have to consent to handing over full control of their data to the platform. Various platforms do provide users with comprehensive access to the content that the user contributed to the platform, such as a download of the posts that were made to G+, but do not provide access to the tracking data that was collected about the user.

8. What is the impact of the dominance of a small number of online platforms in certain online markets?

26. The internet was founded on open standards and interoperable federated services. This offered a landscape for competitive innovation that is now being restricted by isolated "walled gardens" and for the large players, aggressive acquisitions strategies that remove competition before it arises. As noted in the House of Lords inquiry into Online Platforms, such acquisitions do not satisfy the criteria required to be subject to scrutiny by the Competitions and Market Authority (or equivalent elsewhere), and this could usefully be reviewed.

9. What effect will the United Kingdom leaving the European Union have on the regulation of the internet?

27. International coordinated regulation is required in order to have impact, and specifically on large US corporations which have emerged within the US's specific regulatory framework. In this regard the EU is an important player, and the UK has been an important contributor to the EU position, whereas the UK will in future be a minor voice unless it continues to coordinate and support EU action in this area.

[1] http://www.horizon.ac.uk
[2] https://epsrc.ukri.org/research/ourportfolio/themes/digitaleconomy/
[3] http://casma.wp.horizon.ac.uk
[4] http://unbias.wp.horizon.ac.uk
[5] https://www.nature.com/articles/nature11421
[6] Pan European Game Information http://pegi.info

Written on May 24, 2018