by Rob Coleridge and Sabrina Goran, Irwin Mitchell LLP
The Social Dilemma, a recent Netflix documentary about the business model and use of consumer data by big tech, is said by critics to be a shocking revelation and wake up call to many. Whether or not that is the case, the Social Dilemma is an example of a rapidly increasing public concern about how consumers are being affected by big tech’s use of data, and consumer businesses are taking note of this.
Spoiler Alert! The Social Dilemma asserts (in summary) that:
- The business model of social media (and other tech businesses: search engines, websites, mobile apps, messaging services, and electronic games) is summarised in the comment “if you are not being sold a product, you are the product”. In other words, these platforms collect data on individual users, and use proprietary algorithms to analyse this data and calculate which adverts and content are best suited to each individual user. On the basis of this information and using real time bidding (in the time it takes to load a webpage), platforms sell this ad-space to advertising agencies or direct to consumer businesses. This is, however, unlikely to be a shock to most people, who may be happy to have interesting adverts and content put before them.
- In addition to the use of targeted content and advertising, platforms also use known psychological methods to encourage individual engagement with the platform (and therefore allow for more advertising time). A classic example being the refresh function which automatically and deliberately delivers new content every time and which, like a slot machine, is said to give users “a glorious and new hit of dopamine every time”. As such, these devices can lead to genuine addiction and there is growing concern about big tech’s use of such methods, particularly where children are concerned.
- The adtech algorithms, which calculate the content in which users are interested, also suggest new content in which a user might be interested and stop the display of content in which a user is not interested. As a result, users may never see a counterpoint and, in the case of certain interests, are drip fed one-dimensional content which can be extreme. In some cases, this has led to the radicalisation of users and the polarisation of communities. Also, as is often seen in the news, this element of social media can be used by malign third parties as a strategic means of spreading disinformation.
Does the current UK data protection regime offer sufficient protection to users?
The current data protection regime in the UK is set out by the Privacy and Electronic Communications Regulations (PECR) and the General Data Protection Regulation (GDPR). After a lot of work in the last few years, these two pieces of legislation are familiar to most businesses. In very brief summary:
- The GDPR requires organisations to provide clear and comprehensive privacy information, including how data is processed and who the data is shared with. It also requires explicit user consent to process special category data (relating to health, religion, political opinions, racial or ethnic origin and other sensitive personal data).
The UK’s Data Protection/Enforcement Authority, the Information Commissioner's Office (ICO), has previously raised concerns about adtech and commented that it “is a systemic problem that requires organisations to take ownership for their own data processing, and for industry to collectively reform”. In June 2019 the ICO published a report setting out various concerns with adtech’s lack of compliance with PECR and GDPR. In particular
- The ICO noted that insufficient privacy policies are being used within the adtech industry. Organisations often provide information which is overly complex and cannot be easily understood by users. Indeed, sometimes organisations do not know themselves with whom the data will be shared, so they cannot provide this information to users with any clarity.
- The ICO was concerned about the way the adtech industry collects data. The GDPR broadly requires that personal data is collected lawfully, fairly, and in a transparent manner. There are various lawful bases for processing data, the most flexible of which is the “legitimate interests” basis. This basis can be relied upon if the use of data is proportionate and it has a minimal privacy impact, and if people are generally unlikely to object to the use of their data in this way. The ICO alleged, however, that real-time bidding does not meet the legitimate interest requirements and the only lawful basis for this type of processing of personal data is user consent.
Since the report, these assertions have not been tested or enforced. It is likely that further guidance or regulations are required to ensure the adtech industry are aware of and comply with the relevant data protection obligations. However, as a result of Covid-19, the ICO’s investigations into the adtech industry and real-time bidding have been put on hold and its progress in this regard is likely to depend on the political appetite for change.
In the meantime, and on a more positive note, progress is being made to protect children online. The ICO recently implemented its Age Appropriate Design Code, which applies to any business providing online services and products which are “likely to be accessed” by children in the UK. The code sets standards that businesses must meet by September 2021, with a focus on providing default settings which minimise the amount of data being collected and used in order to safeguard children. Businesses should also provide age-appropriate privacy information which can be understood by child audiences. This does not, however, address concerns about other means by which platforms seek to increase user engagement.
The impact on business
The platforms that come to mind (and which are often referenced in the Social Dilemma) are likely to be the big players with whom consumers regularly interact - Google, Facebook, Twitter, etc. That is not the whole picture, however. Consumers visit hordes of other internet sites for the purpose of news, their interests and hobbies and, of course and ever increasingly, shopping. Data about their interaction with those sites can be and often is collected, analysed and used for adtech purposes. That is because those sites are often designed and operated by separate platform providers with a data focused business model. Many of these platforms therefore have huge market power – AWS or Azure for instance – and their businesses customers may have no control over how the data collected by these platforms is used, even if that data is collected in association with the said business.
Against this backdrop, consumer concerns are increasing (as is clear from the Social Dilemma) but the enforcement of existing consumer data regulation (in terms of GDPR) is not strong and the likelihood of new consumer regulation is low. Consumer businesses (especially those who subcontract their website, e-commerce or advertising operations) are therefore likely now to be looking at how they can use adtech without having their online practices being questioned by their customers. This will mean obtaining some control (or at least having some clarity) about how the data collected on their customers is used, so as to control the risk in this regard. And in this respect there is hope in the new Platform to Business Regulations 2020.
The Platform to Business Regulations came into force in July and for the first time gives businesses (amongst other things) a helping hand in understanding how data associated with the marketing of their products is used.
In summary, the Regulations govern the contracts between any platform and any business, by which the platform provides a service which facilitates the sale by the business of goods or services to consumers – this covers all the platforms mentioned above and more. The Regulations cover many issues (not covered by this article) but at Article 9 they provide that platforms must include in their business terms and conditions a description of the access which the business has to personal and/or non-personal data collected through the services provided by the platform. This must cover:
- whether the platform has access to this data and if so to which categories of data and under what conditions;
- whether the business has access to this data and if so to which categories and under what conditions.
- whether the business also has access to personal or other data, including in aggregated form, generated by the platforms’ services to/use by other businesses and their consumers and if so, to which categories and under what circumstances.
- whether data to which the platform has access is provided to third parties and, if such provision is not necessary, to explain the purpose of the sharing and any opt-out provisions available to the business.
For the first time, this gives businesses a right to understand how personal and non-personal data collected in association with their products/services is used. It does not, however, give businesses a right to better terms regarding this data than they can negotiate, which will come down to each party’s bargaining power.
For the purpose of enforcement, it is also worth noting that the Platform to Business Regulations give businesses the right to make collective complaints and bring group claims through trade associations. This remains untested and it will be interesting to see who is willing to bring an action under the Platform to Business Regulations.
In conclusion, in light of increasing public concerns about social media and adtech, there are now better regulatory tools which businesses can use to get on top of this tricky issue. It remains, however, an ever evolving and tricky area of commerce, law and ethics, and is a dilemma that needs to remain under constant review.
Read more about our Commercial team.