Commercial and tech update - December 2020

Commercial and tech update - December 2020

Welcome to this month’s edition of our commercial and tech update, which covers the interpretation of “goodwill”, guidance for the gaming industry and the need for fairer algorithms.

CoA clarifies meaning of “goodwill”

The Primus International Holding Company & Ors v Triumph Controls – UK Limited & Ors,1 Court of Appeal judgment provides useful clarification on the meaning of “goodwill” in contracts. The facts of the case relate to a share purchase agreement (“SPA”) entered into between the Triumph Group, as buyer and appellant (“Triumph”) and the Primus Group, as seller and defendant (“Primus”). One of the documents provided to Triumph during negotiations was a financial forecast predicting that the two target companies were going to be profitable in the future despite being loss-making at the time. Post-completion, Triumph quickly realised that the target companies were nowhere near achieving the predicted forecast and instead required an injection of cash of circa £85 million to keep them afloat. There was no argument that Triumph had a successful breach of warranty claim however the case turned on whether the claim was “in respect of lost goodwill” and therefore excluded by a specific clause in the SPA.

Triumph alleged that “goodwill” means the good name, reputation and connections of a business and that, since they were claiming for overpayment of the purchase price due to a careless set of financial forecasts prepared by Primus, the exclusion clause did not apply. Primus, on the other hand, submitted that loss of “goodwill” means “a loss of share value, where that value represents the difference between the costs of acquisition and the fair value of its identifiable net assets and/or where that loss of share value is caused by the impairment of the value of non-identifiable assets” which all parties agreed was an accounting definition of “goodwill”.

The Court of Appeal unanimously dismissed Primus’ appeal stating that the ordinary legal meaning of “goodwill” is “the good name and public recognition of the business concerned” and suggested that Primus’ accounting definition of the term was “convoluted”. In arriving at this conclusion, Coulson LJ confirmed that the relevant authorities pointed overwhelmingly to the conclusion that “goodwill” should be given its ordinary legal meaning. In addition, Coulson LJ considered how “goodwill” had been used elsewhere in the SPA on the basis that “it should ordinarily be presumed that language is used consistently within the four corners of an agreement.”

Whilst the judgment provides a useful clarification on the meaning of “goodwill”, it also serves as an important reminder that the Courts are reluctant to depart from the ordinary meaning of words when it comes to questions of interpretation. Parties should therefore never assume that their own interpretation of a word or term will be favoured and should instead endeavour to make it explicitly clear exactly what meaning is ascribed to a term or word, especially if the intended meaning is unusual or particularly technical.

Shaping the future of Google and Facebook

On 1 July 2020, the Competition and Markets Authority (CMA) published its final report in their market study examining competition in online platforms that generate revenue from digital advertising. The report assessed the markets for search, social media, and digital advertising, finding that Google and Facebook are the largest platforms with over a third of UK internet users' time online spent on their sites. These conclusions led the CMA to make four recommendations intended to address the sources of market power and the behaviour of platforms with market power.

On 27 November 2020, the Department for Business, Energy & Industrial Strategy and Department for Digital, Culture, Media & Sport jointly published their response to the four recommendations. While this response acknowledges the value that online platforms bring to businesses, consumers and the economy, it also raises questions about the concentration of market power in the technology industry. To summarise:

  • Recommendation 1 - The CMA’s first recommendation was to establish an enforceable code of conduct focussed on fair trade, open choices and trust and transparency. The code would govern platforms funded by digital advertising and with strategic market status (SMS). The government acknowledges the need for urgent action and agrees that an enforceable code is the best way forward and the response confirms that the Digital Markets Taskforce has already been commissioned to advise on the design and implementation of such a code.
  • Recommendation 2 - The government also supported the CMA’s second recommendation to require the Digital Markets Unit (DMU) to designate platforms with SMS and take responsibility for implementing the code of conduct. The response makes clear that the government intends to consult on the form and function of the DMU in early 2021 with the goal of having the DMU fully established by 2021.
  • Recommendation 3 - The CMA also recommended that the DMU be granted the necessary powers to enforce the principles of the code of conduct as well as having the authority to amend the code in line with evolving market conditions. The government agreed that the code would need to be enforceable but stated that careful consideration would be given to the powers of enforceability with specific regard to the DMU powers, the CMAs continued role, the Information Commissioner’s Office’s role relating to data protection and Ofcom’s position in relation to related markets.
  • Recommendation 4 - The CMA’s last recommendation was to grant the DMU necessary pro-competitive powers for data-related interventions, consumer choice interventions and separation interventions. Although the government agrees in principle with this recommendation, their response highlights the potential policy and implementation risks of granting the DMU pro-competitive powers.

Both the CMAs report and the government’s response echo the themes evident in the Platform-to-Business Regulations, released in July this year as part of the EU Digital Single Market strategy. It is clear that the UK government has followed the EU’s example in placing transparency and fairness at the forefront of regulating online platforms.

Moving forward, we expect the government to consult on the recommended reforms early next year and take steps to promote competition in the digital market. With the UK’s forthcoming Digital Strategy, it will be interesting to see how closely the government adopts the CMAs recommendations throughout 2021. Regardless of those decisions, it seems change is on the horizon for key market players such as Google and Facebook.

ASA consulting on new guidance on ads for in-game purchases

The ASA is consulting on a proposal for new formal guidance on advertising for in-game purchases, including “loot boxes” (the random-item purchases). Many modern video games and apps enable purchases within a game itself, which are made either directly or through an external platform and range from small to high value purchases.

Concerns regarding these purchases have been raised by various campaign and research organisations, Government Select Committees, the public and the video game press.

The ASA has identified three areas amongst these concerns relating to advertising regulations where the ASA deems that formal guidance can help mitigate the identified issues. These are:

  1. Clarity of information at the point of purchase: there are concerns over how the value of in-game items is presented and that currently the practice does not clearly indicate how much the items will actually cost. The guidance is intended to address what information needs to be provided to ensure clarity.
     
  2. Responsibility of advertising messages: the nature of in-game purchases, especially loot boxes, can potentially cause harm to consumers, particularly vulnerable ones. A report by the European Parliament identified a correlative link between gambling problems and loot box spending. However, it is unclear if the loot boxes cause increased gambling or just attract vulnerable consumers with existing gambling problems. The guidance is intended to ensure the messaging, language and approaches used to advertise in-game purchases reduce the risk of harm to vulnerable consumers.
     
  3. Truthfulness in advertising of games containing purchasing: the guidance seeks to address this by ensuring that advertisements clearly indicate that in-game purchases are available and ensuring any purchased content featured in advertisements does not mislead consumers.

The consultation sets out the draft guidance for comment and particularly asks for feedback on (i) how well it addresses concerns about advertising in-game purchases and whether any other points should be included; and (ii) if the guidance would disproportionately impact the video game industry. It is open until 28 January 2021 and can be found here: https://www.asa.org.uk/resource/consultation-on-ads-for-in-game-purchasing.html.  

The algorithmic social dilemma

The Centre for Data Ethics and Innovation released a paper in November 2020 on its review into bias in algorithmic decision-making (publishing.service.gov.uk). The Centre recognises that the risk in the use of algorithms is increasing as the algorithms become more complex.  The Centre cites the biggest risk to algorithmic systems that make decisions about individuals is that they may encode existing human biases, however it is not clear whether human biases carry more or less risk than algorithm biases. The Centre focused on four key industries in its review where algorithms have the potential to affect individuals: (i) recruitment; (ii) financial services: (iii) policing; and (iv) local government, and it makes various recommendations drawn out of its analysis. This summary will focus on the Centre's key findings in relation to recruitment and financial services.

Recruitment

In the industry of recruitment, the Centre recognised that there is a fast growth in the use of algorithmic tools at all stages of the recruitment process – however, on balance, the industry is also experienced in collecting data to assess certain outcomes, for example diversity statistics. Also, it is well known that human bias is typically rife in traditional recruitment methods and therefore it is possible that algorithmic tools may objectify processes to the exclusion of human biases.

However, due to coding on historic or current employment data, there is a risk that algorithmic technologies may entrench inequalities. The Centre has therefore recommended that in order to not discriminate against certain groups of people, the Equality and Human Rights Commission should update its guidance on the application of the Equality Act 2010 to recruitment, to reflect issues associated with the use of algorithms.

Financial services

The Centre noted that the financial services industry is a much more mature sector that has used data for a long time to support decision-making. The Centre's main focus within financial services was on credit scoring decisions made about individuals. It is well known that specific groups are historically underrepresented in the financial system and, similarly to the recruitment industry, there is a risk that these historic biases could be entrenched further through algorithms.

The Centre recommend the use of data from non-traditional sources to enable population groups who have historically found it difficult to access credit, due to lower availability of data about them from traditional sources, to gain better access in future. However, the Centre found that some of the key challenges to further progress the industry in this area is the issue of how to source data ethically alongside a risk averse banking culture.

Fortunately, the financial services industry has a strong regulatory presence due to the Financial Conduct Authority and so the Centre recommends that that the Government should work with regulators to provide clear guidance on the collection and use of protected characteristic data in outcome monitoring and decision-making processes and the Office of National Statistics and Government Statistical Service should work with public-private partnerships, like the Financial Conduct Authority's and Bank of England’s AI Public Private Forum, and regulators to promote harmonised principles of data collection and use in the private sector.

It seems that a recurring theme across both sectors is the use of historic and current data to reinforce decision making. It may be that any efforts to benchmark outcomes after the decision-making process are in vain. However, the various sectors may not have to look too far by way of inspiration; the Information Commissioner’s Office’s concept of "privacy by design" may be useful in that it begs the questions whether social fairness should be at the heart of the design of any algorithmic tools that make decisions about individuals.