England & Wales: Cybersecurity

The UK National Cyber Security Centre (NCSC) defines cybersecurity as ‘how individuals and organisations reduce the risk of cyber ‘attack’. Cybersecurity’s core function is to protect the devices we all use (eg, smartphones, laptops, tablets and computers), and the services we access – both online and at work – from theft or damage. It is also about preventing unauthorised access to the vast amounts of personal information we store on these devices, and online.[1]

The increasing reliance of businesses and consumers on digital technology, and the growing frequency and impact of cyber attacks, mean that cybersecurity must not be seen as a purely technical matter, largely reserved for an organisation’s IT team; it is now widely recognised as an enterprise-wide risk issue for each business. The 2020 Department for Digital, Culture, Media and Sport survey of cybersecurity breaches,[2] released on 25 March 2020, found that 46 per cent of UK businesses reported at least one cyber security breach or attack between 2019 and 2020, and approximately one in five UK businesses that have suffered such a breach or attack also reported a ‘material outcome’ of either losing money or data.

For years now, the Information Commissioner’s Office (ICO), the NCSC and the Financial Conduct Authority have warned companies that cybersecurity should be treated as a boardroom-­level issue. In a 2018 speech, Elizabeth Denham, head of the ICO, cautioned that ‘we have seen too many major breaches where companies process data in a technical context, but security gets precious little airtime at board meetings. . . . If left solely to the technology teams, security will fail through lack of attention and investment.’[3]

The ICO’s security concerns are increasing alongside its enhanced enforcement powers to take action against those in breach of data protection laws, especially those relating to security breaches. As a result, it is essential that organisations keep abreast of the key cybersecurity obligations imposed by UK legislation (summarised in below), as well as the potential legal and regulatory implications arising from a data breach. As an overview, and in addition to the overarching risk of reputational damage, the main consequences of a breach include:

  • An organisation may be required to notify a breach to relevant regulators and (if a sufficiently serious risk arises from the loss of or damage to personal data) affected private individuals. That notification would involve disclosing occurrence of the breach and providing information regarding the nature, consequences and mitigation of the breach, as prescribed by each regulator. Multiple notification requirements may give rise to challenges for an organisation, in managing the different forms of notification, different types of information required by each regulator and different deadlines, as well as managing the flow of such information to the marketplace.
  • Regulators have shown an increasing willingness to impose large financial penalties for breaches of specific cybersecurity obligations. Since the implementation of the EU General Data Protection Regulation (GDPR) and the Data Protection Act 2018 (DPA 2018), the ICO is able to impose fines of up to €20 million, or 4 per cent of an organisation’s worldwide turnover, whichever is higher. This represents a considerable increase in its powers under the Data Protection Act 1998 (DPA 1998), where the upper limit on fines was £500,000. Similarly, under the Network and Information Systems Regulations (NIS Regulations) 2018, introduced alongside the GDPR, the ICO has the power to fine relevant digital service providers and providers of critical infrastructure services up to £17 million. The ICO has shown that it will use these newfound muscles through its recent issuance of significantly large fines to Marriott International (£18.4 million) and British Airways[4] (£20 million) for their respective failures to safeguard electronic customer records from cyber attack. The trend of significant regulatory fines is not limited to the ICO: in October 2018, the FCA fined Tesco Bank £16.4 million for failing to secure retail clients’ banking deposits.[5] Incidentally, the FCA was forced to report itself to the ICO in 2019 after accidentally revealing personal information about 1,600 individuals who had lodged complaints against it.
  • Directors should be aware that, under the DPA 2018, they can be now personally liable for criminal offences set out in the DPA 2018 that are committed by their company under the GDPR.[6]

In addition, there is a fast-increasing risk of civil litigation arising from personal data breaches as individual claimants seek compensation for loss (both pecuniary and non-pecuniary) caused to them by a data controller’s contravention of data protection legislation. For example, in May 2020, an £18 billion class action was filed against easyJet following the ‘highly sophisticated’ theft of 9 million customers’ personal data. This trend follows the decision of the Court of Appeal in Vidal-Hall v Google Inc [2015] EWCA Civ 311, which established that compensation could be awarded under the DPA 2018 to individuals under English law if they suffered non-pecuniary loss such as emotional distress arising from a breach of their data privacy in addition to their rights to claim pecuniary loss.

The trend may now accelerate following the decision of the Court of Appeal in Lloyd v Google [2019] EWCA Civ 1599. Here, it was held that it was possible to bring a representative action on behalf of a class of claimants who had, as a result of alleged breaches of data protection law, lost control or autonomy over their personal data, which had inherent value, and there was no requirement under the DPA 1998 (and it is likely to be the same under the GDPR or the DPA 2018) that the claimant had to prove financial loss to obtain compensation under data protection law. The United Kingdom’s Supreme Court has granted Google permission to appeal the Court of Appeal’s decision and is expected to be heard in late 2020 or early 2021. If the decision is upheld by the Supreme Court, it will potentially allow a wide range of claims to be brought against controllers for breach of data protection law, including as a result of cybersecurity breaches in a manner that makes economic sense (and that may be funded by litigation funders).

If this were not the case, the limited damages claimable to compensate for each individual’s loss would mean that it is unlikely to be economically worthwhile to seek redress on a separate, individual basis.

Furthermore, the UK government has commenced a consultation seeking opinions as to whether to allow non-profit organisations to make data protection regulatory complaints and bring court claims on behalf of individuals without their consent.[7] At present, the DPA gives non-profit organisations the right to bring actions on behalf of individuals who have given their permission to be represented, but the uptake of the existing provisions ‘appears to be quite low’.

Some respite has been given to commercial entities for potential liability arising from data breaches. The UK Supreme Court in Wm Morrison Supermarkets PLC v Various Claimants [2020] UKSC 12 held that Morrisons was not vicariously liable for the actions of a ‘rogue’ employee who acted outside the scope of his employment deliberately to cause a data breach.

Overview of cybersecurity legislation

Below provides a broad overview of the key UK legislation that imposes cyber­security obligations on companies and businesses.

The Communications Act 2003

Public electronic communications network (PECN) and public electronic communications service (PECS) providers apply the Communication Act 2003 (CA), in respect of their activities in the United Kingdom.[8] PECN and PECS providers must take technical and organisational measures appropriately to manage risks to the security of PECNs and PECSs.[9]

The Office of Communications (Ofcom) regulates the CA. Breach notification requirements include the following: PECN and PECS providers must notify Ofcom of a breach of security that has a significant impact on the operation on a PECN or PECS, and PECN providers must notify Ofcom of a reduction in the availability of a PECN that has a significant impact on the network.[10]

The sanctions for breaching the CA are suspension of the entitlement to provide networks or services, or fines of up to £2 million.[11]

Privacy and Electronic Communications Regulations 2003[12]

PECS and PECN providers, in respect of their activities in the United Kingdom, apply the Privacy and Electronic Communications Regulations (PECR).

PECS providers must take appropriate technical and organisational measures to safeguard the security of their services. These measures must:

  • ensure that personal data can only be accessed by authorised personnel for legally authorised purposes;
  • protect personal data stored or transmitted against accidental or unlawful destruction, accidental loss or alteration and unauthorised or unlawful storage, processing, access or disclosure; and
  • ensure the implementation of a security policy with respect to the processing of personal data.

PECN providers must comply with reasonable requests from PECS providers made for the purpose of taking the above measures.[13]

The regulator for the PECR is the ICO. With regard to breach notification requirements, PECS providers must:

  • notify the ICO of a personal data breach without undue delay. The ICO website specifies that certain essential facts of the breach must be notified to the ICO within 24 hours of a provider become aware of such facts;[14] and
  • if the breach is likely to adversely affect the personal data or privacy of a subscriber or user, notify the subscriber or user of the breach without undue delay unless the provider has demonstrated to the ICO’s satisfaction that it has implemented appropriate technological protection measures which render the data unintelligible to any person who is not authorised to access it, and that those measures were applied to the data concerned in that breach.[15]

Sanctions for breach include fines of up to £500,000.[16] Further, article 31 PECR grants the ICO the same general powers to enforce PECR as under the DPA 1998, including the use of enforcement notices.

GDPR/DPA 2018

Controllers and processors[17] based in the United Kingdom and based outside of the European Union that offer goods or services in the United Kingdom or monitor the behaviour of individuals in the United Kingdom are subject to the provisions of the GDPR and DPA 2018. Following the end of the Brexit transition period, the GDPR will be brought into UK law by the EU (Withdrawal) Act 2018. After the transition period ends, EU businesses that do not have an establishment in the UK will nevertheless be subject to UK data protection law if they offer goods or services in the UK or monitor the behaviour of individuals in the UK, and vice versa.

Under these laws, controllers must process personal data in a manner that ensures appropriate security of the personal data, including protection against unauthorised or unlawful processing and against accidental loss, destruction or damage, using appropriate technical and organisational measures.[18] Article 32 GDPR extends this obligation to processors, and specifies measures to include (as appropriate):

  • the pseudonymisation and encryption of personal data;
  • the ability to ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services;
  • the ability to restore the availability and access to personal data in a timely manner in the event of a physical or technical incident; and
  • a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.

These laws are regulated in the United Kingdom by the ICO, and in the case of a personal data breach, controllers must notify the ICO without undue delay and, where feasible, not later than 72 hours after having become aware of the breach, unless the breach is unlikely to result in a risk to the rights and freedoms of natural persons. Where the notification is not made without 72 hours, reasons for the delay must be provided.

Further, processors must notify the controller without undue delay after becoming aware of a personal data breach.[19] Where a personal data breach is likely to result in a high risk to the rights and freedoms of natural persons, controllers are also obliged to notify data subjects of the breach without undue delay.[20]

Sanctions for breaching the above laws include:

  • for controllers under article 5(1)(f) GDPR: fines of up to €20 million or 4 per cent of the total annual worldwide turnover, whichever is higher;[21] and
  • for processors and for controllers under article 32 GDPR: fines of up to €10 million or 2 per cent of worldwide turnover, whichever is higher.[22]

The ICO also has the power to impose an absolute ban on processing any personal data, or a ban on the processing of certain descriptions of personal data in a particular manner or at a particular time.[23] An enforcement notice can only be imposed for certain types of breaches,[24] including breaches of data subjects’ rights and failure to notify the ICO of a personal data breach.

Directors are also personally liable to be prosecuted for criminal offences committed by their company under the DPA 2018.[25]

NIS Regulations 2018

Operators of essential services (OESs) in the energy, transport, health, drinking water supply and distribution, and digital infrastructure sectors that satisfy certain threshold requirements under Schedule 2 of the NIS Regulations oversee its application. Any OES that provides services in the United Kingdom falls within the scope of the NIS Regulations, regardless of where it is actually based. OESs must take appropriate and proportionate measures to:

  • manage risks posed to the security of the network on which their essential service relies, which measures must, having regard to the state of the art, ensure a level of security of network and information systems appropriate to the risk posed; and
  • prevent and minimise the impact of incidents affecting the security of the network and information systems used for the provision of an essential service, with a view to ensuring the continuity of those services.[26]

Relevant digital service providers (RDSPs) also apply the NIS Regulations. Essentially, an RDSP is any provider that:

  • provides an online marketplace, search engine or cloud computing service;
  • has a head office or nominated representative in the United Kingdom; and
  • satisfies certain size and turnover thresholds.

RDSPs must identify and take appropriate and proportionate measures to manage the risks posed to the security of network and information systems on which it relies to provide, within the European Union, either an online marketplace, online search engine or cloud computing service.[27]

Schedule 1 of the NIS Regulations designates sector-specific competent authorities for OESs, and the ICO is the competent authority for RDSPs.

In the case of a breach, an OES must notify its designated competent authority without undue delay and in any event no later than 72 hours after becoming aware of any incident that has a significant impact on the continuity of the essential service the OES provides.[28] RDSPs must notify the ICO without undue delay and in any event no later than 72 hours after becoming aware of any incident having a substantial impact on the provision of any of the relevant digital services.[29]

Fines of up to £17 million will be issues for breaching the NIS Regulations.[30] The relevant competent authority can also serve an enforcement notice prescribing steps that an OES or RDSP must take in order to rectify a failure to fulfil its security duties under article 10 or article 12.

FCA Handbook

Financial services firms regulated by the FCA apply the FCA Handbook, under which there are no cybersecurity-specific provisions, but firms are required to take reasonable care to organise and control their affairs responsibly and effectively, with adequate risk management systems.[31]

Under the FCA Handbook, a firm reports material cyber incidents to the FCA. An incident may be material if it:

  • results in significant loss of data or the availability or control of a firm’s IT systems;
  • affects a large number of customers; or
  • results in unauthorised access to, or malicious software present on, a firm’s information and communication systems.[32]

The FCA published an update[33] to its insight document of March 2019,[34] highlighting cyber-security risks for firms operating in the financial sector. The areas presenting as highest risk were identified as: supply chain; increasing social engineering attacks; ransomware; insider threats and credential stuffing. The FCA also warns of the risks posed by inappropriate and ineffective identity and access management and sophisticated malicious emails. It is recommended that the following areas be reinforced: cloud security; Development and Security Operations; and payment systems security.

There are no upper limits on fines.

Overview of recent enforcement cases

Under GDPR

The table below summarises three examples of recent enforcement action under the GDPR.

GDPR Enforcement

DateCompanyBreachFine
16/10/2020British Airways PLCHackers diverted customers away from website and fraudulently harvested data; ‘poor security measures’; affected 500,000 customers.£20 million[35]
30/10/2020Marriott International IncExposing nearly 339 million guest records globally; insufficient due diligence when acquiring subsidiary.£18.4 million[36]
13/11/2020TicketmasterFailure to put appropriate security measures in place to prevent a cyber attack on a chatbot installed on its customer payment page and so to protect customer data.£1.25 million[37]

In light of increasing privacy concerns and increased regulatory powers under the GDPR, we can expect greater fines and enforcement under privacy laws in the European Union in general. Under the GDPR, the ICO has fined British Airways PLC the sum of £20 million following a cyber incident in September 2018, which diverted their website users to a fraudulent site that harvested their customers’ data. The breach affected around 500,000 customers, and the ICO noted in its press release that ‘poor security arrangements’ at the company were partly to blame.[38] Similarly, the ICO issued a notice of intent to fine Marriott International, Inc the sum of £18.4 million in connection with a cyber attack where hackers stole 339 million guest records (including those of UK and EU citizens) dating back to 2014 (although the fine only related to Marriott’s failure to protect customer data post GDPR coming into force in May 2018). The ICO noted that the company had not undertaken sufficient due diligence when it acquired Starwood Hotels & Resorts Worldwide (where the breach originated) and had not paid sufficient attention to the security of the IT system thereafter.[39] This highlights the importance of both legal and technical due diligence in mergers and acquisitions transactions, and the risks that follow if issues such as this are overlooked. Both of these fines, at the time of writing, can be appealed.

The ICO has issued 14 penalty and five enforcement notices in 2020 (despite having to respond to lengthy and detailed observations in British Airways and Marriott cases) and it remains a fact that the ICO will continue to impose large fines where it believes that global conglomerates have failed to take adequate measures to protect their customer records from cyber attack, resulting in data breaches that affect large numbers of individuals. The difference between pre-GDPR enforcement and the position now is that the ICO is able to impose much higher fines under the GDPR, although notably these fines still fall short of the upper limit (4 per cent of an entity’s annual revenue) permitted by the legislation.

In the Ticketmaster case, the fine was about 1 per cent of turnover. The data breach, via a chatbot on Ticketmaster’s online payments page, included unauthorised access to names, payment card numbers, expiry dates and CVV numbers, and potentially affected 9.4 million of Ticketmaster’s customers across Europe including 1.5 million in the UK. Investigators found that, as a result of the breach, 60,000 payment cards belonging to Barclays Bank customers had been subjected to known fraud. Another 6,000 cards were replaced by Monzo Bank after it suspected fraudulent use. The ICO found that Ticketmaster failed:

  • to assess the risks of using a chatbot on its payment page;
  • to identify and implement appropriate security measures to negate the risks; and
  • to identify the source of suggested fraudulent activity in a timely manner.

Global enforcement trends

In general, there is a global trend towards greater enforcement of privacy and data related breaches. In the United States, for example, the Federal Trade Commission recently imposed a record fine of US$5 billion on Facebook for allowing third-party applications to download the users’ data on the Facebook platform.[40] The fine related to the Cambridge Analytica scandal, which affected around 50 million Facebook users and centred around the default ‘opt-in’ setting for a third-party application that harvested the user’s friends’ data. Facebook also agreed to stricter controls and regulatory oversight by privacy and consumer protection agencies, including establishing of an independent privacy committee made up of Facebook’s board of directors, quarterly compliance certifications, and a 20-year commitment to overhaul privacy governance at the company.

Additionally, in August 2019, the state of New York passed a bill for the Stop Hacks and Improve Electronic Data Security Act (or SHIELD Act) to take effect in March 2020. The SHIELD Act is a significant development in this area because it imposes specific data security requirements on businesses that own or license private information and stipulates particular breach notification obligations, and leaves enforcement to the New York Attorney General. California has also passed two pieces of legislation governing cybersecurity, both of which are due to come into force on 1 January 2020. The first is Senate Bill No. 327 (the Bill), which governs the cybersecurity of internet of things (IoT) devices. The Bill requires manufacturers of IoT devices to include ‘reasonable’ security features appropriate for the relevant device. The second is the California Consumer Privacy Act 2018 (CCPA), which largely parallels the European Union’s GDPR and will include a private right of action for data breach incidents in California.

In February 2020, Brazil’s Lei Geral de Proteção de Dados (LGPD), the General Data Protection Law, came into effect. The law has extraterritorial application and applies to any business or organisation that processes the personal data of people in Brazil, regardless of the location of the organisation itself. The LGPD grants rights that are essentially the same as the GDPR’s eight fundamental rights and requires that a data protection officer (DPO) be appointed ‘to be in charge of the processing of data’, although it does not outline when this would be required. In addition, the LGPD has 10 (as opposed to the GDPR’s six) lawful bases for processing. The LGPD is less severe in its application, as it does not give a deadline for reporting breaches, nor are its fines as severe as those available under the GDPR.

On 16 July 2020, the Court of Justice of the European Union in Schrems II (Case C-311/18) ruled that the EU/US Privacy Shield scheme (an agreement between the United States and the European Union to facilitate transfers of personal data from the European Union to the United States) is invalid, and that the standard contractual clauses (another permitted method of transferring personal data to countries outside the European Union in accordance with the GDPR) are not an ‘appropriate safeguard’ for data transfers if they are not or cannot be complied with in the relevant third country. The consequences of this judgment are still being worked out by regulatory authorities and courts alike (the European Data Protection Board adopted recommendations relating to the decision on 10 November 2020) but one side effect has resulted in an increasing focus on cybersecurity issues when transferring data to the United States – one of the principal factors in determining whether a transfer under the standard contractual clauses is compliant with the GDPR is whether the security of the data (eg, the level of encryption) is such that the federal authorities of the United States cannot access it in a readable manner if they intercept it using their rights under US law.

On 6 October 2020, the CJEU issued its judgment in the Privacy International case (Case C-623/17), and in joined cases, La Quadrature du Net and Others (C-511/18), French Data Network and Others (C-512/18), and Ordre des Barreaux Francophones et Germanophone and Others (C-520/18), regarding the EU’s privacy and electronic communications directive (2002/58/EC) (the E-privacy Directive) and tangentially the GDPR.

In its ruling, the CJEU has confirmed that EU law prevents member states from imposing national legislation that requires electronic communications service providers, such as internet service providers (ISPs), to carry out the general and indiscriminate transmission or retention of traffic and location data, for the purpose of combating crime in general, or of safeguarding national security.

The effect of the CJEU’s ruling means that it will make it more difficult for the security services to use bulk surveillance technology for the purposes of crime prevention and national security. While such surveillance and bulk data retention is permitted in some circumstances, the CJEU’s ruling makes it clear that the UK domestic legislation needs adjustment to align with EU data protection laws. This will also have an impact on Brexit – see below.

The impact of Brexit

Transition period until 31 December 2020

During the transition period, the UK has ceased its membership of EU political institutions, but it continues to follow EU rules and regulations. In particular, GDPR continues to apply and the ICO has said it will be ‘business as usual’ for data protection. What will happen at the end of the transition period is less certain, particularly without a trade deal in place but it is likely that similar regimes will continue to apply – see next section.

After the transition period (from 1 January 2021)

The GDPR is an EU Regulation and will no longer have force in the United Kingdom (other than in respect of its extraterritorial jurisdiction). However, the DPA 2018 will in effect mean that the same obligations as those set out in the GDPR will continue in the United Kingdom after the transition period. The main uncertainty is how transfers of data will occur between the United Kingdom and the European Union. In theory, there should be little change – the United Kingdom has the same laws as the rest of the European Union in this regard and has a better track record than many member states of the European Union in implementing and enforcing the law, but members of the European Parliament and the European Commission have expressed concerns about the proportionality of powers of the UK government to intercept communications and thus process personal data for the purposes of national security and, if a trade deal cannot be reached, it is not impossible that as a third country the United Kingdom will be treated in a similar way to the United States and other third countries whose data protection regimes are not approved by the European Commission. These concerns have been reinforced by the judgment of the CJEU in the Privacy International (Case C-623/17), which is briefly described above. Without some sort of deal, an adequacy decision by the European Commission (ie, that the UK can ensure an adequate level of protection for personal data, thus permitting cross-border data flows outside the EU without any further safeguards being necessary) seems unlikely.

In addition, in Schrems II, in relation to the use of standard contractual clauses, the CJEU ruled that data can be transferred to a third country using this method but only if the personal data is subject to safeguards under domestic law that guarantee a level of protection equivalent to GDPR. Given the view of the CJEU in the Privacy International case that UK law does not provide such safeguards, the legality of using standard contractual clauses relating to the flow of data from the EU to the UK is questionable. This could mean that standard contractial clauses cannot be used to facilitate the transfer of data from the EU to the UK, which will have huge implications for commerce and industry. If this occurs, businesses operating in the EU after 1 January 2021 may not have an easily available mechanism to transfer personal data to the UK.

The ICO has produced a guide and resources for organisations after the transition period, which is accessible on its website.[41] United Kingdom companies that process personal data to which the GDPR applies will also need to appoint a data protection representative in the European Union.

The end of free movement of people, services and data is likely to discourage skilled job-seekers entering the UK cyber security talent pool and (including for the reasons discussed above in respect of personal data) will make cross-border intelligence sharing harder.

The EU NIS Directive has been implemented in UK law so it will continue to apply. The ICO has said that from the end of the transition period, UK-based digital service providers offering services in the EU may need to appoint a representative in one of the EU member states in which they offer services. They will need to comply with the local NIS rules in that member state as well.

Europol is the EU’s agency for law enforcement cooperation and the UK will lose its seat on Europol’s management board once the transition period ends. The UK will lose access to vital resources such as Europol’s central intelligence database (EIS), the Schengen Information System (SIS), and collection of air passenger data (PNR). There are ongoing discussions as to what relationship the UK will have with Europol, but it is suggested the most likely outcome is a ‘second-tier membership’, similar to the relationship other non-EU member countries like Norway and the United States have. The UK will remain a part of other bodies devoted to cybersecurity, such as The Five Eyes (FVEY), the intelligence alliance of the Anglosphere.

UK criminal enforcement of cyber crime

The relevant agencies

The UK government has established the National Cyber Crime Unit (NCCU) as a division of the National Crime Agency. The NCCU heads up and directs the United Kingdom’s response to cyber crime, provides specialist capability to support other law enforcement agencies and coordinates the national response to the most serious of cyber threats.

The NCCU works closely with the Metropolitan Police Cyber Crime Unit, a team of specialist detectives within the Metropolitan Police responsible for investigating complex cyber crime and fraud. Local forces will usually have their own cyber unit, but will refer the more serious cyber incidents to either this unit of the Metropolitan Police or the NCCU.

The ICO and the DPP have the power to prosecute the offences relating to the misuse of personal dataset out at sections 170–173 of the DPA 2018. The written consent of the DPP is required in order to bring a private prosecution.

Legislation

Computer Misuse Act 1990

Despite its age, the Computer Misuse Act (CMA) 1990 (as amended) is the primary statute in the UK that criminalises acts that facilitate or result in breaches of cybersecurity. The CMA 1990 creates five offences:

  • Unauthorised access to computer material (section 1): for example, using someone’s password to gain access to specific data without their permission. This offence is punishable by, on summary conviction, imprisonment for a term not exceeding 12 months or a fine, or both; and, on indictment, imprisonment for a term not exceeding two years or a fine, or both.
  • Unauthorised access with the intent to commit or facilitate commission of further offences (section 2). The further offences must in themselves carry a sentence of five years’ imprisonment or more (eg, theft), meaning that this section is likely to be used for offences such as unauthorised access to a business’s records to steal customers’ credit card details. This offence is punishable by, on summary conviction, imprisonment for a term not exceeding 12 months or a fine, or both; and, on indictment, imprisonment for a term not exceeding five years or a fine, or both.
  • Unauthorised acts with intent to impair, or recklessness as to impairing, the operation of a computer (section 3): for example, by a denial of service attack or the insertion of malware into a computer programme. This offence is punishable by, on summary conviction, imprisonment for a term not exceeding 12 months or a fine, or both; and, on indictment, imprisonment for a term not exceeding 10 years or a fine, or both.
  • Unauthorised acts causing or creating risk of serious damage (section 3ZA). This offence was introduced by the Serious Crimes Act 2015 and is an aggravated form of the section 3 offence, designed to cater for serious computer misuse that (for example) damages critical national infrastructure and where the maximum penalty available under section 3 is inadequate. This offence is indictable only with a maximum of 14 years’ imprisonment for cyber attacks causing, or creating a significant risk of severe economic or environment damage or social disruption; and a maximum of life imprisonment for cyber attacks that result in loss of life, serious illness or injury or serious damage to national security.
  • The Making, supplying or obtaining of articles (eg, hacking tools) for use in offences under sections 1, 3 or 3ZA (section 3A).[42]

The CMA deals with the borderless nature of cybercrime by conferring broad jurisdiction on the English courts to deal with the above offences, provided a ‘significant link’ can be established between the offence and the United Kingdom. A significant link comprises any one of the following: the accused was a United Kingdom national when the unauthorised act was committed; the accused was in the United Kingdom at the time the unauthorised act was committed; the computer subject to the unauthorised act was in the United Kingdom at the time; or (in respect of section 3ZA) the unauthorised act caused or created a risk of serious damage in the United Kingdom.[43]

Fraud Act 2006

A person found guilty of an offence under Section 3A of CMA 1990 might also be guilty of an offence under the Fraud Act 2006, if the ‘article’ made, supplied or obtained was intended for use in fraud. An offence of making or supplying articles for use in fraud under section 7 of the Fraud Act is punishable by a maximum of 10 years’ imprisonment, while an on offence of possession of articles for use in fraud under section 6 is punishable by maximum of five years’ imprisonment.

DPA 2018

Section 170 of the DPA 2018 creates offences of: the deliberate or reckless obtaining, disclosing, procuring and retention of personal data without the consent of the data controller; and the sale (or offering for sale) of data obtained in such manner. Additionally, section 198 creates personal liability for directors to be prosecuted for criminal offences committed by their company.

As noted above, proceedings for offences under section 170 may be instituted only by the ICO, or by or with the consent of the DPP.[44]

Despite proposals made during the passage of the bill that certain offences created by the DPA 2018 be punishable by imprisonment, the DPA 2018 preserves the status quo of financial penalties only. While the Crown and Magistrates’ Courts can impose unlimited fines, there is little authority on the appropriate level of fines for the section 170 offences. Most cases brought by the ICO for offences under section 55 of the DPA 1998 (the precursor to section 170 DPA 2018) resulted in fines in the hundreds or low thousands of pounds.

Other noteworthy developments of cybersecurity offences

In its most recent report on Data Security Incident Trends published in September 2020 for the first quarter of 2020/2021, the ICO indicates that the total number of cyber attacks reported to it was 412 out of a total of 1446 reports for data breaches; of these 185 were the result of phishing, 87 due to unauthorised access, 61 due to ransomware and 16 to malware. [45]

The data indicates that the great majority of incidents reported to the ICO are the result of human error or organisational failings rather than breaches of cybersecurity – this may be the result of businesses becoming savvier when it comes to cybersecurity planning and investing in cybersecurity software and they are, generally, more alert to potential issues. In addition, these figures also show that there has been a reduction in the number of breaches being reported to the ICO. In comparison with the equivalent figures for the first quarter of 2019–2020, 1,645 fewer breaches have been reported to the ICO. The reasons for both of these results include the probability that entities are improving their data protection compliance and are under-reporting in response to the guidance published by the ICO, which indicated that there had been some over-reporting. Other explanations could be changes in attacker behaviour, the impact of the covid-19 pandemic and, indeed, changes in the way businesses are responding to the survey itself. The latter could, for example, be owing to an organisation’s unwillingness to admit to cybersecurity breaches in light of potentially hefty liability under the GDPR. The recent penalty notices issues by the ICO also highlight the importance of controllers ensuring that their suppliers adhere to cybersecurity standards. In the Marriott case, the IT system that was compromised was managed on its behalf by Accenture; and in the Ticketmaster case, access to its data was achieved via the third-party supplier of its chatbot function. Indeed, the National Crime Agency has listed the omission to consider suppliers as a potential entry point to a company’s IT systems – and thus enabling attacks – as a cause of much cyber crime.[46]

Incident preparation and response: practical advice for companies

There are a number of basic policy and infrastructure measures businesses should have in place in order to report breaches and to minimise damage in the event of a cyber attack. Perhaps most important is having a robust breach response protocol.[47] This includes an awareness of, and planning for compliance with, all relevant notification requirements under the GDPR and related legislation, including stock exchange rules on announcements for listed companies, as well was having a cross-disciplinary team of IT, legal, public relations, compliance, insurance, and human resources staff and advisers in place to ensure a holistic approach to incident management. Particular attention should be paid to the relevant rules that apply to ‘processors’ and ‘controllers’ (as defined in the GDPR). Particular attention should be paid to:

  • who is responsible for identifying a data breach;
  • who the response team members are;
  • how to evaluate and contain a data security breach;
  • how and when to notify individuals, the regulator or law enforcement;
  • handling dialogue with those third parties;
  • managing external communications and media enquiries;
  • remediation measures to be taken following a breach; and
  • how and when to notify individuals and the reasons for so doing.

Response protocols must remain valid and up-to-date, so regular ‘stress tests’ of company protocols are recommended as well as typical IT security tests (eg, penetration testing using other tools).

Businesses should also assess whether they can be insured against cyber attacks. The cyber-insurance industry is growing fast but there are numerous difficulties involved in insuring against cyberthreats.

Some insurance companies are developing services aimed at quantifying and pricing cyber risk by, inter alia, developing their own underwriting models that incorporate numerous cyber-related variables. Until these products are developed, businesses should ensure they review their insurance policies and, where relevant, update policy language to take into account cyberthreats. Indeed, some insurers and security firms may offer response teams and also ‘stress testing’ to asses potential damage in the event of a cyber attack and base insurance pricing from that test.

The authors wish to thank Hui Ying Chee, Reena Patel, Jujhar Dhanda and Lois Child for their assistance in writing this chapter.


Footnotes

4 Specifically, British Airways’ holding company International Consolidated Airlines Group. See https://ico.org.uk/media/action-weve-taken/mpns/2618421/ba-penalty-20201016.pdf.

6 Section 198 DPA 2018.

8 See section 32 for the definition of PECN and PECS.

9 Section 105A CA 2003.

10 Section 105B CA 2003.

11 Section 105C and 105D, CA 2003.

12 Subject to Brexit developments, the PECR 2003 is expected to be replaced in due course by the E-Privacy Regulation. The European Commission’s original intention was for the E-Privacy Regulation to come into force alongside the GDPR on 25 May 2018, but the draft regulation has yet to be finalised. While the GDPR applies to the processing of personal data more broadly, the draft regulation covers cookies, electronic direct marketing, over-the-top (OTT) services and machine-to-machine (M2M) communications. Notably, the draft regulation also introduces GDPR-style fines linked to a percentage of annual worldwide turnover.

13 Article 5 PECR 2003.

15 Article 5A PECR 2003.

16 Article 31 PECR.

17 Article 4, GDPR defines data controllers and data processors as below:

(7) ‘controller’ means the natural or legal person, public authority, agency or other body which, alone or jointly with others, determines the purposes and means of the processing of personal data;

where the purposes and means of such processing are determined by Union or Member State law, the controller or the specific criteria for its nomination may be provided for by Union or

Member State law;

(8) ‘processor’ means a natural or legal person, public authority, agency or other body which processes personal data on behalf of the controller.

18 Article 5(1) GDPR.

19 Article 33 GDPR.

20 Article 34 GDPR.

21 Article 83(5) GDPR.

22 Article 83(4) GDPR.

23 Sections 149 and 150 DPA 2018.

24 As set out in section 149(2)-(5) DPA 2018.

25 Section 198 DPA 2018.

26 Article 10 NIS Regulations.

27 Article 12 NIS Regulations.

28 Article 11 NIS Regulations.

29 Article 12, NIS Regulations.

30 Article 18, NIS Regulations. The ICO has confirmed that its NIS enforcement powers are separate from those under the DPA 2018, and that it is able to impose fines under both NIS and data protection law if appropriate and proportionate to do so.

31 Principle 3, PRIN 2.1.1.

35 This is considerably less than 1 per cent of global turnover for British Airways PLC which is £12.26 billion.

36 This appears to amount to around 1 per cent of global turnover for Marriott International, Inc.

42 Section 3A CMA 1990

43 Section 5 CMA 1990

44 Section 197(1) DPA 2018

Get unlimited access to all Global Data Review content