Weekly podcast: Uber, Google, and City of York Council vs RapidSpike

This week, we discuss the latest fines for Uber in connection with its 2016 data breach, GDPR complaints against Google, and the other side of the City of York Council ‘hack’ story.

Hello and welcome to the IT Governance podcast for Friday, 30 November. Here are this week’s stories.

The Information Commissioner’s Office has fined Uber £350,000 for the data breach it tried to cover up in 2016. According to the ICO, a “series of avoidable data security flaws” allowed attackers to download the personal details of around 2.7 million UK customers – including their names, email addresses and phone numbers – as well as the records of almost 82,000 UK-based Uber drivers from Uber’s Cloud servers. (Worldwide, a total of 57 million drivers and customers saw their personal data compromised.)

The fine – issued under the Data Protection Act 1998, which was in force at the time of the incident – was compounded by Uber’s decision to pay the attackers $100,000 to destroy the data they’d downloaded rather than report the incident.

The ICO’s director of investigations, Steve Eckersley, said:

“This was not only a serious failure of data security on Uber’s part, but a complete disregard for the customers and drivers whose personal information was stolen. At the time, no steps were taken to inform anyone affected by the breach, or to offer help and support. That left them vulnerable.

“Paying the attackers and then keeping quiet about it afterwards was not, in our view, an appropriate response to the cyber attack.”

It’s been a pretty expensive week for Uber. As well as the ICO fine to contend with, it was fined €600,000 by the Dutch Data Protection Authority, the Autoriteit Persoonsgegevens, under its own pre-GDPR (General Data Protection Regulation) legislation.

However, both fines pale in significance when compared with the $148 million settlement Uber agreed to pay in the US in connection with the breach in September. Under its terms, the company was also required “to incorporate privacy-by-design into its products”.

According to the Financial Times, Uber said this week that it had made “a number of technical improvements to the security of [its] systems both in the immediate wake of the incident as well as in the years since”.

The two attackers behind the incident were indicted in October for attempting to extort a similar sum from LinkedIn’s online learning portal lynda.com (since renamed LinkedIn Learning).

It’s now been six months since the GDPR came into effect and there’s been precious little regulatory action under the new law – so far. The ICO issued an enforcement notice to Aggregate IQ Data Services Ltd in October as part of its investigation into Facebook and Cambridge Analytica, the first GDPR fine was issued in Austria the same month – a €4,800 penalty against an entrepreneur whose CCTV faced a public space – and, this week, a German data protection authority in Baden-Württemberg fined a local social media provider €20,000.

However, the first big GDPR fine could be on the way. Consumer agencies in seven EU countries have filed complaints against Google with their national data protection authorities for tracking users who have switched off their location history – in breach of the GDPR.

A press release from the European Consumer Organisation BEUC said:

“Google collects users’ location data notably through the features ‘location history’ and ‘web & app activity’, which are integrated into all Google user accounts. The company uses various tricks and practices to ensure users have these features enabled and does not give them straightforward information about what this effectively entails.

“These unfair practices leave consumers in the dark about the use of their personal data. Additionally they do not give consumers a real choice other than providing their location data, which is then used by the company for a wide range of purposes including targeted advertising.

“These practices are not compliant with the GDPR, as Google lacks a valid legal ground for processing the data in question.”

According to Reuters, a Google spokesperson commented:

“Location History is turned off by default, and you can edit, delete, or pause it at any time. If it’s on, it helps improve services like predicted traffic on your commute.

“If you pause it, we make clear that — depending on your individual phone and app settings — we might still collect and use location data to improve your Google experience.

“We’re constantly working to improve our controls, and we’ll be reading this report closely to see if there are things we can take on board.”

Finally, last week we reported that City of York Council had suffered a data breach that potentially affected the personal data of 5,994 locals. According to the BBC, a council app was hacked by an unnamed third party, who accessed residents’ “names, addresses, postcodes, email addresses, telephone numbers and encrypted passwords”.

The council’s reaction was to delete the app, ask for it to be removed from app stores, and advise users to delete it from their devices.

However, what wasn’t clear at the time was that the third party involved was a developer for a technology company, who had contacted the council in line with its own guidelines when he detected that the app was leaking users’ personal data.

However, instead of thanking him for identifying the security vulnerability, the council reported him to the police.

This Tuesday, the company – the digital asset assurance platform RapidSpike – broke its silence. It explained:

“Despite the framing of this vulnerability report by the City of York Council, this vulnerability was disclosed following the Council’s own responsible disclosure guidelines. […]

“It is important to note here that our developer did not do ‘anything to exploit the vulnerability.’ He simply browsed to a page within the app, as any user would.”

It continued: “The Council’s initial statement and subsequent public media reporting left us confused with the portrayal of the issue. There is an established precedent in the UK for legitimate security researchers to disclose vulnerabilities within information systems to relevant security teams. The Council’s positioning of this good-faith disclosure as a deliberate attack flies in the face of the UK Government’s National Cyber Security Centre advice on the matter, and the International Standard framework for vulnerability disclosure.

“We have to say that North Yorkshire Police’s Digital Investigation & Intelligence Unit dealt with the whole situation superbly, and should be commended for their approach. It must also be noted that the security community have also stood up and supported our developer, even without the full story.”

According to ZDNet, a police spokesperson said the force did not regard the incident as criminal. “We recognise the benefits of software vulnerability disclosure as part of a healthy security environment and the researcher has acted correctly,” they said.

“There are times when ‘researchers’ overstep the mark but this is not one of those. We’d rather work with public-spirited individuals and share learning than criminalise people who act in good faith.”

City of York council subsequently tweeted:

“Whilst we consider we took appropriate measures based upon the facts at the time, we can now confirm that this was a well intended action by the individual concerned and we would like to thank them for raising this matter.”

Well, that’ll do for this week. Until next time you can keep up with the latest information security news on our blog. Whatever your information security needs – whether regulatory compliance, stakeholder reassurance or just greater business efficiency – IT Governance can help your organisation to protect, comply and thrive. Visit our website for more information: itgovernance.co.uk.

Leave a Reply

Your email address will not be published. Required fields are marked *