What’s the worst thing you’ve accidentally deleted without meaning to? I can assure you that it wasn’t anywhere near as bad as Google Cloud deleting the entire customer account (including all backups) of $135 billion Australian pension fund, UniSuper, putting services for its over 615,000 members on the fritz for nearly two weeks.
As reported by Ars Technica, UniSuper picking Google Cloud for its digital services was a big enough deal to get its own press release, with the fund outsourcing crucial infrastructure to the internet giant—many companies do the same with Amazon Web Services.
While UniSuper said that pension payments were not disrupted by the outage, customers were unable to log into their accounts on mobile or desktop, and the fund’s outage update timeline implies that transactions were unable to be processed during the blackout, with a data rollback to the end of April required once everything was said and done. Thankfully, UniSuper had kept a backup of its cloud data through another service, which helped in the full resumption of its operations.
But it’s still not clear just how this occurred: The deletion of a multi-billion dollar client’s account and every backup therein, bypassing all safeguards, knocking out financial services to hundreds of thousands of pensioners for two weeks is, in the words of Brass Eye, the one thing we didn’t want to happen.
A joint statement from Google Cloud and UniSuper describes the deletion as resulting “from an unprecedented sequence of events whereby an inadvertent misconfiguration during the provisioning of UniSuper’s Private Cloud services ultimately resulted in the deletion of UniSuper’s Private Cloud subscription.” What, that wasn’t crystal clear to you? Basically, “it was a freak accident.”
Software developer Daniel Compton points out that the vague language in Google and UniSuper’s statement does very little to explain how this occurred on a technical level. Compton argues that one likely explanation was some error on UniSuper’s part when using a common cloud infrastructure tool, but Google Cloud seemed to admit fault on its end in a statement during the outage, implying it was a bug on Google’s end.
My theory? Some hapless Google employee was chasing a chimpanzee through the server room (you know how those permissive tech offices are, ball pits and whatnot), slipped on a roller skate, and accidentally hit the big red “Delete the Australian pensioners’ data” button. They really should have installed a flip cover on that thing. Whatever the explanation may be, clearly no tech service is too big to fail, and you should keep redundant backups of any data that matters to you, he says, ignoring his 20 year-old, 308-game strong Steam Library.