Pushfor exhibiting at RSA in San Francisco 4 – 8 March 2019 Read More
Heroes of healthcare need heroic tools, not insecure compromises Read More
Pushfor featured in Financial Times – How to Spend It Read More
Pushfor featured in Financial Times – Makings of a British Technology Unicorn Read More
Pushfor set to disrupt the Digital File Transfer Market Read More
Globaleye agrees USD$250,00 ARR contract for Pushfor Read More
John Safa explains why using consumer messaging apps in business is very risky Read More

Data protection isn’t technological but organic

human error data protection

Humans are the problem.

Way back in 2006, the Council of Europe decided that the 28th of January would be Data Protection Day, or Data Privacy Day if you happen to be outside of Europe. Call it what you will, but truth be told as we marked the 13th such event last week data privacy has never been more important, or more lacking.

Despite the best efforts of the EU, by finally bringing the General Data Protection Regulation (GDPR) to life, I have yet to see any noticeable impact on data protection out there in the real world. Not that I am surprised, given that a whole clutch of US laws and regulations have pretty much failed to protect the privacy of data in any meaningful way. Neither the Health Insurance Portability and Accountability Act (HIPAA) or the Gramm-Leach-Bliley Act (GLBA), and you may as well throw the EU-US Privacy Shield Framework into the mix, have prevented data breach numbers from rising year on year. The reason is as simple as it is obvious: humans are, well, only human.

Let me explain, and that means understanding what real threat intelligence means when it comes to data protection. The accepted definition of threat intelligence is something along the lines of ‘the information used by an organisation to understand the threats facing it’ but all too often this will focus on threat technology: software vulnerabilities, the evolving malware landscape, misconfiguration of data storage buckets and so on. The irony being that threat intelligence all too often fails to understand the biggest threat to data protection isn’t technological but organic. Threat intelligence could almost be reduced to a single statement writ large and stuck on every server, laptop and smartphone: people are the problem.

I’m not talking about phishing, social engineering or hacking humans either. I’m talking about good old-fashioned human frailties that mean once your data, in whatever form that takes, leaves your hands and lands in someone else’s then as far as data protection is concerned its game over. Seriously. Once you hit send, your email, instant message, document or whatever is no longer in your control and that spells trouble. Sure, you may well be compliant with whatever framework tickles your regulatory fancy but just because you’ve done the technologically-correct thinking and provided end-to-end encryption for the data transmitted that is not the same as understanding the risk that remains. That risk is magnified once the data is with someone else, someone you have no control over and whose actions you have no visibility into. It matters not whether we are talking about the same third-party supply chain or the CEOs secretary, they absorb the risk once they receive the data but that isn’t the same as absolving you from it.

Absolve: to free someone of guilt, obligation or punishment. It’s a strong yet totally apt word when it comes to data privacy. Just because the data is now in someone else’s hands, on someone else’s network, doesn’t mean you can afford to forget about it. Because you’ve taken ‘necessary measures’ to ensure your handling of that data has been ‘as secure as possible’ might save you from a regulatory spanking, but it won’t save your business from the financial or reputational damage that free-running data can cause.

Which brings me back to my point of humans being only human: they make mistakes. Human error was found to be more problematical than cyber-attack when it comes to data breach risk according to one in-depth report at the end of last year. This found that the most common of these errors were sending data to the wrong person, data left somewhere insecure and the loss or theft of unencrypted devices. I’d add people trying to be more productive by deliberately bypassing clunky in-house tools and using faster and more convenient (while less secure) alternatives for good measure.

Until and unless everyone from the c-suite down understands the people problem, really gets the importance of knowing not just that data is being sent securely but maintaining a view (and control) of what happens once it gets there, then Data Protection Day will remain on a par with Valentine’s Day: sure, it’s nice to get a card and some flowers once a year but far nicer to be showered with love and affection the whole year round…