Everybody wants to be a hero


By Jameeka Aaron, CISO, Auth0

But what are the questions we, as security professionals, should be asking ourselves?

The recent Verkada hack has again thrown hacktivism into the spotlight. From tackling hate crimes to the defence of civil rights, many forms of hacktivism play an essential role in helping stigmatized groups prosper, calling those in power to account, and highlighting poor security posturing on the part of large organizations.

But for security professionals, this recent form of hacktivism also serves as a timely reminder of the bigger ethical questions around how we collect, share, store, and talk about personal data and security vulnerabilities.

This is something that applies to all security professionals. From one-person pen testers to government organizations, we all have a responsibility to think about what we're doing, and why we're doing it.

Everyone wants to do good, but some fail to think of the consequences

In hacking over 150,000 Verkada cameras, those responsible were able to peer into the lives of prison cells, gyms, private homes, and elementary schools. Those responsible, citing their motivations as “hacktivist" in nature, released a statement stating the data breach was in the service of freedom of information, anti-capitalism and “a hint of anarchism" in addition to demonstrating the pervasiveness and security weaknesses of surveillance cameras.

This private data was seen not only by those within the hacktivist group, but also journalists from Bloomberg, and then also security professionals from Verkada itself. The wider goal was to highlight the potential for more people to take advantage of this flaw, but many people had their privacy violated in order to highlight the point.

The Verkada Hack reminds us of our dual responsibility as security professionals. Yes we have a responsibility to protect, but that role is multifaceted. We are not only responsible for protecting the intellectual property of the organizations we work with, but also the confidentiality, integrity and availability of personal data.

Like many forms of hacktivism, the techniques used are not always legal (they are all illegal for the most part) as often hacktivists are involved in the divulging of personally identifiable information (PII), stealing non-public information, or attacking websites and servers.

With this blurring of legality and uncertainty around the impact on individual privacy, the idea that 'the end justifies the means' is a common, but often flawed, philosophy. And it's not confined to hacktivists. There have been a number of cases of federal agencies collecting data without truly considering the impact of those involved in the crossfire.

So what are the questions we should be asking ourselves?

  • How can I demonstrate the risk without adding to it?
  • Can I communicate the vulnerability without exposing the vulnerable?
  • Can the vulnerability be closed prior to exposure?
  • Why am I collecting this data, and do I feel comfortable and able to tell those impacted that reason?
  • Can I use examples with synthetic or other less invasive data?
  • Can Icommunicate directly to law enforcement / CEO's / trusted media agencies without exposing data to the public that could cause further harm?
  • Do I know how individuals impacted will be able to get out of harms' way?
  • Who am I going to bring into these discussions who could help? — professional networks / journalists / trusted civil liberties and consumer/industry groups for example.

There's not one answer for everyone

Ultimately, unassuming consumers/students/patients/users/customers/volunteers/families/ employees are almost always the main victims when the questions above are not considered. And it's often these same humans that have less power to change, and who are more at risk from exposure.

With the power of those working in security comes responsibility, and this should be treated seriously.

There's not a one size fits all approach for these questions, and that the most important thing is to make sure we are constantly questioning, testing and learning every step of the way. Don't ignore your gut, but check with people who would be impacted — particularly those more vulnerable, and less tech savvy than yourself.

Sustaining Partners