It's time to start punishing election officials for security and privacy violations

It’s not just the federal government that does not demand verifiable elections

Published November 22, 2018 12:59PM (EST)

   (iStockphoto/selimaksan)
(iStockphoto/selimaksan)

The original article appears on WhoWhatWhy.org.
WWW-refresh_concept_single

Following a week of often contentious and confusing recounts, it bears remembering that the US government made a conscious choice not to audit election-related systems — and that there are no plans to audit them in the future.

This hands-off attitude is like choosing to focus on who is kneeling before an NFL game rather than on who is routinely bribing the refs.

It’s not just the federal government that does not demand verifiable elections. On the state level, Georgia went to court to defend its use of hackable voting machines that leave no paper trail. And shortly after a court ruled there was not enough time to switch to a better system, WhoWhatWhy broke the news that a massive vulnerability in the state’s voter registration would have allowed even amateur hackers to wreak havoc on the election.

I had been asked, as a privacy & security expert, to evaluate this vulnerability and was surprised to find multiple vulnerabilities in the system. Not only that, they were all on the top 10 Open Web Application Security Project (OWASP) vulnerability list — the lowest hanging fruit if you will — in terms of hackability.

A responsible data controller would have immediately taken the system down and evaluated it.

Instead, Georgia’s Secretary of State Brian Kemp (who was also running for governor of Georgia) denied the vulnerability, and then attempted to divert blame by slinging mud at his political rival, and announcing that he had ordered the FBI and DHS to investigate the breach.

As a security professional, I can state that this is irresponsible behaviour. The onus and responsibility should be on the data controller. But Kemp was not held to account, and I don’t suspect he ever will be.

What was blatantly obvious is that this system had never been audited by an independent security auditor. Anyone with the most basic security credentials would discover vulnerabilities such as the ones alleged.

We also know that this vulnerability was presented to Democratic party officials only after it had been sent to the Secretary of State Kemp, who apparently ignored it. The Democratic officials, who did not have the technical capabilities to test or confirm the vulnerability, then asked outside experts to assess it.

I was one of them, but it didn’t take my privacy and security expertise to notice the conflict of interest in having the secretary of state, who was tasked with keeping the election secure, also running as a candidate for governor.

It seems obvious that there should be checks and balances in place to ensure that a candidate who also oversees the election does not unduly influence the outcome.

Which brings us back to my experience. One of the things I stressed over and over again to the various reporters who interviewed me was that there are systemic problems that have to be addressed.

Here is what’s needed to protect democracy, and the personal information of citizens who exercise their constitutional right to vote:

1) Real whistleblower protection. One place to begin would be to end the prosecution of whistleblowers such as Edward Snowden, so that media and security researchers will feel safe in disclosing the problems they find.

Until then, any system that claims to protect the data of citizens is a house of cards, and the public should assume most of the systems in place are compromised, unless they have been verified by a qualified and independent expert.

2) Vulnerability Reporting. The US has some of the best vulnerability reporting experts in the world, including, for example, my friend Katie Moussouris, who built the vulnerability reporting program for Microsoft, and is still blazing a trail in responsible disclosure of bugs in systems around the world.

What is responsible disclosure in terms of vulnerability reporting? It is creating a process that ensures a security researcher gets rewarded instead of punished. Such a reward entails what people in the field call a bug bounty, a recompense for providing to a vendor the details of a security problem with their software.

This is the opposite of what we saw in Georgia. There, the people reporting the problem were met with threats instead of rewards. At the moment, reporters and researchers take on a bigger risk in reporting a vulnerability than the data custodian does for allowing the vulnerability to exist in the first place. Until that assumption of responsibility is reversed, we can expect little to change in this area.

3) There need to be consequences for data custodians who don’t protect personal data. The new privacy law in the European Union, GDPR, demonstrates how this can be done. With a €20 million penalty for non-compliance, organizations are sure to work towards protecting the personally identifiable information (PII) of individuals.

4) It’s time for the US to consider jurisdictional privacy laws. In Canada, we have federal and provincial (what you would call state) laws. These laws need to have a regulator, who has the ability to investigate and enforce these laws, and impose serious penalties for noncompliance. For example, if the secretary of state can use taxpayer money to pay fines for privacy violations, the privacy law is useless.

Getting a jurisdictional privacy law passed will not be easy. It will be possible only if politicians are persuaded to protect the personal information of Americans, rather than treating it like the political football it is today.

Kris Constable is a cybersecurity and privacy expert.


By Kris Constable

MORE FROM Kris Constable


Related Topics ------------------------------------------

2018 Midterm Elections All Salon Brian Kemp News & Politics Privacy & Security Voting Security Whowhatwhy