COLUMBUS, Ohio (WCMH) – Everyone knows you have to have good cybersecurity to protect data from falling into the hands of those who would steal it.
Often, that means added security measures that may require software or hardware to block access.
Those security measures are only as good as the people using them and when it comes to elections and their security, nothing is more important.
Ohio Secretary of State Frank LaRose has a passion for elections and making sure they are fair, secure and unimpeachable is his mission.
He has done about as much as he can to ensure the primaries that will be held here in a few weeks and the general election in November will be accurate and every person eligible to vote will get to — and that their vote will be counted.
His efforts and those of lawmakers at the Statehouse are being lauded by a partnership between the University of Southern California-Berkley and Google, which is taking a cybersecurity training course around the country to all 50 states.
Their target audiences?
– Campaign professionals, to impress upon them the need for security
– Election officials, to provide best practices — some of which are coming from Ohio
– The public, to educate them on another aspect of election security they are both subject to and play a role in
LaRose pointed out the reality of the situation in Ohio.
“All the work at the board of elections, that’s done in a bipartisan way, where there is two locks and two keys for all these doors,” he said. “I always joke, it’s like those 1980s submarine movies where it takes two keys to launch the torpedo. That’s how it is at the board of elections. The room where the tabulation equipment is stored has a Republican lock and a Democratic lock. The room where the machines are stored has a Republican lock and a Democratic lock. Everything is supervised by bipartisan teams. The machines are never connected to the internet. They’re tested before the election, by bipartisan teams of experts, and they’re audited after the election where we require a post-election audit. All of those safeguards that go into place, our foreign adversaries know that. They know that they can’t actually change the results of an election, or tinker with the numbers or that kind of thing. But what they can do is they can sow doubt.”
That sowing of doubt is what the second half of Monday’s training focused on. Partially for campaigns and partially for the public, the information provided shows just how easy it can be for someone to manipulate another based on biases.
LaRose went on to describe how social media can be used against us.
“This is an example from Texas where a foreign operative, a foreign adversary, created a Black Lives Matter Facebook group and also created a White Supremacist Facebook group and got them both followers and got people engaged in this,” LaRose said. “Then a few months later, they said, ‘We’re gonna have a rally.’ Well, guess what? They scheduled the White Supremacist at the same time and location as the Black Lives Matter rally. What do you think would have happened if that had gone down?”
In that instance, law enforcement sniffed out the duplicitousness of the dual rallies and was able to head off the confrontation.
In some ways, there is a greater risk of election interference coming from social media than through cyberattacks on election equipment and organizations.
According to USC Annenberg fellow for communication security Marc Ambinder, when people ingest information, it affects different parts of their brain. Positive information they agree with triggers a part of the brain that chemically rewards them. At the same time, there is another part of the brain that is triggered by things that disgust people. The reaction likewise rewards the person chemically.
Ambinder said that’s why trying to reason with people over something like a deeply-held belief is extremely difficult to do with facts alone. He added they may ultimately accept the facts to be true but may find ways to rationalize things so they can still believe what they want in spite of them.
Ambinder also said things said loudly and with confidence are more often believed, so messages are being constructed to influence people with that in mind.
“Nobody wants to be fooled by disinformation, so the more that you can sensitize to the fact that they might be fooled and say, ‘Hey, do you really want to fall for this?’ that’s actually a really effective tactic against people who are loud, and people who are certain and people who are bullies.”
Things are getting even trickier, however, with technology opening up the possibility of manipulating videos so well that it can be difficult to recognize for deception.
“We have now simple, readily-accessible tools to create really good-looking fake videos,” said USC Annenberg Executive Director for the Election and Cyber Security Initiative Adam Clayton Powell, III. “Candidates are all going to be facing people out there who will create videos of — whether it’s Donald Trump or whoever the Democrats nominate — saying something they never said, doing things they never did.”
Powell said it can be really difficult for the public to spot because it requires a great bit of expertise to sniff out.
Several universities have set up a kind of video fact-checker that will analyze the video to determine its voracity.
The bad news is information moves lightning fast today so by the time they get ahold of it and can analyze it, the damage will have already been done.
Powell hopes quick responses to these false videos will help minimize the damage done.
Something similar to this happened not too long ago. A video was posted by someone, claiming it was Iran firing missiles into Iraq at American forces. The video was actually several years old and from another part of the world.
Despite that, the video and the message it carried picked up 150 impressions on Twitter in an hour and was retweeted by verified users, including some who worked for news networks. According to Ambinder, the video migrated to a number of different online platforms within minutes.
Another impact outside influencers want to have on our elections and on the democratic process as a whole is for Americans to simply give up and not participate because they think it’s pointless or too complicated.
They bombard with disinformation to put you in a position of civic paralysis, according to Ambinder. That disinformation can be amplified by people of prominence whom you trust if they themselves are fooled into thinking it is true.
If you don’t participate in the election process because you have been convinced your voice doesn’t count or matter, then the enemies of democracy have won. That is why it is important to insist disinformation is rejected by everyone.
I asked both major political parties if they would commit to rejecting disinformation spread by anyone, including one of their own candidates or their campaigns, as a way to assure the people of Ohio they have their concerns over security and fairness of our elections squarely in mind.
Ohio Democratic Party Chairman David Pepper made the commitment without stipulation.
“Yeah, I’m against that and if someone on my side is doing it, I would, basically, not allow for it.”
Ohio Republican Party Chairman Jane Timken also made a commitment.
“As much as my job is to protect our party brand and our candidates from disinformation and misinformation, I will do that,” she said.
LaRose made a commitment but his statement limited it to calling out any disinformation he sees about our elections and the security of them.
The one thing everyone involved agrees on is Ohio voters need to vote and sitting on the sidelines because you have been disenfranchised is not in anyone’s best interest.
Here are some tools Ambinder provided. To protect yourself from being duped, he recommends bookmarking these:
Using these tools can help you figure out if the messages you are seeing on social media are true or intentionally created to deceive or manipulate you.