Has America ever hacked Russia

Disinformation in the US election campaignHow democracy is hacked

Election campaigns are times of wooing and great promises. Always have. Candidates and parties vie for the favor of the voters, hire strategists and spin-doctors who gild their strengths and hide their weaknesses. As here, in September 1984, U.S. President Ronald Reagan:

"The rate of inflation has halved in the last four years, now you can look to the future with confidence again. Tomorrow is breaking again in America. And under the leadership of President Reagan our country is prouder and stronger and better."

(picture alliance / Wolfram Steinberg)

The commercial secured his re-election. Targeted advertising isn't new either. Presidential candidate Barack Obama successfully used social media such as Facebook as early as 2008. In the meantime, however, the machines are learning faster, the algorithms are more sophisticated, and the data on each one is even more gigantic. Nobody uses them more resolutely than Team Trump. Even between elections. Marcel Schliebs, researcher in the "Computational Propaganda Team" at the Internet Institute at Oxford University:

"During the Trump impeachment, the Trump Campaign placed 14,000 different campaign slogans and images in the Facebook ad library, which is publicly available, targeting very different strata of the population, ages, genders, etc. with different messages A classic example would be that the mother, who has just had a child, is shown a campaign picture on Facebook about some kind of statistic that migrants, for example from Mexico, have kidnapped a child or have become violent Election campaigns get Hillary Clinton to take his guns away from him. "

The Trump people, says Schliebs, understand above all how to use behavioral data: "That means: How do people react to different slogans and how do variations lead to them being more successful or less successful? The Brexit Campaign has that very much, by the way made similar. "

The former head of the election campaign for Donald Trump, Brad Parscale (www.imago-images.de/SMG)

Political messages are perfectly calibrated

Specialized companies with technical and scientific know-how help the election strategists to calibrate their messages more and more perfectly, preferably via so-called social media. "We took advantage of opportunities that the other side probably didn't."

After the 2016 election victory, Brad Parscale, a giant with a Viking beard, was celebrated on US television. The data, Parcale reported, until recently Trump's digital guru, dictated the election campaign: fundraising, advertising spending, travel routes, the topics of the speeches. On some days, up to 100,000 variants of a post were played through on Facebook, and words and colors were diligently tried out. With helpers from Facebook who were specially assigned to Trump's campaign headquarters.

"When I spend $ 100 million on a platform, the largest amount ever, it makes sense for their people to be there to make sure we are spending the money right. And they are doing everything right."

Parscale's balance sheet: Twitter is the mouthpiece, Facebook is the autobahn on which Trump drives to victory. He operates with vast amounts of data about every single citizen. They come from the social media giants, from advertising companies, from voter registers and other data collections.

"You put that into a machine and begin to learn how people react in different areas. That is then personalized. These" hard IDs ", these hard identities are matched with phone numbers, e-mail addresses and everything. If all those pieces are put together, you have your audience. You can say, I want to find anyone in this part of Ohio who believes the wall needs to be built. "

On Click. As a table that is fed in via Facebook's interfaces. Parscale: "Facebook's job is then to scrape all this together and layer it on top of each other. Then you have a little button on your computer:" This audience ". They use it for all of your advertising."

First celebrated as a star, then at the center of a data scandal: the managing director of "Cambridge Analytica", Alexander Nix. (AFP / PATRICIA DE MELO MOREIRA)

"Microtargeting" based on a psychological model

Alexander Nix, CEO of Cambridge Analytica, steps onto the huge stage of the "Concordia Summit" in New York in 2016. A rendezvous for politics and business.

"Ladies and gentlemen, dear colleagues. It is an honor for me to speak to you today about the power of data and psychography in elections."

Nothing speaks of a global revolution. His company runs a huge database in which 4 to 5,000 data points about every adult US American can be called up: age, gender, place of residence, lifestyle, attitude.

"Which car you drive, which products you buy, which magazines you read, which golf club you belong to, which church you go to. And of course personality and behavioral data."

He presents himself as the rock star of microtargeting: "Today we no longer have to guess which creative solution might work. We can use hundreds, thousands of individual data points for our target groups to understand exactly which messages will address which target groups before the creative ones Process begins. "

Initially, social media was seen as the new spearhead of freedom, as platforms on which billions of people step out into the world and form opposition movements against dictators. Today they are suspected of helping paying customers gain absolute control over millions of heads. Critics speak of "cognitive hacking": digital brainwashing, based on huge data resources updated in real time and fueled by artificial intelligence.

This special folk psychology is based on a five-factor model of personality research, often referred to in English with the acronym OCEAN. The "Big Five" are, in German, the personality dimensions of openness, conscientiousness, extraversion, tolerance and neuroticism.

(Chip Somodevilla / AFP / GETTY IMAGES NORTH AMERICA) Lies, Cunning and Influencers: Social Media in the US Presidential Election
Secret services warn: The US election campaign could be even more susceptible to disruptive attempts than in 2016. The attackers have upgraded and are trying to influence social media through influencers.

Highly explosive scientific knowledge

An article was published in the Proceedings of the National Academy of Sciences of the United States of America in the spring of 2013, entitled: "Private characteristics and attributes are predictable from digital records of human behavior."

The authors: Michal Kosinski, researcher at the Psychometrics Center at the University of Cambridge and two colleagues from Microsoft Research.

"This study shows the extent to which relatively simple digital recordings of human behavior can be used to automatically and accurately estimate a wide range of personal attributes that people commonly consider private."

The authors had discovered that a few Facebook likes are enough to predict many, sometimes very intimate, personality traits, from intelligence to drug use, from political attitudes to sexual orientation. Your tool: the OCEAN model. Your material: personality profiles and Facebook data from 180,000 users. The scientists were well aware of the explosiveness. They warned of "considerable negative effects" if - quote - "commercial companies, government institutions or even Facebook friends" should use such methods:

"One can imagine situations in which such predictions, even if incorrect, could endanger a person's well-being, freedom or even life."

"Go to jail!" Protest poster on the front door of the Cambridge Analytica company in London (imago / Stephen Chung)

Cambridge Analytica data scandal

Shortly afterwards, Alexander Nix signed a contract with Aleksandr Kogan, then a psychology lecturer at Cambridge University. Kogan had hoarded personality data through a Facebook app called "This is your digital life". A total of around 87 million Facebook users are said to have been affected. The data was sold to Strategic Communications Laboratories, or SCL for short, the parent company of Cambridge Analytica. SCL, founded in 1990, claims to have influenced more than 100 elections in over 30 countries, from Argentina to the Philippines. Founder Nigel Oakes, an advertising man, described his method in his early years as follows: "We use the same techniques as Aristotle and Hitler. We address people on the emotional level in order to get them to agree on a functional level."

(ZUMA Press | imag) Cambridge Analytica: Inside views of a manipulator
Young, angry men were Chris Wylie's targets. The Cambridge Analytica hacker lured them online, manipulated them, and then brought them together - until it got out and went public.

Kogan later told a television station that everything was "super normal".

When the entanglements became known in 2018, Cambridge Analytica and SCL soon went bankrupt. Christopher Wylie, a former employee, acted as a whistleblower, shed light on many connections, including Kogan's connections to Russia: "Cambridge Analytica is the canary in the coal mine. We have to look at the digital echo chambers that are used to generate algorithms for American society columns.

Well-calculated appearance: Mark Zuckerberg in front of the US Congress (picture alliance / dpa / Alex Brandon)

Are social media platforms responsible for content?

High-tech giants like Facebook and Google are spending millions to exert more influence in Washington. They fight for their business base, the data of billions of people - their product. They fear more political oversight, possibly a breakup.

Facebook boss Mark Zuckerberg in particular is on fire. He has hired well-connected Republicans as lobbyists. He even dined with Donald Trump in the White House.

In the US Congress, Zuckerberg and colleagues had to be accused that their algorithms are anything but transparent; that they shy away from responsibility for the messages they convey and that foreign trolls and agents influence the US election campaign through their platforms. Recently, however, it has also become clear how much effort the Trump campaign took to target black elections in order to prevent them from voting for the Democrats via Facebook.

The Facebook founder is also increasingly being criticized within the company. Thousands of employees expressed concern at the beginning of August that the company's services - including WhatsApp and Instagram - could be misused to cast doubt on the result of the presidential election in November. In the meantime, platforms sometimes mark false news. But many studies show that disinformation continues to grow, not just in the USA. A former Facebook data scientist wrote:

"I know I have blood on my hands."

She listed a number of cases - from Azerbaijan to Bolivia - in which political parties and heads of government had tried to spread false information via Facebook, threatening opponents and misleading the public.

Proof of "non-authentic" accounts is tedious

Such "inauthentic activities", as the experts say, point to paid trolls and other forms of computerized propaganda from home and abroad - a growing problem for all social networks. Countless attempts to influence political attitudes and voting behavior through misinformation, twisted narratives and public opinion are well documented. For example that of the notorious "Internet Research Agency" in St. Petersburg. But the work in detail, reveals Oxford researcher Marcel Schliebs, is extremely arduous.

"On the one hand, there are, for example, properties of accounts that distribute content. So how many followers does an account have? When was it created? But also the immediate environment and topology of the networks in which they move. There are different actors between them big differences. Russian actors are known for having cultivated profiles in the most complex way over the years, in order to be able to generate large organic reach with them. For example, having Facebook or Twitter accounts tweeted or posted on completely non-political topics for years. And then at some point when they are needed or called into action, they start to get political. In the last few months, for example, we have seen China on the other side, which probably wanted to build up capacities very quickly in the context of Hong Kong or around the corona virus which has led China to use very primitive accounts lately, which are on Twitter almost all of them have zero or one follower and very few people follow themselves. "

These pseudo-Americans made in China often act hard on the cliché, according to Schliebs:

"They almost always follow Donald Trump, sometimes Bill Gates, Lady Gaga, Katy Perry and NASA, for example."

How effective is trolling and propaganda actually?

It is even more difficult for scientists to judge what effect all these deception maneuvers actually have on the audience. Another hurdle: The data are mostly owned by corporations or secret services, which often show little enthusiasm for independent research. And then there is an ethical dilemma:

"If we wanted to build the perfect research design that explored this problem - sort of a gold standard experiment, in a way - we would have to set up a perfect randomized experiment where a randomly selected segment of the population got some kind of misinformation, for example on their Facebook account , is displayed, we then, in an American context, match their Facebook account with their 'Voter Registration File' and then see whether we were able to influence their voting decision through the false information, but if we do what I just said "Let it pass in review, we notice that we have crossed virtually every legal and ethical red line that we as responsible scientists have, of course."

So to speak, methods à la Cambridge Analytica, this time for a good cause? Unthinkable.

Renée DiResta, Head of Research at the Stanford Internet Observatory, says: Every successful foreign intervention starts with existing breaks and problems in the target society:

"Ultimately, the goal will be to undermine trust in the legitimacy of the elections. When it comes to information operations, one must note that these figures and their material are well received because they address existing social divisions, legitimate displeasure, and the low level of trust in institutions Government."

(Book cover transcript Verlag / Background imago stock 6 people) Angela Nagle: "The digital counter-revolution"
The internet is said to bring democratization and freedom and to tear down social barriers. Angela Nagle goes to the political periphery of the network and shows that this idea no longer corresponds to reality.

"Social Bots" - Can Disinformation Be Automated?

The question that makes many nervous: To what extent will disinformation be automated in the future? Could improved artificial intelligence soon really create clever, adaptive bots that then also multiply themselves rapidly? So far, fake identities have often been botched together, from lists of names, stolen photos and minimal résumés. Marcel Schliebs says: The problem is both overestimated and underestimated.

"I believe that human-operated, manually-operated accounts and automated, algorithmic accounts, which we also often refer to as bots, are usually in a synergy relationship."

On the one hand, bots have so far rarely been able to act really creatively. On the other hand, their sheer mass is underestimated.

"The latest example: a case from China, where Twitter deleted a total of 23,000 accounts that regularly spread Chinese state propaganda on topics such as the coronavirus, but also Hong Kong and other topics of geopolitical relevance for China. But that's behind this network of 23,000 fake accounts There were still hundreds of thousands of automated accounts whose role was solely to further disseminate this man-made content and, for example, to retweet it. "

Facebook estimates that around five percent of its nearly three billion active users are fake. Although the company shuts down billions of fake accounts every year. In the meantime, a machine learning system called "Deep entity classification" takes over the deletion. In March 2020, it was said that the system had just shut down 6.6 billion accounts.

Trust in reliable information is deliberately destroyed

Kate Starbird, researcher at the University of Washington, gave a lecture in Stanford in October 2019, entitled: "Beyond Bots and Trolls":

"Disinformation reduces our ability to recognize what or whom we should still trust. Those who no longer know where reliable information can be found run the risk of losing their ability to act. We do not know what to do because we are no longer sure are what's going on. And so lose our ability to make decisions based on a knowledge of the world when we no longer trust our knowledge of this world. So disinformation is not simply wrong information. Sometimes it is not wrong at all. To be effective, such information layers truth on top of lies, often with a true or plausible core. They add new details and leave others out to form a specific narrative for a specific strategic goal. "

Starbird and colleagues came across the topic by chance. Actually, they had explored the framing in the debate about police violence and "Black Lives Matter". But as soon as their study was finished in October 2017, Twitter published a list of accounts from the troll factory in St. Petersburg - over 3,000 accounts that had sent around three million English-language tweets. Starbird clicked on the list. And recognized a number of actors. The team immediately got to work, discovering that trolls had tried to cheer on Black Lives Matter activists as well as their often racist opponents. Like puppeteers. The team started to analyze the content of the tweet mountains:

"I can't really recommend this kind of work. It took me years, now I finally feel better. I finished it after a sabbatical. But it was really disorienting."

The QAnon movement spreads bizarre conspiracy myths (imago / Zuma Press)

"Ecosystem" of disinformation develops its own dynamic

Many in the team, she reports, were disturbed by how skillfully the trolls had succeeded in infiltrating and instrumentalizing both sides of the debate through a mixture of facts, half-truths, exaggerations and deliberate lies. Like many researchers, Starbird speaks of an "ecosystem" of disinformation, where various actors work together:

"So it was not orchestrated. It happened organically, was only sometimes, when it seemed useful, strategically cultivated and reinforced by Russian and other outlets. But they did not create the conspiracy theories. The activities may have been loosely linked to state disinformation. What here." Happening: People in these online communities are now so embedded in these kinds of teachings or ways of thinking about the world that when they get new information, they see new events, they immediately begin to incorporate them into their guiding ideas Event a new theory based on these issues and take the event as further evidence of this great conspiracy of powerful people manipulating everything. On their behalf, the mainstream media also lie. "

Almost like a perpetual motion machine. Some of these constantly newly generated conspiracy patterns are picked up by "online influencers", by established media and political actors. Even the crudest stories like that of QAnon make a career for years. QAnon is said to be an anonymous savior in the Washington apparatus helping Trump achieve the ultimate victory against the "deep state" and the Democrats' pedophile rings. We are all vulnerable, says Kate Starbird.

(imago images / Sachelle Babbar) US election: What influence does QAnon have?
QAnon went from being a reservoir for conspiracy ideologues to being a movement. In the US, their influence on politics seems to be increasing. US President Donald Trump is considered a savior by QAnon.

Regulate political advertising on the Internet like on TV and the press?

Ultimately, it is now said in committed appeals from scientists, it is about the survival of democracy in the digital age. MIT researchers Sinan Aral and Dean Eckles, for example, called for more research to save democracy in an article in Science. The alignment, to put it bluntly: count fewer peas, look more at effects and counter-strategies.

"Gaining a scientific understanding of the effects of social media manipulation on elections is an important civic duty. Without this, democracies remain vulnerable. The sooner we start a public discussion about balancing privacy, freedom of speech and democracy, the sooner we can find a way around." recognize forward. "

So the demand, also from science, is getting louder to regulate the new social media. The thought leader Jaron Lanier already calls them "mass behavior change machines."

Nina Jankowicz, a fellow at the Woodrow Wilson International Center for Scholars, calls for a new traffic regime for this media: "We really need to put some bumpers on this bowling alley before things get too out of hand."

Facebook and Co, says the researcher, will never control themselves. That is the task of society and the state: "We will actually lose the information war if we continue on this path of politicizing common sense."

A start would be to subject the distributors of political advertising on the Internet to the same standards as radio, television and print. There are bills in the US parliament. But so far they are just - drafts.

"Trump's false allegations are more effective than all trolls"

Other scientists primarily consider the corrosive effects of radicalization: the complete loss of confidence in politics, the media, and democracy. The lies of the elite, says Brendan Nyhan, Professor of Public Policy at the University of Michigan, are more powerful than any troll.

"Anyone concerned about misinformation should worry most of all about those who come from our political elites and reach far more people. The President of the United States has made more than 9,000 false claims since he took office. Nobody is in the media The world is more present than him. The exposure to false claims by the President of the USA is orders of magnitude higher. "

Most recently, the corona epidemic created a tsunami of misinformation about the origin and spread of the virus, about countermeasures and vaccines. The new term: "Infodemy". Many experts fear that rumors and false reports will spread much faster than facts in crisis situations; and democratic societies are gradually losing the common basis of empiricism and observation and the basic consensus that supports them.

(dpa / picture alliance / Sebastian Gollnow) The new immaturity
Corona has changed many of our everyday routines. Technical aids such as video conferencing have found their way into where it is necessary to keep your distance. This is practical, but it also raises ethical questions. Do people make themselves immature?

Some researchers are trying to take a closer look at the right-wing discourse networks in the USA, which are mainly conservatives, libertarians, Christian fundamentalists and white nationalists. The interrelationships are complex, the transitions fluid, the channels ever more numerous. New "micro-celebrities" are constantly emerging, which sometimes have enormous reach.

Social media algorithms "reward" extreme messages

Guillaume Chaslot used to work at Youtube, Google's video platform, to which 400 hours of new "content" is uploaded every minute. And a billion hours of material are consumed every day:

"I think most people don't realize how powerful AI - artificial intelligence - is today. We believe this is a thing of the future, maybe in a robot. In fact, there is AI around us for a long time. On YouTube, for example AI recommends over 70 percent of all things viewed. On Facebook we have AI on several levels. One, for example, curates our timeline. It doesn't show us everything that our friends like so much. It chooses. "

The decisive factor here: the "Engagement metrics". The "viewer time" on Youtube, the length of stay, shares and likes on Facebook. The customers should be emotionally absorbed, tied to the monitor. And see as many advertising messages as possible. The extreme is rewarded.

"That has side effects. A cult, for example, is by definition engaging. People in a cult spend a lot of time with their leader. This is the perfect model for Facebook. Facebook is designed in such a way that it will always favor cults - because they are really good at it We see how well conspiracy theories like QAnon, which are similar to sects in many ways, work on Facebook. Because they captivate people, generate lots of shares, likes and redirects. Even if they have nothing to do with reality. "

In 2016, the Wall Street Journal reported on internal Facebook investigations, according to which almost two-thirds of users on Facebook had joined extremist groups after they had been recommended by the in-house algorithm.

"64 percent of all additions to extremist groups happened because of our recommendation tools."

Guillaume Chaslot: "I don't think these companies see democracy as a value, as something important. They are taking very small steps to protect it. Facebook, for example, has announced that it will no longer publish any new political advertisements in the last week before the election It looks like a very, very small step towards protecting democracy. "

Are the algorithms better today?

"The fundamental problem is not the sophistication of the algorithm, but the question: What is it optimized for? A very basic algorithm optimized for democracy, for example, would do a lot more for democracy than a very complex algorithm based on user participation is optimized. "

Both US election campaign camps rely on their own app (www.imago-images.de/AppStore)

US election campaigners rely on their own campaign apps

US election campaigners are now trying to emancipate themselves from the big data corporations. For example with campaign apps like "Trump 2020". The app asks the user: name, telephone number, email address, zip code. And when it is used, it constantly generates new data of interest to the strategists. Again, the main thing seems to be to find out as much as possible about the users and their contacts. And send them messages that are as effective as possible and tailored to their needs. Joe Biden's app, on the other hand, "Team Joe" is much less hungry for data. The Trump app, says Technology Review magazine, "has a much longer list of access requirements. It wants to read your contacts and requires permission to find out your location, to read the status and identity of your phone, using Bluetooth -Couple devices and read, write or delete possible SD cards in the device. "

"Technology Review" reports that attempts to locate people using Bluetooth transmitters, hidden in front yard posters or at churches, are becoming more and more common. The new technical term: "Geo-Propaganda".

For its "culture war", its often religiously charged final struggle for power in the USA, the Trump campaign seems to be legitimate - also technically - every means. Again and again, their multipliers emphasize that November is really about everything, about God or the devil, fame or downfall, greatness or socialism, truth or lies.

The Republican Youtuber Charles Kirk recently painted the consequences of a Trump defeat for his supporters. A kind of doomsday vision: Trump would be locked up, the churches closed, the accounts blocked. All upright Republicans would be fair game.

"If we lose, if the president loses, they will come for us all. They will take your children, they will show up everywhere."

Easy to manipulate: voting computer in a US polling station (AP)

Voting computers are extremely susceptible to manipulation

And then there is another problem that is often forgotten with all the software: the hardware of the US election offices.

"I don't think I've ever run into an issue more difficult than the security and integrity of elections."

In August, at the BlackHat conference, a rendezvous of computer security experts in Las Vegas, this time online, Professor Matt Blaze, professor at Georgetown University, drew a devastating conclusion: The nearly 178,000 electoral offices in the United States had very different devices - for them Registration, voting, counting and submitting results. There are, says Blaze, "enormous attack surfaces".

"What they all have in common: Our confidence in the election result increasingly depends on our confidence in the integrity of the mechanisms we use to achieve it."

Especially since the election results are sometimes extremely tight. In 2016, Donald Trump's lead over Hillary Clinton in the states of Michigan, Pennsylvania and Wisconsin was between 0.23 and 0.77 percent. Voting computers on which citizens cast their votes via touchscreen are considered particularly treacherous. Many are out of date, prone to bugs, hacks and viruses.

At the US hacker conference DefCon, right next door, voting computers are subjected to a stress test every year in a "voting village". So far, Blaze reports, every device has been cracked.

"The people who spend money to influence how you vote are far more resourceful than those who try to make sure things are going well, that everyone has an opportunity to vote, and that the votes are counted correctly become."

Blaze's conclusion: computers can help. But voting must be done on paper. The urn is the best. In a poll at the BlackHat conference, more than two-thirds of the participants said that any form of electronic voting was risky. Only disinformation, 71 percent thought, was even more dangerous.