Loyola University > Center for Digital Ethics & Policy > Research & Initiatives > Essays > Archive > 2017 > Who takes ethical responsibility for social media influence?
Who takes ethical responsibility for social media influence?
November 30, 2017
There’s a storm brewing, and social media is at its eye. For a decade, businesses and politicians have increasingly used both Twitter and Facebook as convenient mouthpieces. Twitter’s values declare: “We believe in free expression and think every voice has the power to impact the world.” Facebook’s mission is to “bring the world closer together.” Unfortunately, their common quest to enable free speech in a connected world has recently come under scrutiny. Global influence is being brought to bear on domestic matters, and there is the strong possibility that voters in the 2016 presidential election were manipulated through these platforms.
Facebook was the first social media channel to be put under the spotlight. Its platform features an increasingly effective model for targeting specific groups, globally, with the right advertising message. There is uproar because foreign actors appear to have used this very system to influence opinion. Facebook founder Mark Zuckerberg asserted, in the days following the election, that there was no question of this happening — that ‘voters make decisions based on their lived experience’ — but eight months later, concrete evidence emerged of Russian activity on the platform. The criticism levelled at Facebook for allowing this to happen verges on the hypocritical. The ease of advertising has been its attraction to many. The rise of social media marketing has allowed brands to push the boundaries and experiment on their markets in real time. Until now, it seems that all this was okay in the public eye: innocent, positive, ‘good’ use of the medium. The realization that advertisers could have ulterior motives has been a rude awakening.
Facebook has been scrambling to keep pace with the reaction to its own revelation that thousands of ads were placed by profiles leading back to the Internet Research Agency based in St Petersburg, a source known to have promoted pro-Kremlin propaganda. Observers have found the explanations given by Facebook to be wanting: its credibility has been damaged, said Casey Newton in The Verge. The news that $100,000 had been invested by this advertiser seems trivial when set against the $90 million spent by the Trump campaign alone on digital advertising, but the nature of commercial social media promotions against traditional political advertising means the effect on the population may be skewed. The implications are serious. Our lived experience increasingly includes our social media interactions. Our decisions are influenced by what we read.
There is a significant difference between the treatment of commercial and political advertising, which raises questions of ethics and freedom of speech that neither Facebook nor the government can answer. Federal Election commissioner Ellen L. Weintraub has called for regulations to increase transparency. But what is to be regulated? There are already some clear rules that have served us well for decades.
Commercial advertising has long-standing ethical guidelines and regulatory requirements which have been gradually adapted to work in the online space. The American Advertising Federation’s Institute for Advertising Ethics lays down eight crystal-clear principles for commercial advertisers. These require high ethical standards and the common objective of truth; insist on the ethical creation and dissemination of commercial information; and make the point that if consumers don’t know they are reading an ad, then this is unethical. We’re used to seeing ‘Advertisement’ printed in the header of a magazine page, and ‘Sponsored’ displaying in the corner of an ad on our Facebook and other social news feeds, making users aware of paid content. This is important, because knowing that something is an advertisement changes its credibility to the reader.
The Federal Trade Commission has already published guidelines about ‘native advertising’ which bears a similarity to the material that surrounds it online — in short, ads, which are a seamless part of our newsfeed on any social media channel. Seeing the ‘Sponsored’ statement should make us more skeptical about the content we are viewing and the motivation of the advertiser.
Unfortunately, critical thinking on social media is not our greatest skill. As we scroll through the echo chamber of our closed group of like-minded friends, the ads that are targeted to our type of people reinforce our beliefs and prejudices. Our unconscious bias is fed by the confirmation that we’re in line with the community’s opinions. It is this tendency that makes native advertising so dangerous and so attractive to anyone who wants to influence thinking, whether they are selling running shoes, or spreading disinformation.
Facebook has made changes in response to the recent criticism. The Russian ads were placed across 470 fraudulent profiles and pages, and there has been a visible crackdown on fake accounts. Advertisers must now have an established profile and a month of activity before they can promote a post. This is welcome, as fake advertising has not been restricted to political matters. I have lost count of the number of ‘win an RV/ win a holiday’ posts that I’ve reported when friends fall for false promises, and like and share fake pages. My news feed will become cleaner as a result. It’s a start, but the efforts so far have been described by the Guardian’s Julia Carrie Wong as putting lipstick on a pig. Is this a fair assessment? It certainly puts the scale of the problem into focus.
In the management of commercial advertising, Facebook has been adhering to the letter and spirit of the existing regulations. However, these do not apply to political communications. The Federal Election Commission sets out requirements for disclaimers in political communications which are like those in commercial regulations. Political communications are assumed to come from or refer to a candidate or party. The Federal Communication Commission requires cable operators, satellite television (DBS) providers, and broadcast radio and satellite radio licensees to post their public and political files to the FCC’s online public inspection file database. Social media is not mentioned, nor the role of influencing and reinforcing bias, which sways public opinion without specifically relating back to a candidate. Anyone would think that this was a new phenomenon.
In the context of elections, it is.
Since Facebook and Twitter were first created, there have been only three presidential elections in the United States. Who could have predicted in 2008 that two young tech start-ups could have a significant influence on 2016’s voters? Even in 2012, there was little understanding of the behavioral changes that were already underway: Facebook's global user base has doubled from one billion to two since 2012.
Twitter’s active user base is smaller, currently estimated at 328 million, but its potential to harbor manipulative and fake accounts is greater because it allows a high degree of anonymity. Twitter is now facing scrutiny after evidence that its platform may have been used even more extensively than Facebook’s in the Russian influence campaign, according to the New York Times. Representatives of both Twitter and Facebook, alongside Google, which has attracted criticism for returning links to fake news in search results, testified before a Senate Intelligence Committee hearing in November.
Free speech is a democratic necessity and a human right, and social media is its facilitator. Facebook, Twitter and their ilk are the home of absolute free speech, unfettered but for the policing of abuse, and censoring of posts which violate societal norms. Commentators are divided over whether the value of a free speech arena is compromised by the ease with which bots and trolls are able to manipulate the system. Despite criticism over fake news and the current advertising and influence scandals, the digital giants are wary of actions that may open them up to accusations of bias. In terms of regulating content, they are damned if they do, and damned if they don’t.
Traditional media outlets are already clear on their responsibilities. Reuters aims for “independence, integrity and freedom from bias.” The New York Times delivers “content of the highest quality and integrity” to fulfill public trust. The editorial control that maintains these ethical standards is simply not part of the social media model.
Twitter has revealed very little about Russia’s use of its platform or the reach and impact of activity which is still ongoing. Facebook originally said it would not supply any details of the Russian ads under scrutiny. It’s probably more accurate to say that it was struggling to find the information. Facebook has since managed to collate details of more than 3,000 of the Russia-linked ads and will pass them to Congress. The platform is set up to maintain the confidentiality of commercial advertisers and to make the advertising process smooth and seamless without human intervention in approvals: Facebook does not keep reliable records of advertising activity, targeting, or reach. There is no provision to distinguish political advertising, still further the subtle, manipulative content intended to influence thinking and reinforce beliefs.
It seems that the influence of social media has been overlooked by regulators and providers alike. We are dealing with an unprecedented shift in the behavior of a society, not simply that of individuals, as a direct result of the development of social media. Whether this was foreseen by the founders is immaterial: they have created the monster and must now find a way to manage it.
Kate Baucherel BA(Hons) FCMA is a digital strategist specialising in emerging tech, particularly blockchain and distributed ledger technology. She is COO of City Web Consultants, working on the application of blockchain, AR/VR and machine learning, for blue chip clients in the UK and overseas. Kate’s first job was with an IBM business partner in Denver, back when the AS/400 was a really cool piece of hardware, and the World Wide Web didn’t exist. She has held senior technical and financial roles in businesses across multiple sectors and is a published author of non-fiction and sci-fi. Find out more at katebaucherel.com