Editor’s Note: This piece was originally intended to shine a spotlight on the chronic disinformation that plagues Wikipedia as a consequence of a practice called astroturfing. However, our research led us into the troubling waters of the opioid and heroin epidemic ravaging our communities. Just as we were discovering the extent to which OxyContin’s manufacturers had engaged in widespread astroturfing simply to hide the danger of their prescription painkiller, we were also coming to understand more fully the scope of a rampant addiction crisis that is claiming an American life once every 19 minutes. As we explore the connection between Wikipedia’s astroturfing problem and the opioid epidemic, the power of informational accuracy should become inescapably clear. Indeed, with the number of deaths due to heroin and opioid overdose increasing by an alarming 14% between 2013 and 2014, lives depend on this accuracy.
Wikipedia has an astroturf problem.
Boy, that sentiment takes me back.
When I was a kid and the Internet was still just some obscure military tech, the word “Astroturf” had a distinctly negative connotation. I was raised in Philadelphia, where we live and mostly die by our sports teams. Disappointment is part of our DNA.
For more than 30 years, our Phillies and Eagles played in a place called Veterans Stadium, a big, gray, concrete, urinal-cake-shaped monstrosity with a 700 level where bad things happened.
Well, bad things also happened on the field, and Astroturf was a big reason why. When the stadium was built in 1969, it became one of the early outdoor test subjects for artificial grass. It failed that test constantly and repeatedly for the better part of the next three decades. The synthetic lawn was reportedly riddled with uneven seams and—according to every outfielder who ever dove for a fly-ball and every quarterback that ever had his face drilled into the ground—was hard as a cement.
The Astroturf in Veterans Stadium was notorious for doling out injuries, often of the career-ending variety. But it was hard to get rid of. We were stuck with it. For some reason, Philly’s athletes endured Astroturf right into the start of the 21st Century.
Well, like the Astroturf that brought Veterans Stadium general disrepute for most of its years on this earth, there is an unnatural layer of synthetic material permeating the online knowledge-lode known as Wikipedia. The Internet’s most extensive and most-frequented source for information has its own Astroturf problem. And this layer of artifice may prove even more dangerous and difficult to uproot than the stuff once carpeting Veterans Stadium.
What is Internet Astroturfing?
So let’s start by addressing the obvious question.
According to The Guardian astroturfing is an attempt to create the impression of grassroots support for a policy, individual, or product that probably otherwise lacks such support. The practice is considered inherently misleading and dishonest.
In other words, it’s a great way to control your online image if you are a politician that most people dislike, a retail operation with a less than stellar record on labor rights or a pharmaceutical firm whose clinical trials are under lock-and-key.
From Russia with Love
In 2012, Russian President Vladimir Putin provided a powerful demonstration of this practice in action. According to a story broken by prominent hacktivist group, Anonymous, Putin was the willing subject of widespread astroturfing when his administration dispatched and financially compensated a broad-based network of online users to both distribute messages supportive of his regime and to obscure or contradict criticism.
Operatives for a youth organization called Nashi were instructed, among other tasks, to leave pro-Putin remarks in the comment sections of articles otherwise critical of the Russian president, to “dislike” anti-regime videos on YouTube, and to engage in Web-based smear tactics against opposition leadership.
This is astroturfing at its most sinister, which is really the only way that the Putin regime does things.
Wikipedia’s Battle for Natural Grass
Perhaps less sinister but also potentially more dangerous is the astroturfing on Wikipedia.
In an essay titled “Selling Community: Corporate Media, Marketing and Blogging,” Gavin Stewart identifies Wikipedia as among the most problematic examples of astroturfing on public grounds, largely because such a wide cross-section of online users draw their information therefrom.
Stewart describes a project from 2007 called WikiScanner, in which a graduate student named Virgil Griffith dispatched a program capable of identifying the Internet protocol (IP) addresses behind anonymous edits on Wikipedia. When these IP addresses were traced back to corporations and politicians, three common editing patterns emerged:
- Wholesale removal of text critical of the editor;
- Replacement of negative or neutral adjectives with adjectives of relatively similar meaning but more positive connotation
- Insertion of negative information into the pages of political competitors.
Griffith cited a few specific examples of the astroturfing that his WikiScanner spotted, noting for instance that the CIA was responsible for adding large portions of content to its own page and that Congressmen were particularly prone to editing their own pages to remove campaign promises that they had failed to keep.
Griffith also pointed to Walmart’s page, where it once stated that Walmart employees are paid roughly 20% less than those working for competitors, Griffith points out that the page’s language had been revised to state that the average pay at Walmart is double the minimum wage. Naturally, Griffith’s WikiScanner program determined that the edit came from a computer owned by Walmart itself.
In another instance, Griffith’s WikiScanner observed that a computer owned by somebody identified as a member of the “Democratic Party” was responsible for supplanting the word “popular” on Rush Limbaugh’s Wiki entry with the word “idiotic.” Of course, in this case, there is no evidence that the individual was paid to make the edit in question, nor that the edit in question might not be verifiable with some additional research. But it does underscore the issue that one’s editorial motives on Wikipedia may not always be pure. Certainly, the editing party’s affiliation is of consequence.
None of this should be taken to suggest that Wikipedia is a static organization that passively allows its content to be distorted. The stated goal of the site, which came to life in 2001, is to facilitate the continuing development of a compendium of human knowledge that is at once entirely democratic and governed by some degree of quality control.
- Intentionally or knowingly posting content that constitutes libel or defamation;
- With the intent to deceive, posting content that is false or inaccurate;
- Attempting to impersonate another user or individual, misrepresenting your affiliation with any individual or entity, or using the username of another user with the intent to deceive; and
- Engaging in fraud.
In 2013 press release, Wikipedia reaffirmed this position while announcing that it had banned more than 200 users who were guilty of manipulating the site’s content in exchange for monetary compensation. Wikimedia Foundation Executive Director Sue Gardner said:
“We consider it a ‘black hat’ practice. Paid advocacy editing violates the core principles that have made Wikipedia so valuable for so many people. What is clear to everyone is that all material on Wikipedia needs to adhere to Wikipedia’s editorial policies, including those on neutrality and verifiability.”
The Smoking Gun
Still, none of this makes the work of detecting such practices any easier. So was this amply demonstrated in 2012 when volunteer editor DocTree was tapped to review the veracity of the page for a company called CyberSafe. Though DocTree’s area of expertise was ornithology, the birder agreed to assist the Wiki community as it targeted entries for possible deletion.
CyberSafe’s page was one such entry. As DocTree reviewed the page, he found that most of its links were to sources of only limited relevance, that there was in fact little on the page that was verifiable. To DocTree, this suggested a page that might not have been created with the most dutiful attention to truth and objectivity. It did not, however, immediately imply some large-scale information conspiracy.
But with the page earmarked for elimination, DocTree noted something peculiar. Suddenly, multiple users had emerged to defend red-flagged passages by way of the “Talk” links found in the page’s “View History” tab (more on said tab later). It occurred to DocTree that for a bunch of random concerned citizens, these “users” were strangely inclined toward similar phrasing and language. What began as five such users soon revealed itself as a small mountain of user accounts. According to The Daily Dot, DocTree had inadvertently stumbled on a hornet’s nest, buzzing with 323 confirmed user accounts and another 84 under suspicion.
At the end of its investigation in November of 2013, the Wikimedia Foundation issued a cease and desist to a company called Wiki-PR. Until that point, the PR firm had operated largely out in the open. Indeed, even at the time of this writing, the company’s website identifies it as “the easy way to accurately tell your story on Wikipedia.”
The cease and desist targeting Wiki-PR was part of a far-reaching crackdown on distorted user content. Through the reconnaissance of its volunteer editor community, Wikimedia was able to confirm that Wiki-PR had created more than 300 editor accounts to dispatch its commercially-biased content.
By the by, there’s a Wikipedia page that you can visit if you want to level the claim that somebody is using multiple Wiki user accounts for improper purposes. The page explains that the practice of using multiple online identities for the purposes of deception is called “sockpuppeting” (which I feel compelled to mention here because it’s really funny).
So yeah, technically Wiki-PR was guilty of widespread sockpuppeting for the purposes of astroturfing Wikipedia’s content. (Just for fun, jump into a hot tub time machine and attempt explaining that sentence to somebody from the 1980s.)
For more on sockpuppeting, check out Wikipedia’s own extremely lengthy rap sheet on the serial sockpuppeter who first appeared under the username Bambifan101. Capsule review: Bambifan101 uses more than 200 different user account names to torture Wiki writers and editors, specifically on pages pertaining to Disney, juvenile, and tween media content. Hilarity ensues.
Both the Wiki-PR and Bambifan101 cases reveal just how difficult and labor-intensive it is for the Wiki community to effectively police its content. Though IP addresses make it more than feasible to track down supposedly anonymous users, the sheer scope of this undertaking presents a considerable logistical challenge. According to its own stats, Wikipedia is host to well over 5 million entries, with roughly 1,100 new pages created and another 1,000 deleted every single day.
Whether it is for the purposes of economic gain as with Wiki-PR, or for the simple and perverse pleasure of being a nuisance, as with Bambifan101, sockpuppeting is the low-grade polymer at the seams of Wiki’s Astroturf. (Sorry for that sentence but this whole online deception business is lousy with mixed metaphors.)
Bambifan101 work is readily detectable. He or she has a trademark tendency to engage in rancorous, argumentative, grammar-butchering and typo-ridden discourse within a specific subject area. One could argue that the obvious connective tissue between Bambifan101’s many sockpuppet accounts is the user’s desire to flout Wiki’s enforcement arm. It ultimately reveals that anyone with more insidious intent would have no problem creating new accounts and planting new Astroturf seeds. Those with money to gain (and spend), even less so.
With a little subterfuge, users like Wiki-PR and Bambifan101 can readily work around their respective banishments (and probably do), meaning the gatekeepers of Wikipedia must remain on high alert.
So Much For Good Intentions
Putting aside the basic fact that pulling up existing Astroturf and tracking the activities of its sockpuppet groundskeepers are both logistically challenging undertakings, there is also a philosophical discussion to be had on the subject.
Marketing and dissemination of truth are not inherently at cross purposes. Naturally, one would hope for the former to be substantially dependent on the latter. But truth can pose a genuine challenge to image-shaping. I’m pretty confident that if the Subway sandwich chain could expunge Jared Fogle from their Wiki, they’d do it faster than you can say five-dollar footlong.
But that jack is not going back in the box.
Obviously, no amount of editing is going to make that information disappear. But let’s say your negative press is a little subtler. Maybe the health inspector paid a visit to your restaurant the day you ran out of hairnets and your not-so-great inspection grade seeped onto your Wiki page and ballooned into an entire paragraph about how gross your place is.
Firms like Wiki-PR are paid to load your page with positive information and, where possible, to suppress the less-than-favorable stuff. A few sensible revisions, an added paragraph here, a deleted adjective there, and your hairnet scandal could be disappeared beneath a heap of actual positive health inspection results.
Many of these companies claim to have been misled by Wiki-PR about the legal particulars. For most, it was merely a fast and affordable way to create a company Wiki page and to ensure its maintenance. On the surface, this isn’t that illicit.
And Wikimedia acknowledges as much, even going so far as to encourage companies to take part in monitoring their own pages for factual inaccuracies. But Wiki also concedes that this presents a bit of a dilemma. Naturally, no company is inherently unbiased or objective in the way it presents itself.
To this extent, Wikipedia amended its policy somewhat in 2014, establishing a rule requiring that content editors disclose whether or not they are being paid for their efforts. This did not materially change the consequences facing those guilty of astroturfing: banishment is still Wikipedia’s brand of capital punishment.
The greater point of the new policy was both to put sockpuppets on notice that their actions would be closely monitored and to attempt to create a context in which commercial entities could rightfully participate in the process of formulating and protecting their own information.
There remains a massive gray area—one which Wikipedia actively acknowledges in its FAQ on Paid Contributions Without Disclosure page. For instance, if you work for a university and are paid as a professor, but occasionally like to dip into the school’s page and voluntarily address gaps or inaccuracies thereupon, all good. As long as you aren’t being paid to do exactly that, you don’t even have to disclose it.
If you work for the university’s PR department, you’re on the honor system to disclose your role.
Regardless of its effectiveness, the call for disclosure is at least partially a recognition by Wikipedia that it simply can’t control its content without some level of voluntary cooperation from the commercial sphere.
To this extent, founder Jimmy Wales has said that he wants companies to participate in controlling the information that pertains to them. Indeed, this input should be seen as essential to getting the facts right. But how this input is contributed is of critical importance. Wales said in 2012 that “I am opposed to people who are paid advocates being allowed to edit in article space at all, and extremely supportive of paid advocates being given other helpful paths to assist in our work usefully and ethically.”
Any commercial entity that wishes to change its own content must abide certain conditions. Specifically, the preferred way to correct the record on your page is to reach out to a Wikipedia editor to discuss the desired revisions. If you feel you absolutely must do it yourself, you should plan on disclosing your role as the editor and your relationship to the company in question, including any payment arrangement that might underlie your motive to edit.
Wikipedia, Opiate of the Masses
Of course, these conditions really only apply when the party in question wishes to behave honestly. We can agree that this probably isn’t always the case. It is safe to assume, for instance, that Putin’s astroturfing campaign was not an accidental misreading of anybody’s Terms of Usage.
But as bad intentions go, Putin’s was merely to shape public opinion. If it is at all possible to convince you that Vladimir Putin is a nice guy, then here’s a campaign to do it.
In a sense, this type of disinformation isn’t really even that harmful insofar as it doesn’t substantially change behavior, even when it does a good job of distorting reality. It’s not like you could have done anything to stop Putin from committing whatever heinous acts he intended to suppress.
But what about the kind of information that actually influences behavior? What about the stuff you’d have no way of knowing, or informing yourself about, or formulating an opinion upon, without a little bit of online research?
Look, it should go without saying that one ought not to make important health decisions based on information acquired through Wikipedia. It should go without saying that real medical advice can only come from consultation with a physician whom you know and trust. It should go without saying that there is more to diagnosing your mysterious rash than just Googling it.
All of this should go without saying, but it doesn’t.
People most definitely Google and Wiki in search of medical information under the assumption that it beats calling for an appointment, getting in a car, making a copay, or incurring an unforeseen out-of-pocket expense. And sure, this could incline us to digress into a far broader discussion about our generally broken healthcare system. But that’s neither here nor there.
Here, under the auspices of confirmation bias, the reluctant patient will seek out information that best confirms the narrative he or she has already begun to formulate.
Wikipedia is especially excellent for rewarding confirmation bias so long as one chooses to read it the right way. Drug company astroturfers have proven particularly adept at exploiting this excellence. Again, it isn’t easy to make something like Jared Fogle go away if you’re living through a PR nightmare at a particular hoagie chain. On the other hand, it isn’t quite as hard, if you’re a pharmaceutical company, to suppress unfavorable clinical trials, to obscure lobbying efforts aimed at influencing medical professionals, or to downplay certain side effects of a given drug.
In essence, it isn’t quite so hard to disseminate disinformation with the capacity to influence public health decisions. Consequently, it is possible to use Wikipedia to encourage dangerous, even downright deadly behavior just to make a buck (or a few billion).
Drug company Purdue Pharma could tell you all about it (and I’m guessing there are some court-sealed depositions out there in which they actually do).
The precipitous rise in the prescription of opioid pain-killers connects irrefutably to increases in the occurrence of opioid addictions, opioid-related overdoses, and consequent heroin addiction and overdose. An article in Huffington Post reports an incredible acceleration in the number of painkiller prescriptions written, from 76 million in 1991 to 207 million in 2013. In 2014 alone, OxyContin and other opioid-based painkillers earned their parent companies something in the range of $9 billion.
As these drugs have risen in popularity and visibility, their risks have become increasingly apparent. The Centers for Disease Control (CDC) notes that 2014 saw the biggest number of prescription drug overdoses on record at 19,000. Heroin overdoses, at 10,500, had more than tripled since 2010. As the maker of OxyContin, Purdue Pharma agreed in 2007 to pay $600 million in fines for misleading the pubic about the highly addictive drug’s associated risks.
As it happens, the CDC has been mounting its own informational campaign aimed at better informing the public of the risks associated with opioid prescriptions. This January, the agency had planned to finalize new prescribing guidelines for painkillers that, while not legally binding, would have substantially impeded the laxness with which physicians currently avail OxyContin, Vicodin, and other painkillers whose users are highly prone to addiction, abuse, and accidental overdose.
The CDC’s efforts were aggressively derailed by groups with names like the U.S. Pain Foundation and the American Academy of Pain, which both sound either like totally legitimate advocacy groups or like villainous pro-wrestling coalitions. They are neither. They are organizations funded entirely by the pharmaceutical industry in order to create both legal and rhetorical obstacles to the CDC’s end goal, which is to save lives from addiction and accidental death.
The above-mentioned groups are actually part of a broader network called the Pain Care Forum, which enjoys direct funding from Purdue Pharma and its industry counterparts and which succeeded in producing what it called its own “consensus guidelines.” The basic gist of these guidelines was that placing any additional barriers on the acquisition of opioids would both stigmatize patients and prevent them from accessing necessary pain treatment.
So effective was the drug industry’s real-world astroturfing campaign that it coerced the FDA to intervene before the CDC could finalize its guidelines. The CDC ultimately retooled its approach, releasing new guidelines just this past March. But the mere fact of their delay underscores the drug industry’s powerful capacity to produce high stakes consequences through disinformation.
Now, in light of these facts, take a good look at Purdue Pharma’s Wikipedia entry. At first glance, we can’t necessarily say for certain that astroturfing has occurred here (which kind of underscores one of the biggest challenges in preventing it).
Still, it defies likelihood that it hasn’t. So obviously, some suspicion is warranted here. To satisfy this suspicion, one need only click on the upper right hand tab that says “View History.” This gives you a backstage pass, as it were, to the goings on betwixt those who have edited the entry and those who provide oversight of said editors.
A little digging makes it quite clear that Purdue Pharma’s entry has been the subject of some editorial wrangling. In reality, there are numerous instances that denote beyond a reasonable doubt that both sockpuppeting and astroturfing have occurred, including evidence that now-banned user accounts have been used to make changes that have since been reversed.
For instance, we can see that, in April of 2011, a user by the name of Nancy0515 played it fast and loose while editing the ‘Controversy’ subsection. Indeed, checking in on the page’s history, one finds this incredible paragraph, which was substantially altered by Wikipedia shortly after its insertion:
Even though OxyContin has a high risk of abuse, the patient is fully warned about these side effects. Those that choose to use them in illicit ways know what they are getting themselves into and continue down that path.… Joe Levy of Apopka said he took 240 milligrams of OxyContin for three months and did not become addicted. “I don’t know how I would have survived without it,” he said. “Please don’t intimidate the doctors, to a certain point, from prescribing OxyContin and make people suffer needlessly.”
“Please be aware valid pain patients such as myself have great concern,” said Fred Brown, who said he has used OxyContin and other therapies to treat pain caused by failed back surgeries.
While Joe Levy from Apopka (whoever and wherever that is) has nice things to say about OxyContin, it isn’t entirely clear why his opinion belongs in a Wikipedia entry.
Wikipedia wasn’t particularly convinced that his opinion mattered either. In the “Talk” tab of the page’s edit history, a Wiki editor has written that the “Entire paragraph reads like it was written by a Purdue Pharma employee.”
Quite so. This explains why every word of this has since been removed. But what becomes evident with even closer inspection is that traces of Astroturf are frequently left behind in the fight to plant real grass. Sentences clearly authored by Purdue Pharma peak their heads out intermittently through the entire entry, like guilty little opiate-addicted prairie dogs.
Behold the three paragraphs under the subheading “Controversy.” The section includes what appears to be fair, even-handed, and well-sourced reporting on the risks of addiction associated with OxyContin. Conspicuously, one paragraph ends with what would otherwise appear to be a non sequitur. It indicates that, “Nevertheless, strong analgesic drugs remain indispensable to patients suffering from severe acute and cancer pain.”
True enough. And yet, what is it doing in the “Controversy” section of the entry. It’s almost as if an editor here had a particular interest in undercutting the concerns outlined as controversial. Also notable, the citation for this claim leads to a World Health Organization page from 1990. That the source was authored six years before the advent of OxyContin would seem to call its legitimacy and relevance into question.
Dig into the history for this section and you’ll find that, in March of 2011, a user working anonymously and identified only by IP address edited the following passage: “Even though the drugs have helped numerous patients who require it alleviate pain successfully, the negative impact it causes are the main argument behind the controversy behind the company’s medication production.”
When he was done, it said, “Nevertheless, strong analgesic drugs remain indispensable to patients suffering from severe pain.” (The reference to cancer was added in a later editorial entry.)
The very same edit struck this sentence entirely from the page: “OxyContin abuse is a large problem in this country. Over the past five years, opiate abuse has become a national epidemic, especially increasing among street and recreational users.”
These edits remain very much in place today.
The very next subsection, “OxyContin-related lawsuits,” once again seems to be fair, objective, and properly-sourced in its reporting of various legal cases against Purdue. However, at the end of the first paragraph in this section, there is a distinctly out-of-place sentence which is otherwise enveloped by a preponderance of evidence against the company. It states that “the company has since implemented a comprehensive program designed to assist in detection of the illegal trafficking and abuse of prescription drugs without compromising patient access to proper pain control.”
The positively spun claim earned the dreaded “Citation Needed” stamp from Wiki editors.
As point of fact, this section points directly to a 2014 lawsuit, still pending, in which two California counties—Orange and Santa Clara—are claiming damages against nine major pharmaceutical firms for over-marketing prescription painkillers and using deceptive tactics to aid in their proliferation.
Among the suit’s claims is that “Defendants created campaigns—including literature, websites, community groups, and programs—related to chronic non-cancer pain from illnesses such as low back pain, shingles, migraines, osteoarthritis, phantom limb pain, fibromyalgia and multiple sclerosis.”
This lawsuit implicates misuse of outlets like Wikipedia and may shine the light of day even more harshly on questionable insertions like the above-noted from Purdue Pharma’s entry. But it also happens to be part of a much longer list of astroturfing offenses—both virtual and corporeal—in which physicians on drug company payrolls produced supportive journal studies, served sympathetically on medical boards, and even developed some of the tools by which prescribing doctors currently evaluate patient needs.
Returning to the idea of confirmation bias, those isolated Astroturf edits on Wikipedia suddenly become the missing link between fact and fiction. They add the stamp of Wiki-thority to an idea that seems true enough based on your preliminary reading.
This underscores the greater danger of astroturfing on Wikipedia. In a respect that cannot be overstated, this type of obfuscation threatens its very existence. In order for Wikipedia to be more than the sum of the infinite junk sites that now saturate the Web (and which haunt Wiki’s citations), it must learn to defend itself against the subversion of truth. But as persistent pests like Bambifan101 demonstrate, those with the will and motive to remain active and one-step-ahead of editors will find a way to do so.
What are We Really Looking At?
Now, with everything you read and do on the web, you have to carefully weigh the relationship between truth and money, dysfunctional as it is. Wikipedia is very much a demonstration of this admonition, even if aspires not to be.
This aspiration is to its credit and all evidence suggests that Wikipedia will always venture its best efforts at controlling the integrity of its content. But the open source premise at its very heart promises that such efforts will also always be undermined by those with either the economic incentive or basic desire to do so.
So what to do as the end user? Wikipedia is an amazing way to settle a barstool debate. It certainly beats the imperfect old-fashioned method of attempting to yell louder than your rhetorical sparring partner. But one should approach with care and caution any subject of greater importance than that.
If you want to continue to enjoy Wikipedia, you have to know that you aren’t always playing on real grass. Verify its sources, back up its claims, or at least acknowledge its imperfection and its vulnerability to bias.
Forgive me for once again strolling Memory Lane—which is actually an industrial lot of decaying warehouses in South Philly—but I remember when they finally got rid of the Astroturf in Veterans Stadium. They tore it all up and replaced it with something called NexTurf. It was marginally less dangerous but ultimately, still fake. Two years later, inevitably, the City Fathers imploded the Vet and the fake grass that came with it.
Nobody missed it.
Here’s hoping that Wikipedia’s penchant for real grass ultimately grows over its Astroturf problem so that its implosion does not also become inevitable.