How can we best engage experts in the project of building and improving Wikipedia? This question has guided my work for many years, and that Wiki Strategies from the beginning. When the Wikimedia Foundation hired me in 2009, it was to take on that question in an academic context. We opened our project plan with these words:
Subject-matter experts have always been valued Wikipedia contributors, and a key goal of this initiative is to facilitate their collaboration with the Wikipedia editing community.
That project, the Wikipedia Public Policy Initiative, became the model for the Wikimedia movement’s formal engagement with academia.
The academic variety, of course, is only one kind of expertise. Practitioner expertise is also important: archivists and museum curators, for instance, have a great deal to contribute to Wikipedia. Expertise resulting from any of a variety of personal or professional activities is also valuable. In The Sponsored Point of View, a panel discussion I helped convene in 2012, experts from Consumer Reports, Wikipedia, ProPublica, and the Foundation for Integrity and Responsibility in Medicine addressed an issue we described as follows:
Hidden financial conflicts of interests create the possibility that scientific research, expert recommendations and journal articles are biased by secondary interests…
Expertise is not easily measured, and reasonable people might disagree about a certain person’s level of expertise on a topic, or about the significance of various perceived conflicts of interest (COI). But never mind the details: expertise exists, and it is important. The purpose of Wiki Strategies is to help subject matter experts — including, but not limited to, academia — to contribute ethically and effectively to Wikipedia.
As I will explore below, engaging with experts necessitates careful consideration of COI. As Wikipedians, we should not neglect to the thinking around COI in other fields — and we should not limit our attention to those who engage in work on Wikipedia itself.
Experts are vital to Wikipedia’s success
Knowing a subject means more than just knowing facts. Expertise can and should guide crucial judgments about how the facts fit together, and about what facts have been sufficiently established. That’s why reporters seek out experts to interview, why courts of law involve expert witnesses, and why continuous improvement organizations like the National Quality Forum explicitly value expertise.
Wikipedia — which has, from the beginning, largely modeled its processes on those of longstanding institutions — values expert contributions no less those institutions do. It would be a thin, uninteresting, and inaccurate encyclopedia if its articles were written exclusively by those without expertise — even though its many years as the world’s top source of curated information decisively prove that non-experts can play vital, game-changing roles as well.
But conflict of interest looms large
A core consideration with subject matter experts, however, is conflict of interest (COI). Frequently, those with the most expertise in a topic have close ties to the topic — financially, emotionally, or otherwise. So, when somebody has a COI with relation to a task, should that disqualify them from acting? Does their involvement irrevocably taint the integrity of that task?
Typically, no. In most cases, the key is to properly disclose the COI, and to manage it responsibly and transparently.
Organizations typically address COIs by establishing processes that mitigate the impact they might have on the integrity of their work. Much has been written, for instance, about how courts should handle expert witnesses’ COI (example). Big companies publish documents covering how to responsibly manage a conflict of interest. There are even jobs entirely devoted to managing COIs. A COI (whether actual or perceived) must be properly managed, and stakeholders should have insight into how they are managed.
The National Science Foundation, for example, lists several techniques that might be used to manage or eliminate a COI in its grant programs (under item 4):
- modification of a plan
- severance of relationships
But there’s a word for COIs that trump expertise
Of course — as the last three of the NSF’s options listed above suggest — there are COIs that go far beyond the ordinary, and that must be avoided altogether in order to protect the integrity of a project or an institution.
If a judge were to make a habit of golfing and making business deals with an attorney arguing in his court, for instance, he would have a hard time convincing the public of his integrity — no matter how diligent and principled his approach. When Dick Cheney was nominated for Vice President of the United States, many considered his deep ties to defense contractor Halliburton a disqualifying factor; while he arguably addressed that by assigning his future stock earnings to charity, I don’t remember anybody arguing that the COI could have been responsibly managed if he had taken no such action.
Such COIs are identified, in the kind of documents referenced above, as “unmanageable.” The prospect of an unmanageable COI may indeed disqualify somebody from taking a certain position.
I looked into a few definitions of unmanageable COI. Organizations considering the concept explicitly identify it as a rare case. It would typically result from a case where an individual wields a great deal of influence, or where similarly unusual circumstances apply. The American Academy of Neurology, for instance, states that “disclosure is the appropriate remedy for mitigating most instances of conflict of interest.” The 2004 paper “Implementation of financial disclosure policies to manage conflicts of interest” studied the determinations of various COI committees, and found that only 2% of the cases considered were deemed unmanageable.
COIs on Wikipedia are manageable — if disclosed
In the context of Wikipedia, the most important way to manage a COI is for each editor to at least make a sincere effort to manage his or her own behavior. The purpose of a Wikipedia article is, generally speaking, not to favor one position over another, but to explore the shared factual foundation that underlies competing positions; and to represent the existence of competing views, with good references. When a Wikipedia article (or a key paragraph in an article) hits the mark, people of competing views are satisfied with it, and readers gain the information they need to begin forming an opinion of their own.
In order to participate in moving Wikipedia articles toward that ideal, one needs to have some self-awareness around a topic, and learn to refrain from pushing one’s own views too hard — and to seek agreement from people one might ordinarily regard as adversaries.
That’s what Wiki Strategies does: We help editors manage their own COI.
But of course, many editors have a hard time doing that. But when they have disclosed a COI, that enables those around them to keep them in check. In an extreme case, an editor with a COI can be blocked entirely from participating in Wikipedia (or at least, from doing so within the rules — many blocked editors have tried to return under an anonymous account, though they often undermine themselves with their own patterns). But many less extreme measures are frequently employed. Those arguing with a COI editor might seek out colleagues with no particular stake in that particular topic, to help sort through decisions; they might seek input from a WikiProject, or seek more formal mediation or arbitration; and the results of arbitration might include stronger measures like a “topic ban” or an “interaction ban.” One of the hallmarks of Wikipedia is that an intervention need not exceed the needs of a situation; the wide array of available interventions means that an appropriate resolution, that permits expert input without allowing a COI to dominate a decision, is always available.
Wikipedians should always watch for COIs beyond its own pages
It is a good thing that Wikipedia has long had a guideline around COI, and ample discussions about how to handle it. There are many cases where I disagree with other well-known figures in the Wikipedia space about specifics (as many of my blog posts will attest); but regardless, frequent and robust consideration of the issue is essential.
But one thing concerns me: debates around COI in relation to Wikipedia often seem to focus too heavily on the COI of Wikipedia editors. I think this happens as a rather natural result of an ever-increasing body of content, and an ever-shrinking body of committed editors to watch over articles; those trends can create a stressful environment, and it’s no wonder that we are sometimes cranky and quick to judge individual editors with a COI.
As our panelists discussed back in 2012 at the Sponsored Point of View event, COIs infect the source materials we draw from in many insidious ways. Sometimes, Wikipedians do a great job in their detective work, identify such COIs, and write better Wikipedia articles as a result.
And sometimes, the Wikipedians doing the best work in tracking down and correcting this kind of ingrained COI have COIs themselves.
For that reason, perhaps above all, we should always fight to stay true to our ideal of being “the encyclopedia anyone can edit,” and to find ways to embrace editors who happen to have a COI, provided they make the effort to manage it responsibly.