The Real Reason Miriam Adelson Deserves a Wikipedia Page

A new AI program claims it can fix Wikipedia’s gender bias by identifying – and writing articles about – deserving female scientists. But does data contain its own bias?

Miriam Adelson and her husband.
Moti Milrod

“The biggest problem with Wikipedia,” an avid editor involved with the online project once told me, “is that people write it and that people read it.” The belief that technology will help humanity overcome the inherent biases of the human mind is commonplace among the Wikipedia community. Last week, Wikipedia’s techno-idealists got a taste of the power artificial intelligence can have, when a tech firm called Primer launched a program that writes Wikipedia articles automatically, suggests timely updates to existing entries, and even identifies deserving ones that still need to be written.

The program, called Quicksilver, can help solve Wikipedia’s gender problem, the company claims, by finding female subjects worthy of Wikipedia articles but currently lacking them. As a proof of their program’s ability, Primer created 100 sample articles written entirely by the program about female scientists it identified as appropriate. One of the first subjects it proposed: Miriam Adelson, (née Farbstein) is an Israeli-born physician who made a name for herself as an expert on addiction treatment and wife of the casino mogul and right-wing mega-donor Sheldon Adelson.

The fact that she was chosen highlights how the program can successfully identify women in the scientific world who may have been shunned by Wikipedia for political or social reasons that have more to do with her husband than to her own professional work. At the same time, the choice of such a wealthy and powerful woman also shines light on the limitations of big data in upending the existing social-power hierarchy.

Through the advent of machine learning, Quicksilver was “trained” on 30,000 existing Wikipedia articles about scientists. Through them, it “learned” how a Wikipedia article is structured and written. More important, perhaps, using the large group of Wikipedia articles as its data set, the program also deduced what scale and scope of academic work seem to justify giving a certain scientific figure a Wikipedia article.

After being fed more than 200,000 academic articles and given access to online news publications, Quicksilver quickly produced a list of 40,000 academics it says were deserving of a Wikipedia article, but didn’t yet have one.

The question of who deserves an entry in Wikipedia is a tricky one – especially when it comes to living people. Formally, Wikipedia has what it calls the “general notability guidelines” – criteria decided on by the community of editors governing who is sufficiently notable for an entry. Actors, artists, writers, CEOs, there are “GNG”s for almost every field and profession, and all have to meet Wikipedia’s “central notability criterion,” which was put in place to prevent the encyclopedia that can be written by anyone from turning into a platform for self-promotion. Who doesn’t want their very own Wikipedia page?

For example, though I write extensively about Wikipedia, not enough has been written about me or my work to make me “notable” enough to justify an article about me – per the GNG for journalists.

Feminists active on Wikipedia have long claimed that the online encyclopedia’s guidelines are biased against women and help preserve the gender imbalance embodied in the idea that history is literally “his story.” Feminist groups on Wikipedia like “Women in Red” also claim that the task of “de-gendering” knowledge is further hindered by the lack of female editors on the platform. This, they claim, creates a situation whereby Wikipedia’s community of volunteers, because it is male dominated, is less likely to write up articles about women or change GNG criteria to accommodate women – scientists and others – who have been pushed out of the history books.

Primer claims it can fix this by doing away with the human element. The bots, it suggests, being blind to gender, can judge scientists by objective criteria alone, unhindered by human prejudice.

The idea, however, that big data does not contain biases – a common assumption among computer scientists and the tech community – seems to be misplaced.

For example, in a number of widely reported cases, facial-recognition programs had a harder time recognizing black faces because the algorithms powering them were written by – and trained on – white engineers.

Quicksilver also provides a good a example of how data can preserve certain biases, intentionally and unintentionally: The initial data fed into the program was that of academics from the world of computer science, skewing the results in favor of that field from the outset. More so, a large number of those Quicksilver proposed for articles were American figures from the world of IT, suggesting that the initial dataset provided by the San Francisco-based company reflected its own location as much as their own backgrounds as engineers.

Adelson also proves an interesting example, too. The “GNG” for academics, in all fields, also called Wikipedia’s “prof test,” requires that a living scholar’s research have a “significant impact in their scholarly discipline [or] outside academia.”

It is doubtful Dr. Adelson’s academic work would meet Wikipedia’s criteria without the powerful soapbox provided through her marriage, if only because the research center that publishes her, the Adelson Clinic, happens to be funded by her and her husband. There are thousands of scientists who publish extensively and still fail to meet Wikipedia’s notability guidelines, because they have received little press, for example.

Moreover, though Quicksilver recognized Adelson for her own merits, Wikipedia’s community only held them to be notable because of their wider social significance.

“Miriam Adelson is an Israeli American philanthropist, doctor and political donor. She married Sheldon Adelson in 1991, and has since become a prominent Republican party donor. She is the current publisher of the [newspaper] Israel Hayom” – is how the current version of Wikipedia article opens, with little else said about her academic work.

This was not the first time Miriam Adelson had a Wikipedia article written about her: In 2012, an article was proposed but quickly merged within that of her husband’s. A similar process can be said to be occurring this time around as well, with successive edits casting her more and more in the political shadow of her husband – the two are now even pictured together.

The article written by Quicksilver for Dr. Adelson – and later uploaded to Wikipedia – reveals that though the data may be blind to her money and power, the knowledge it reflects is not. There are no technological solutions to social problems, and solving Wikipedia’s gender problems will take more than artificial intelligence.