[ad_1]
Twenty years ago, Wikipedia emerged as an interesting online project aimed at crowdsourcing and documenting all human knowledge and history in real time. Skeptics worried that much of the site would contain unreliable information and often pointed to errors.
Now, however, the online encyclopedia is often referred to in a balanced way as a place to help combat the false and misleading information that has spread elsewhere.
Last week, the Wikimedia Foundation, the group that manages Wikipedia, announced Maryana Alexander, a social entrepreneur in South Africa who has been fighting youth unemployment and women’s rights in nonprofits for years, will become its CEO in January.
We spoke with him about his vision for the group and how the organization is working to prevent false and misleading information on its sites and on the web.
Give us an idea of your direction and vision for Wikimedia, especially in such an information packed and polarized world.
Wikimedia projects, including Wikipedia, have a few key principles that I think are important starting points. This is an online encyclopedia. It’s not trying to be anything else. It is by no means trying to be a traditional social media platform. It has a structure managed by volunteer editors. And as you know, the foundation has no editorial control. This is a user-managed community that we support and enable.
The lessons to be learned not only from what we do, but also from how we continue to repeat and evolve, begin with this radical idea of transparency. Everything on Wikipedia is a quote. It is discussed on our talk pages. So even if people have different points of view, these discussions are open and transparent to everyone, and in some cases really allow for the right kind of back and forth. I think there is a need in such a polarized society – you have to make room for back and forth. But how do you do this in a way that is transparent and ultimately leads to a better product and better information?
And the last thing I’m going to say is, you know, this is a very humble and honest community. Looking to the future, how can we build on these qualities in terms of what this platform can continue to deliver to the community and provide free access to information? How can we be sure that we reach the full diversity of humanity in who is invited to participate, who is written about? How do we truly ensure that our collective efforts to more reflect reality reflect more of the global south, more women, and the diversity of human knowledge?
What is your view on how Wikipedia fits into the widespread problem of disinformation online?
Many of the key features of this platform are very different from some traditional social media platforms. If you get false information about Covid, the Wikimedia Foundation has partnered with the World Health Organization. A group of volunteers got together around what is called WikiProject Medicine, creating articles that focus on medical content and then very carefully watch, because these are the kinds of topics you want to be wary of misinformation about.
Another example is the foundation’s creation of a task force, again trying to be very proactive before the US elections. [The task force supported 56,000 volunteer editors watching and monitoring key election pages.] And the fact that there are only 33 returns main US election page It was an example of how to focus on key issues where misinformation poses real risks.
Then there’s another example that I think is really cool, a podcast called “The World According to Wikipedia”. And in one of the episodes, there’s a volunteer interviewed and he really did his job being one of the main viewers of the climate change pages.
We have technology that alerts these editors when changes are made to any of the pages so they can see what the changes are. In fact, there is an opportunity to temporarily lock a page if there is a risk of misinformation being leaked. No one wants to do this unless absolutely necessary. The climate change example is helpful because the talk pages behind it cause great controversy. Our editor says: “Let’s discuss. But this is a page that I watch and follow closely.”
A big debate taking place on these social media platforms right now is this issue of information censorship. There are those who claim that prejudiced views come to the fore on these platforms and that more conservative views are removed. While you are thinking about how to handle these discussions when you head Wikipedia, how do you make judicial decisions while being in the background?
For me, what’s inspiring about this organization and these communities is that there were pillars that were created on Day 1 when founding Wikipedia. One of them is the idea of presenting information from an impartial point of view, and this impartiality requires understanding all parties and all viewpoints.
That’s what I said before: Forget the discussions on the talk pages, but then come to the informed, documented, verifiable kind of conclusion about the articles. I think this is a fundamental principle that could potentially offer others something to learn.
Since you come from a progressive organization that fights for women’s rights, have you given much thought to weaponizing false information to say it could influence your background in your calls to what’s allowed on Wikipedia?
I would say two things. I would say that the really relevant aspects of my past work are the volunteer-led movements that are probably much more difficult than others think, and I really play an operational role in understanding how to build systems. , build culture and processes that I think will be relevant for an organization and a range of communities trying to increase their scale and reach.
The second thing I will say is that I was on my own learning journey again and I invite you on a learning journey with me. I choose to interact with others on the assumption of goodwill and to relate in respectful and civil ways. That doesn’t mean other people will do it. But I think we should hold on to it as a yearning and a way to be the change we want to see in the world.
When I was in college, I used to do most of my research on Wikipedia, and some of my professors would say, ‘You know, that’s not a legitimate source. But I still used it constantly. I was wondering if you had any ideas on this!
I think most professors now admit that they sneaked onto Wikipedia looking for something!
You know, this year we celebrate Wikipedia’s 20th anniversary. On the one hand, there was this thing that I thought people were making fun of and saying you can’t get anywhere. And now it has become the most legitimately referenced resource in all of human history. I can tell you from my own conversations with academics that the narrative about sources and use of Wikipedia has changed.
[ad_2]
Source link