[00:00:02] Speaker A: Welcome to the 228th episode of the Atlas Society Asks. I'm Jag. I'm the CEO of the Atlas Society. I'm very excited to welcome back Ashley Rinsberg, novelist, journalist, author of the Grey lady, how the New York Times Misreporting Distortions and Fabrications Radically Alter History. He's recently turned his focus to to Wikipedia as the latest victim in the far left long march through institutions. It's a process I've been dismayed to watch, among other reasons, because Wikipedia's co founder Jimmy Wales is someone I've long admired. So Ashley, welcome back.
[00:00:47] Speaker B: Thank you so much for having me. I appreciate it.
[00:00:50] Speaker A: Yes. And thank you for accommodating us. I know that you are over in the uk, so it's late there.
We are going to get into what's been going on at Wikipedia, the weirdness there. But first I wanted to take a moment just to catch up with you as it's been a hot minute. You first joined us back in February of 2022 and there's been a few things going on in the world and going on in your life since then. So last time we spoke you were living in Israel. Now you're joining us from London. Did you just move there or are you there temporarily? Catch us up.
[00:01:31] Speaker B: Yeah, we moved here in 2022 and I've been in Israel for my wife as well for quite a long time and it was sort of time for change, especially post Covid. So we are now UK based and enjoying London and also confronting some of the interesting challenges here regarding free speech and some of the changing political dynamics. So interesting time to be alive, as they say.
[00:02:03] Speaker A: Yeah. So I've been following events in the UK on X& it conveys a kind of dystopian picture of police persecuting ordinary citizens for wrong speak on the Internet while actual violent murderers are let out of jail after ludicrously light sentences. Are we in the US getting the full picture or is it being kind of sensationalized by hot takes that we see online?
[00:02:33] Speaker B: Yeah, this is one of those topics that I really think is actually not being sensationalized. And it is actually pretty reflective of the reality where some of these cases that you hear about the high profile people that are being arrested or they get a knock on the door from the police, or sometimes you see a clip of someone who is getting one of these like home visits by the police to ask questions about something they posted online and you get the sense that maybe this is just isolated, you know, there's A lot of people in the uk, and maybe a few of them have been, have been like, incorrectly singled out. But more and more what we're actually hearing is regular people. And, you know, we're talking all walks of life from blue collar people, right up until I heard recently of someone, a highly trained medical professional investor, as in another case where these people are not just getting knocks on the door with a few intimidating questions from the police, they're actually being arrested and they're being arrested for a tweet, they're being arrested for private Facebook messages to other people that were considered either hateful or inflammatory or inciting.
And this is now becoming sort of a norm, a new. A new normal within the uk.
[00:03:51] Speaker A: Wow. Out of control, A new abnormal. Now, in our last interview, we focused on your book on the New York Times. You were pretty pessimistic about the prospects for any course correction at the paper. In the past, well, almost three years, have we seen any significant changes or has it been pretty much what you expected?
[00:04:15] Speaker B: Yeah, the big change at the Times is that they replaced their executive editor. So the previous editor was still. Is, I would say, more on the harder progressive side of things and push the paper in that direction. Obviously that was something that was part of the culture, cultural context at that moment where people were really interested in those stories. It seemed to be the way that the culture was moving. And I think the Times was trying to capture that energy.
Since then, they replaced the editor with a much more traditional editor who was a longtime reporter on and in China.
So that's kind of a shift away from some of the internal domestic stuff and more looking at the world more broadly.
And surprisingly, the world is not as interested in woke politics as the blue cities of the United States seem to have been, at least. So that's been a big shift, but he's got some significant challenges in really turning that ship around.
[00:05:22] Speaker A: Yeah. Now, with regard to some of the other papers, at least as far as the owners are concerned, both the Washington Post and the LA Times refused to endorse a presidential candidate in the past presidential election. And the latter's owner, Dr. Patrick Soon Xiong, said, quote, if it's news, it should be just the facts, period. If it's an opinion, that's maybe an opinion of the news. And that's what I now call a voice. Also, we want voices from all sides to be heard and we want the news to be just about the fact. So what do you make out of that? Might there still be an opportunity for these other outlets to Right. Their ships.
[00:06:06] Speaker B: I think the case with the LA Times might be a little bit different than the Washington Post, what happened there, because the LA Times is also, I think, falls into this sort of hard, progressive, quite leftist take on affairs. And what we heard from the daughter of the owner of that paper was that the reason they withheld the endorsement of Harris was not because they wanted to appear to be politically neutral, but because it was a protest vote, so to speak, against Harris's what they perceived to be her support of Israel during the Gaza war. And they were protesting that. So that really actually came from the far left, if you take it, if you read it that way, and if you believe what his daughter said in the media.
And it's possible it could go either way. It seems believable to me. But with the Washington Post, my read on it was that Bezos was like, really actually trying to stay out of politics. I think he got a sense of which way the wind was blowing, both politically and culturally, and the idea that you were going to be, or that his he and his business interests and the newspaper would be in Trump's crosshairs after the election. It was just not worth it. It's kind of like an asymmetric bet. They put out an endorsement that everybody expects.
There's very little upside there, but there's a whole lot of downside, which is that he could have his businesses entangled in regulation, he could have the paper targeted, or even more simply that the broader American public, the potential readership for the Washington Post looks at the paper like we look at a lot of these other papers and say, okay, these guys are in the tank. No big surprise there. So this might have been an attempt by Bezos to change the weather within the paper, just kind of to reset things a little bit.
[00:08:02] Speaker A: Well, we wish him success now, sticking with Israel and Gaza for a second. Of course. Course, last time since we spoke, tragically a big development were the horrific October 7th Hamas terrorist attacks, killing over 1200 Israelis. That largest massacre of Jews since the Holocaust. Were you in Israel at the time, and how did those attacks impact you with your friends, your family?
[00:08:29] Speaker B: We were in London. We were. In the morning, we found out was. Was the Jewish holiday of Simcha Torah. And, you know, there's just these rumors that started to fly that morning. It's the same Torah is a. As a holiday that is really revolves around joy being happy. And these rumors started to fly that, you know, I was told early in the morning that something like 20 hostages have been taken to Gaza. 30 hostages and I was shocked by that. I mean, even knowing the number, and obviously this was a small fraction of what was actually the case, but knowing that, my sense was like this was a complete game changer, that whatever had happened or whatever was going on that morning would spark a potential regional war that would last for months or years. I think everyone around me, including myself, we were all extremely shaken by it. There was a sense of total disorientation and kind of a feeling that the ground beneath your feet was crumbling a bit. Everything you had assumed about Israel, about the security of Jewish people in the world, was overturned in an instant. Since then, fortunately, things have been have.
I think Israel has restored obviously its border, its security, and probably some amount of deterrence in the region after it has gone after Hamas in Gaza and Hezbollah in Lebanon and to an extent the Iranian regime. But I don't think any of us take that for granted at this point. I think we're all still living in a place where we realize the world is not as secure a place as we once imagined it.
[00:10:16] Speaker A: Right.
So speaking of October 7th and its aftermath, that's a good jumping off place to talk about what's happened to Wikipedia. Last month you wrote a piece for Pirate Wires, and I also like to hear about, you know, that the new media company and its kind of orientation, direction, mission. But your piece was, quote, how Wikipedia's pro Hamas editors hijacked the Israel Palestine narrative, in which you found, quote, a coordinated campaign led by around 40 Wikipedia editors working to delegitimize Israel present radical Islamist groups in a favorable light and position fringe academic views on the Israel Palestine conflict as mainstream over the past years, intensifying after the October 7th attack. So what are some examples of that? And we're going to put the link in there because I think this is a piece that everybody needs to read.
[00:11:23] Speaker B: Yeah, this was a really shocking piece of investigation that I did.
And it really touches on what's going on in Wikipedia generally, which is that people have understood that if you sort of are able to hack Wikipedia articles, and you do that just by sort of having these little clusters of editors, these swarms of editors who are ideologically aligned and they're coordinating in some fashion, you can change articles. And because Wikipedia has such a close connection with Google, such that every Google topic search, or let's say at least 80% of them, put a Wikipedia article as the first result in Google, which means millions of people are exposed to whatever idea that these small groups of editors are able to seed into the articles. So in this case, what they were able to do was they made something like 850,000 edits across nearly 10,000 articles in the Israel Palestine space. They were editing things like downplaying allegations of rape or even removing allegations of rape on October 7th. They were really whitewashing accusations, very credible accusations, and corroborated in some cases accusations of Iranian human rights abuses and crimes. I mean, this, they would just go through and remove dozens and dozens of mentions of these kinds of crimes by the Iranian regime. They were whitewashing Hezbollah, removing references of terror and terrorism by Hezbollah from lots of different Wikipedia articles. And they were also making a very concerted effort to sort of sever any ties between the Jewish people in Israel in hundreds of articles. So the idea there would be to delegitimize Israel and try to show that the Jewish people don't really have any place in Israel whatsoever. And case by case, some of these things can appear very trivial. You can look at them and kind of shrug and be like, doesn't seem that big a deal. But when you take it as a whole, when you think about the scale of the setting, we're talking about nearly 1 million edits, then you start to understand that all these edits add up to a sum that's greater than, than the, to a whole that's greater than some of the parts, because they're really shifting the entire landscape. And again, because all this stuff ends up on the front page of Google. Not just the front page, but the top result. This means that millions and millions of people around the world are absorbing a perspective that is actually tied back to groups of radical editors. And for all we know, because they're all anonymous and there's no way to understand who they are, where they come from, they also could be tied to other interests and other agendas that are hidden.
And obviously this is an enormous problem and it's something that very few people are talking about.
[00:14:14] Speaker A: Yeah, because one of the things I found myself asking as I was reading about the sheer scope of such a coordinated editing campaign and this shadowy group, Tech for Palestine, you pointed out that just two of these editors contributed 15,000 and 12,000 edits on Palestine Israel articles in just the past three years, putting them in the top 99.975% of editors by the number of edits. So we don't know who they are. Are they volunteers, Are they paid? I mean, the volume of such edits really would seem like a full time job.
[00:14:57] Speaker B: A lot of, a lot of editors on Wikipedia, especially in very contentious areas, they are volunteers, but they're very, very dedicated. And that's what gives them an edge over people who. Let's say you might have a reader who might even be an expert in that area and says, hold on a second, this is obviously wrong. And what that expert will do will sort of create an account on Wikipedia and go in and make an edit. Be like, okay, I fixed it. But because that editor doesn't have that depth of experience in Wikipedia procedure, it's almost this, like, parliamentarian system of rules that you need to understand how it works, you need to understand how to communicate. And also, you don't have these alliances with other editors, because what they end up doing is they sort of gang up or they team up and they wage these edit wars together. Well, they'll just exhaust an editor.
This happened in a number of cases where you just see there would be one person who's sort of standing up for, let's say, an example where the Grand Mufti of Jerusalem is a very famous Palestinian leader From the early 20th century, one of the most important Palestinian figures in history. And there was a debate on whether or not there should be a photo of the Grand Mufti touring a concentration camp, because the Mufti had made an alliance with Hitler and worked very closely with Hitler. And there was one editor who said, we should include the photo. This is a very important photo. There is a news article about this in Tablet magazine. And that is sort of the threshold for whether or not to include something is the news media covering it. And there were three other editors who jumped in. And these were three editors who were part of this group of 40 editors that had coordinated with each other extensively and intensively. And the three of them, over the course of days, just sort of wore this one guy down until he literally put, like, threw his hands in the air and said, okay, that's. We're just going to stop. That's only one isolated case. But that's the template, which is that two of us, three of us together, especially out of the 40, where we can sort of rotate through the cluster to make it. To make it seem as if we're not actually coordinating, and you can then hide the activity from Wikipedia administrators.
You can really win the vast majority, if not all of these types of debates over controversial topics like that one. So that's the kind of stuff that we're seeing going on. It's very, very hard to regulate this stuff by, let's say, the Wikipedia administrative apparatus. They don't have a handle on it. They're under resourced, they're understaffed because they're all volunteers. So these people have identified a vulnerability and they are now exploiting it.
[00:17:42] Speaker A: I don't know, it sounds to me like that these volunteer editors, some of them that are coordinating this, especially when you see them going beyond Israel and Palestine and actively trying to water down the human rights abuses by the Iranian regime that if there was a way to trace the money, I'm just saying it kind of stinks. And apologies everybody. I am, as you can see, not in Malibu, I am in New York. So again, sorry for any of the background noise.
So how does this behind the scenes coordination violate Wikipedia's own stated principles and operations of value?
[00:18:31] Speaker B: There are lots and lots of policies that prevent, or are intended to prevent and prohibit this kind of activity. One is called canvassing. Or canvassing is you're not supposed to actually be coordinating to push an ideological view or any kind of view in a way that is, even, even sort of doesn't necessarily rise to the level of explicit coordination. They don't have to be like on an email chain. But if they're working together to push a viewpoint, what that does is that tilts the, the balance of influence on an article and it kind of short circuits the whole point of Wikipedia, which is that this is supposed to be truly crowdsourced knowledge. You're supposed to be bringing in a variety of viewpoints and not just the viewpoint of, let's say, 30, 40 people who happen to agree with each other on this issue. There are a number of other policies regarding point of view that are very important to the site. You're supposed to have a neutral point of view. And if you, if you are, you know, pro Palestinian and you're working to push that agenda, obviously that's a violation of that policy, which is an important policy on the site. So there's probably at least four or five different policies and guidelines that are being violated. I think people on the site who are either conversant with it or sort of expert at the, within the Wikipedia ecosystem understand that this is absolutely a violation. Since that article came out, there's been at least one investigation that has been started, and that's particularly with regard to the Tech for Palestine group, because that was explicit. This is a case where people are out in the open using discord to communicate with each other. In order to do this kind of canvassing activity, they are saying, okay, let's edit this article, let's go after these articles. You do this, I'll do that. And it was led by one veteran editor who was helping sort of coordinate all the newbies and the less experienced people. And she is now, at least we presume it to be a she. Her name is, the username is Ivana. She is now under investigation by Wikipedia's arbitration committee. So, you know, these things take time. Wikipedia is a bureaucracy in many ways and there's a lot of procedures and there's a lot of rules to be followed and there's a lot of committees that have to be consulted for things to get to a point where a decision is actually made. Things can take a lot of time. And when you have that kind of bottleneck, especially when we're dealing with a website that has nearly 7 million articles on it, you, you're just naturally going to have to either wait a long time for decisions to be made or you're going to also also restrict what can be decided upon. Like, there's just only so much that the arbitrators can handle. And this is what we're seeing today is how we got here.
[00:21:26] Speaker A: So we are going to get to our audience questions. I do just want to get some of the, you know, meat on the bone here.
You did actually a previous investigation into Wikipedia for pirate pirate flyers this summer, resulting in another expose quote, how the regime captured Wikipedia inside the cultural revolution at Wikipedia, which pivoted it from a decentralized database of all of the world's knowledge to a top down social activism and advocacy machine.
Now you mentioned previously about how, you know, you capture Wikipedia and then that's what shows up on Google. Can you nerd out a little bit about that relationship between Wikipedia and Google and that dynamic?
[00:22:18] Speaker B: Yeah, that's, that's core to everything that happens on Wikipedia. If Wikipedia was just a massive website that you had to like, you know, click over to the second or third page of Google on a, on a result for a search and then kind of like find the article. Nobody would care. Like, no, we wouldn't even be talking about this. The fact that it ranks, that any given article on a search ranks in the first place on the result on the first page is exactly why this matters. And for Google, what this means is that as Google was growing, I mean, today of course, we think about this global juggernaut with pretty much infinite money and they can do whatever they want. But even as recently as 10 or 15 years ago, Google was not that. And what Google really, really needed and still needs today is high quality content, or at least content that can be perceived as high quality. And that could be perceived as trustworthy and relatively neutral. That's why we use Google. If you search something like the Ukraine war and you get like some crazy blog, hyperpartisan blog from either side, you're kind of, you're going to dismiss it. You say, this is not really what I'm looking for. But if you get this really nicely formatted Wikipedia article that is published under all these policies regarding neutrality and point of view and all these things, these lofty ideas, and it's crowdsourced and it doesn't really belong to anyone, then you think to yourself, okay, this is actually working. This is good. Google is giving me good stuff. I like Google.
What that meant for Google is that they didn't have to pay anybody for this stuff because Wikipedia is volunteer based. It's not owned by any for profit structure. That meant Google got this amazing source of free content and Wikipedia got this incredible source of free traffic. They both did really well by that around 2016. And of course this is now in the the wake or the aftermath of Trump's presidency where misinformation, disinformation start becoming a buzz term among the establishment. Google starts, sorry, Wikipedia starts to make this shift led by Catherine Maher was at the time the CEO of Wikimedia foundation, which is the NGO that owns Wikipedia. And she starts to develop a plan called the Wikipedia Momentum Strategy or Wikipedia 2030, which is really about saying Wikipedia needs to be centered to this new shift in how we understand information. We looking at information as a potential threat, but we definitely understand it as a source of immense power in today's world. And we need to take an approach that leans into this idea. What they do is they start developing a partnership with the Tides foundation or the Tides center, which has a number of different foundations and funds within it. The Tides foundation itself is a billion dollar fund. And it's not the only one within Tides, but Tides is a very, very progressive activist organization that is premised on social change and bringing about social justice within American society.
Wikipedia stitches itself into our Wikimedia foundation, stitches itself into Tides financially by creating an endowment that they want to raise $100 million for. They nest it inside of Tides foundation and they also send the general counsel from Tides into Wikimedia foundation to be the general counsel there. So you have this very tight sort of weaving of these two organizations together. @ that same time, Google did something extremely unusual. So the way Tides foundation works is that it's a donor advised fund that means that people can donate into Tides and then Tides will distribute the money to various other NGOs without anyone else knowing where that money went. So you could say, I like this cause, but I don't want anyone to know that I'm donating to it. I donate to Tides, and the Tides distributes the money on my behalf.
Google, during the same period, 20, 17, 18, 19, starts making massive donations to Tides, and we're talking about 10x what anything it had ever done before, Google would usually donate on the order of 1 million or 2 million dollars to any given NGO. At that moment, they start donating 40, 70, and 90 million dollars per year to Tides, and then it drops off again after that. So right around that time, something happens. And it all lines up with this momentum strategy at Wikimedia foundation, where they're looking at this Trump win, seeing it as a major global threat, seeing information as a new form of warfare that's being waged around the world, and understanding that Wikipedia has more ability to influence people through information than virtually any other single entity on Earth, maybe, except for Google or maybe Facebook or, you know, two other two or three other absolutely massive platforms. But Wikipedia is right up there. I think it's, you know, for sure top 10, maybe top 5 website by traffic. And this is what we saw happen through that period, that crucial period.
[00:27:36] Speaker A: Wow. All right, so we're going to get to some of these questions. This one's a slightly different topic, but one that you actually know probably more about than others. My modern gal asking any thoughts on the court case surrounding the Internet Archive. Do you think this is an act of censorship? So I'm not aware of this court case. Maybe you are.
[00:27:58] Speaker B: Yes, I think I am. I think that's the Hachette case, the publisher, because the Internet Archive has been publishing books, so it scans a lot of books. A lot of these books are in the public domain, but some of them are not in public domain. And the argument that the Internet Archive makes is that these. It's sort of fair use, that it's. It's not publishing these books so that it can make money. It doesn't charge people for it. It's doing it as a service so that people can use these books. It's premised on the idea that the Internet Archive is. Its sort of core mission is to provide universal access to all human knowledge. And I know that because I used to work there. It was my first job after college. But, you know, I think this is one of these thorny issues, the Archive and a lot of the organizations that it's. It's allied with are really trying to push back against some of the more won't say egregious, but the overreach and copyright law, where you have big companies like Disney, for example, where Mickey Mouse is the core to the Disney brand, but it was Mickey Mouse was created whatever, 100 years ago, whatever it was and technically should be out of copyright in the public domain. But I don't believe it is because these companies are finding new, novel ways to keep things in the private domain. So this is kind of that back and forth and that tension between companies and NGOs like Internet Archive. I think it's a healthy thing that we have that kind of discourse. I think it's good on both sides that there's pressure coming from the private sector and also coming from places like the archive.
[00:29:40] Speaker A: All right, Alan Turner has a question to see if you have a take on whether Google should curate search results or is that it should be more of an interest algorithm based on what is the most relevant.
[00:29:58] Speaker B: Great question.
I think that, you know, once we start curating, which I think we're seeing that more and more on Google, where it seems to be politically curated, we had this thing right before the election where to vote. So if you put in like where to vote for Harris, you would get like a very precise location around you. Like you can go here, here, here, here. And if you did the same, where to vote for Trump, you didn't get anything like that. You got some really vague stuff. And you didn't get Google suggesting that you should go vote at this particular location near to you. So these kinds of things can become very dangerous. Once you start curating for Google, it also opens up a huge legal liability, which is that once Google starts making decisions in an editorial fashion, it is no longer just a platform where other people are publishing on it, but it itself becomes or could be considered a publisher, which opens it up to liability to say they could be sued for someone saying you published and you are responsible for the publication of this various piece of these various kinds of information which were false or led to an outcome that had some adverse financial effect and you are now liable to restore the damages lost to us. So Google's going to have to deal with this. I think this is all happening in the context of AI and we're really starting to see the shift away from searching through an index like Google and more towards chat, where it's question and answer. And Google's also starting to experiment with that as well.
[00:31:39] Speaker A: I had not been aware of that case that if you searched, where can I go vote? And they knew that you were a Harris voter, that you would get precise geolocated suggestions. But you know, with, if you were more along the lines of a Trump voter, you'd be like, I don't know, maybe, maybe just don't.
Okay. Jackson Sinclair has a really interesting question and it kind of gets to our, you know, bailiwick as a pro capitalism organization. He asks, if Wikipedia was a for profit entity, do you think its original mission statement and quality standards would be better maintained?
[00:32:23] Speaker B: Probably not, but I think there would be some improvement in certain areas. For example, you would be able to at least provide more resource and more human power into some of the things like the arbitration committee, which right now I think arbitration committee is down to something like 10 or even less active members. And these are the people are making the final decisions on who gets banned or what happens to a certain page, or if an editor has been misbehaving.
10 people for a site with 7 million articles is obviously inadequate, but there's not much they can do about it because the bureaucracy is so frozen. It's, you know, they have bylaws and there's way that things are done on Wikipedia. They are very, very difficult to change. Whereas in a private company you have more flexibility, more adaptability and say, you know, why can't we just create like a panel of experts on Israel, Palestine and they're going to govern the whole topic area, which in a way kind of seems tempting, but it also can lead to exactly the kind of things that we saw with Twitter back in the day, where Twitter was a private company and had the resources, but they were still outsourcing the stuff, or not even outsourcing was in house to people who were ideologically biased and were then working with the United States government or working with the FBI, working with the Democratic Party to censor and shut down certain stories. So I don't think there's any necessary safeguard when it comes to the privatization of something like Wikipedia. I do think competition though would provide more of a guardrail. So if we're able to see a Wikipedia alternative, that's where I think Wikipedia might start to realize, oh, wait a second, we're not the only game in town. We need to clean up our act because these guys over there are growing fast and they might one day eat our lunch. Right now we don't really have that. There are some good alternatives. There is something called Justopedia, which is a good site, but they don't have the 20 years of growth that Wikipedia has had, has enjoyed. And they don't have this alliance with Google. So competition, I think is a good answer. I'm not necessarily sure if it's privatization.
[00:34:32] Speaker A: All right, here is an earlier question from a regular Kingfisher 21. When we were talking about the craziness going over in the uk, he asks Ashley, what are your thoughts on news outlets in the uk? Do any of them stand out as better or worse than the New York Times?
[00:34:51] Speaker B: Yeah, I think the uk, you know, it's got a very active, thriving media ecosystem. It's known for that, the Fleet Street Group. And I always like what's going on in the Telegraph. I think they're great, I've written for them, they seem to be on point. They're not afraid to speak their mind sometimes. The Times is also really good. I mean, obviously these are traditionally considered right of center. The Guardian, I think, is completely captured and always has been. It's always been a tool of social activism and social justice from its very earliest days. And that's part of the reason it was created, was to pursue those kinds of ideals.
And we have some younger, newer publications popping up, like Unherd, which I've written extensively for in the past as well, which as a great publication, it's led by Freddie Sayers, who is incredibly sharp as an editor. They're pushing ahead. They recently, I believe, merged with the Spectator, which I think is going to be a good thing for both publications. And we're seeing this kind of, kind of empowering of publications that are, that are a little bit less orthodox than the traditional British publications and sort of widening of the spectrum of different points of view that have generally been available to the media consuming public here.
[00:36:17] Speaker A: So now that you've given this love to some of these other outlets, I'm a subscriber to Pirate Wires. So tell us a little bit about this new media outlet and when you joined and what its focus is.
[00:36:34] Speaker B: Yeah, Pirate Wires is one of these outlets that's doing something really different, not just in the stuff that we cover, which is definitely the case, but also the tone. Mike Solana is the founder and editor in chief and he is someone who has really been immersed in everything digital for so long. He worked and still works at Founders Fund with Peter Thiel and has a real sense of where the culture is going. And I think this is something that he started to understand by seeing that the culture is shifting onto tech itself. So we're seeing culture emerge on X and culture emerge on other, other social platforms. And that's actually where the action is, that's like where politics is happening. We saw this with Trump in the election, where Trump went on, Rogan went on X to talk with Elon, and people are really there and the energy is there. And that's something that Mike Solana understood very early and wanted to sort of like, wrap his arms around that idea and put it into a publication that's also doing investigative stuff, that we have a daily publication that does really short takes on stuff that's going on in the news that people absolutely love because it gives a bit of a different perspective, and it does it in language that's a little bit more lively and fresh and is kind of, like, willing to push the boundaries a little bit compared to a traditional publication. I've been there for about a month or so, so pretty, pretty new. But I've been writing for them for a bit longer than that, with a lot of the Wikipedia stuff and some other stuff on censorship and on media and some of the stuff that's going on in Europe having to do with the overregulation of tech. So we're definitely off to the races. I definitely encourage everyone to take a look, to at least subscribe for the daily newsletter, which is free. And, you know, from there, you can make a decision if you want to subscribe and give it a shot for two weeks of free trial.
But it's great, great publication, a lot of good writers, and a lot of really good art, too, which I appreciate.
[00:38:39] Speaker A: Yeah. I would say if you love Twitter X and you like some of the style and banter, then I think you'd also find that Pirate Wires is a great product for you. So in your coverage of Wikipedia, one of the examples that you use is one that, you know, many people might have seen happening in the news, but didn't maybe understand that it actually was happening first on Wikipedia. And that was the example of Harris as border czar discipline appearing from their list of presidential czars. Can you unpack what happened there?
[00:39:22] Speaker B: Yeah, this is one of these things where Harris, you know, these cases that could go either way in this particular case, because the White House, when they announced her role, whenever that was, they were very careful not to call her a border czar. So. And that's, I think, because they didn't want to have that kind of responsibility attached to anyone in the administration for the situation at the border, they realized that they're actually not going to manage it, they're not going to solve it. So they didn't want her to be called a czar So a lot of the media early on surrounding the announcement said specifically, she's not a czar, she's not the border czar. But some of the media did say she is the border czar. There was an Axios piece and a couple others that are from outlets considered to be reliable sources on Wikipedia that described her as the border czar. So this could, this could fall in either direction.
Someone put her on a list of border czars that she was then subsequently taken off for that reason that some of the media said she was not the czar. But what was interesting about this is that it sort of worked out in Harris's favor and not the other way. So it wasn't clear cut, but nonetheless it was. The outcome was the one that was favorable to the Democratic Party and to the Harris campaign. It was the outcome that they would have wanted themselves. And that was an indicator of exactly what's going on on Wikipedia is that when there is gray space, it somehow falls, the decision falls into a more blue outcome.
There are other examples of this and there's some stuff with Trump that happened in the same vein.
[00:41:10] Speaker A: All right, well, what are some of the ways in which bias gets baked into Wikipedia citations? For example, you mentioned how they have a list of which sources are to be considered reliable and which are to be considered unreliable. And the reliable category included includes not just all of the legacy media, including the very unreliable New York Times, but also far left outlets like Mother Jones and Jacobin and the Nation.
And the unreliable category includes not just Fox News, but also the Wall Street Journal, who makes these determinations.
[00:41:51] Speaker B: Yeah, and Al Jazeera is considered green for reliable as well. That's owned by Qatar. And China Daily was given a yellow rather than a red. So a lot of the conservative publications are coded red for generally unreliable, where China Daily, which is an actual propaganda outlet owned by the Chinese state, is considered somewhat reliable. So the way this came about is really just one guy. I think this was in 2018 or so, made the list and that's it. Everyone just kind of started to accept it. There's a little caveat to the story though, is that that one guy is considered. His name is Mr. X on Wikipedia. He was one of the more aggressive figures on Wikipedia who was trying to shut down debate about political topics that was being raised by people that were more conservative on the site. So I did another article called How Wikipedia Launders Regime Propaganda. And that was talking about one particular user called ATSC Me. And ATSC Me was, I would say, more Conservative and was trying to push back against some of the evident bias in articles related to Trump, such as, you know, claiming that Trump is a, like outright liar and, you know, he like making lists of his lies and rather than saying he's been accused of or using language that's, you know, related to falsehoods, they were pushing a very hard line. And a lot of that was Mr. X. And Mr. X eventually had asked me banned from the American politics topic space, so she could no longer post at all in that space. And it was Mr. X who created this reliable sources list where he just wrote down who he liked. The New York Times is green, the Daily Mail was red. And continue to expand it. Other people continue to expand upon it as well. The community as an as a whole has never formally accepted it. It's not that there was a vote or anything. It's not that there was any power structure within Wikipedia that said this is the list we're going to use now. It's just this one guy who has a political agenda did it and everyone kind of went along with it.
[00:44:02] Speaker A: So let's say you're a volunteer editor and you're going in and you are either creating a page or you're making changes to a page. Are you allowed, does the system allow you to cite a Wall Street Journal article or is it just frowned upon or it can be challenged? How does that work in practice?
[00:44:26] Speaker B: The Journal in itself, I think there's part of it is coded red for unreliable. I think it may be the opinion page or it's not the entire thing. But a lot of the other publications on the right and conservative publications are just considered completely unreliable, such as the Daily Mail, for example. The Daily Mail did a lot of very, very important reporting on the origin of COVID and that it might have come from the lab. They did it almost before anybody else. But if you were to use that reporting and put it into an article to say, actually there is evidence that there could have been a live leak, and here's the evidence.
That citation would be evidence or would be a grounds for removing that claim. So someone could go and say, we're reverting your edit. This is not a legitimate edit because you have sourced it to a source that is considered generally unreliable. So it has an incredibly important effect on the way articles are shaped. And again, when this sort of trickles up to Google, then the claims that people are seeing on the front page of Google are the ones that are dictated by sources that are reliable and not those that are considered unreliable.
[00:45:37] Speaker A: Where are the co founders in all of this? I mentioned admiration for Jimmy Wales. He's somebody who in the past described himself as an objectivist.
Are they still active? Have they, you know, weighed in on what's going on?
[00:45:58] Speaker B: Yeah, Jimmy, Jimmy still has a presence.
He is sort of, you know, here and there on the site. He'll weigh in on certain topics.
He, I believe, is on the sort of like a top level oversight board at Wikimedia foundation which he is a part of. But I'd say from the look of it, and I don't want to too much because I don't exactly know his direct involvement day to day, but from the look of it, it's a bit hands off. Where today Wikimedia foundation is run by CEO.
There's a structure within it there, there's professionals at Wikimedia foundation that are doing fundraising, doing a lot of other stuff.
And then on the site itself, on Wikipedia itself, things are supposed to run independently. So things are supposed to run according to the rules and the policies of the site. Arbitration committee is supposed to have the final say on, on decisions related to the content on the site, though there are exceptions to that. And the admins as well on the site are supposed to be part of that governing structure. So it seems like Jimmy has taken a step back. I'd be very curious to know exactly what he's doing. I know he's got a book coming soon, but it seems like it's not really a direct and hands on role either at WMF or at Wikipedia day to day.
[00:47:19] Speaker A: Well, now that you're both in London, I hope that your paths cross and that you're able to have a productive exchange and you could tell him that his old pals at the Atlas Society miss him and want him back. So at the end of our last interview, we were talking about the New York Times. I asked you about your forecasts and prognostications. What is your take on the future of Wikipedia? Will it, like the legacy media, lose credibility and become kind of increasingly irrelevant? Or will the hundreds of millions in its endowment, in part funneled by Google through the Tides foundation, ensure that it will play an ever increasing role in our information ecosphere?
[00:48:12] Speaker B: I think the challenge and the strength Wikipedia has right now is the way that information is changing, the way we consume information, the way that it's produced and distributed. So we've lived in this indexed world, indexed world of information, where Google sort of rules everything.
We're seeing a shift away from that. As kind of mentioned before, towards the LLMs, towards chat. And to the extent that Wikipedia is able to keep up with that change and is not crowded out by competitors, I think it can persist. I mean, right now it's pretty clear that Wikipedia is feeding a lot of information into various LLMs.
They're training on its, on its data. It has developed some APIs that let some of these companies do this for money, they pay them for it. So if it continues to evolve that way, I think it still is going to play a pretty important role in our lives. But my sense is that at some point someone is going to innovate in that space, in the space of how do we gather information that's out there, how do we put it into a good format, how do we ensure it's credible and accurate, and then how do we deliver it to a lot of people who want it. I don't think it's going to be too long before somebody does that and Wikipedia loses its sort of primacy of place in this entire ecosystem.
[00:49:46] Speaker A: So speaking of those LLMs and ChatGPT and Grok and Gemini, have you played around with them much and do you have any preferences or insights into which are better, which have potential?
[00:50:02] Speaker B: Yeah, I mean, I use ChatGPT quite a lot, probably like a lot of people out there, and I still think it's really great. It's obviously not perfect. I know Gemini has had some issues, but what we're seeing with Gemini now is that if you search on Google for certain types of searches, you're actually going to get an AI overview at the top of the screen. So rather than some kind of summary or that is taken from a web page, Google is actually generating that top level overview, which is, I think very cool and I think they're doing a good job. Obviously they had a major stumble when they released Gemini and it was producing images of like the founding fathers of America that in Google, in Gemini's view, were black.
They corrected it. Hopefully they're going to continue to correct stuff. I haven't used GROK as much. I think that the advantage of grok obviously is that it doesn't have as many trust and safety guardrails, which is good in some sense because there are some things that you want to be able to play around with and create. I think there's an issue around artistic expression where you say I do want to create an image of a public person in whatever, in context. Because that's satire. That's the basis of satire. We've been doing this for hundreds of years and a Lot of the other platforms will just tell you, no, I can't do that, even if it's a public figure. So, you know, the fact that there are at least now four or five major chat or based chat interfaces is a good thing because it's competition. These guys are going to be challenging each other and pushing each other and they're going to go where the market is. So I think that is a great thing. I think there. It's an obviously transformative technology. We've only first, we've only seen the like first little inklings of what this can actually do and how it can impact our lives. But it's going to continue to evolve in ways that are going to astonish us, just as we've been astonished over the last two years since ChatGPT 3 came out.
[00:52:06] Speaker A: Well, I know we promised that we would give you a hard stuff stop at the top of the hour, so we're going to do that. But before I let you go, I mean, you've just had such a amazing career from, you know, Wayback Machine, the Internet Archive, writing short stories and non fiction and your journalism. So just would love to know what is next for you, particularly at Pirate Wires, any topics that are catching your interest and might we see a Ashley Rinsberg as the White House correspondent for Pirate Wires? That would be my vote.
[00:52:53] Speaker B: Thank you. Yeah, I'm going to continue on the Wikipedia front probably as long as I keep digging up good stuff, which I think there is more, more of that story to tell. Continue to think about and write about information today at the news media, where it's going. And also I'm starting to think about, you know, how can I use technology myself to solve a lot of these issues, particularly with media. How can we use technology to improve media reliability? Because, you know, I take this view where you don't want to throw out the baby with the bathwater. And the United States has such an incredible tradition of journalism that goes way back to the founding fathers. I mean, in the beginning of my book I have this little quote by Thomas Jefferson which just says, you know, if you, if I had to choose between a government without newspapers or newspapers without a government, I would choose the latter case. He would rather have journalism with no government than government with no journalism.
So we need to fix it and we need to do better. And I think technology is giving us the ability to do better. I think this new wave of culture we're seeing evolve on the Internet where people want to know things that they don't know and they want to dig and they want to investigate and they want to find out for themselves is incredibly positive. And it's a sign of the health of the democracy. So diving into that space a bit more and being a little more hands on with the technology and building stuff that can make an impact is, I think, where I'm going.
[00:54:23] Speaker A: Well, we will continue to follow you folks. You can follow Ashley on X. And thank you, Ashley, again, for not just this interview, but the work that you're doing, very important work that you're doing. I mean, at the Atlas Society we're promoting objectivism, which means we want to get to objective truth. And if these sources of our information, and not just at Wikipedia, but then by default what's showing up at Google or being manipulated for partisan reasons, then this is important information that we need to have. So thank you. Thanks to everybody who joined. I loved seeing a lot of your love in the chat and I invite you to turn that love into a donation of whatever amount. If never donated to the Atlas Society. Your donation will be matched by our board of trustees. So you can check that
[email protected] donate and next week I will be off, but Atlas Society founder David Kelly will host a webinar with senior fellow Rob Tresinski to discuss our latest pocket guide to free Speech. We'll see you there.