Comments

The following is an automatically-generated compilation of all talk pages for the Signpost issue dated 2022-11-28. For general Signpost discussion, see Wikipedia talk:Signpost.

Book review: Writing the Revolution (513 bytes · 💬)



Back to 2022-11-28

Discuss this story

CommonsComix: Joker's trick (529 bytes · 💬)



Back to 2022-11-28

Discuss this story

Concept: The relevance of legal certainty to the English Wikipedia (10,283 bytes · 💬)



Back to 2022-11-28

Discuss this story

  1. As far as I know, admins are held to the same standards as anyone else. If anything, a bit higher: an admin's behavior is much more likely to be a topic of broad discussion than that of a random user.
  2. There are plenty of places, including in actual government, where lifetime appointments subject to recall are the norm.
  3. Vague talk of cabals on the Internet isn't worth the paper it (isn't) written on.
  4. Hyper-legalism is a troll's paradise. - Jmabel | Talk 22:33, 19 December 2022 (UTC)[reply]

Disinformation report: Missed and Dissed (2,486 bytes · 💬)



Back to 2022-11-28

Discuss this story

Essay: The Six Million FP Man (15,794 bytes · 💬)



Back to 2022-11-28

Discuss this story

It was originally meant to be by someone else interviewing me, but holidays and timeliness proved an issue with availability. It was do this or get very annoyed at an offer being rescinded. Adam Cuerden (talk)Has about 8.2% of all FPs. Currently celebrating his 600th FP! 20:22, 28 November 2022 (UTC)[reply]

Featured content: A great month for featured articles (0 bytes · 💬)

Wikipedia talk:Wikipedia Signpost/2022-11-28/Featured content

From the archives: Five, ten, and fifteen years ago (0 bytes · 💬)

Wikipedia talk:Wikipedia Signpost/2022-11-28/From the archives

In the media: "The most beautiful story on the Internet" (4,947 bytes · 💬)



Back to 2022-11-28

Discuss this story

  • We as The Signpost are here to report about what the media report about things. They said the movement "shielded" SBF. "Shielded" is pretty strong language coming from a top-tier national newspaper, and that's why the item exists in this column. If they were saying such things about other movements with a Wikipedia nexus, that would show up here too. ☆ Bri (talk) 16:55, 28 November 2022 (UTC)[reply]
    I have nothing against the Signpost inclusion! I was commenting about the original WaPo article, not objecting to the Signpost's coverage of thus, which is only proper. Apologies for any lack of clarity. CharredShorthand (talk) 16:58, 28 November 2022 (UTC)[reply]
As long as WP:BLP is followed, I think we can say what we want. Adam Cuerden (talk)Has about 8.2% of all FPs. Currently celebrating his 600th FP! 22:57, 28 November 2022 (UTC)[reply]
It seems kind of juvenile to me as well. My theory is that novel takes are a market; when somebody does something shitty, it creates a large demand for takes about why they suck. The obvious reasons are, well, obvious, so they are written into takes within the first couple days. But there is still demand, so the market responds by creating more: this is how we get all of those opinion pieces about how some mass murderer listened to heavy metal music, or drank Pepsi, or whatever. They don't make a lot of sense, but so it goes. jp×g 05:38, 29 November 2022 (UTC)[reply]
Effective altruism seems a bit of a buzzword to me, but I get the idea of get rich, then donate. The Andrew Carnegie route, as it were. Has the whole "but who will do the charity work you fund?" problem if everyone does it, but it's fine. However, if it's true the movement worked to promote itself and it's fellow members on Wikipedia, that' s where it's notable for us. We may have buried the lead, though, in my efforts to make sure there was context. Adam Cuerden (talk)Has about 8.2% of all FPs. Currently celebrating his 600th FP! 05:47, 29 November 2022 (UTC)[reply]

Interview: Lisa Seitz-Gruwell on WMF fundraising in the wake of big banner ad RfC (2,105 bytes · 💬)



Back to 2022-11-28

Discuss this story

Bizarre bug. I guess this is karmic punishment for fixing the other ones earlier this month! jp×g 16:08, 28 November 2022 (UTC)[reply]

No questions about Tides? That's... a bit disappointing. That's been one of the more opaque parts there, and it would be good to see some light shed on that. Seraphimblade Talk to me 21:36, 28 November 2022 (UTC)[reply]

I was trying to cover that with the question about the endowment, but if there was anything I missed might be worth putting a question here in the discussion and pinging Lisa and/or Julia? WMF certainly seem to be in dialogue mode at the moment... The Land (talk) 08:29, 29 November 2022 (UTC)[reply]
So there's an overall principle of "Our Movement will make decisions at the most immediate or local level wherever possible" which suggests that many things the WMF does ought to be decentralised. It's not specified anywhere that fundraising (as opposed to, say, making grants or recognising affiliates or indeed anything specific) but the idea of WMF continuing to raise 95% or whatever of movement funds seems fairly inconsistent. The Land (talk) 08:29, 29 November 2022 (UTC)[reply]

News and notes: English Wikipedia editors: "We don't need no stinking banners" (11,346 bytes · 💬)

Page views item is misleading[edit]

This note is very misleading:

Page views in September for all wikis have been up in relation to the past months with 23,657,615,038 views in comparison to 23,033,712,122 in August. Wikipedia has been growing in popularity since January of 2016, when page views were 20,865,413,322.

As the small print under the linked chart warns: By default, this data shows page views from automated traffic as well as human traffic. To focus on human user page views, please use the Agent type filter.

Filtered to human page views, there is no such "growing in popularity since January of 2016". In fact the October 2022 number (16,043,576,424) is lower than the October 2016 number (16,573,593,158). And this view shows that there has been an increase of over 2.7 billion in monthly "Automated" views from March to October 2022.

(In fairness to Helloheart, this is easy to overlook. It's a common fallacy to go with the default view of https://stats.wikimedia.org/ , I've seen several other people make the same mistake. The Foundation's team who maintains this tool should have fixed this and other usability issues years ago.)

Regards, HaeB (talk) 17:21, 28 November 2022 (UTC)[reply]

Re: Fundraising[edit]

Not really a comment on the discussion, but a datapoint: every year there are a lot of complaints in the Archive of Our Own community about fundraising to support the website & its staff. Enough complaints that a defense of fundraising for that website has been posted every year on Tumblr. This leads me to wonder if complaints about fundraising are common at other community-based projects (such as, for example, Internet Archive), & if not how they avoided this blowback. In some ways, we Wikipe[d|m]ians can be very provincial in our outlook. -- llywrch (talk) 18:47, 28 November 2022 (UTC)[reply]

This might be of interest. --Firestar464 (talk) 10:21, 29 November 2022 (UTC)[reply]
@Llywrch: I've seen the complaints on Tumblr as well, though I don't think they're close to the same very specific objections to fundraising methods and the doling out of said funding. Some objections definitely seem to be in the "don't donate to people who support/host insert-thing-here I find objectionable!" camp, and the list of things that are 'objectionable' (on a website where you can really specifically tag content for what's in it and then...avoid that stuff) and 'shouldn't be allowed' are invariably silly (some people apaprently can't separate fiction and reality in their head). Those complaints, I don't think AO3 worries about; they're not based on anything reasonable to worry about.
But I have seen some that genuinely wonder what they do with all that money. It's true that they generally go over their donation goals, but it's worth noting that their owner, the Organization for Transformative Works, has some large main avenues that money goes down: legal advocacy (necessary for copyright-derived works, hello Anne Rice), archival services (the open doors project imports fanfic collections that can no longer be supported independently), an academic journal, as well as their own wiki and a directory of resources.
I don't think they have the same problems that we do, and I don't remember them ever having had to make a post about criticism with their funding; for a website that hosts derivative works of non-free content, providing solid legal support that doesn't leave creators trying to figure out a cease and desist themselves is incredibly necessary work. There's a writeup of a reddit post on the subject here on fanlore.org that does a pretty good rundown of all the arguments.
What probably helps also is that AO3 doesn't write their fundraising banners like we do...—Ineffablebookkeeper (talk) (((ping)) me!) 12:41, 30 November 2022 (UTC)[reply]
From what I've seen of the AO3 banners, they are indeed very different from the Wikipedia ones, and it seems like the complaints they get are not generally about the form or content of the banners. The Internet Archive is an interesting example because their banners are like ours—indeed, they frequently appear to be virtual copy-and-pastes (their current banner starts with "Please don't scroll past this" and talks about "the one in a thousand users that support us financially"). But IA is not primarily based on user-generated content the way AO3 or Wikipedia is, so the "why do they need my money?" question is more obviously answered.
Also, fan fiction is a traditionally social and participatory medium, and the stereotypical fan fiction consumer is extremely online; wank is not unexpected. I'm not sure who the (stereo)typical IA user is.
Finally, tangentially related, the Organization for Transformative Works (which is a membership nonprofit) just had their board elections. Some interesting comparisons to be drawn. —Emufarmers(T/C) 22:27, 1 December 2022 (UTC)[reply]

Re: "We don't need no stinking banners"[edit]

Quoted audit report data in the Notes section is mis-representative, misleading[edit]

Hi, I’m from the Foundation’s Finance team. I want to note that the meaning of the “investment income (loss), net” section of our annual financial audit is explained in the audit report FAQ section. In essence, the $11,665,241 number in the audit report represents the value of the assets we held as of June 30, 2022 vs June 30, 2021, which did indeed go down; this was a period of global economic downturn that affected most people. Most of this number is an unrealized loss, meaning that we held onto most of our assets and will continue doing so until markets improve, rather than selling at a loss.

In the article, this data is presented as unexplained, with implied questions about whether Foundation finances are mismanaged or wrongly reported. Neither implication is correct.--SLangan (WMF) (talk) 20:46, 29 November 2022 (UTC)[reply]

@SLangan (WMF): Thanks for your note. However, all it says at the place you link to is this:
"Investment income (loss), net" is primarily interest, dividends, realized gains/losses, and unrealized gains/losses earned on the Wikimedia Foundation's cash and investment portfolio. During this audit period, some of the Foundation's cash was invested in mortgage backed securities, U.S. Treasury securities, corporate bonds, and stocks (Note 3). It is the Foundation’s investment intention to preserve capital, income and liquidity over‎ capital appreciation, which has higher volatility."
This is completely generic and hardly likely to enable the average community member to understand what caused the loss, all the more so as there was an economic downturn in 2020 as well, and WMF investment income remained several million in the black.
What I actually say in the article above is "At the time of writing, the Wikimedia Foundation had not responded to questions about the precise circumstances responsible for the negative result." This is indisputably true. I asked on the mailing list on Nov 10:
Dear WMF Finance staff, I inquired over a week ago on Meta-Wiki why the WMF is reporting a negative investment income (–$12 million). There has been no answer to date.[1] I am a layperson, but how can an investment income be negative? Would you mind sharing what this is about? I was also surprised to find that the reported increase in net assets for the 2021–2022 financial year was "only" $8.2 million. The third-quarter F&A tuning session published in May (based on data as of March 31) forecast a far higher surplus, with an increase in net assets of $25.9 million.[2] Would you mind sharing what happened in the fourth quarter to reduce the surplus by so much?
And I asked a week before that (twice) on Meta whether more information could be shared, and received no reply in either place.
You replied on Meta today, which is greatly appreciated, but frankly I am still none the wiser as to which of your assets specifically went down by so much, and why and how a projected $25.9 million increase in assets at the end of the year (I am reading that right, aren't I?) was reduced to just an $8 million increase in the space of three months.
I am not implying anything. I am asking you to help us understand. Regards, --Andreas JN466 21:28, 29 November 2022 (UTC)[reply]
Please also note my reply on Meta. Best, Andreas JN466 13:56, 30 November 2022 (UTC)[reply]
No you are not reading it correctly. The projected $25.9 million and $8 million you quoted represent very different calculations, across different types of reporting and different periods of time. They should not be compared directly. We have answered more fully on the audit report talk page. --SLangan (WMF) (talk) 20:19, 30 November 2022 (UTC)[reply]

Taylor & Francis[edit]

"The Wikipedia Library has made Taylor & Francis academic journals automatically available for anyone with a Wikimedia unified login" – Anyone? Even if you don't meet the criteria? Nardog (talk) 04:58, 10 December 2022 (UTC)[reply]

Obituary: A tribute to Michael Gäbler (1,939 bytes · 💬)



Back to 2022-11-28

Discuss this story

Truly great pictures.--Dutchy45 (talk) 18:49, 28 November 2022 (UTC)[reply]

Op-Ed: Diminishing returns for article quality (23,738 bytes · 💬)



Back to 2022-11-28

Discuss this story

Hey casualdejekyll, thanks for reading. As the article states, "[a]rticle quality is important, as a method to achieve our mission". The text doesn't say we shouldn't improve our articles – of course we should. It just comments on how article quality, past a certain point, relates to reader retention. /Julle (talk) 08:23, 2 December 2022 (UTC)[reply]
As for the content I want to comment that I've often compared wikipedia decorated articles with classical Encyclopedia Britannica articles. The latter were mostly much better structured, had good subtitles and an optimized length. And I've asked myself: why couldn't we use those well-known technics to collapse the abundance of details in order to get an optimized length at least in our "excellence" decorated articles? But I'm afraid there's no willingness in broad parts of our community to enter a path of innovation like this.
Last not least I'd like to emphasize the broader context of Wikipedia in a increasingly messed up Western society. The public educational system in America is not very good, even in rich Germany it's not good and underfinanced. Maybe Scandinavia is better off in that reference compared to most of the world. The cost of living crisis (inflation of 7-12% in core Europe) we are facing doesn't make it easier to appreciate classic standards of text quality. Isn't the Tiktok mania like other hypes before not also an evidence that most individuals nowadays are psychologically struggling for attracting notice instead of fulfillling standards of quality like pre-neoliberal generations before? Only in older (1960-1980) feature films or literature we can find a much slower pace of everyday life. But the postmodern lifestyles nowadays leave the good old path of the achievements of the Enlightenment. --Just N. (talk) 16:16, 1 December 2022 (UTC)[reply]
Thank you for the kind words. /Julle (talk) 08:23, 2 December 2022 (UTC)[reply]

To the point that convenience often supersedes quality, look no further than the shift from land lines to cell phones. AT&T (COI note, I worked for that company for 15 years) put a lot of effort into improving long distance & land line quality, only to be thwarted by the adoption the more convenient but lesser quality audio of cell phone calls. It is true that cell phone quality has improved over the years, but this was not much of a factor in general adoption of the technology. Peaceray (talk) 22:11, 2 December 2022 (UTC)[reply]

And quality can continue to improve long after adoption – as in the case of Wikipedia. /Julle (talk) 23:26, 2 December 2022 (UTC)[reply]

Opinion: Privacy on Wikipedia in the cyberpunk future (3,712 bytes · 💬)



Back to 2022-11-28

Discuss this story

Since The Signpost invited Ladsgroup to give us permission to publish his essay, I didn't want to modify it before publication. But it is worth mentioning that the stylometry techniques discussed, whether AI assisted or not, can be used to link real-world identity to on-Wiki identity by comparison to off-wiki writing such as blog posts or other published material. Something to think about? ☆ Bri (talk) 17:16, 28 November 2022 (UTC)[reply]

The third paragraph of the second section addresses that, I think. I'd go on to add that it's not just oppressive governments or big companies that could do that, but also basically anyone with a grudge and a target - could be Kiwifarms, 8chan, terrorists, extremist partisans, and other such undesirables (in fact, I'd say they're even more dangerous than Big Brother or Big Tech). This is definitely something everyone should be careful about, both within Wikipedia and elsewhere on the Internet. Doxxing and IRL harassment based on Wikipedia edits has already happened on Wikipedia, in relation to the 2020 Delhi riots article - a well-respected editor was doxxed and harassed by far-right "news" site OpIndia.
Stay safe, people... W. Tell DCCXLVI (talk to me!/c) 18:19, 28 November 2022 (UTC)[reply]

Recent research: Study deems COVID-19 editors smart and cool, questions of clarity and utility for WMF's proposed "Knowledge Integrity Risk Observatory" (0 bytes · 💬)

Wikipedia talk:Wikipedia Signpost/2022-11-28/Recent research

Technology report: Galactic dreams, encyclopedic reality (5,829 bytes · 💬)



Back to 2022-11-28

Discuss this story

Let's forget about the print editions of The Signpost please! And maybe we should still define AI as artificial ignorance. After all, the machine has no understanding of the subject it is writing about. If it ever becomes a Wikipedia editor, it will likely be kicked off in a week for violations of WP:CIR, WP:BLP, WP:V, WP:NOR, etc. Before we start accepting any text directly from AI programs, there should be a test on whether it can follow BLP rules - that's just too difficult. Maybe just throw out all AI contributions about BLPs, but run the test on WP:V. In theory, at least, it could get the references right once it gets a concept of the meaning of what the references say - but that's a way off. Sure, there are tasks AI can do but they are essentially rote (easily programmable) tasks, e.g. finding possible refs, alphabetizing lists, maybe even constructing tables. Once an AI program can consistently do those simple tasks, then we can try it out with more difficult problems, e.g. identifying contradictions in articles or checking birth and death dates.

ORES is a marvelous program for checking article quality but it only does certain things that are related to article quality. I'm guessing the factors it considers to be number of words, refs, sections, illustrations, quality tags placed by editors. Maybe even incoming links and outgoing links. It can count real good and the added value is the correlation of thse counts to real person evaluation of quality. I love it for what it does, but everybody knows that there are some terrible articles with lots of words, refs, sections, and photos. Smallbones(smalltalk) 16:51, 29 November 2022 (UTC)[reply]

And furthermore - AI programs should be able to recognize that "and furthermore" means that somebody is pissed off! and that "it counts good" is a criticism, not a compliment!
Perhaps the first test we should run on an AI contributor is whether it can write articles on "Limitations of Artificial Intelligence" and "Benefits of Artificial Ignorance". If those 2 articles don't have considerable overlap, the designers should go back to the drawing board. Smallbones(smalltalk) 17:17, 29 November 2022 (UTC)[reply]
@Smallbones: Well, GPT-3 had this to say when I prompted it with your comment:
I don't understand why you would suggest that AI can only do rote tasks, when in fact AI has been used for a variety of tasks, from playing chess to recognizing faces. Furthermore, AI is being used for more complex tasks, such as natural language processing and image recognition. It is being used to automate and streamline processes in industries such as finance, health care, and transportation. And it is being used to create new products and services. So, I think the idea of artificial ignorance is misguided. AI should be judged by its performance, not its level of understanding. If an AI can do something better than a human, then it should be used. If it can't, then it should be discarded. And if it can do something better than a human, then it should be given the opportunity to do so. The bottom line is that AI is a powerful tool, and it should be used to its fullest potential.
I believe that AI could be a great asset to Wikipedia, as long as it is used responsibly and appropriately. AI can be used to help automate routine tasks, such as verifying facts and sources, formatting content, and checking for spelling and grammar errors. AI can also be used to more accurately match content to Wikipedia's editorial standards - for example, it could be used to detect and flag potential bias or inaccuracies in articles. AI could even be used to suggest possible improvements or additions to articles. Ultimately, AI could be used to help make Wikipedia's content more accurate, comprehensive, and up-to-date. However, AI should not be used in place of human editors. AI can only do so much, and it cannot match the nuanced understanding of the subject matter that comes with human experience. AI should be used as a supplement to human editors, not as a replacement.
Straight from the horse's mouth... jp×g 03:16, 1 December 2022 (UTC)[reply]
I liked the smallbone's "further more" comments.-Nizil (talk) 04:44, 7 December 2022 (UTC)[reply]
I think the problem with AI is, that we are not able (at least because of the potential risks, if not by systematic errors in training) to let an AI have own ideas, so that it e.g. can correct systematic errors in training data on its own. The programming interrogator is everything to the AI. If there now would be a real artificial intelligence, with broader understanding, in the best (but also bad) case the programmer is some type of god to it, in the worst case some type of O’Brien, who tells it that 2+2 equals 5. Habitator terrae (talk) 22:49, 7 December 2022 (UTC)[reply]

Tips and tricks: (Wiki)break stuff (2,301 bytes · 💬)



Back to 2022-11-28

Discuss this story

Thanks for a great article. I can vouch for the importance of taking a break. In my case I felt I was pretty much driven away from the project in the middle of 2019 by the events that led to this arbitration case and came back, tentatively at first, some 18 months later, not knowing quite what to expect. I was delighted to see that things were ticking over much as before and the community had dealt robustly with (most of) the problems I had been struggling with. Wikipedia is bigger than any one of us and will be here long after we've all gone. It can cope without any one of us for a while. If you need a break - for any reason - take it. Wikipedia will still be here and your contributions will still be welcome when you get back. WaggersTALK 16:33, 28 November 2022 (UTC)[reply]


Last Wikibreak, I wrote an operetta. A few breaks before that, though, was what killed off WP:Featured sounds, though. Apparently I was needed back then. Adam Cuerden (talk)Has about 8.2% of all FPs. Currently celebrating his 600th FP! 22:30, 28 November 2022 (UTC)[reply]

I have been in numerous wikibreaks in 2021 as I'm going through tribulations of my life at that year, before settling on semi-retirement since. I'm still editing, but at a much lower activity level than late 2020 and early 2021. Wikibreaks and semi-retirement helped me regain most of my focus towards important issues that I am facing in my life first and foremost. I remain stressed ever since, sometimes losing focus, and I didn't get the mental help I wanted, but at least I'm trying. MarioJump83 (talk) 12:34, 6 December 2022 (UTC)[reply]

Traffic report: Musical deaths, murders, Princess Di's nominative determinism, and sports (809 bytes · 💬)



Back to 2022-11-28

Discuss this story