The following discussion is an archived debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA. The result of the discussion was Approved.

Operator: Hasteur (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)

Time filed: 00:32, Wednesday July 24, 2013 (UTC)

Automatic, Supervised, or Manual: Automatic

Programming language(s): Python/PyWikipedia

Source code available: [1]

Function overview: Nominate for Speedy Deletion articles that are valid for CSD:G13 (Stale/Abandoned) Articles for Creation submissions that have not been modified in 6 months. Notify creator of AfC submission that their submission is being nominated. Log nominations in a userspace page for auditing after the fact.

Links to relevant discussions (where appropriate): Wikipedia talk:WikiProject Articles for creation#Proposed: A Bot to traverse old AfCs and nominate for deletion

Edit period(s): Initially, it will be triggered in a 1 hour period, but once the backlog has been burned down the trigger period will be reduced to 4 hours.

Estimated number of pages affected: 79,000 AfC pages initially, but after the backlog has been burned down, it will traverse the new AfC submissions that have slipped into eligibility (Estimated at 200 pages a day).

Exclusion compliant (Yes/No): Yes

Already has a bot flag (Yes/No): No

Function details: Working on a maximum threshold of 100 nominations per run of the bot. If there are more than 150 nominations in the CSD:G13 section, the bot will terminate it's run early (as we don't want to overflow the admin corps with loads of CSD nominations). AfC pages need to be edited, and the author should be notified if their AfC submission has been nominated (in the same way that an author is notified when a user uses twinkle to nominate a page for G13 deletion) therefore I do not see a appropriate reason for Exclusion.

Discussion[edit]

What's wrong with it taking 79 days. Considering this stuff has been sitting there for years now, managing to remove it in three months would be a very worthwhile accomplishment. This is not one of the things there is a rush about. There's at least two more critical backlogs: the Copyvio problems page, and the oldest unreviewed new articles. DGG ( talk ) 04:14, 27 July 2013 (UTC)[reply]
So if I accept the 50 article limit, will you be the on call admin to clean out the backlog each time it fills up? 50 nominations is trivial compared to the throughput a individual user could do when they set their mind to it. You have been on the deleting end of several of the nominations I have made recently, and you object to a little higher request on the CSD bandwidth? Hasteur (talk) 05:03, 27 July 2013 (UTC)[reply]
I assume the "oldest unreviewed new articles" you are referring to is Category:Pending AfC submissions I would note that the oldest Unreviewed AfC submission is 5 days old (and within the project's tolerances). Could reviewers do better on making sure that items moved out of the AfC space into Article space are assessed better? Sure. Is that part of CSD:G13 and nominations for it? No. Hasteur (talk) 05:03, 27 July 2013 (UTC)[reply]
I would certainly lend my own efforts to actually deleting tagged articles if this was passed. I gave up tagging by hand because I won't be part of a pointlessly painful and ineffective process, but what's proposed here is probably about right--the minimum force necessary to actually address an issue I consider to be serious. --j⚛e deckertalk 15:27, 29 July 2013 (UTC)[reply]
And I would agree with "79 days isn't that long". If we could actually reach steady-state in 180 days, I'd be thrilled. --j⚛e deckertalk 15:31, 29 July 2013 (UTC)[reply]
"activist editors" -- do you mean people who would nominate other articles for deletion that need deletion even more than these, because they're in mainspace? I would concentrating on this first would be a good thing to do, and anything that would slow down our removing them from mainspace is unconstructive. DGG ( talk ) 04:09, 27 July 2013 (UTC)[reply]
In that context activist editors are people who are not the bot, also crawling through the potential G13 articles (or ones that just became 180 days stale). These pages never hit Article/Main space. How the Admins choose to prioritize the backlog is entirely their decision, however having more than 50 entries on the overall CSD category causes a backlog notice to show up on the Admin dashboard. Nudging admins to take care of backlogs is why they're entrusted with the supply closet keys. Hasteur (talk) 05:03, 27 July 2013 (UTC)[reply]
Not true! Just because there is a backlog does not mean you drop everything to deal with it. How many active admins do we have today? What about 6 years ago? How much has the encyclopedia grown in that time.\
It was observed that stale AfCs have been scraped out of Wikipedia and replicated onto illegitimate mirrors. In the discussions leading up to CSD:G13 becoming a valid reason, there was discussion about how we would go about cleaning up the stale submissions. The idea for a bot was endorsed, but never created. I simply decided to pick up the idea and run with it after it was pointed out in the AfC discussion. Hasteur (talk) 15:16, 27 July 2013 (UTC)[reply]
I'm sorry, I don't know exactly how the admin backlogs work, but if the bot can identify pending speedy deletions by type, shouldn't admins have that same ability? Wouldn't they be able to completely halt the bot by simply not processing any of the G13 nominations? If the bot will not push the backlog above 150, then leaving the current 150 G13 nominations in the queue should stop the bot, or am I reading this completely wrong? VanIsaacWS Vexcontribs 09:51, 27 July 2013 (UTC)[reply]
@Vanisaac: There's Category:Candidates for speedy deletion as abandoned AfC submissions which is where the G13 nominations land and Category:Candidates for speedy deletion. If admins refuse to process the pending nominations in the category, the next step is for someone who is tracking the category to post a nag notice on WP:AN to remind the admins that these valid nominations, that the community has endorsed, are waiting and should be dealt with. Hasteur (talk) 15:06, 27 July 2013 (UTC)[reply]
So the concern is that it would flood the general candidates for speedy deletion category. I'm wondering if a compromise could be to still process the larger number, so that a couple of ambitious admins wouldn't run out of G13 nominees before another run, but list them only at the G13 CfSD category and just make it a subcat for during the initial run through the backlog. It wouldn't clog up the main CfSD category while we're processing this horde of abandoned content, but still allow us to process nominations at a good enough rate that the limiting factor is how fast the admins can move through the backlog. VanIsaacWS Vexcontribs 20:04, 27 July 2013 (UTC)[reply]
@Vanisaac:The way the categories (and the associated template pages) work is that ((db-g13)) adds the categories Category:Candidates for speedy deletion as abandoned AfC submissions and Category:Candidates for speedy deletion to the page. That category is tied into Category:Candidates for speedy deletion and ((CSD-categories)). If it is explicitly desired I could see a forked template that is specific for the initial burndown by the bot that does not include the main CSD category so that while it will show up in the AfC subcategory, and the count pages, it does not blow the entire backlog out of the water. I'd like to get some feedback from administrators/BAG before we took such a drastic step, as (if I understand correctly) changing the nomination template on this task would require a new BRFA as it no longer follows the description.Hasteur (talk) 23:34, 27 July 2013 (UTC)[reply]
Why couldn't it just check for ((Db-afc-notice)) on the user talk page? VanIsaacWS Vexcontribs 23:28, 27 July 2013 (UTC)[reply]
@Vanisaac: Couple problems with that. 1. The documentation says that the warning needs to be ((subst))-ed to work properly. Taking a look at the "invocations" of the template [2] we see that it's only been "used" by linkage/transclusion/redirection 1 time. I know that the template has been used a great many times beyond that (For example: User talk:Wolfjc where I twinkle nominated). Without maintaining a running tally of which editors have been nagged in the past 48 hours, the computational complexity becomes enormous. Also, what happens if during a run, the bot nominates 1 by an author and comes across another submission by the same author? Hasteur (talk) 23:45, 27 July 2013 (UTC)[reply]
I'm certainly not arguing against this bot in any way. I specifically supported it in the original G13 discussion at CSD, and I created the refund/G13 template, as well as significantly contributing to Db-afc-notice, so I am 100% behind this. I just want to make sure we do due diligence and ensure we've thought of as much as possible before we start implementing things. VanIsaacWS Vexcontribs 10:40, 28 July 2013 (UTC)[reply]

Comment Maybe we are approaching the problem the wrong way. Note that G13 is a black or white criterion; no human judgement is needed to determine whether or not the article meets the criterion. Rather than having a bot nominate en masse, and trust the admins to clean up the G13 backlog, what we need to do is as follows:

This approach has the distinct advantage that the admin corps would not be affected at all. Editors would be free to plod through the backlog at their own pace, without quotas, maximum G13 counts, or other complications.

The problems I see with this approach are twofold. The first is that is a submission meets 2 or more criteria, including G13, the other criteria should generally be the one used. It is much more important to know that it is a deleted copyvio for example, rather than a mere old draft. The second is that if there is a bug in the adminbot, the resulting issues could plausibly be much more severe.

Food for thought. Tazerdadog (talk) 05:39, 31 July 2013 (UTC)[reply]

Implementation Notes[edit]

in what sequence are you planning to do them? If there is a known sequence, those who want to save articles will be able to go through before the bot. (I suggest in would be most reasonable to start with the oldest)
I still have major doubts about this, and will probably oppose it, but I could give my reasons better if I knew how you plan to do it. I suggest that even if one does not want to bother oneself in seeing what G14s can be rescued, there is at least one other critical step in evaluating a G13--check if there is already an article, perhaps under a variant title, because if there is there is about a 50% chance that it will be bad enough to need deletion & this is an excellent chance to identify it. DGG ( talk ) 04:03, 27 July 2013 (UTC)[reply]
During the initial phase I intend to run the bot over the oldest categories and moving forward.I know that as of the last time I checked, 2008 was cleared and January 2009 was cleared. G13 (and this bot) only focuses on the stale AfC. I would imagine that other tasks, such as "Aritlces with few inbound links" would allow those AfCs that were moved into Articlespace over the declined/stale AfC purpose. Once the titanic backlog is burned down the bot would run every 4 hours (to move only the ones that just became eligible for G13). As was said in multiple places "A submission that had potential 6 months ago that never moved forward is probably still going to have potential 6 months from now and will probably have a new submission by then". Hasteur (talk) 04:21, 27 July 2013 (UTC)[reply]

To test all the use cases I am requesting the following trial cases (I understand if I need to break my allotment down to exercise these test cases)

  1. Hiting the CSD:G13 category limit (i.e. Pre-existing nominations + My nominations = 50)
  2. Hitting the Bot invocation limit (i.e. My Nominations = 50)
  3. Hitting no limit (i.e. My Nominations < 50 and My Nominations + Pre-existing nominations < 50)

Please let me know if I'm being far too verbose and anxious about this. I really think this is a good thing for the community. Hasteur (talk) 15:57, 27 July 2013 (UTC)[reply]

Hasteur, you are doing a great job of balancing all the various opinions and concerns. Be as verbose as you want. Storage bytes are cheap. —Anne Delong (talk) 17:41, 27 July 2013 (UTC)[reply]

2013-07-31: Ok, I've done some restructuring in the logic of this bot to tie it in with Wikipedia:Bots/Requests for approval/HasteurBot 2. The Workflow is this

  1. The task 2 bot goes out across all the old AfC submissions and sends a notice to the creator that the article is eligible for G13 and that it could be deleted soon to attempt to get them to do something about the submission. This will seed a database with potential nominations that the bot could make
  2. This task will only nominate submissions that have had at least 30 days since the notification date. The bot will nominate up to 50 in a single pass. If there are already nominations in the CSD category, it will deduct that many from the nominations that the bot will put up so that the bot does not push the category membership over 50. The bot will update it's database to indicate what datetime the bot nominated the article
  3. A 3rd backend process that queries Wikipedia to perform maintenance on the g13_records database (Remove lines where the page has been edited and therefore is no longer potentially G13 eligible, Remove lines where the submission was turned into an article or redirect to an existing article, Remove lines where the submission was deleted). I do not know if I need to get BRFA approval for this task because it does not edit Wikipedia, but will query the servers.

I will be pinging all the users who have responded so far to this proposed bot in hopes that the previously stated objections can be resolved. Hasteur (talk) 13:18, 31 July 2013 (UTC)[reply]

Correction: I see you do intend to have the bot do it in 2 or more batches. I think that's a good idea, & not just for the test period. When the number daily is increased, multiple small batches would seem the best way to increase it--I think it would remove the objections to doing 100 in one day. DGG ( talk ) 18:55, 31 July 2013 (UTC)[reply]
Once the nom-bot had items seeded into it's database, it would take 50 - existing membership in the G13 nominated category during each pass (Running on an automated schedule, every 2~4 hours?) so that at most any single invocation of the bot would dredge up 50. It'll keep feeding the category as long as there's space and there's ripe "nudged-submissions" in it's database. The 30 days window slides along as the day slides along. Hasteur (talk) 19:20, 31 July 2013 (UTC)[reply]
  • Perhaps a maintanance category for pages that have been submitted, declined, and have been stale for 60 days (1/3 of the way to abandoned) for editors to look through and try to find an adopter for. Something we can certainly discuss at WP:AFC. Hasteur (talk) 18:43, 31 July 2013 (UTC)[reply]
And this will also give us a chance to get some G11s. Most reviewers are very reasonably reluctant to immediately delete G11s unless they are outrageous because they might get fixed. but for them, 60 days is plenty. DGG ( talk ) 18:55, 31 July 2013 (UTC)[reply]

I skimmed a bit of above discussion, so pardon if I'm asking something answered. It seems the issue brought up is overloading admins with work. So if we have a clear X month cut-off for the declined AfD submissions, can the bot just delete the pages once the author was notified and reasonable time has passed? Do they still need human review if they were reviewed once already (declined)? (Are we not being too optimistic of future improvements for 80k abandoned pages?) If that's an issue, could the bot delete only submissions with specific concerns (like an ad) and tag others? Besides all that, I don't see major objections to a controlled G11G13ing (I'd like to see task #2 running first though). —  HELLKNOWZ  ▎TALK 21:02, 31 July 2013 (UTC)[reply]

For me I don't mind overloading admins, but I do care about there being no extra live eyes on the G13d page. So at least an admin should look before deleting. I agree with point 1 and 3 of proposed implementation, as I am noticing quite a few people responding to the speedy delete notification they get. They then ask for a WP:REFUND. Graeme Bartlett (talk)
I'd rather have a human (and preferably one who is good at evaluating consensus) make the final delete decision than a bot. Task #2 has to run to seed the database before task 1 can be ran. The idea is to only nominate on articles that meet the 180 days for G13 and have been given 30 days further before the bot nominates. It's concievable that shortly after the bot nudges (and gets the article in the category) a user could come along and request the G13 by hand (or twinkle). At that point it's out of the bots hands. Hasteur (talk) 00:06, 1 August 2013 (UTC)[reply]
Case in Point: Wikipedia talk:Articles for creation/Accountkeeper which was nudged here. I didn't see the article go up for CSD:G13 so I don't know which regular user snagged it, but this demonstrates that it's concievable that organic users could get to these candidate articles before the bot does. Hasteur (talk) 20:19, 1 August 2013 (UTC)[reply]
My concern is to just get the stale drafts - some of which might contain copyvio or blp violations - tagged with G13 in a timely manner so that they are queued for the decision to delete, which remains with an admin. I fully support the orderly way Hasteur proposes to deal with the large "trash heap" of older than 6-months drafts. In short, I fully support the implementation as proposed. Roger (Dodger67) (talk) 11:36, 1 August 2013 (UTC)[reply]

Trial[edit]

Approved for trial (1 run (for now)). Please provide a link to the relevant contributions and/or diffs when the trial is complete. With a reasonable number of CFDs and user pre-notifications (task 2). Let us see how this performs and if any issues are raised. The task itself lends nicely to being done by bot. I am fine with technical details. And it seems workload and review concerns are addressed as well as they could be in this case. —  HELLKNOWZ  ▎TALK 21:04, 1 August 2013 (UTC)[reply]

Ok here is my trial plan:
  1. Modify the notice_date in the backend database to make sure that Wikipedia talk:Articles for creation/ Atlanta Bomb Squad Dance Production qualifies for the query (since the seeding occured yestrday).
  2. Modify the query to explicitly be only 1 article
  3. Run the nomination bot, which will nominate the submission and notify the user that the nomination has taken place.
I will report back with the diffs after the trial has been conducted. Hasteur (talk) 21:27, 1 August 2013 (UTC)[reply]
By "1 run" I meant 1 batch (hitting the CFD limit), not 1 page. :) —  HELLKNOWZ  ▎TALK 21:32, 1 August 2013 (UTC)[reply]
Ok, then what I will do (because I know I don't have that many in my list) is manipulate the records in the sqlite database for everything I nudged on in the task 2 job yesterday. I don't know if the 8 records I have will hit the max-DBG13 limit, but the query is fairly straight forward in the code. Trying to follow the rules of the trial as closely as possible. Hasteur (talk) 21:38, 1 August 2013 (UTC)[reply]
BRFAs are not as WP:BURO as we make it out to be. Trial is for you to convince us and yourself that it works both technically and within consensus without unexpected results. The exact details are up to you, I'm deliberately vague with the number of pages to edit. If you need to post more notices via task 2, that is fine. If you need time, that's fine. 8 records isn't terribly many, I was thinking closer to 50 or so (whether at once or in batches). —  HELLKNOWZ  ▎TALK 21:45, 1 August 2013 (UTC)[reply]
Trial complete. Ok, here's the postmortem from 00:09 to 00:44
1 code flaw in the logic with dealing with commiting the cursor for updating the nomination time. Caused inturruption from 00:12 - 00:17. Also summary was not being asserted on user talk page notifications, so was corrected. Notified users affected by myself.
Toolserver appeared to hang and prevented me from getting a definite response on the commit on the DB cursor betwen 00:25 and 00:41. Notified user affected by the talk notification myself. Also corrected the change where anything not in the User Talk namespace was marked as a minor edit. I do not consider a CSD nomination a minor edit and have forced all edits coming out of the g13_nom_bot.py script to be non-minor.
Since the bot is doing CDSs, may be not flag the CSDing edits as "bot" as well? —  HELLKNOWZ  ▎TALK 18:52, 4 August 2013 (UTC)[reply]
The framework I'm using asserts bot to the API. Hasteur (talk) 02:19, 5 August 2013 (UTC)[reply]
Code cut off at 40 nominations as I configured it to (I always err on the side of caution). Hasteur (talk) 00:54, 2 August 2013 (UTC)[reply]

Approved for extended trial (a few more batches/runs). Please provide a link to the relevant contributions and/or diffs when the trial is complete. at your discretion. Just to make sure the entire process is satisfactory and no new immediate stuff pops up. —  HELLKNOWZ  ▎TALK 18:52, 4 August 2013 (UTC)[reply]

Trial complete.Ok I ran some more through the system. Everthing looks good. Hasteur (talk) 02:34, 5 August 2013 (UTC)[reply]

((BAGAssistanceNeeded)) Pinging BAG. The potential G13 category has filled out quite nicely (currently 21k articles in it, and some are being handled manually by regular editors), so in reality the only thing left is to arm the periodic firing portion of this script (targeted for every 4 hours) to attempt to nominate up to the 50 candidates active limit. Without me prematurely advincing the notification date on the database record, we're looking at no earlier than September 5th before the current collection of records I've notified on will be nominated. In the mean time, I'm crawling through the AfC submissions by date, advancing through the months one at a time. Hasteur (talk) 22:20, 8 August 2013 (UTC)[reply]

Personally I oppose this task. For that reason I will take no action here as a Bag member. Without reading all of the above in detail I see no point in tagging everything that is already in a category for deletions (all be it slowly).. Admins know the category is there and know that that are applicable for deletion. In my mind there is no point in double handing the work. Notifying the creator is a nice touch but why not just do this for all of the items in the category rather than also tag them for deletion. ·addshore· talk to me! 16:34, 10 August 2013 (UTC)[reply]
@Addshore: That's the thing, these articles are in the "G13 eligible category", but not in the "Candidates for Speedy Deletion as abandoned AfC submissions" category. Ideally we'd like a real life editor to look at each and every potential page in the eligible category, but with the giant mass that currently exists, this is not feasable. One task of the bot notifies the article creator that their article is eligible for G13 and encourages them to do something about it. 30 days after if the article is still not edited, this task will then nominate for G13 (and notify the creator that the page has been G13 nominated). This task specifically nominates only enough articles to hit 50 in the Candidates for Speedy Deletion category which is part of the Admin backlog. The bots purpose (if somewhat longer in form to respond to and nullify the previous objections) is to take the same action that a user would if they were using TWINKLE to nominate the page for deletion. Please reconsider and not respond to tickets if you're going to admit that you haven't read the consensus and negotiation I've had since proposing this task. Hasteur (talk) 18:10, 10 August 2013 (UTC)[reply]
@Hasteur: Are the two categories not just basically synonymous.? Indeed ideally we would like an editor to look at every article before it gets deleted, so why not allow that to happen? There doesn't seem to be any need to rush at all, AFCs that are sitting there are not doing any harm. In the grand scheme of things 21k articles isn't that great an amount, I have seen people tend to far more orphaned articles than that in a year, no reason this might not also happen with these AFCs. I stand by what I have said, notifying all of the creators of the articles that are no eligible is good. I still don't think I can support a bot nominating them all for deletion. ·addshore· talk to me! 18:34, 10 August 2013 (UTC)[reply]
@Addshore:*grumbles* Category:G13 eligible AfC submissions represent potential AfC nominations as having met the basic criteria for G13. Category:Candidates for speedy deletion as abandoned AfC submissions represents nominated G13s. 21k was the previous count, which is now 24.4k and still rising. The bot's behavior, which I observe you still haven't read all the appropriate information, is to go ahead and do the G13 nomination on the strict logic that last edit date was more than 180 days ago and it has been at least 30 days since the page creator was notified that their page could be deleted. Yes it's entirely possible that regular editors could take activist roles and clean out the nominations prior to hitting the 30 days notified mark, however a backstop process needs to be authorized that ones where we have made significant effort in contacting the creator are dealt with. We're getting (on average) about 300 new AFC submissions a day, without any cleanup happening to resolve these abandoned and stale pages, AfC and Wikipedia as a whole are going to be drowned in a flood of sub-standard AfC pages that get scraped out and mirrored to illegitimate mirrors. I am further being activist on getting this approved due to Wikipedia:Bots/Requests for approval/HasteurBot 2 being effectively approved, but being held up on this task being approved. Hasteur (talk) 19:43, 10 August 2013 (UTC)[reply]
I have read the above and my opinion is still the same. Either we should just be admitting the majority of the AFCs in the category are probably never going to be checked by a user and delete them all, or we should presume / hope they will and leave the category as it is for users to review articles a nominate them for deletion. Again I see the pinging of authors to be a good thing, just not the nomination. ·addshore· talk to me! 19:59, 10 August 2013 (UTC)[reply]
I'd like to resubmit my idea for an adminbot in light of addshore's concerns. G13 is a black and white criterion. You can argue whether something is blatant advertising. You cannot really argue whether the submission is G13 eligible. We should have regular editors patrol this category, and tag articles for G13, and then have an adminbot do the deletions. Every article would get looked at by a human (albeit non-admin). The adminbot confirms G13 eligibility, and then does the deletion after the warning the user and waiting an appropriate length of time. The tagging editor would be responsible for the deletion, and would be so noted in both the talk page warnings and the deletion log. The admins aren't overwhelmed, and every page is looked at by a human. Addressing Hasteur's complaints about this when I suggested it above:
The page is G13 eligible, as checked by this hypothetical adminbot. Admins have broad consensus to delete these pages. Therefore, I must reject your premise that the deletion is particularly controversial.
The bot would refer them to the editor who added the tag in its messages. It might even use a special signature like Added on behalf of User:tagging user by User:hypothetical adminbot (timestamp)
It is (or should be) the tagging editor’s job to look for copyvio prior to tagging. However, such articles will slip through. I contend that this is a relatively minor problem that can be adequately handled by the admin corps.
I’m sure we can find someone who does have the sysop bit to run it.
The adminbot could do the same, notifying the editor, waiting thirty days, while anyone could make an edit, and then deleting if the G13 criterion still holds.

Tazerdadog (talk) 07:57, 13 August 2013 (UTC)[reply]

With respect to Tazerdadog, The following responses should be noted
  1. Nominating a article for CSD by automation is a relatively uncontraversial action whereas actually deleting by automation is (as evidenced by the fact that Wikipedia:Bots/Requests for approval/7SeriesBOT 3 operated by an admin was turned down.
  2. The viewpoint has been expressed several times by both editors and admins that they do not want an automated process deleting articles. They want a human to sit down and evaluate nominated articles
  3. I would have low confidence in any admin-bot operator who did not write the code they were running.
  4. I've already made significant effort in negotiating a tentative consensus on this and therefore it is not wise to just blow the entire described procedure out of the water by changing the requirements.
For these reasons I am declining to take onboard such a radical change in requirements. I have no objection to another editor writing a G13 deleting bot, but I do not think that it will ever attain consensus to do such tasks. Again I refer to the description of what bots are allowed to to A bot is an automated or semi-automated tool that carries out repetitive and mundane tasks to maintain the ... English Wikipedia Hasteur (talk) 12:00, 13 August 2013 (UTC)[reply]

Break[edit]

The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.


(Sry, hadn't the time to read the whole thread) @Hasteur would it be possible for you that your bot tag articles which are included in the categories as declined as "joke", as "test edit", "hoax", and so on? So getting the uncontroversial stuff out? These categories include enough work for the admins. So actually we have enough time to discuss and find a solution / consensus how to proceed with the rest of the articles. mabdul 14:44, 14 August 2013 (UTC)[reply]

MabdulSo I understand the request correctly you're asking for all articles in the "Declined as joke/test edit/hoax/blank" categories
  1. Bypass the 180 day check
  2. Bypass the 30 day notified check
  3. Nominate for CSD:G2 pages that were declined as test edit or "blank"
  4. Nominate for CSD:G3 pages that were declined as joke/hoax.
I'm uncomfortable on these types of bypass for the simple reason that the submission could be declined on a questionable reason, the editor goes back and fixes the problem which invalidates the decline reason, and then doesn't come back. If this is a reasonable action then I think a limited duration bot request can be crafted up, but that the task would have a unreasonably high error rate as compared to a straight G13 nomination (and the process I've crafted so far). Hasteur (talk) 14:58, 14 August 2013 (UTC)[reply]
No, you misunderstood me. I never intended to rewrite our BRFA for bypassing any checks.
Most of these mentioned submissions are very old and thus would totally valid if nominated (after a check) under G13.
I would start with a more selective way of tagging before going through every G13 eligible submission...
As these are uncontroversial nobody will having anything against it. While these mess would be cleaned up, we can safely discussing how to resolve the rest of the submissions. mabdul 15:05, 14 August 2013 (UTC)[reply]
Mabdul The problem still stands that while they may have been declined at one point during their history as blank/test/joke/hoax, there's the distinct possibility that we could recieve a false positive due to the author making some improvements, but not making the final step of re-submitting the page. Let's table this sidebar discussion to WT:AFC (or some other appropriate venue) to establish the nuts and bolts of the proposal before proposing it as a new task for the bot. Ok? Hasteur (talk) 15:16, 14 August 2013 (UTC)[reply]
And why not checking there wasn't any non bot edit after declining? mabdul 15:45, 14 August 2013 (UTC)[reply]
Mabdul That's very broken english, but I think the question is "Why do you not check if there was any non-bot edit after declining". The problem with that is we would then have to go into each revision to find the revision where the Decline was enacted, and then by hand see what users have edited since then. Again, this is getting into nuts and bolts of a subtask/edge case. Let's stop the discussion here and open one at WT:AFC regarding how we should handle this exception case. Pending a significant objection I'm going to close off this sub-discussion in 1 hour as I'm trying to get the cats herded together so that they can walk through an eye of a needle for approval. Hasteur (talk) 15:50, 14 August 2013 (UTC)[reply]
The discussion above is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Sorry the above has brought one more small question to mind! What happens if there is an AFC, the user gets notified and after 29 days the AFC page gets an edit? Would the bot still nominate it for deletion? ·addshore· talk to me! 19:16, 14 August 2013 (UTC)[reply]
As long as the page has been edited before the bot comes around to nominate it(which invalidates the base G13 rule and invalidates the double check rule (to make sure that we've really waited 180+30 minimum)) the bot will not nominate the page. When the bot is lining up to delete, it checkes the last created timestamp to verify that the rule is still valid. If it's not, the bot removes the "Page-Creator" record from the database and moves along with no hard feelings [3]. If the article shows up 6 months later back in the danger zone, the bot is more than happy to do the cycle again. Hasteur (talk) 19:26, 14 August 2013 (UTC)[reply]
Lovely! ·addshore· talk to me! 19:32, 14 August 2013 (UTC)[reply]

Ok, TLed the BAG assistance requested. The bot will get approved in the fullness of time, even if that is just shortly before the heat death of the universe. Hasteur (talk) 00:43, 17 August 2013 (UTC) ((BAGAssistanceNeeded)) Hellknowz I think the discussion and the after discussion and the after after discussion has died down. Can we move onward to approval or detatch task 2 and get that approved? I'm starting to be pinged on my talk page about the HasteurBot showing up on recent changes when bots are supposed to be filtered out. Hasteur (talk) 17:08, 18 August 2013 (UTC)[reply]

 Approved. After some deliberation, I am going with an approval on this one with the process as it is now. There is no arguing the pages in the original category are stale and most are unsuitable for inclusion, so a systematic and controlled pruning is a beneficial task. Working through CSD eligible category and moving pages to actual CSD appears to be the best venue to do so. Concerns were expressed about extra workload, extra review work, "double" nominations, but as with most other bot work we have to compromise between full automation and human review. I believe current method of notifying editors and then tagging for deletion after history check is suitable and consensus seems to be that this works; at least outstanding issues are addressed. A few editors expressed that bot should delete the pages outright, but as trial edits and botop response indicates, some of the CSDed articles get rescued, and we cannot have a bot delete false positives. A few other suggestions, ideas and process changes were proposed, but while some were taken onboard, others either exceeded this BRFA's scope or added extra issues that are best addressed separately (as in potential future BRFAs). I tried my best to weigh whether I should ask the botop to pursue these, but at this point I am fine with the way it is, although I wouldn't mind future improvements, such as not pinging indef blocked users, checking versus existing articles, or correlating for blatant copyvios, etc. As a final note, I'd like to stress again that this is dealing with new editors and contentious venue (deletion), so please exercise good WP:BOTCOMM throughout. —  HELLKNOWZ  ▎TALK 18:48, 18 August 2013 (UTC)[reply]

The above discussion is preserved as an archive of the debate. Please do not modify it. To request review of this BRFA, please start a new section at WT:BRFA.