Operator: OsamaK (talk · contribs · SUL · edit count · logs · page moves · block log · rights log · ANI search)
Time filed: 10:42, Sunday July 29, 2012 (UTC)
Automatic, Supervised, or Manual: Automatic
Programming language(s): Python
Source code available: The script will be based on pywikipedia, but it's too simple to be published.
Function overview: To make it easier to access articles about website, the English Wikipedia currently has many redirects from website URLs (e.g. twitter.com, thepiratebay.se, megaupload.com) to their corresponding articles (e.g. Twitter, The Pirate Bay, Megaupload). This bot will create more of these redirects.
Links to relevant discussions (where appropriate):
Edit period(s): One time.
Estimated number of pages affected: ~ 1,000
Exclusion compliant (Yes/No): Not applicable.
Already has a bot flag (Yes/No): Yes
Function details: The bot will depend on this list this list to create more redirects to make it easier and quicker to link to and search for articles about websites.
How was the list generated? I notice a few problems with it. For example, there are a lot of pages where the URL and the name are exactly the same (e.g. ebay.com, for one), or where only the case is changed (e.g. Break.com, SAYNOTO0870.COM). In most cases (!), Wikipedia will automatically redirect different casings of the article title to the correct page, so there is no need to have a lot of those. Additionally, a lot of the redirects are already created (e.g. Tagged.com -> Tagged), is it possible to remove these from the list too, so we get a better idea of what edits the bot will actually carry out? Another problem I noticed is that some URLs appear twice (you have fab.com twice, once pointing to "Fab.com" and the other to "Fab (social network)", you also have musicomh.com twice), these would need to be fixed. Similar to this, you have sahibinden.com and Sahibinden.com both pointing to Sahibinden.com, as mentioned, due to the casing there is no point creating either of these. I'm slightly confused by this entry:
"FOK!" fok.nl local
What is "local" doing in there? The following entry seems to be an invalid URL:
"Gorilla vs. Bear" gorillasvsbear.net
In summary, my thoughts at the moment are that the list is going to need a lot more work for this bot to work out. - Kingpin13 (talk) 15:24, 29 July 2012 (UTC)[reply]
|url=
fields in article infoboxes. In the Gorilla vs. Bear example, it was the |url=
field that was wrong (both, list entry and url field, are now fixed). The bot will not overwrite existing redirects, but duplication with different letter cases was an actual mistake. I removed all duplicated entries from the final list.--OsamaK (talk) 19:31, 29 July 2012 (UTC)[reply]Approved. As I said above, this is a pretty straightforward task. Seems to be implemented well by OsamaK, and the number of redirects to be created is small enough that any problems can be dealt with without too much effort. - Kingpin13 (talk) 21:50, 31 July 2012 (UTC)[reply]