From DoomWiki.org

(Templatable links: autonomously -> as autonomous processes)
(Report on Wikipedia linking?: sorry, one more...)
Line 228: Line 228:
  
 
:::: There's so much progress here, people can't help being inspired to further suggestions!  :>   I hope you won't interpret it as criticism if I make suggestions before testing any bibliographic edits — it's a big job and I don't want people to have to revert me later.
 
:::: There's so much progress here, people can't help being inspired to further suggestions!  :>   I hope you won't interpret it as criticism if I make suggestions before testing any bibliographic edits — it's a big job and I don't want people to have to revert me later.
 +
 +
:::: Regarding ''mass uploaded images'', you have apparently handled this, by counting links in template space once per template instead of once per transclusion.
  
 
:::: You're probably correct to parse wikitext instead of HTML; it provides more fine-grained results.  Note that:
 
:::: You're probably correct to parse wikitext instead of HTML; it provides more fine-grained results.  Note that:
 
::::* {{c|<nowiki>[[wiki:</nowiki>}} also points to Wikipedia.
 
::::* {{c|<nowiki>[[wiki:</nowiki>}} also points to Wikipedia.
 
::::* I'm a regex newbie, but the above doesn't seem to include MediaWiki's case insensitivity on the first character.&nbsp; E.g. {{c|<nowiki>{{wp</nowiki>}} and {{c|<nowiki>{{Wp</nowiki>}} are the same.&nbsp; {{wp|fair use}} and {{wp|Fair use}} are the same.&nbsp; Further, interwiki prefixes are [[wIKIpEDia:Help:Interwiki linking#Project titles and shortcuts|completely case-insensitive]].
 
::::* I'm a regex newbie, but the above doesn't seem to include MediaWiki's case insensitivity on the first character.&nbsp; E.g. {{c|<nowiki>{{wp</nowiki>}} and {{c|<nowiki>{{Wp</nowiki>}} are the same.&nbsp; {{wp|fair use}} and {{wp|Fair use}} are the same.&nbsp; Further, interwiki prefixes are [[wIKIpEDia:Help:Interwiki linking#Project titles and shortcuts|completely case-insensitive]].
 +
::::* Similarly, {{wp|Blake Stone: Aliens of Gold}} and {{wp|Blake_Stone:_Aliens_of_Gold}} are the same.
 
::::* I endorse Gez's idea of a third column: direct links are far more likely to be unmarked sources, at least on the articles I've seen.
 
::::* I endorse Gez's idea of a third column: direct links are far more likely to be unmarked sources, at least on the articles I've seen.
 
:::: Regarding ''mass uploaded images'', you have apparently handled this, by counting links in template space once per template instead of once per transclusion.
 
  
 
:::: If the consensus is not to filter mainspace links (time for that general discussion&nbsp; :D&nbsp;&nbsp; then there will be many more ideas and many more possible mass edits to implement them.&nbsp; I personally still intend to focus on the task in the OP, and would heuristically skip entries clearly outside it ({{wp|Blu-Ray}}, {{wp|Lava}}).&nbsp; If no one else made use of the remainder, then anything I missed would simply remand to a more general link rot project in the far future.
 
:::: If the consensus is not to filter mainspace links (time for that general discussion&nbsp; :D&nbsp;&nbsp; then there will be many more ideas and many more possible mass edits to implement them.&nbsp; I personally still intend to focus on the task in the OP, and would heuristically skip entries clearly outside it ({{wp|Blu-Ray}}, {{wp|Lava}}).&nbsp; If no one else made use of the remainder, then anything I missed would simply remand to a more general link rot project in the far future.

Revision as of 20:28, 27 March 2018

Archived discussions

External link editing

Hi Xymph.  I'm not sure about deleting items from Sources sections [1].  That probably means it was the source of some content in that article specifically (e.g. the readme, or some trivia in the DW reviews).    Ryan W (usually gone) 13:31, 24 February 2017 (CST)

Well, you should know, as you created the initial revision which already contained the link. :) Upon further reading, it is related to the "Inspiration and development" mention about "Nathrath's Fab Four". That just wasn't clear, as other Sources entries often have more info like this or use a reference to link it to the place where it's used. With neither, I got the impression the idgames link was used just like here, and that seemed redundant as indicated by my removal reason.
I'll fix them up, including the mix-ups where M5 (in the other collection) didn't get a sources link – nor did M2 – while M6 (not in it) did. ;) --Xymph (talk) 15:00, 24 February 2017 (CST)
Thank you.  FTR I was prepared to hear reasoning for not listing certain sources, which has happened in the past.  And I should know, but I don't, because it was such a disorganized period that I'm thankful to have remembered the difference between square and curly brackets each time.  :>
The general point about footnotes is a good one (but a completely separate project, I think!).  Long story short, the situation you describe does confuse people, but it is a huge and probably non-bottable task to fix it.  I personally would be OK with uniform use of inline references if everyone had put some thought into the question and decided they were good.  I don't believe that is currently the case.    Ryan W (usually gone) 15:28, 24 February 2017 (CST)

Fixes

Thank you for all your fixes. Ducon (talk) 07:38, 21 May 2017 (CDT)

No problem, we all build on previous contributions. If you're going to do more map series, I can help creating the map skeletons bot-wise. It improves consistency and leaves you more time to write information and descriptions, and add screenshots (which are still needed for a gazillion map pages). --Xymph (talk) 07:44, 21 May 2017 (CDT)
I’m taking screenshots for the top 100 WADs and for the cacowarded maps, thus, I create pages for such maps. Ducon (talk) 07:49, 21 May 2017 (CDT)
I'm working on the same, so we can collaborate, like we did implicitly yesterday :) and previously on IC2005. After you create an overview page like Vae Victus 2, you could leave it to me to create the individual map pages with all the basic info that the bot scripts can add. Then you resume with the screens and other info. Of course I'm not always available to do that immediately, so it takes perhaps a little planning ahead (and occasionally patience) for this to work for you too. --Xymph (talk) 08:12, 21 May 2017 (CDT)
Great. I’m going to play Classic episode 2 (https://www.doomworld.com/idgames/levels/doom/Ports/a-c/class_e2). Ducon (talk) 13:59, 21 May 2017 (CDT)
Fine, so create the WAD overview page with sufficient info, then I'll follow with the map pages. And for future WADs you gonna play, creating the overview page will again be my cue – no further posts here necessary. Have fun! --Xymph (talk) 14:25, 21 May 2017 (CDT)

Woodexial's Secrets

I'm using GZDoom's automap instead of an editor and it shows only 7 secrets instead of 8. Was Sector 188 is the one with a Ccacodemon and a backpack in a pit? --ZeroTheEro (talk) 19:20, 22 May 2017 (CDT)

Yes, 188 is the sector just before that pit. I did only a fly-thru, clipping on/off as needed to get to places and trigger the secrets, but the automap in ZDoom does show x/8 secrets too. Not sure how to make it reveal the sector number, and why you got most of them differently than they are stored in the map. --Xymph (talk) 02:31, 23 May 2017 (CDT)
To put it simply, the sector numbers didn't match the secret because I have no idea how to check it. I'm just having guesswork on the sector number since I list the secrets by gradual progression. Please correct me if I am wrong. --ZeroTheEro (talk) 04:28, 23 May 2017 (CDT)
I see. But in skeleton pages the secret sectors are simply in numeric order rather than play progression. I'd rather not be correcting them all the time though. ;) To match the numbers to sectors in the map, I use DeePsea: switch to sector mode with the mouse scroll-wheel, use the 'j' key to bring up the jump dialog, and enter the number to be taken to that highlighted sector. There may be other/better ways in other editors. --Xymph (talk) 04:44, 23 May 2017 (CDT)
Thanks, yo. --ZeroTheEro (talk) 13:23, 23 May 2017 (CDT)
Hi Xymph.  Not to open another can of worms here  ;>  but you mean that bot-generated skeletons use numeric order.  IME human editors adding descriptions tend to reorder them along the base walkthrough route, in hopes that's clearer for the newbies.    Ryan W (usually gone) 11:30, 26 May 2017 (CDT)
That wording is indeed more accurate, and what I meant in general. Although it has also happened that a contributor added the sector numbers manually without descriptions, e.g. from DeePsea's search results for sector type, and those would also be in numeric order. I'd still call that skeleton lists, but once a contributor has added descriptions I'd no longer consider that section a skeleton anymore. And of course the order can then be any that is best suited for the pertaining map. --Xymph (talk) 13:11, 26 May 2017 (CDT)

XymphBot and skeletons (again)

Hi Xymph,

I'm just now beginning to examine some bot edits in more detail.  It's still amazing how many features you managed to add in a short time.  Anyone who has started a few dozen pages from scratch will appreciate the massive efficiency gain (with respect to both actual writing and interlarded decompression :D

I notice that you sometimes return to address an issue manually, e.g.:

  1. erasing a footnote
  2. removing a category
  3. adding a template that was already present
  4. transposing a heading with a paragraph

You've forgotten more about programming than I'll ever know  :>  so perhaps you added validations to the code as needed, or the bot itself alerted you when existing text didn't match the standard form.  But if that wasn't possible (because string parsing is very hard), I would like to help by finding any remaining similar issues.  Is there a way to narrow down relevant edits of each type, e.g. by time stamps or canned edit summaries?

Yes, I've complained about gameplay articles attracting hoi polloi revisions because of the low barrier to entry, but those still deserve to be evaluated in context by a human.

Anyway, I hope that makes some sense, and thanks again for caring about our project.    Ryan W (living fossil) 01:33, 12 July 2017 (CDT)

Thanks for the com(pli)ments. Also, I'm glad to see you becoming more active again. I don't follow "interlarded decompression" though, is that one of your Dr Who terms? ;)
Regarding manual touch-ups: yes sometimes the pattern matching is too tricky, but usually it's a trade-off between effort and gain.
1. Footnotes between/below Things tables are very rare so I simply didn't bother trying to code for them, as it would be harder than manually restoring them.
2. On most map pages the Statistics section is followed by other (stub-)sections. The MW API – and thus dmmpstBot.php – allows updating a single (sub)section, but category lines don't form their own section and are simply included in the last one. So in case of those Heretic pages the cats got stripped off by the bot edit because I didn't code for the possibility of any being in the Statistics section, as pages usually have External links headers and such in between. Additionally, conventionsBot.php warns me of category lines embedded somewhere else than at the end of pages. Lastly, the bot removals of the unnecessary "Levels by name" cats were actually convenient, as those could be dropped from all Heretic pages, while the firemace cat only needed to be re-added on 20 of them.
3. That happened because mapviewBot.php didn't take the empty line before the existing mapspots template into account. It's too long ago to remember exactly, but glancing over the code it could be that mapviewBot.php handles it correctly now. I must admit that sometimes I reviewed the interactive diffs too hastily and submitted edits needing manual clean-up which I then addressed in the pertaining script.
4. This page stood out because of the non-standard walkthrough formatting, but I didn't bother cleaning it up beyond the header move so that skeletonBot.php and mapviewBot.php wouldn't complain.
For cases 1 and 2, since dmmpstBot.php shows me the diff before actually submitting the edit, I know where to return to and manually restore things, and I also manually review all pages in a series afterwards. Thus I'm pretty sure I didn't leave (m)any loose ends of issues caused by bot edits, but I may have slipped up on occasion. I can't think of a pattern by which to locate them though.
For me the fun is both in programming and thus helping myself/others avoid tedious jobs (that were evidently hard to execute consistently and completely), and improving DW's content and presentation of factual information. After a year and a half, I do consider it my project too. :)
Some statistics: in my archive are 166 .ini files (including three for all single-level Doom, Doom II and Heretic PWADs) covering 3499 map pages.
So, much thanks for the offer of help and sorry for not being able to point you directly in the desired, uhh, direction. It you're still undeterred after all the above, here's how you (or anyone) can help. However, currently I'm immersed in INFO.c tools programming and subsequently transferring their output to the wiki, so combined with real-life interference it may take a while before I can return to bot-based map page updates. --Xymph (talk) 07:32, 12 July 2017 (CDT)
Thank you; that is quite informative.  There's certainly no rush, and I hope the bold text didn't imply otherwise.  (I did reread the discussions first, so I'm glad I hadn't overlooked something obvious in your updates.)
I'm learning that "active" is a floating-point quantity  :>  so e.g. I will hesitate to join a project systematically playtesting megawads.  That said, I already had your list bookmarked because some of those articles appeared hastily during an RFC, which I suspected would cause trouble for somebody.  Now that I know it's XymphBot, its previously successful PWADs can serve as examples to follow.
Re "decompression" — AFAICT it's a common experience that long, tedious jobs may be followed by an unrelated mindless task to clear one's head, before one can return to a different demanding activity.  It's hard to plan for, especially with real-life concerns present, so it's another reason bots are great.  :D    Ryan W (living fossil) 16:44, 12 July 2017 (CDT)

←←←

Hi again.  I've changed (hopefully improved) Template:Map skel slightly; does that affect XymphBot at all?  I dimly remembered that it did, but now I can't find the original discussion (oops...), only several cousins:

If I'm barking up the wrong tree again, just say so.  :>   I did not have to change any headings to match our newest articles, which presumably means you and the other "map regulars" had already applied your usual thoroughness.  In theory there might be additional tricks available, e.g. reading the slot automatically from the page title, but I would like to keep the markup simple because non-regulars sometimes create map pages (especially after Cacowards).    Ryan W (living fossil) 00:31, 17 July 2017 (CDT)

You may be remembering the remark at the end of Template:Things. But DMMPST includes its own copies of the templates it uses, so there is no direct consequence. However for consistency, I similarly dropped the ftp: stub and added the Adding custom music wikilink; and adopted the dummy NavboxTemplateGoesHere line for which I had to update navboxBot.php as well. Thanks for the – indeed – improvements. --Xymph (talk) 08:28, 17 July 2017 (CDT)

"Revert misguided spots formatting"

Hi Xymph.  This function appears not to detect multiple letters within the same parentheses.  For example, here the text ''(J, K)'' was unchanged.    Ryan W (living fossil) 06:52, 16 August 2017 (CDT)

Good catch, thanks. Fixed for all Ultimate Doom maps, found none in the Doom2 map pages. Let me know if you find some in another map series. --Xymph (talk) 08:20, 16 August 2017 (CDT)

November featured article thread

Which "both articles", RyanW?   — Hmm, probably just being sloppy with verb tenses again.  "Both articles will have been featured when all is said and done, so there's no harm done to the original goal of showcasing the good work in both".  Does that make sense?    Ryan W (living fossil) 17:49, 10 November 2017 (CST)

I don't dare to say 'no' anymore at this point. :-P Yes --Xymph (talk) 02:38, 11 November 2017 (CST)

Report on Wikipedia linking?

Hi Xymph,

Ideas for future scripts, anyone?   — okay, what about a list of all links to Wikipedia meeting certain criteria?

There was a brief IRC discussion about knock-on effects of Wikipedia deletions.  It can be argued that any game-related article aside from the main AAA releases is at risk.  I would like to remediate this somehow, and as Quasar pointed out, that is much easier before the article is deleted.  :7

It is of course your decision whether this sounds interesting (and whether you have time!).  I have vague ideas about implementation details, which can and should be finalized through general discussion.  I am asking you first, however, because if we don't have a bot then my initial proposal will change.  :D

Thanks for reading, as always.    Ryan W (living fossil) 17:03, 18 March 2018 (CDT)

Time, indeed, is of the essence :) and now that my .exe/iwad project is done, I wanted to turn to other things on my to-do list, on DoomWiki and elsewhere. But this sounds like a multi-day project rather than a multi-month one, so it's doable. Can you elaborate on what the script should do, across which pages against what criteria, producing what kind of output? --Xymph (talk) 06:51, 19 March 2018 (CDT)
I can respond, but it creates no obligation on your part.  :>
Output would be a one-to-many mapping like this:
Wikipedia page Linked from
GeForce 6 series Aspect ratio
Category:Node builders
Christoph Oelckers (Graf Zahl)
Doom 3
PTV (Family Guy) Aspect ratio
File:Familyguy.jpg
Shareware
On any page where "encyclopedic" content is expected, sources might be added, so the territory would be namespaces 0, 4, 6, 10, 12, 14 except
  • Doom Wiki:Central Processing*
  • Doom Wiki:RFC*
  • Help:Spam/Archive
Criteria provide the most room for debate.  If I did this manually, I would want two things:
  1. The ability to identify every link, even if added indirectly (interwiki, transclusion, etc).
  2. A conservative first pass to filter out less likely candidate links.  Total volume is a bit too large to digest otherwise.
I assume #1 can be addressed by requesting pages in HTML format.  #2 is far more heuristic; options include
  • For each Wikipedia article, read Wikipedia's category system to identify related areas (gaming, computing, scripture, ...).
  • List only links formatted locally as references (citation tags, specific templates, ...).
  • List only links under whitelisted section titles.
I would choose the last approach.  The second would be very incomplete (indeed one motivation is to clean up sloppy citations from a decade ago).  The first would be beating the bot's head against the scale and nonlinearity of Wikipedia's categories, at least until it got blocked.  I did manage to compile promising titles already:
  • Additional information
  • Bibliography
  • Credits
  • Discography
  • Encyclopedia
  • External link
  • External links and references
  • External links:
  • External links
  • Footnotes
  • General info
  • General source material
  • Info
  • Information
  • Misc.
  • Misc
  • Miscellaneous
  • More information
  • Note
  • Notes
  • Other links
  • Other notes
  • Other references
  • Other
  • Publications
  • Published works
  • Reference/glossary section
  • References
  • Related articles
  • Related links
  • Related
  • Resources
  • See also
  • Source
  • Sources
  • Useful pages
In this context, a "section" ideally comprises all text until the next header at the same level.  The next header at any level is likely 99% accurate however.
This post represents only my personal hypotheses, subject to later consensus-building and any needed compromises regarding what is or is not feasible to code.    Ryan W (living fossil) 17:34, 19 March 2018 (CDT)
Further idea: this probably needs a throttle to avoid mass image uploads, which link pages such as Fair use through templates, and in practice have section structures varying by user.  If the search returns more than 25 results (say) for the same external link in the same namespace, just write [26 pages in namespace 6] or something.  If it's clearly an unusual case, I'll deal with it manually.    Ryan W (living fossil) 23:18, 22 March 2018 (CDT)
The past two days of work on the script resulted in a fair amount of progress. The last run went over 5 of the 6 namespaces, excluding main – I don't have an idea of the link volume in the latter yet. Because runs over that many pages take a long time, the script logs its progress like this:
Attempting to log in... Success.
collecting Wikipedia namespaces:  34

processing 'Doom Wiki'...
processed:    43 pages

processing 'File'... 100 200 300 400 500
processing 'File' from File:32in24_MAP09_hilltop.png... 100 200 300 400 500
processing 'File' from File:AnimFace.gif... 100 200 300 400 500
processing 'File' from File:Cchest4_MAP13.png... 100 200 300 400 500
processing 'File' from File:CommunityChest-map12-mud.png... 100 200 300 400 500
processing 'File' from File:Congestion1024-map19-end.png... 100 200 300 400 500
processing 'File' from File:DTWID-LE_E2M8_map.png... 100 200 300 400 500
processing 'File' from File:Doom3_weapons_07.jpg... 100 200 300 400 500
processing 'File' from File:Doom_v0.3_thestore.png... 100 200 300 400 500
processing 'File' from File:E3M4_map.png... 100 200 300 400 500
processing 'File' from File:Eternity-e2m1-s.png... 100 200 300 400 500
processing 'File' from File:H2H-Xmas_MAP31_map.png... 100 200 300 400 500
processing 'File' from File:Heretic-e3m4-snail.png... 100 200 300 400 500
processing 'File' from File:Interception_MAP04_map.png... 100 200 300 400 500
processing 'File' from File:Kansams_Trial_E2M7_map.png... 100 200 300 400 500
processing 'File' from File:MM_MAP02_Secret.png... 100 200 300 400 500
processing 'File' from File:MementoMori2-map04-down.png... 100 200 300 400 500
processing 'File' from File:NDCP-map23-torches.png... 100 200 300 400 500
processing 'File' from File:Origwad_2.png... 100 200 300 400 500
processing 'File' from File:Psx-command-control-start.png... 100 200 300 400 500
processing 'File' from File:Requiem-map27-end.png... 100 200 300 400 500
processing 'File' from File:Scythe2_MAP03_map.png... 100 200 300 400 500
processing 'File' from File:Stardate20X6.png... 100 200 300 400 500
processing 'File' from File:TheEvilUnleashed-e3m5-cage.png... 100 200 300 400 500
processing 'File' from File:Vile_Flesh_MAP11_map.png... 100 200
processed: 12260 pages

processing 'Template'... 100 200 300 400 500
processing 'Template' from Template:MAYhem_1500_21-30... 100 200 300 400
processed:   940 pages

processing 'Help'...
processed:     8 pages

processing 'Category'... 100 200 300 400 500
processing 'Category' from Category:Easter_eggs... 100 200 300 400 500
processing 'Category' from Category:MrGuyTodd_levels... 100 200 300 400
processed:  1491 pages
Some info about what listWikiLinks.php does:
  • init namespace lists and patterns; at first I used an API call to get Wikipedia's namespace list, then hard-coded it to save a call; in the exception list, Template:Wikipedia and Template:Wp were also needed
  • the regex patterns are: '{{wp\|(.+?)(?:\|.+?)?}}', '\[\[Wikipedia:(.+?)(?:\|.+?)?\]\]', and '\[https?://en.wikipedia.org/wiki/(.+?)(?: .+?)?\]'
  • from all selected namespaces, lists of pages are queried at 500 per batch and each page is then fetched and searched for the link patterns; this is the most time-consuming part, so a progress marker is logged every 100 pages
  • all data is collected in an array (keyed by Wikipedia path) of arrays (with DoomWiki paths), which are all sorted with flags SORT_NATURAL and SORT_FLAG_CASE
  • the tables are generated by traversing the data array for each Wikipedia namespace; for entries with more than 20 DW links, a more value is appended
Filtering of WP links is not implemented yet, I'm not sure that's wise. For one thing, quite a few WP links occur in the intro (section 0) of a page, so that should be in your list too. For another, perhaps the link volume isn't overwhelming, I am now running the script on main space only to find out.
About avoiding mass uploaded images, the script could skip filenames ending in "_map.png" and "_title.png" to save processing time indeed. Or what (else) did you have in mind?
Do you now want to launch a general discussion before I continue? --Xymph (talk) 05:41, 27 March 2018 (CDT)
I added the main space data. Okay, yes, it's voluminous. :) --Xymph (talk) 06:51, 27 March 2018 (CDT)
There's so much progress here, people can't help being inspired to further suggestions!  :>   I hope you won't interpret it as criticism if I make suggestions before testing any bibliographic edits — it's a big job and I don't want people to have to revert me later.
Regarding mass uploaded images, you have apparently handled this, by counting links in template space once per template instead of once per transclusion.
You're probably correct to parse wikitext instead of HTML; it provides more fine-grained results.  Note that:
  • [[wiki: also points to Wikipedia.
  • I'm a regex newbie, but the above doesn't seem to include MediaWiki's case insensitivity on the first character.  E.g. {{wp and {{Wp are the same.  fair use and Fair use are the same.  Further, interwiki prefixes are completely case-insensitive.
  • Similarly, Blake Stone: Aliens of Gold and Blake_Stone:_Aliens_of_Gold are the same.
  • I endorse Gez's idea of a third column: direct links are far more likely to be unmarked sources, at least on the articles I've seen.
If the consensus is not to filter mainspace links (time for that general discussion  :D   then there will be many more ideas and many more possible mass edits to implement them.  I personally still intend to focus on the task in the OP, and would heuristically skip entries clearly outside it (Blu-Ray, Lava).  If no one else made use of the remainder, then anything I missed would simply remand to a more general link rot project in the far future.
Hope that makes some sense, and thanks as always for your patience.    Ryan W (living fossil) 19:56, 27 March 2018 (CDT)

Hey, a suggestion: would it be possible to have a third column listing the link types used? More generally, a bot capable of looking through the wiki pages to find any and all instance of direct links to domains that have interwiki entries would be useful for maintenance. --Gez (talk) 06:25, 27 March 2018 (CDT)

Sure, you mean whether a DW page uses wp template, interwiki or https link for a WP link? How should the third column look like, for example? And what if more than one method is used for the same WP link on a page?
The general suggestion would be a separate bot script. --Xymph (talk) 06:42, 27 March 2018 (CDT)
I figure the third column could just contain a simple marker, like D for direct link, I for interwiki link, and T and template link. If more than one method is used, then you get more than one marker. If there's a direct link and a template link, then it'd show DT for example. --Gez (talk) 07:03, 27 March 2018 (CDT)

Interwiki links

Copied from Wikipedia topic:
More generally, a bot capable of looking through the wiki pages to find any and all instance of direct links to domains that have interwiki entries would be useful for maintenance. --Gez (talk) 06:25, 27 March 2018 (CDT)

To verify, this script needs to search pages for the URLs in the Interwiki table (up to the $1 parameter), and show which interwiki prefix should be used instead? --Xymph (talk) 12:26, 27 March 2018 (CDT)
Yeah, I figure that should be enough. The only potential subtlety would be to watch out for http/https, i.e. there could be an http direct link to a site that has https in the interwiki table (or vice-versa). --Gez (talk) 13:42, 27 March 2018 (CDT)
If we're talking about maintenance, the reverse of this (an analogue of Special:LinkSearch for interwikis only) would occasionally be useful as well, e.g. when a site goes down permanently.  Again a low priority, unless someone finds a live example.    Ryan W (living fossil) 20:00, 27 March 2018 (CDT)

Templatable links

Moved from Wikipedia topic:
And while I'm on that general topic, a third script could search for direct links that could be templated -- I'm thinking about stuff like direct links to forums or archives, notably. There's no hurry, though. --Gez (talk) 07:03, 27 March 2018 (CDT)

Okay, one thing are a time. :) --Xymph (talk) 12:26, 27 March 2018 (CDT)
This one should be uncontroversial, but "no hurry" is an understatement.  :>   IMO a bot performing style edits would have to sweep at regular intervals, and I don't assume Xymph or anyone else can commit to that (wishful thinking solution: someone in the Doom community figures out how to host bots as autonomous processes, like the WMF does).    Ryan W (living fossil) 17:04, 27 March 2018 (CDT)