Wikipedia:Wikipedia Signpost/2007-12-03/Software issues
Software bug fixed, overuse of parser function curtailed
This week, a bug that was introduced in a new revision of MediaWiki caused an infrequent bug that resulted in the rendering of an internal marker within templates and wikitext; the bug was quickly fixed, but left a series of articles affected by the bug. Meanwhile, developers put a limit on the usage of the {{#ifexist:}} parser function; overuse of this function (thousands of times for one page view, in some cases) led to unusually heavy traffic on database servers.
UNIQ bug
A new preprocessor is being developed; the preprocessor is the part of MediaWiki that processes transclusions and some related constructs before a page is parsed. This has lead to some unusual problems involving 'strip markers', which are used internally by the parser to handle some constructs (notably <nowiki> and <ref> tags, but also some others). The problems involved have had one obvious symptom: strip markers, instead of being merely an internal concept in the parser, have shown up in rendered pages and sometimes even in wikitext as strings starting with UNIQ and ending with QINU. This has lead to two bugs, which were set to maximum priority in the bug tracker.
The first known bug related to this, bug 12056, was discovered on some non-Wikimedia wikis; among other things, it broke the use of <charinsert> (which is used to create the box of characters visible under the edit box). This particular problem was fixed in r27871.
However, a more obvious and serious, yet unpredictable and therefore not as quickly noticed, bug (bug 12154) also existed with similar symptoms, and has caused problems on Wikimedia wikis, including here on the English Wikipedia (Village pump discussion); it caused strip markers to show up in the rendered view of many pages, and worse, some of them ended up in the wikitext of pages. User:Splarka/UNIQ list is a list of the pages known to be permanently affected by this bug.
The software update that caused the second bug was applied at 19:10 UTC on 29 November; it was reverted 18 minutes later due to widespread problems it was causing in relation to the <ref> tag. A developer script (analogous to a bot) was run by developers to help to solve some of the permanent problems caused by the bug.
A potential fix for the bug has been submitted to the code base as r28004; this fix has apparently worked on other wikis, although developers have not yet confirmed that it is a final fix for this problem.
#ifexist limit
The {{#ifexist:}} parser function (which checks for a page's existence) has been causing some concern among developers recently. The function is in heavy use in some templates and on some pages; so heavy, in fact, that it was causing problems for the server. After some discussion on the developer's mailing list, Werdna blanked Template:Highrfc-loop (part of {{usercheck}}), with the edit summary "This page is making hundreds of requests from the database for every parse, tying up the database servers for no good reason. Treat as developer intervention. Use Special:Prefixindex instead". ({{usercheck}} has since been modified to use Special:Prefixindex instead.) This sparked off a discussion at the village pump; one concern raised at that point was that Wikipedia:Protected titles might cause similar problems.
To prevent similar problems in the future, developers have implemented a limit of 100 #ifexist: calls per page maximum; this caused many templates to break across Wikipedia.[1] (The limit was raised to 2000 for a 1-week period to give time to fix this; however, it will be reduced back to 100 afterwards.) As with the template limits (which serve a similar function, of disallowing pages so complicated that they strain the servers), an HTML comment is placed in the HTML source of pages to show how close they are to the limit. A second village pump discussion discussed this, and discussed what other pages might have problems (including the technical village pump itself; {{archive list}} turned out to be responsible for about half the ifexists there). Developers have compiled a list (note: large page with duplicates) of pages that will fail to render properly with the new limit. Users are advised to change or simplify the coding of any page or template that causes problems with the new limits.
Discuss this story