Board Election results, next steps

Unfortunately, I didn't make it onto the Wikimedia Foundation board, you can see the full results. In the first round I was behind by ~280 votes, which is pretty close considering nearly 6,000 votes were cast!

I already said it, but I'm really thankful to everyone who supported me, whether you campaigned for me or just cast a vote. If there's anything I can do to help you out, you know how to find me :-)

What's next? First, it's time for me to get back to work, I only have a 4-month backlog of bugs and patches to get through. And then continuing to agitate for change to make the WMF and Wikimedia a better place.

P.S. I'm also planning to be much more involved with the awesome people in Wikimedia New York City in the coming year.


Thank you for supporting my WMF Board candidacy

While we're waiting for the results (September 21st at the earliest), I want to thank everyone who helped with my WMF Board candidacy.

From the people who initially gave me the confidence to run, to the people who helped me get through the Affiliate round, and then the people who ran Get Out The Vote efforts for the community voting period or even just told me that they were supporting me: thank you.

A very special set of people were there every stage of the way, I can't really express in words how much I appreciate each of you. <3

Globally, voter turnout was down: only 5,955 votes this year compared to last year's 6,873. Yet, in each group of users we tracked, turnout was solidly up:

Group 2021 2022 Diff
Stewards 47.37% 63.16% 15.79 points
mediawiki.org admins 44.26% 67.74% 23.48 points
en.wikipedia.org admins 22.35% 26.41% 4.07 points
NYC meetup list 26.92% 33.33% 6.41 points
Signers of NPP letter 29.82% 46.35% 16.52 points
[[Category:Wikipedians who use Discord (software)]] 14.45% 29.24% 14.79 points
[[Category:Wikipedians who use Internet Relay Chat]] 16.94% 31.40% 14.47 points

Pretty incredible. Now it's at least two more weeks of anxious waiting and hopefully some time for actual wiki editing!



Kiwix in Debian, 2022 update

Previous updates: 2018, 2021

Kiwix is an offline content reader, best known for distributing copies of Wikipedia. I have been maintaining it in Debian since 2017.

This year most of the work has been keeping all the packages up to date in anticipation of next year's Debian 12 Bookworm release, including several transitions for new libzim and libkiwix versions.

  • libzim: 6.3.0 → 8.0.0
  • zim-tools: 2.1.0 → 3.1.1
  • python-libzim: 0.0.3 → 1.1.1 (with a cherry-picked patch)
  • libkiwix: 9.4.1 → 11.0.0 (with DFSG issues fixed!)
  • kiwix-tools: 3.1.2 → 3.3.0
  • kiwix (desktop): 2.0.5 → 2.2.2

The Debian Package Tracker makes it really easy to keep an eye on all Kiwix-related packages.

All of the "user-facing" packages (zim-tools, kiwix-tools, kiwix) now have very basic autopkgtests that can provide a bit of confidence that the package isn't totally broken. I recommend reading the "FAQ for package maintainers" to learn about all the benefits you get from having autopkgtests.

Finally, back in March I wrote a blog post, How to mirror the Russian Wikipedia with Debian and Kiwix, which got significant readership (compared to most posts on this blog), including being quoted by LWN!

We are always looking for more contributors, please reach out if you're interested. The Kiwix team is one of my favorite groups of people to work with and they love Debian too.



A belated writeup of CVE-2022-28201 in MediaWiki

In December 2021, I discovered CVE-2022-28201, which is that it's possible to get MediaWiki's Title::newMainPage() to go into infinite recursion. More specifically, if the local interwikis feature is configured (not used by default, but enabled on Wikimedia wikis), any on-wiki administrator could fully brick the wiki by editing the [[MediaWiki:Mainpage]] wiki page in a malicious manner. It would require someone with sysadmin access to recover, either by adjusting site configuration or manually editing the database.

In this post I'll explain the vulnerability in more detail, how Rust helped me discover it, and a better way to fix it long-term.

The vulnerability

At the heart of this vulnerability is Title::newMainPage(). The function, before my patch, is as follows (link):

public static function newMainPage( MessageLocalizer $localizer = null ) {
    if ( $localizer ) {
        $msg = $localizer->msg( 'mainpage' );
    } else {
        $msg = wfMessage( 'mainpage' );
    }
    $title = self::newFromText( $msg->inContentLanguage()->text() );
    // Every page renders at least one link to the Main Page (e.g. sidebar).
    // If the localised value is invalid, don't produce fatal errors that
    // would make the wiki inaccessible (and hard to fix the invalid message).
    // Gracefully fallback...
    if ( !$title ) {
        $title = self::newFromText( 'Main Page' );
    }
    return $title;
}

It gets the contents of the "mainpage" message (editable on-wiki at MediaWiki:Mainpage), parses the contents as a page title and returns it. As the comment indicates, it is called on every page view and as a result has a built-in fallback if the configured main page value is invalid for whatever reason.

Now, let's look at how interwiki links work. Normal interwiki links are pretty simple, they take the form of [[prefix:Title]], where the prefix is the interwiki name of a foreign site. In the default interwiki map, "wikipedia" points to https://en.wikipedia.org/wiki/$1. There's no requirement that the interwiki target even be a wiki, for example [[google:search term]] is a supported prefix and link.

And if you type in [[wikipedia:]], you'll get a link to https://en.wikipedia.org/wiki/, which redirects to the Main Page. Nice!

Local interwiki links are a bonus feature on top of this to make sharing of content across multiple wikis easier. A local interwiki is one that maps to the wiki we're currently on. For example, you could type [[wikipedia:Foo]] on the English Wikipedia and it would be the same as just typing in [[Foo]].

So now what if you're on English Wikipedia and type in [[wikipedia:]]? Naively that would be the same as typing [[]], which is not a valid link.

So in c815f959d6b27 (first included in MediaWiki 1.24), it was implemented to have a link like [[wikipedia:]] (where the prefix is a local interwiki) resolve to the main page explicitly. This seems like entirely logical behavior and achieves the goals of local interwiki links - to make it work the same, regardless of which wiki it's on.

Except it now means that when trying to parse a title, the answer might end up being "whatever the main page is". And if we're trying to parse the "mainpage" message to discover where the main page is? Boom, infinite recursion.

All you have to do is edit "MediaWiki:Mainpage" on your wiki to be something like localinterwiki: and your wiki is mostly hosed, requiring someone to either de-configure that local interwiki or manually edit that message via the database to recover it.

The patch I implemented was pretty simple, just add a recursion guard with a hardcoded fallback:

    public static function newMainPage( MessageLocalizer $localizer = null ) {
+       static $recursionGuard = false;
+       if ( $recursionGuard ) {
+           // Somehow parsing the message contents has fallen back to the
+           // main page (bare local interwiki), so use the hardcoded
+           // fallback (T297571).
+           return self::newFromText( 'Main Page' );
+       }
        if ( $localizer ) {
            $msg = $localizer->msg( 'mainpage' );
        } else {
            $msg = wfMessage( 'mainpage' );
        }

+       $recursionGuard = true;
        $title = self::newFromText( $msg->inContentLanguage()->text() );
+       $recursionGuard = false;

        // Every page renders at least one link to the Main Page (e.g. sidebar).
        // If the localised value is invalid, don't produce fatal errors that

Discovery

I was mostly exaggerating when I said Rust helped me discover this bug. I previously blogged about writing a MediaWiki title parser in Rust, and it was while working on that I read the title parsing code in MediaWiki enough times to discover this flaw.

A better fix

I do think that long-term, we have better options to fix this.

There's a new, somewhat experimental, configuration option called $wgMainPageIsDomainRoot. The idea is that rather than serve the main page from /wiki/Main_Page, it would just be served from /. Conveniently, this would mean that it doesn't actually matter what the name of the main page is, since we'd just have to link to the domain root.

There is an open request for comment to enable such functionality on Wikimedia sites. It would be a small performance win, give everyone cleaner URLs, and possibly break everything that expects https://en.wikipedia.org/ to return a HTTP 301 redirect, like it has for the past 20+ years. Should be fun!

Timeline

Acknowledgements

Thank you to Scott Bassett of the Wikimedia Security team for reviewing and deploying my patch, and Reedy for backporting and performing the security release.