Originally posted on mastodon.technology.
We had an extended family Zoom call yesterday. My uncle had difficulties joining with audio, so my cousin (in a different location) started facetiming with him on her phone and held it up to the computer screen so he could see us and talk. It was cute but also really frustrating.
For the past 5ish years, I've been working on a project called libraryupgrader (LibUp for short) to semi-automatically upgrade dependency libraries in the 900+ MediaWiki extension and related git repositories. For those that use GitHub, it's similar to the new dependabot tool, except LibUp is free software.
One cool feature that I want to highlight is how we are able to fix npm security issues in generally under 24 hours across all repositories with little to no human intervention. The first time this feature came into use was to roll out the eslint RCE fix (example commit).
This functionality is all built around the npm audit command that was introduced in npm 6. It has a JSON output mode, which made it straightforward to create a npm vulnerability dashboard for all of the repositories we track.
The magic happens in the npm audit fix
command, which automatically updates semver-safe changes. The one thing I'm not super happy about is that we're basically blindly trusting the response given to us by the npm server, but I'm not aware of any free software alternative.
LibUp then writes a commit message by mostly analyzing the diff, fixes up some changes since we tend to pin dependencies and then pushes the commit to Gerrit to pass through CI and be merged. If npm is aware of the CVE ID for the security update, that will also be mentioned in the commit message (example). In addition, each package upgrade is tagged, so if you want to e.g. look for all commits that bumped MediaWiki Codesniffer to v26, it's a quick search away.
Lately LibUp has been occupied fixing the minimist prototype pollution advisory through a bunch of dependencies: gonzales-pe, grunt, mkdirp and postcss-sass. It's a rather low priority security issue, but it now requires very little human attention because it has been automated away.
There are some potential risks - someone could install a backdoor by putting an intentional vulnerability in the same version as fixing a known/published security issue. LibUp would then automatically roll out the new version, making us more vulnerable to the backdoor. This is definitely a risk, but I think our strategy of pulling in new security fixes automatically protects us more than the potential downside of malicious actors abusing the system (also because I wouldn't absolutely trust any code pulled down from npm in the first place!).
There are some errors we see occasionally, and could use help resolving them: T228173 and T242703 are the two most pressing ones right now.
Originally posted on mastodon.technology.
The latest #MediaWiki security update has hit #Debian - https://lists.debian.org/debian-security-announce/2020/msg00053.html
Only buster users need to update as stretch did not contain the vulnerable code (yay?).
tl;dr: The Spartan Daily picked up best student newspaper honors for the first time and had its best awards season ever. Inside Scoop is a column about the operation of the Spartan Daily, San Jose State's student newspaper.
In 2016, I made the decision to go back to school and pursue a degree in journalism. I had briefly dabbled in it in middle school, but really had no idea what I was getting myself into.
I started at De Anza College, as a member of the La Voz staff. After a quarter covering the student government beat, I moved up to serve as news editor. I regularly felt that putting out a paper every two weeks was incredibly difficult ... not realizing what was waiting for me at San Jose State University.
I spent the Fall 2018 semester on SJSU's broadcast program, Update News, mostly getting familiar with the campus. And then, in January 2019, I began my stint as a staff writer on SJSU's flagship publication, the Spartan Daily. I quickly learned that putting out a paper 3 days a week was basically a real job. Every moment I wasn't in class, I'd be running off to conduct an interview or finish typing up a story before my deadline. I started staying late as the editors put together the paper - I was fully hooked.
The Daily basically rotates staff every semester, so in April the advisers and some of the outgoing editors selected me as the next executive editor (our fancy name for the editor-in-chief). I wasn't actually present in class when they played Taylor Swift to announce my selection - I was at a robotics tournament in Houston. Oops.
I spent the summer interning in New York, slowly plotting planning how exactly to run the Spartan Daily. There were some things we had done great while I was a writer, but some things I wanted to redo entirely.
Thankfully, I wasn't embarking on this journey alone. Victoria, my managing editor, was technically #2 in the leadership heirarchy, but it ended up becoming a partnership. Early on I disregarded her advice a few times - and generally came to regret it. I'd like to think I very much learned my lesson.
We were backed up by a great team of editors. I've previously written how we put the team together, but the main thing I want to emphasize is that the editors were picked to create a cohesive team, rather than picking the most skilled person for each role. Add in our staff writers and it really felt like we were a family. Most everyone understood that we won or lost as a team AND THAT'S EXACTLY WHAT HAPPENED.
For the 2019 calendar year, the Spartan Daily was recognized as the best student newspaper in California by the California College Media Association (CCMA) and then again by the California News Publishers Assocation (CNPA).

Left to right: Nick (Spring 2019 executive editor), Jana (Spring 2019 managing editor), Victoria (Fall 2019 managing editor), me (Fall 2019 executive editor). Photo by Professor Craig.
This is probably one of the most team-based awards that I've had my individual name on. It's impossible for me to overstate how much every single person on the Daily staff contributed to this award. It felt incredibly fullfiling and validating with a bit of vindication mixed in to know that all of the work we put in paid off in being named the best student newspaper in the state.
On top of that, the Daily picked up a host of individual awards, wrapping up basically our best awards season ever. Here's the full list:
- Pinnacle Awards: 2nd place best sports investigative story (Lindsey)
- ACP: 2nd place best in-depth news story (Lindsey)
- ACP: 5th place best breaking news photo (Lindsey)
- ACP: honorable mention best newspaper inside page (Marci)
- ACP San Francisco Best of Show: 2nd place best newspaper special edition (for Fighting 'fake news')
- ACP San Francisco Best of Show: 4th place people's choice: newspaper
- ACP San Francisco Best of Show: 4th place people's choice: overall
- Hearst Journalism Awards: 2nd place Hearst Enterprise Reporting (Lindsey)
- CCMA: 1st place best newspaper (Nick, Jana, Kunal, Victoria)
- CCMA: 1st place best podcast (Vicente)
- CCMA: 2nd place best news series (Erica, Brendan, Jozy, Nathan, Chris)
- CCMA: 2nd place best editorial (Jonathan, Kunal)
- CCMA: 2nd place best news photograph (Lindsey)
- CCMA: 3rd place best sports photograph (Melody)
- CCMA: 3rd place best photo series (Brendan)
- CCMA: 3rd place best newspaper inside spread design (Lindsey, Kunal, Marci)
- CCMA: 3rd place best social media reporting (Spartan Daily staff)
- CNPA: 1st place general excellence (Spartan Daily staff)
- CNPA: 1st place best enterprise news story (Lindsey, Jana, Mauricio, Kunal)
- CNPA: 1st place best illustration (Nachaela)
- CNPA: 3rd place best enterprise news story (Christian)
- CNPA: 4th place best enterprise news story (Chelsea, Vicente)
- CNPA: 4th place best news photo (Mauricio)
- CNPA: 4th place best illustration (Cindy)
The list has never been this long before. And while the CCMA and CNPA awards are only statewide, for ACP, Pinnacle and Hearst we competed against colleges all across the country.
I would be remiss if I didn't thank our two advisers, Richard Craig and Mike Corpos, for supporting us throughout this entire experience. I knew that both of them would always have our backs, no matter what. Even that one time I walked into the newsroom and told them, "I'm going to be served sometime this week." The same applies to my adviser from La Voz, Cecilia Deck, who really helped me get started in the first place.
Originally posted on mastodon.technology.
Privacy is a right - not an earned privileged. Read more about how the proposed EARN IT Act would threaten that right in my column in the latest Spartan Daily: https://issuu.com/spartandaily/docs/sd031220all (page 2)
mwparserfromhell is now fully on wheels. Well...not those wheels - Python wheels!
If you're not familiar with it, mwparserfromhell is a powerful parser for MediaWiki's wikitext syntax with an API that's really convenient for bots to use. It is primarily developed and maintained by Earwig, who originally wrote it for their bot.
Nearly 7 years ago, I implemented opt-in support for using mwparserfromhell in Pywikibot, which is arguably the most used MediaWiki bot framework. About a year later, Merlijn van Deen added it as a formal dependency, so that most Pywikibot users would be installing it...which inadvertently was the start of some of our problems.
mwparserfromhell is written in pure Python with an optional C speedup, and to build that C extension, you need to have the appropriate compiler tools and development headers installed. On most Linux systems that's pretty straightforward, but not exactly for Windows users (especially not for non-technical users, which many Pywikibot users are).
This brings us to Python wheels, which allow for easily distributing built C code without requiring users to have all of the build tools installed. Starting with v0.4.1 (July 2015), Windows users could download wheels from PyPI so they didn't have to compile it themselves. This resolved most of the complaints (along with John Vandenberg's patch to gracefully fallback to the pure Python implementation if building the C extension fails).
In November 2016, I filed a bug asking for Linux wheels, mostly because it would be faster. I thought it would be just as straightforward as Windows, until I looked into it and found PEP 513, which specified that basically, the wheels needed to be built on CentOS 5 to be portable enough to most Linux systems.
With the new Github actions, it's actually pretty straightforward to build these manylinux1 wheels - so a week ago I put together a pull request that did just that. On every push it will build the manylinux1 wheels (to test that we didn't break the manylinux1 compatibility) and then on tag pushes, it will upload those wheels to PyPI for everyone to use.
Yesterday I did the same for macOS because it was so straightforward. Yay.
So, starting with the 0.6.0 release (no date set yet), mwparserfromhell will have pre-built wheels for Windows, macOS and Linux users, giving everyone faster install times. And, nearly everyone will now be able to use the faster C parser without needing to make any changes to their setup.