Malus.sh does one thing: upload your dependency list, and AI rewrites open source projects, outputting "legally distinct" code that bypasses all license obligations. This isn't a proof of concept—it's a working tool. The "code copying" fact that open source licenses depend on is being dismantled by AI.

What This Is

Malus.sh calls itself "AI clean-room (open source laundering as a service)." The operation is simple: upload a dependency list like package.json, requirements.txt, or Cargo.toml, and AI reimplements based on the original project's documentation, APIs, tests, and behavioral specifications, outputting a functionally equivalent version with entirely different code. Result: no attribution, no copyleft (a license type requiring derivative works to remain open source), no license inheritance obligations.

The author calls this a satire project, but it's simultaneously functional. The satire is this: open source licenses constrain your behavior of "using, copying, modifying, and distributing code"—but if AI rewrites it in a different language with different implementation, what can the license still constrain? Copyright protects expression, not ideas and methods—AI rewriting lands precisely in this gap.

Similar cases are already happening: a Rust version appeared the day after Claude Code source code leaked; OpenClaw had over a dozen language variants shortly after going open source. Previously, "open source avoidance" required lengthy rewriting to erase traces; now AI compresses this cost to near zero.

Industry View

A relevant talk at FOSDEM 2026 raised a sharp question: if AI can rebuild 90% of the open source supply chain in a short time, what does this mean for the open source ecosystem? We note that the core issue is not technical capability, but incentive mechanisms—AGPL (a strong copyleft license) was originally designed to prevent SaaS companies from using open source projects for cloud services without contributing back source code, but if companies can "AI clean-room rewrite," then license constraints become effectively dead. Open source shifts from "public infrastructure" to "free product prototype library."

But there are dissenting voices. Legal practitioners point out that clean-room defense requires strict process isolation in judicial practice—you must prove that the AI training data did not contain the original code, and the current transparency of large model training data simply cannot meet this requirement. In other words, whether "AI rewriting" truly constitutes independent creation legally is far from settled. Malicious license avoidance behavior could be deemed "substantially similar" in court, paradoxically increasing infringement risk.

Impact on Regular People

For enterprise IT: the compliance review logic for open source components needs updating. Simply checking license declarations may no longer suffice—you also need to assess whether AI rewrite avoidance exists in the supply chain. This becomes a new audit cost.

For individual careers: developers' core competitiveness shifts further from "being able to write code" to "being able to define problems and design systems"—code itself is increasingly easy to copy and rewrite, but problem definition and architectural decisions are not.

For the consumer market: short-term user perception is minimal, but long-term, if the open source ecosystem's feedback mechanism is destroyed, the maintenance motivation for high-quality open source projects may decline, ultimately affecting software quality.