Dependency management at the cost of independence
Table of Contents
Today is Sunday, I woke up to a message on our Slack about a response to a CVE (Common Vulnerabilities and Exposures) message, even though I work in product, to satisfy my curiosity, I joined the call where our DevOps team was attending this issue.
The team was responding to a major security incident involving a popular GitHub Action called “tj-actions/changed-files”. This action, which helps developers identify which files have changed in a pull request, had been compromised. Someone had gained write access to the repository and injected malicious code that would dump environment variables—including sensitive secrets—to build logs. What made this attack particularly nasty was that the attacker had modified all version tags to point to the compromised code, meaning even teams who thought they were safely using older, stable versions were affected.
For many teams, this meant an urgent need to audit workflows, rotate credentials, and find replacement actions.
Now without asking my colleagues any questions, I went directly to Hacker News to read more about this issue and found a thread immediately. What was more interesting to me was the discussion around broader security issues here. I remember reading the book “Tools and Weapons” from Brad Smith and Carol Ann Browne from a couple of years ago which spoke about these issues.
Lessons from “Tools and Weapons”
In “Tools and Weapons,” Smith and Browne presciently warned about the double-edged nature of our technological progress. The same innovations that empower us can be weaponized against us. They emphasized how the interconnected nature of software creates a vast attack surface where vulnerabilities in one system can cascade through others.
One passage that stuck with me was how Smith described the need for a “Digital Geneva Convention” to establish norms in cyberspace. When I read it back then, it seemed somewhat abstract, but watching our team contain the fallout from a compromised dependency made it concrete. We had a similar supply-chain issue a couple of months ago due to a compromised NPM token. The book makes more and more sense to me now.
Smith also emphasized collective responsibility in cybersecurity. He argued that technology companies must take responsibility for the security of their products throughout their lifecycle. Today’s incident, and similar other incidents, shows exactly what happens when we rely on tools without clear accountability structures around them.
Food for thought from the Hacker News thread
Reading through the thread, I was struck by several thought-provoking points that went beyond just this specific incident:
The illusion of trust: We build our software on foundations we don’t fully understand. One commenter shared how they opened eslint’s dependency tree and found it pulled in over 1,200 dependencies when including dev dependencies. How can anyone realistically audit that?
The paradox of updates: We’re constantly told to “keep your software updated” for security reasons, but updates themselves are becoming attack vectors. Do we pin versions and risk known vulnerabilities, or follow pointers to latest versions and risk malicious updates?
Security through isolation: Several developers shared how they’ve returned to using virtual machines for development work or containerizing everything. One person mentioned they now do all their web browsing in a VM too.
The economics of open source: Someone made the point that many of these issues stem from economic and social problems, not just technical ones. When maintainers of critical open source projects aren’t sustainably funded, they sometimes sell to entities with questionable motives.
Defense in depth vs. convenience: There seems to be a growing realization that we’ve optimized too much for developer convenience at the expense of security. Tools like Deno that implement permissions by default were mentioned favorably, despite their initial friction.
The death of fun: One comment that particularly resonated described this situation as “the death of fun” in programming. Like when a city becomes crime-ridden enough that you have to lock your car when going into a grocery store. Has building software become so fraught with security concerns that the joy is being leached out of it?
Personal reflections
As someone in product, and a deep interest in engineering, this incident has me reconsidering how we make decisions about our development stack and dependencies. Are we asking the right questions when evaluating new tools? Do we have visibility into our full dependency chain?
I’ve always championed developer velocity and using best-in-class tools, but maybe there’s wisdom in the old-school approach of doing more with less (jQuery?, couldn’t resist) and carefully vetting what we bring into our ecosystem.
Trust but verify isn’t enough anymore, it’s time to verify then trust.