This is the story of how an afternoon causing mischief with a little bug in a video game led me to on a quest to purge a potentially dangerous exploit from software used by businesses and governments around the world, and also provided me with a practical lesson on why to report issues you find in code, even if they seem small at first.
It all started one afternoon this past Fall in an undergraduate class on entrepreneurship I was taking at Harvard Business School. We were learning about how to work effectively in small teams by participating in one of the Business School's simulations, which is used to teach professional executives lessons on teamwork.
We were put in groups of five and told that our goal was to work together as a team to (virtually) summit a mountain. Each of us was given a link to the simulation website and once we logged into the site we were greeted with a screen telling us about the unique role our assigned character was to play within the team. One person was the leader, one was an experienced mountaineer, one was a medic, and I was of course given the exceedingly useful role of environmentalist 😒🌳
The entire game was run through a set of forms on the website, but in addition there was also an ever present chatbox at the top of the page, which let you send instant messages directly to other players, or to the whole team. As my teammates squabbled over virtual supplies, I started to mess around with the chatbox and quickly realized that it was not escaping any HTML input I was entering.
Endowed with this new power, I sent code to my teammates which made a message popup on their screen informing them that bonus points would be awarded if the trash on the mountain was cleaned up and the person with the role of medic (who was being particularly combative) didn't make it to the summit. The virtual trash was pretty quickly collected and the medic abandoned. I dedicated the remainder of the game to turning my teammates against each other one by one with the lure of bonus points, and reveling in the chaos that ensued.
This bug in the simulation site definitely let me cause a lot of mayhem (all in good fun of course), but it was in no way a severe issue. The problem was internal to the game, so in the worst case a rogue student could use it to mess with classmates, but it couldn't be use to steal important information or hijack accounts like other notable cross site scripting bugs. I figured that the Business School professor who created the game had a student throw the site together in a late, caffeine-fueled night, and the bug was a one-off oversight.
I could have just let it be, but I knew that if there was an exploit like this in live code I had written I would want to be informed of it. So I decided to do the responsible thing and report the bug after class.
This was when things got interesting...
I wrote an email to Harvard Business Publishing, which owned the rights to the game and licensed it out. I told them that I had found a cross site scripting issue in the software and I asked to be put in touch with whomever was in charge of maintaining the simulation so that I could give them the details of the exploit. They responded by informing me that the simulation was actually created by a third-party software development firm and that they would pass along my message to them so that they could follow up directly.
Something didn't sit right about this. This bug wasn't introduced by a careless student, but by professional developer, in an established firm.
Now you may think, what's the big deal? This is just a game. Nothing really bad can happen if there are bugs in it. The issue isn't really the buggy game though. The problem is that when a professional software firm makes a mistake like this it often is not one-off. It was likely that this company had a quality assurance processes which failed to catch this bug and they may even be re-using the broken code in projects for other clients where it could actually cause a security hole.
I waited two weeks for a follow up email from the firm that built the simulation but received nothing, so I sent another message to Harvard Business Publishing to get the name of the firm and contact details for someone in charge. They sent me the company's name and the email of the CTO.
Fortunately the CTO was very responsive when I informed him of the bug and he quickly moved to verify and fix it. He confirmed that the company creates its software by using a platform consisting of a common set of tools and modules, meaning that they naturally copy components like the instant messaging system across their products, but he assured me that all instances of the bug would be fixed.
Because this company builds their software by reusing modules, an issue in any one of their products is likely to exist in a lot of their products, as the buggy code is being reused. Furthermore, it turns out the company doesn't just build simulations for classrooms using this platform. They also create software to share and visualize data for "businesses and government agencies around the world", and looking through the "examples" section of their site many of their business oriented applications seem to have inter-user messaging functionality included. This means that the low risk bug in my class simulation may be a high risk bug in the live applications handling sensitive government or business data that implement the vulnerable messaging system.
Every programmer makes mistakes, and bugs like this cross site scripting issue are all but inevitable. Your quality assurance process may even reasonably miss a bug like this, possibly because the team was running up against a deadline and it made sense for them to circumvent quality assurance since their code was going to be used in a low risk part of the system. However, if you are building your software modularly it then only takes a failure to retest that buggy component before using it somewhere else to allow the vulnerability to multiply into places where it may actually cause damage.
Don't get me wrong, reusing code is an amazing thing that lets us build systems much more quickly and often more correctly. This story is just a real world reminder of how, thanks to the huge benefits of reusing code, almost none of the software we use exists in a vacuum, and so a bug in a silly video game can actually lead to a serious exploit in a sensitive government system.