Recently, I asked Robert L. Read, who left being the Director of Development at Planview, Inc. to be a Presidential Innovation Fellow in 2013 and helped create 18F and 18F Consulting, his thoughts on modernizing government legacy systems. Rob is currently Chief Scientist at Skylight, a government contracting firm, and is creating Public Invention, a charity for public open-source invention whose first project is a polymorphic robot. Rob has a PhD in Computer Science and speaks Esperanto fluently. He co-founded the PIFF and actively works with Agile Gov Leadership and Engineers Without Borders USA. Rob speaks on innovation, Agile methods, programming, and public invention.
What are methods for legacy transformation that you employed or recommend governments employ? For example, what is the strangler pattern?
I’ve tried to summarize my thoughts in a talk to the Child Welfare Digital Services Project of California you can see on YouTube, see here:
- Create a single map of your entire system.
- Break your project into modules using APIs and rewrite them one at a time.
- This the Strangler Pattern articulated by Martin Fowler. The idea is to gradually replace every components so that the original system is gone but you never interrupt service or having a catastrophic launch that can fail.
- Rely on Automated Tests and automated deployment/rollback to decrease the risk of updating each step.
What is an automated test? And, why is it important to implement automated testing when undergoing a legacy modernization?
An automated test is a piece of code that tests a particular piece of code, possibly part of a legacy system. By creating a suite of automated tests, you can produce a reliable blueprint of behavior that a vendor can work to in replacing a module. If you don’t have automated tests, it is almost impossible to deploy a module without a lot of painful problems and bugs. If you do have good test suites, you will likely still have bugs but they become much easier to identify and fix. Modern deployment techniques decrease the risk of roll-out by letting you do it in an automated way and rollback the deployment if needed.
During your presentation recently at the Agile Sacramento Meetup, you talked about the importance of working on the first APIs, next GUIs, then persistence, and finally business logic when approaching a legacy modernization. Can you explain more about the why a government agency should follow this order? Also, can this order be re-arranged?
Yes, it can be rearranged, but the point is to provide the highest return on the investment of your time. It is a really true that interfaces matter more than code, and so focusing on defining clean Application Programming Interfaces (APIs) is almost always the best thing to do at an enterprise level. In a modern system these tend to be web services, but that is really not essential---the essential thing is a clearly defined interface so that the code behind it can change without propagating a “code cancer” throughout the whole system.
I believe GUIs should be built on top of APIs, and often improving the GUI has the greatest return on investment for the end-user once you have clean APIs.
The persistence layer should be the easiest to replace but is the layer which benefits the users the least.
What are some of the common trade-offs governments must make when considering a legacy transformation?
Mega-launches lead to mushroom clouds. That is, attempts to deploy huge systems all at once often fail catastrophically. The size of the modules and which modules to replace is important. I would argue for making the modules as small as possible. Ideally you should release new code every two weeks with an automated deployment system. Moving to such an approach may require a change in mindset for governments. Kent Beck teaches to make the smallest possible meaningful improvement, and that is really critical.
Another important trade-off is how much energy to spend in design and talking to users vs. pure development. Basically I would argue half your energy, in terms of persons, salaries, or time, should be spent focusing on the user, and many organization achieve much less than this ratio.
Do you have a common lesson learned by working on legacy modernization?
I have many, but most important two are:
- Magic happens when you get actual users in the same room with developers. If you haven’t done this and you are an executive empowered to do it, please do so based on a formal Agile process of some kind.
- Rapid prototyping based on modern open-source software can really change the game in terms of showing people how easy it can be to make dramatic improvements to a legacy system if you have an access to an API for at least a fraction of the functionality.
To this end, what were some of your favorite moments as a Presidential Innovation Fellow? Were there any projects you’re especially proud of?
I loved being a Fellow. It was the most meaningful work I have ever done. I think the best moments were when we demoed prototypes made very rapidly and seemed to really expand the possibilities of what was possible in government.
I was very proud of the Prices Paid system, which is only usable by government buyers, so it hasn’t been reported on much. It only achieved a fraction of what could have been achieved, but it was good enough to be taken over and used by the GSA. I also enjoyed just brainstorming and helping others in government, which is a very important part what Fellows do. An example has been written up [on the 18F site].
All of the Fellows carried a lot of techniques and practices into 18F, and I am very proud of being an early influencer of 18F.
The Presidential Innovation Fellows program is a model for how industry and entrepreneurs can do a “tour of duty” in government and really cross-fertilize ideas; I highly recommend it and similar programs to anyone who has a chance to take advantage of them.