The Cheap Fix to

Fixing the problems with is not expensive. To be effective, however, there should actually be no repair process. Instead, the entire site needs to be rebuilt from scratch. The cost, according to at least one industry leading software architect, would probably be only $4-$5 million. And it would take about nine months. The public evidence regarding the site's failure indicates that developers did not perform realistic testing as the system was being built, and instead waited until deployment, which proved catastrophic.

"This should have never happened," said Jeffrey Palermo, who has designed and built successful systems for large businesses. "And the approach to solve the problem probably shouldn't be to go back and try to fix what has been created. That might not even be possible. There needs to be a re-start and a new system built from scratch."

According to Palermo, who has built two successful software companies, was not designed to operate independently of the systems to which it was connected. The website, as a result, inherited the limitations and performance characteristics of the older, less functional systems operated by the government and other vendors. Those systems were not architected to manage the type of traffic experienced by Internet-scale websites like the


"I think there was a central fallacy in the operation," said Palermo. "That was the idea that it would be possible to connect that many people at once to government servers and other vendors that operated the systems that were providing backend data to The health care site's servers could manage the traffic but when they began connecting to other data servers at the government and elsewhere, it all collapsed."

Palermo, who worked on critical commercial transaction systems at major enterprises like Dell, suggested flawed architecture in the design of was responsible for the mess. The correct approach would be to have the government site cache the information about various health care plans directly on instead of interfacing with another system just to show options to the user. If that data were managed on those servers, the buyer could make all decisions without any information being transferred to or from outside systems. No connection would be necessary until there was a decision made to purchase. And new and modified plans could be reloaded on a schedule. In addition, data submitted by users can be quickly accepted by the website even though fully processing that data takes a bit longer.

"This isn't exactly a new concept," said Palermo. "It's the way Amazon and Dell and other large commercial sites function. Data from online orders are managed on their sites and their servers and then they are placed in queues to be processed by backend servers after the consumer decides to buy. And in order to decide what to buy, the user sees a cached product catalogue at the website level so you don't have to go to slower backend systems just to browse products before you buy."

Palermo, presently the CEO of Clear Measure, Inc. in Austin, TX, said Dell's operation represents a good example of how ought to function. One set of back-end servers manages product catalogues, and a separate group of web servers takes in data from on-line purchases. Yet another set of back-end servers processes order fulfillment. doesn't maintain a direct link to the back-end systems. It uses queuing of data to ensure the website never slows down.

This is, apparently, not what was created by the federal government. Because all of the information was known about the health care plans available for purchase, design logic would have placed as much of that and other information on the government site and avoided external connections until there was a need for a transaction, or to grab data from a government server regarding the purchaser. But that data can be placed in a queue and then later delivered to the external government or vendor system with a speed that does not exceed its functional capacity.

"So much of the money spent on that system went into other things," explained Palermo. "Managing various teams and gathering health care information. The site should have been the easy part. And I'm serious; I could put together a team of about 15 people and have a new and functional done in nine months for about $4 million. Who knows what the government will spend on a repair that may never work?"

The Obama administration, nonetheless, has reportedly rewarded a lucrative repair contract to the same company that built the original flawed Quality Software Services, Inc, which is linked to a major campaign donor of the president's, created the dysfunctional data operations on the site. And has also been given a contract to fix it.

Bad technology does not mean is bad policy. But bad technology can destroy good policy.