If the vision of information technology offered by a Dulles Corridor start-up resembles the sleek elegance of an iPhone, government computing can look more like a dial-up modem.
The Census Bureau inked a $600-million contract for hand-held computers to conduct the 2010 count, then had to go back to paper when the project went over budget and fell behind schedule. The FBI put $170 million into one failed attempt to computerize its case files and has since plowed more than $400 million into a successor effort with disappointing results. But the worse failures may be those that don't make headlines: the obsolete hardware and software inflicted daily on government workers who use far better technology at home.
To help fix IT in Washington--an $80-billion annual expense--President Obama turned to the District, appointing Vivek Kundra, DC's chief technology officer, as the nation's first chief information officer.
Kundra, 37, began his government career as Arlington's director of infrastructure technologies from 2001 to 2003, then as Virginia's assistant secretary of commerce and technology from 2006 to 2007, with two private-sector stints in between. But he made a name for himself innovating in DC.
Starting in March 2007 as then-mayor Adrian Fenty's chief technology officer, Kundra switched DC government offices to Google's cloud-based services and pushed agencies to publish data such as crime statistics in an easily searchable form on the Web.
By hiring Kundra, President Obama hoped to see a similar reboot of federal IT. To some degree, Kundra delivered--terminating overdue projects, consolidating data centers, creating cloud-computing strategies, and pushing a growing selection of government statistics and metrics to information portals such as Data.gov.
But he also hit obstacles. An August survey of government IT professionals by the online network MeriTalk gave Kundra only a B grade, with better marks for inspiration than for implementation.
A more accurate grade might be "incomplete." In August, Kundra left his post as federal CIO for a fellowship at Harvard's Kennedy School and Berkman Center for Internet & Society. This winter, he took a position as executive vice president of emerging markets for Salesforce.com, a San Francisco firm that provides Web-based marketing, service, personnel, and other services to businesses.
Over coffee at a hotel in Rosslyn, he talked about what he's learned.
What was it like to move from city and state government to the federal level?
What you're instinctively overwhelmed with in the beginning is the scale of the federal government. You're looking at 12,000-plus major IT systems. And then you look at the budget--$80 billion in technology spending. But even the federal government doesn't violate the laws of physics, so to speak, when it comes to information technology. It's more zeroes, but the problems are very similar.
What about the people?
There was actually amazing talent within the federal government. I think a lot of people make the mistake of focusing on just political appointees or top management. You have some of the most dedicated workers buried away in this vast bureaucracy.
Why do so many of these government IT contracts not work?
On the government side, you basically have a bunch of people who are promoted because they're there for a long time. It has nothing to do with skill sets. That is one of the reasons, as part of the reform effort, we put in place a program-manager career track. If the customer isn't really smart, the vendor is going to take advantage of that.
Folks would go into these bids knowing they would lose money initially but would make money on change orders. It's almost like a business strategy.
One example of inefficiency you found is that the government was running more than 2,000 data centers. So the idea is to close 800 of them.
The ultimate vision would be to actually get to three digital Fort Knoxes. It's a 10-to-15-year journey to move in that direction. The reality is that there are a lot of legacy systems across the federal government that need to be consolidated. At the same time, there needs to be political will when it comes to Congress to make sure they're comfortable with shutting down these data centers.
Wouldn't you still need one in each congressional district?
Victory would be if you had one in every state so you ended up with 50. That would be huge.
What did you learn about selling government agencies on publishing their data--such as White House visitor records, FDA recall orders, and seismic records--for everybody else to reuse?
They want to make sure that as the information is liberated, the consequences of that don't come back and hurt them.
That is why when we launched the Data.gov platform, it started with only 47 data sets. And sort of going agency by agency, department by department, person by person, making that case, what you saw was a tipping point--where people could see amazing applications created by third parties.
When people would actually see that the value generated was far greater than the downside, over the course of this two-plus-year period we actually saw agencies begin to turn around. But even with 400,000-plus data sets, there are plenty that need to be liberated.