Information Technology and Social Progress

Government, industry and academia are investing in research for advances in computing power to continue. It's time to harness this remarkable capacity to create a brighter future for all Americans and the next chapter in the history of American innovation.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

It's widely appreciated that the information technology industry makes a vital contribution to America's economy, both in its own right and through its capacity to boost productivity by helping firms slash the time required to develop new products, tap the expertise of their employees, participate in global markets and improve the management of their supply chains.

That's why President Obama's national innovation strategy includes a number of policies to improve the competitiveness of the IT and other knowledge-intensive sectors, such as making more spectrum available for broadband wireless services, expanding and making permanent the Research and Experimentation tax credit, reforming export controls and encouraging high-growth entrepreneurship.

But the Administration is also interested in the extent to which IT can be used to tackle societal problems. Imagine, for example, advances in information technology that increase the ability of people with disabilities to participate in the workplace and interact with any Web site; mobile "apps" that use speech technology to reduce the huge gap in vocabulary size and school readiness between kids from rich and poor households; a game as compelling as World of Warcraft, that allows American students to outperform their peers in Shanghai and Singapore in math and science; a digital tutor that dramatically reduces the time needed for an unemployed worker or high-school dropout to gain the skills and credentials needed for a high-paying job; new technologies for promoting healthier behavior that are much more effective than traditional broadcast media; and wireless devices that reduce the costs of managing chronic diseases such as diabetes and asthma while improving health outcomes.

There is no doubt that the power of information technology has been underutilized by the public and social sectors. Consider, for example, that traditional social services have costs that increase in proportion to the number of people being served, while digital technologies -- though they may have high fixed costs -- typically have low marginal costs to reach more people. That's why Silicon Valley start-ups can, with modest amounts of capital, develop software and online services that are used by tens of millions of people within months. And it's why IT-based social services have such extraordinary potential to improve people's lives.

In fact, Internet companies have learned a great deal about how to use techniques such as "data analytics" and rapid, low-cost experimentation to continually improve their services even as productivity in conventional public-sector domains remains stagnant or even declines. We should aspire to use technology to deliver continuous improvement in socially important areas such as education through, for example, the application of new knowledge about what works in personalized instruction to develop online courses that improve the more students use them.

Entrepreneurs routinely mix and match hundreds of different technologies and capabilities to address particular markets, such as cloud computing, mobile devices, social media, machine learning, online marketplaces, simulation, data visualization, question-answering and platforms for mass collaboration. Why not use this "combinatorial innovation" (as Google chief economist Hal Varian has called it) not just to make a difference in these niches but also to improve the human condition in new and creative ways?

Realizing the promise and potential of information technology to achieve America's societal goals is a top priority for this Administration. This year, for example, the President's budget includes $90 million to create an organization within the Department of Education that would do for education what DARPA (Defense Advanced Research Projects Agency) has done for the military.

Progress in this area will require close collaboration between the public and private sectors. Individuals and organizations from government, industry, philanthropy, academia and non-profits will need to come together to answer questions such as:

· What problem are we trying to use IT to solve, and is it plausible that IT can make a real difference?

· What metrics should be used to evaluate the effectiveness of an IT-enabled solution?

· What are the roles and responsibilities of different stakeholders in the design, development, evaluation, and scale-up of IT applications with social benefits?

· If some applications of IT have high social returns and modest or uncertain private returns, can companies attract financing from "impact investors" who are generally willing to accept a lower financial return if the social return is sufficiently compelling?

· How can the public sector be a better customer for IT applications that help address societal challenges? How might different public sector users (e.g. large school districts, state employment agencies) pool their demand?

One promising sign of the technology industry's growing interest in pursuing these collaborations is the formation of ConvergeUS, a non-profit organization that was launched by the leading industry group TechNet with the goal of accelerating technology-based social innovation. ConvergeUS intends to organize annual summits to bring together technologists, social entrepreneurs and subject-matter experts to explore the potential of technology to help societal needs in areas such as early childhood education, healthcare and military families.

The potential has never been greater. We are continuing to see dramatic advances in our ability to store, process and transmit information. Forty years ago, Intel's first microprocessor had 2,300 transistors; today's microprocessors have over 2 billion transistors. Government, industry and academia are investing in research that will allow these advances in computing power to continue for decades to come. It's time to harness this remarkable capacity to help create a brighter future for all Americans. Doing so could create the next chapter in the extraordinary history of American innovation.

Tom Kalil is Deputy Director for Policy at the White House Office of Science and Technology Policy and a Senior Advisor at the National Economic Council.

Popular in the Community

Close

What's Hot