There's a mystery lurking at the heart of the American labor market that is confounding today's managers and leaders. The U.S. workforce is more efficient than ever, with productivity rising 102 percent from 1970-2012.
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

There's a mystery lurking at the heart of the American labor market that is confounding today's managers and leaders. The U.S. workforce is more efficient than ever, with productivity rising 102 percent from 1970-2012. But as anyone working in social media will tell you, this very same group of productive people has also driven the incredible growth in website hits, Twitter activity, and Facebook usage, thanks to procrastination and boredom at the office.

According to Gallup, 70 percent of American workers say they dislike or even hate their job, and 20 percent are so dissatisfied that they "roam the halls spreading discontent," costing an estimated $450-$550 billion in lost productivity each year. At the root of much of that discontent is an issue of trust -- according to a new survey by the American Psychological Association, only about half of employees feel their company is open and upfront with them, and a quarter of people simply don't trust their employer.

So what is to be made of this seeming incongruity? How are workers simultaneously so productive, and yet so unhappy with their jobs? It's the paradox of big technological leaps: innovations greatly expanded what it's possible for workers to do -- from automated assembly lines to managers who can suddenly text, email or Skype their team whenever needed. But the social contract between employees and employers, not to mention the legal contracts concerning work hours and compensation, that govern what is reasonable to expect of workers, hasn't yet caught up. For example, that doubling of productivity from 1970-2012? During that same period, the median income for people working full-time rose only about 7 percent. The news has been even worse more recently -- from 1996-2012 productivity rose by 40 percent, while median wages flatlined, and new data indicates that the American middle class has not only shrunk, but fallen far behind the middle class of other developed nations.

Against this backdrop, 'big data' approaches to labor issues (operating under various noms de guerre, like "HR Analytics," "Workforce Science," and even the "Quantified Self Movement") are being hailed as a potential savior for both unhappy workers and corporations. But while these approaches hold a lot of promise, managers need to make sure their Big Data initiatives are undertaken with workers, and workplace morale in mind. It's not just ethically admirable, it's business savvy: more efficiency is a great thing, but not if it leads to negative externalities, like convincing good workers to quit, or turning a garden-variety unhappy employee into an "insider threat" who might embezzle company funds or steal consumer credit card information.

The New New Thing

Google's "People Operations" department has used advanced statistics to rebuild its entire hiring, firing, training and promotion processes. Progressive, data-driven companies like Netflix and Zappos have shown that added employee perks like office nap-rooms and unlimited vacation time result in workers that are not only happier and less likely to quit, but work much more efficiently.

At RedOwl Analytics (the data analytics software company where Renny is a co-founder and Joe is a senior data analyst), we've helped companies identify workers who may not be particularly good at self-promotion, but are integral to the flow of communication, quietly connecting key players and improving team dynamics. A recent study we conducted highlighted areas of a company where employees were feeling burnt out and discouraged, so that managers could intervene to improve morale problems.

But some early successes do not a Glorious Revolution make. Cash-strapped public schools paying to monitor students' personal social media accounts, and companies tracking their employees' physical fitness can be extreme, and the headline-hungry media is always happy raise an alarmist red flag on "Big Brother" surveillance issues. Fortunately, this isn't the first time new technology and data-driven management ideas have been tested. Two particularly relevant periods in American business history -- the "Scientific Management" movement of a century ago, and the post-World War II "Organization Man" era -- can provide some great lessons in what can go wrong, and what can go right, when companies use statistics to cut spending and improve efficiency.

The Old New Thing
A century ago, the stopwatches, slide-rules and efficiency studies of the world's first management consultant, Frederick Winslow Taylor, were hailed as a coming economic miracle. Taylor and his "Scientific Management" understudies were lionized in the progressive press, consulted with the likes of President Theodore Roosevelt and future Supreme Court Justice Louis Brandeis, and accumulated enough accolades and acolytes to make even Steve Jobs or the most sought-after TED-talkers of today blush.

Taylor's work provided a number of important insights into manufacturing techniques and how to improve efficiency. But Taylor himself was contemptuous of the workers he studied -- routinely firing people, insulting their intelligence and comparing them to "dray horses and donkeys" in his books. Progressive Era politicians and ordinary Americans became so suspicious of "Robber Baron" industrialists trying to squeeze a few more dollars out of already exploited factory workers, that the movement was stonewalled. "Tayloritis" in the end was so reviled that today we associate it more closely with the dystopian absurdism of Charlie Chaplin's "Modern Times" or Lucille Ball's job at a chocolate factory than with savvy, progressive management techniques. Taylor understood numbers, but he never understood the workers that his statistics ultimately relied upon.

Despite the problems with Taylor's methods, though, many of his ideas were sound, and re-gained influence during and after World War II, with modern management gurus like Peter Drucker calling Taylor's work "the most powerful as well as the most lasting contribution America has made to Western thought since the Federalist Papers." The 1950s and 60s was the age of the Organization Man in American business, with a business degree on his office wall ("Taylor is the mortar...of every American business school," Harvard historian Jill Lepore wrote in the New Yorker) and reams of data on his desk.

By this point the field had been re-branded with monikers like Operations Research and Systems Analysis, and the stat-heads did come into occasional conflict with workers. When a group of Air Force statisticians including future Secretary of Defense Robert McNamara went to work for the Ford Motor Company, they were derisively nicknamed the "Quiz Kids" (later changed to "Whiz Kids") by Ford managers because they were constantly distracting workers with their endless questions. By and large though their work was a success. Manufacturing techniques, supply-chain management and marketing were revolutionized, and American companies became the envy of the world, not only in size but also in production quality and price.

The Organization Men were so much more effective than the Management Scientists in part because they learned from the externalities that Taylor never understood, like worker morale, and took employee concerns into account. In 1911 Frederick Taylor demanded that metalworkers at a government arsenal double their output or face his wrath. Instead they declared a strike, gained sympathy from reporters, and secured a Congressional investigation which got Taylor and his methods banned from government facilities. A half-century later, one of the great inefficiencies that Whiz Kid Robert McNamara--the first president of Ford not named Ford -- went to work eliminating was the enormous cost of labor strife. Strikes, along with more informal tactics like work slowdowns and sick-outs, were tremendously expensive for car companies. Better to give in occasionally to union demands, McNamara and his less high-handed number-crunchers realized, and let union reps have some say in new initiatives, than to engage in constant, costly battle with them.

Modern managers need to keep these two examples in mind when they take on new data-driven HR initiatives or employee monitoring systems. Employees aren't stupid, and information is cheap. In a climate in which only half of people think their employer is honest with them, how can employees be expected to have faith in complex, data collection initiatives that management institutes with little or no consultation? "People Analytics" holds much promise for the American workplace, but only if we remember and are guided by the principle that the people still need to come before the analytics.

Popular in the Community

Close

What's Hot