Michael C. Mankins with more than 25 years consultancy experience advising business leaders, is well placed to write the article below. It challenges us to think again, before we invest in the next ‘new technology’ He postulates that we may have already reached a tipping point, where we have to question if a new technology will help people get more work done or not!
Twenty years ago, new office technologies like email and teleconferencing contributed to a dramatic boost in productivity. Information flows accelerated. Collaboration with coworkers became easier and easier. Productivity grew significantly faster during the 1990s and early 2000s than in previous years.
Today, productivity growth has declined appreciably. Since 2007 it hasn’t even kept up with inflation.
What happened? The financial crisis, sure, but that’s not all. Companies have continued to invest in new technologies for white-collar workplaces, but the benefits are no longer visible. In fact, we may have reached a tipping point where each new investment in office technology must be carefully assessed against a simple test: will it actually help people get more done, or not?
The roots of this conundrum lie in a seemingly innocent piece of technological wisdom known as Metcalfe’s Law. Robert Metcalfe is a giant in the technological field, co-inventor of the Ethernet and cofounder of 3Com, a company later acquired by Hewlett-Packard. He has postulated that the value of a network increases with the square of the number of users. One fax machine, for example, is worthless. Two fax machines are worth only a little. But a network that includes thousands of fax machines is worth millions, because now all those people can send documents to one another.
Metcalfe’s Law, however, has a dark side: as the cost of communications decreases, the number of interactions increases exponentially, as does the time required to process them. The impact can be seen in the workplace. Thirty years ago, when executives or managers got a phone call while they were away, they received pink slips from their secretaries saying that someone had called. A busy exec might get as many as 20 on an average day, or about 5,000 a year. Then came single-user voicemail, followed by multi-party voicemail (the pre-email version of “Reply All”); the cost of leaving a message thus declined, and the number of messages left rose accordingly, perhaps to 10,000 a year. Then, finally, came today’s layers of networks — phone, email, IM, and so on — in which the cost of communicating with one person or many hundreds of people is virtually nil. Not surprisingly, the number of messages has burgeoned, perhaps to 50,000 a year (figure 1).
The same principle applies to meetings. It used to be that setting up a meeting with five managers was tricky. The meeting organizer’s assistant had to contact the assistants for the other participants. The assistants checked their bosses’ calendars — most were not kept electronically — and then agreed upon a time and location. All that took a good deal of time and effort. With the introduction of Microsoft Outlook and other calendar programs, the cost of setting up a meeting plummeted. As a consequence, the number of meetings has increased and the number of attendees per meeting has exploded. Some 15% of an organization’s collective time is spent in meetings — a percentage that has increased every year since 2008.
My colleagues at Bain and I have studied these effects using people analytics and data mining tools. We combed through email, IM, calendar and other data to understand how organizations spend their collective time. We then combined this data with information on headcount and productive output to understand the impact of technology in the workplace. Here’s what we found:
A typical front-line supervisor or midlevel manager works 47 hours per week. Of this time, he or she devotes 21 hours to meetings involving more than four people and another 11 hours to processing e-communications. (This doesn’t count the emails sent during meetings, a common practice in many companies.) So the manager has less than 15 hours a week to do other work.
And that’s not all. If you deduct time periods of less than 20 minutes between meetings or processing emails as “unproductive time” — it’s hard to start and complete most tasks in less than 20 minutes — then you are left with a sad truth: The average manager has less than 6½ hours per week of uninterrupted time to get work done.
Meanwhile, the number of interactions required to accomplish anything has increased. A recent CEB study found that 60% of employees must now consult with at least 10 colleagues each day just to get their jobs done, while 30% must engage 20 or more. The result? Companies take more time to do things. For example, it takes 30% longer to complete complex IT projects, 50% longer to hire new people, and nearly 25% longer to sign new customer contracts. And that’s just in the last five years.
A Luddite’s Proposal
The original Luddites, nineteenth-century English textile workers protesting labor-saving machinery, feared that the new technology would take their jobs away. Some of today’s office technologies have the opposite effect: they encourage workers to behave in ways that are wasteful and unproductive, such as scheduling unnecessary meetings.
In our work with clients, we recommend that organizations consider two factors in assessing technology investments:
1. What impact will the new technology have on organizational time? Will the technology actually enable people to do more in less time, or does it merely make work and collaboration easier? Investments that reduce the cost of interactions but do not themselves save time should be viewed skeptically. Unless an organization is highly disciplined in its management of time, the dark side of Metcalfe’s Law will trample whatever benefits the new technology might promise.
2. Could better rules eliminate the need for further investment? Today, many investments in new technology are essentially workarounds for bad behaviors or poor procedures for sharing information. Were customer, financial and operational information readily available to all, for instance, the need for crowd-sourcing or reconciling data sets would be reduced significantly. Leaders should carefully assess whether to accept a bad behavior as given and invest in new technology to cope with it, or instead change the dysfunctional behavior.
Technology can have enormous benefits in the workplace. But it’s fair to ask whether we have reached the point of diminishing returns in some areas. Call me a Luddite, but I believe we have. If leaders consider the unintended consequences of new technology on collaboration and workforce productivity, they may well reject many new investments.