‘Big Brother’ managers should turn the lens on themselves
© Matt Kenyon
Blue-collar workers are used to being monitored 24/7. Lean manufacturing facilities often have productivity screens right by work stations to see progress in real time — and gauge whether staff should get a bit extra at the end of the day or be docked for not working hard enough.
Likewise, low-wage service staff working in restaurants or retail chains often have their lives upended by algorithmic software that co-ordinates their schedules to customer demand, making it difficult to look after children or plan very far in advance. Upper-level white-collar workers have historically enjoyed more humane evaluation methods; but now, thanks to surveillance capitalism, their progress is being tracked minute by minute, too.
The number of employers using data surveillance software to monitor employees has doubled since the start of the pandemic. Nearly two-thirds of medium to big companies in the US (and many elsewhere) now use such systems, which do everything from monitoring email and web browsing, to tracking workers’ location and movement, to recording what keystrokes and eye movements they make, or when their screens go dark.
In some ways, this is the next step from existing workplace software programs such Google Workspace or Microsoft 365, which gather certain kinds of data but don’t track keystrokes or take screenshots. Such programs are part of the surveillance economy, no question, but they may obscure individual workers’ identities or limit the time period in which data can be tracked.
Either way, the rise of workplace surveillance represents what Microsoft chief executive Satya Nadella has called a new “productivity paranoia” on the part of employers. Clearly, the work-from-home trend and many employees’ reluctance to give it up has managers desperately seeking new performance metrics. But digital Taylorism isn’t the way forward, for three reasons.
First, just as office face time was an imperfect productivity metric, tracking keystrokes is too. White-collar jobs, particularly the kind that won’t eventually be done by technology, tend to be about creative thinking, relationships, teamwork and soft skills. Indeed, the very productivity-enhancing things that managers cite when trying to get people to come back into the office, such as accidental idea exchanges and trust-building at the water cooler, are exactly the activities that can’t be tracked by surveillance software.
Second, while there’s no proof these metrics do a good job at gauging productivity, research shows they increase stress and resentment. A recent Brookings blog post on the topic cited a case in which a retail store worker used an office computer to check her personal email and bank account occasionally (who hasn’t?), and subsequently found out that another employee had seen the information. When she informed her employer, she was fired.
"The rise of workplace surveillance represents what Microsoft chief executive Satya Nadella has called a new ‘productivity paranoia’ on the part of employers"
While you could argue she simply made a bad judgment call to use a work computer for a personal task, the anecdote reflects something bigger, which is that there are barely any lines between work life and personal life these days. Post-pandemic, with employees often logging in to corporate systems from their own computers at home, or doing weekend Zooms because they can, there is a need for stronger protections about how and where employees can be monitored.
Privacy and labour advocates would like to see clear notifications for employees about when surveillance software is being used (some US states, such as Connecticut, Delaware and California, already have this in place). The EU mandates this via GDPR, though in practice, many workers simply box-tick the consent rules since they can’t do their jobs if they don’t sign off on data monitoring. I’d argue that surveillance of employees working from home should be illegal, and that surveillance in the workplace should be done only for limited time periods, with full transparency and a clear purpose (measuring the success of a new project, for example).
Perhaps the best way for managers to improve productivity would be to turn the data lens on themselves. Managerial meetings are ripe for study: consider that the number of meetings attended by employees rose 13.5 per cent during the pandemic, even though research shows that 70 per cent of all meetings keep employees from doing more productive work. Much of the blame goes to newbie managers, who — perhaps driven by their own desire for visibility — hold almost a third more meetings than more seasoned peers.
Aside from cutting meetings, companies may do well to cut managers. One recent NBER paper by Daron Acemoglu, Alex He and Daniel le Maire found that business managers in particular (meaning MBA types, rather than those with sectoral expertise) tend to lower worker wages without increasing output, investment or employment growth. The authors conclude these types of bosses do exactly what they are taught to do at business school, which is to cut labour, cut costs and maximise share price.
But they also find that in doing so, such companies tend to lose the most highly skilled workers, who leave for greener (and perhaps kinder) pastures. Perhaps the lesson here is that managers themselves are often the key differential in corporate productivity. Perhaps they should practice less surveillance, and more self-examination.
Letters in response to this article:
Productivity gap is harder to spot among laptop class / From Jonny King, Tokyo, Japan
Don’t heap the blame on beleaguered managers / From Hilary Sutcliffe, Director, Society Inside, London SE21, UK
Copyright The Financial Times Limited 2022
© 2022 The Financial Times Ltd. All rights reserved. Please do not copy and paste FT articles and redistribute by email or post to the web.