A detailed article from a British perspective, discussing the darker side of the emergent ‘Gig Economy’ we see all around us (think Uber, Lyft, Airbnb, etc.). Using complex algorithms to match people with needs, and those who can provide them on demand, has brought tremendous improvement and opportunity to all parties.
But what about the downside of being accountable to an algorithm rather than real people? How do you factor in relationships? Being injured and not having insurance coverage? Needing to accept most every ride presented even if considered unsafe or being done at a loss, otherwise being dropped as a provider?
Having driven the past several months for Lyft (and before that briefly for both Lyft and Uber), I can appreciate the points being made here.
This protest outside the UberEats office in south London on August 26 is one of the first industrial disputes to hit the city’s so-called gig economy. It is a strange clash. These are workers without a workplace, striking against a company that does not employ them. They are managed not by people but by an algorithm that communicates with them via their smartphones. And what they are rebelling against is an app update.
It’s hard to spread the word when you don’t even know who your colleagues are. But the couriers have an idea. They open their apps as customers and order food to be delivered to them. As UberEats couriers arrive with pizzas at the place their app has sent them, the strikers tell them about the protest and urge them to join in. Algorithmic management, meet algorithmic rebellion.
Using algorithms to monitor performance is associated with companies like Uber and the gig economy, but also harks back to the ‘scientific management’ of Frederick Winslow Taylor a century ago. John Gapper discusses the return of this trend with Sarah O’Connor.
Tanaka says his algorithm is also a more reliable boss because it is better at forecasting demand: employees tell the app when they are available to work, then receive their schedules in advance. They are not subject to the chaotic scheduling common in many retail jobs, where employees are sometimes sent home early because the shop is unexpectedly quiet, or told to wait by the phone.
Better for the employer? Or better for the workers? Both, argue Tanaka [and Netessine]. Unlike human store managers, algorithms do not use hours to reward the people they like, or the people they are related to, or the people who look like them. Tanaka is forever fighting battles against the “incredible biases” of managers who want to tweak his algorithm’s carefully calibrated schedules. “The manager comes in and says, ‘Look, my favourite person is not working when I’m working, so I want to muck with the schedule’ and it’s like, ‘Oh my god, just by doing that you lost a few percentage points in sales!’”