an algorithm has betrayed it
We are already used to the fact that price of a flight changes depending on when you search for it, where you search from or how many times you have visited that page. Also than an Uber or Lyft ride change your price depending on if it’s raining or if it’s three in the morning. Companies have been adjusting their prices dynamically for years based on what they know about you. What you may not have known is that some companies are starting to do exactly the same thing with the salary or bonuses they pay you. The phenomenon already has a name: “surveillance salary.” And it is no longer limited to delivery workers or drivers of the so-called gig economyis also beginning to be implemented in human resources systems to condition salary increases, access to incentives, and even the minimum base salary for which you would be willing to work according to your economic needs at that moment. a report of the Washington Center for Equitable Growth warns that it is spreading to such everyday sectors as healthcare, customer service, logistics and retail trade. How the algorithm that sets your salary works. The mechanism is simpler than it seems. Companies use artificial intelligence tools that collect real-time data from public or social media information about each worker: how often they accept shifts, how quickly they respond to offers, what they were paid in previous jobs, or whether they have outstanding loans and credit card debt. With all this, the system calculates what the minimum wage is that that person would accept for work and offers you exactly that. According to Nina DiSalvo, policy director for the labor group Towards Justice“some systems use signals associated with financial vulnerability, such as data on whether a potential employee has applied for a quick loan or has a high credit card balance, to infer the minimum salary a candidate might accept.” The result is that two people doing exactly the same job can charge very different amountswithout either of them knowing or being able to claim anything about it. A model that penalizes those who need to work most. The problem is that the “surveillance salary” does not only affect those seeking employment. Just like reveals the report ‘Prohibition of surveillance prices and wages’ prepared by different US labor organizations, once hired, the worker continues to be monitored, so the system adjusts his remuneration depending on how he reacts to the company’s requests: if he accepts shifts urgently, if he works more hours than usual or if his personal finances deteriorate. The algorithm interprets this as a sign that the employee needs the job, and can take advantage of it to offer less money. How much the more vulnerable the worker isthe more exposed it is. The study of the Washington Center for Equitable Growth analyzed 500 AI companies dedicated to labor management and identified 20 vendors whose products are at high risk of generating algorithmic pay discrimination. 16 of those 20 vendors integrated their products directly with payroll or workforce management platforms, giving them continuous access to each employee’s most sensitive data. Opacity as part of the design. One of the most worrying features of this model is that the worker does not know what data is being used against him or what variants are used to calculate his salary, but they are no longer neither the experience nor the work capacity or productivity. The algorithms that determine remuneration are, for the most part, black boxes: neither the employees themselves nor the unions nor the regulators have access to their internal logic. Joe Hudicka, author of the book ‘The revolution of AI ecosystemsdescribes it this way in statements collected by MarketWatch: “We know the concept of the glass ceiling. But at least in that concept we have some visibility through it. This salary surveillance ceiling is made of iron.” A study from Cornell University’s Worker Institute detected that 42% of digital platform workers in New York declared that they had been paid less than what was agreed, without clear mechanisms to complain, precisely because the control of their activity goes through algorithms that they cannot audit. Researcher Veena Dubal, from the University of California, has been documenting for years how these platforms adjust downward remuneration on an individualized basis. For example, stopping assigning races to a driver who is about to exceed his productivity goal. The legal response that is beginning to take shape. Faced with this situation, legislators are beginning to move to put a stop to this practice. In the US, the state of Colorado is processing the bill HB26-1210one of the first initiatives in the US that seeks to specifically regulate the use of algorithmic tools in salary setting based on personal data surveillance. In Spain, the Rider Law, aimed at regulating labor relations between delivery platforms and self-employed delivery drivers, also included a modification to article 64.4 of the Workers’ Statute with the obligation to provide access to the algorithm that managed working hours and the assignment of orders to avoid this practice. The new European regulations for pay transparency are also oriented in this direction, preventing two people who do the same job from having very different salaries. In Xataka | Against all odds, AI is reactivating employment. Don’t get excited, if you are young it is increasingly difficult to get hired Image | Unsplash (Towfiqu barbhuiya)