Amazon insisted that its developers use its AI to work. They’re just fixing what AI breaks
AI was going to make us work less and better. In the Amazon offices the reality is being very different. According to several employees of the firminternal tools like Kiro are causing a rebound effect: developers spend more time fixing the code defective that generates that tool than writing your own. From Malaga to Malagon. The situation is ironic, because as some engineers indicate, they are trying to get out of a problem caused by AI by using more AI. Dina, a software developer from New York, joined Amazon two years ago and began her job as writing code. However, what I was doing recently was not writing it, but rather fixing the code that Amazon’s programming AI—called Kiro—broke. According to her, this model was misleading and frequently generated bad code. Days after speaking with The Guardian For that report, Dina was fired. Go-go layoffs. The case of Amazon is especially bloody due to the company’s recent wave of massive layoffs. In recent months has cut its workforce by 30,000 people10% of its corporate strength. At Amazon they deny that these layoffs have to do with AI, but the CEO, Andy Jassy, has suggested in internal communications that the efficiency gains raised by the process automation will allow you to operate with tighter equipment. You are contradicting exactly what the company says without fully meaning it. workplace suicide. Many employees interviewed by this means indicated that they felt like a kind of work suicide. His current job is to document processes in detail and correct system errors to, in essence, prepare for his own replacement by machines. These forced training phases are making employees feel like their cycle at Amazon has an expiration date. I don’t need a hammer anyway. We have experienced what is happening at Amazon in the past. The deployment of AI tools has in many cases been chaotic according to the employees interviewed. They have been forced to use “half-baked” tools born of hackathons without adequately evaluating whether they really provided the appropriate solution. An engineer said it clearly: you can’t look at every problem and think how to use the hammer you have for said problem. The first thing is to know if this problem really needs a hammer. Service outages. AI integration issues are also reportedly responsible for Amazon service outages. Internal reports link at least two of those crashes to code changes that were made with AI tools. These changes were not properly supervised, and although the company can blame “human errors” for these problems, the origin is as usual: delegating critical decisions to systems that are not yet 100% reliable. Amazon knows who uses AI. In this adaptation of Amazon to the age of AI there is also another disturbing element: those responsible are monitoring to the millimeter what your employees do with AI. What they were already doing in the warehouses to measure the performance and productivity of these employees now also happens in the offices. There are dashboards where team leaders monitor who uses AI and how often, and in some teams the goal is for at least 80% of the workforce to use these tools weekly, regardless of whether they are useful or not. Promotions. And whether you use AI tools a little or a lot can also be decisive for promoting internally on Amazon. Documents have been detected in which the candidate is explicitly asked how they have used AI to improve their impact. The message is clear: if you do not embrace this technology, even if it is deficient, your chances of moving up become very complicated. Low morale. Among the employees surveyed, the feeling that everyone perceived was the demoralization of the teams. In fact, more than 1,000 workers signed a petition against that aggressive deployment of AI tools. For them, the company culture is changing and what is now required is work more hours with fewer resources with the excuse that external competition is “hungry.” In Xataka | The role of developers of the future was supposed to be to “review” the code that AI wrote. Claude just buried him