Anthropic has announced new weekly limits for your CLAUDE They have already entered into force, especially affecting Claude Code, their programming tool through artificial intelligence. The measure arrives after some users have been executing the tool continuously 24 hours a dayconsuming resources equivalent to tens of thousands of dollars with the most expensive subscriptions of $ 200 per month.
There are saturation and the accounts do not come out. The most intensive developers They are putting cane To the Anthropic servers, saturating them with a use that goes far beyond what the company had planned at first. Some programmers have configured Claude Code to permanently operate in the background, automating tasks and generating code without stopping. This has caused Claude Code to have suffered partial or total falls at least seven times in the last month, according to THE STATE PAGE of Anthropic.
New limits. Of course, users will have limits that restart every seven daysin addition to the current five -hour limits. Users of the Pro Plan ($ 20 per month) will be able to use between 40 and 80 hours a week with the sonnet 4. those of the Max Plan of $ 100 will have between $ 140 and 280 hours with Sonnet 4 and up to 35 hours with Opus 4. The Max Plan of 200 dollars will allow up to 480 hours with Sonnet 4 and 40 hours with Opus 4. Anthropic ensures that less than 5% of its users will be affected by these changes.
It is not an isolated case. Other companies in the sector are living similar situations. Cursor, the popular programming tool with AI, changed its price strategy In June to stop the most intensive users of their 0 dollars plan, although He had to apologize Then not to communicate the changes well. Replit, another rival in the space of the program assisted by AI, also implemented similar measures the same month.
It is clear that companies dedicated to AI are experiencing their plans and studying the use given by their users. That there are price changes as soon is a symptom that they are still taking their pulse while looking for a way for the business to come out profitable.
Hidden limits. One of Claude’s problems is that it does not offer an accountant that allows users to know how much they are consuming in real time. Developers have to go “blind” until they run directly with the limit, something that generates frustration especially among those who pay the most expensive subscriptions. In addition, the company has detected that some users are sharing and reverting accounts, which makes the use of these accounts even more intensive.
An experimental business model. Anthropic has pledged to offer “other options” for cases of intensive use in the future, but for the moment the priority is to maintain stable service for most users. Subscribers of the Max Plan may buy additional use at API prices when they exceed their limits. Keep a generative ia service as Claude It’s a challengeespecially if you want to guarantee maximum performance to all users. That is why we see so many changes in their plans, and it will not be uncommon to see them in the rest of the alternatives.
Cover image | Anthropic
In Xataka | There are those who believe that the best AIs become more silly over time. It is no madness
GIPHY App Key not set. Please check settings