It also protects you from phishing and data leaks

Internet has become a vital tool, to the point that we spend several hours a day sailing. Doing doing it as surely as possible is vital to safeguard our data, especially now that there are so many data leaks or identity supplantations. Common sense can help us, although going to an additional tool is A movement that can save us more than one problem. One of the most outstanding is Kaspersky, especially its premium plan, reduced right now to the 34.99 euros a year. Kaspersky Premium – Annual * Some price may have changed from the last review Your safety and your family, at another level with Kaspersky Premium Kaspersky is a company that has been a reference for years in Internet security. It is true that its antivirus facet is almost always taken into account (both for PC and MacOS, iOS and Android), But the reality is that it offers much more. Especially if we take into account its Premium Kaspersky Plan, the most complete of all. One of the key points of this service is your call ‘Identity protection‘. Thanks to it, the service will detect when anyone tries to access our data remotely. In addition, we can also have a kind of Wallet Digital where to store our ID, the card or delicate information to avoid any type of identity theft. And that is not all, because we will also receive a notification if our data or telephone number appear in some data filtration. Speaking of identity and impersonation, it is also worth highlighting its’Antiphishing‘. It has a system capable of identifying links Phishing In emails or on web pages, automatically blocking those that it detects as fraudulent, thus avoiding clicking on one for an oversight. Kaspersky Premium – Annual * Some price may have changed from the last review This Kaspersky service normally has a price of 79.99 euros per year, but right now A 56 % discount that leaves it in the 34.99 euros a year we have mentioned previously for the first year. In addition, this subscription includes a free year of Kaspersky Safe Kids, ideal if we have children at home and we want their internet navigation and experience to be as sure as possible. And eye Some of the links of this article are affiliated and can report a benefit to Xataka. In case of non -availability, offers may vary. Images | Kari Shea in Unspash (with edition), Kaspersky In Xataka | Wi -Fi, configuration fund guide: everything you have to know to improve your connection In Xataka | These are the two things I recommend doing if you want to improve the safety of your accounts and you are not sure how to start

So far Arm only designed chips. Now you will manufacture one for data centers, and goal will be your first big client

There is a lot of money invested in data centers. Maybe too much. And that is a juicy cake in which many companies want to enter. One of them is ARM, which throughout its history He had limited himself to designing chips that others then licensed and manufactured. Now they will go further, and AI is the main reason. THE FIRST CHIP ARM. As revealed In Financial TimesARM will reveal his first chip this summer. Until now, Arm had remained outside a career in which Intel, AMD, Nvidia, Qualcomm or Apple have been the protagonists, but now they want to enter that fight. Data centers in the spotlight. SoftBank, which has a majority participation in ARM, has ambitious plans in the data centers segment. It has allied with Openai and other companies for the colossal Stargate projectwhich will make them theoretically invest 500,000 million dollars in data centers in the US. And the company wants ARM to be part of that project with own chips that govern the servers of those data centers. Goal as a great first client. This chip is expected to be a CPU for servers and that it is created so that it can then be customized by companies as a goal. In FT they indicate that in fact the company of Mark Zuckerberg is already the first big client of ARM in this initiative. TSMC could be in charge of the production of chips, according to sources close to the plans. Acquisition of ampere in sight. We speak these days of the alleged advanced negotiations that They could end SoftBank buying ampere. The firm specializes precisely in the use of ARM architecture in multinuk chips Oriented to data centers. Threat to Intel and AMD. These two companies (especially Intel) and architecture X86 have always dominated the segment of servers and data centers. However, chips with ARM architecture have begun to be an interesting option, and the alleged appearance of these Arm’s own chips could further boost their market share. And Varapalo for Qualcomm. Qualcomm socate are references in our mobiles, but the company also takes time Working in chips for servers. In the last months Qualcomm and Ar had maintained a litigation that finally closed a few days agoand it is about to see what is the impact of this ARM project on the Qualcomm roadmap, which apparently He was also in negotiations with goal to provide chips for their data centers. Image | ARM In Xataka | West tries to block China in technology while giving him his greatest opportunity in decades

GPT-4.5 It is the demonstration that using more GPUS and more data is no longer useful

In the last two years we have seen how companies that develop AI models have not stopped showing voracity almost without limits. They bet on climbing and using more data and more GPUS to improve those models. However, there has been a surprise: it turns out that this strategy no longer works. GPT-4.5 will be the last of your lineage. We have always associated with chatgpt with the traditional models “that do not reason”, although in recent times it also gives Access to reasoning modes. Even so, its current base is GPT-4Oand that model will have a last successor. It will be GPT-4.5, which will not be renewed. That is precisely the interesting thing. Climbing no longer serves much. As they point out experts like Gary MarcusGPT-4.5 It seems to be the finding that spending more and more money on climbing, using more and more GPUS and data to train models no longer makes no sense. OpenAi’s hope was Orionwhich aimed to be GPT-5, but it is not: it is (probably) GPT-4.5. Shock against a wall. The jump in performance and capacity It was never the expectedwhich resulted in the deceleration of AI. At least, of the generative AI that does not reason. That of course seems to have collided with a wall, and can no longer improve. We are, in the face of a change of total focus towards reasoning models. It is happening to all. GPT-4.5 is the acceptance of this new reality by OpenAI, but there are many other AI companies that are in the same situation. The new versions of the models “that do not reason” do not just arrive. Grok 3 does not arrive and Xai is staying behindbut we have also not seen Claude 3.5 successor and we don’t know what Anthropic is working. Google just Present Gemini 2.0but the leap in capabilities with respect to Gemini 1.5 is not spectacular, at least if we do not take into account its reasoning version, Flash Thinking. I told you. Experts like Yann Lecun, head of goal, since warning that this strategy of “more data and more GPUS” had an expiration date. Ilya Sutskever, Openai co -leaflet and now with her own startup of AI, It also made it clear months ago. For him the massive training of an AI model using a large set of data without labeling so that the model detects patterns and structures no more than itself, and even trying to do it more and larger, also did not offer too many advantages. So, why spend so much money? If traditional models can no longer advance with that climbing, the question is obvious: why are companies investing billions of dollars in data centers? The answer is diverse. First, the climb is still useful to improve the models and make them behave better and comment less errors. Data centers make sense. But it is also the section of inference: that gigantic infrastructure in which companies are investing is not so much to train models with the traditional approach, but so that hundreds or even billions of people end up using AI in their day continuously. That is the current bet. That live the models that reason. The deceleration of the AI ​​that takes time speaking is not “of the whole AI”, but as we say of the traditional generative models that did not reason. The new models such as O1, Deepseek R1 or Gemini 2.0 Flash Thinking are clearly the trend: increasingly precise and with answers that have more and more quality and really help us to trust them. To do work for us almost “blind.” We have advances in AI for a while. The AI ​​still has a long way forward. That the climbing approach (more Gpus, more data, this is war) does not make much sense, because there are other paths. Many. And that of reasoning models is just one of them. Image | Amazon In Xataka | OpenAi wants to be the new Google with GPT-5: You will ask and the AI ​​will already decide how it answers you

The most valuable company in the world has no interest in investing in data centers fever

Big tech are investing Absolute fortunes in data centers. Almost all seem to be prepared to meet the colossal demand for computing capacity for AI tasks. And yet, there is a curious and surprising exception. Money, money and more money. In recent days, several large technology have announced their investment plans by 2025. In all cases astronomical budgets for the construction of data centers for AI: Microsoft were announced: Microsoft He spoke of 80,000 million dollars, Google of 75,000Goal of 65,000 And at the end of the week Amazon exceeded all those figures and promised to invest 100,000 million dollars In this area. To get a slight idea of ​​what that figure supposes, a comparison: GDP Bulgaria was of 102,407 million dollars in 2023 (that of Spain, 1.62 billion). Apple. However, in Apple things are very different, and their capex planned for 2025 will remain almost flat compared to 2024 and will be 12,000 million dollars. It is a remarkable figure, of course, but is far from its competitors in the field of AI. And it is the one that has the most money. It is also interesting to compare the investment that these companies will make regarding their Market capitalization. Amazon is the strongest bet here: its capex is 4.12% its current market capitalization, followed by Meta and Google. Microsoft is somewhat more cautious (2.63%), but it surprises again that Apple, which is the world’s first company for market capitalization, has a capex of only 0.35% of all its value. Wall Street rules. The decisions of these companies seem to be very influenced by the behavior of stock markets. The presentation of financial results and short -term forecasts has caused a kind of contagious effect: if we do not invest a money in data centers, shareholders are going to punish us a lot, they seem to have said the Big Tech with those colossal figures. The great beneficiary is Nvidia. Meanwhile, it is likely that the great beneficiary of these colossal investments is Nvidia, which will be the one that will receive a good part of that money if everything follows the current trend. There will undoubtedly other beneficiaries in this area, but the Capex de Nvidia is comparatively ridiculous, both with respect to these companies and Regarding its current market capitalization (3.18 billion dollars). It is estimated that in fiscal year 2025 it will be about 3,000 million dollars, a quarter of the Apple, for example. In 2024 his capex It was barely 1,000 million dollars. Apple goes to another rhythm with AI. The firm of Cupertino seems clearly to have a very different strategy than that of its rivals in this segment. He has barely offered news since he presented Apple Intelligencewhose options are limited and whose deployment is being slow. The feeling, even internally, is that In Apple they are clear followers And that at the moment will not change. Image | Ekapol With Midjourney In Xataka | Is Tim Cook the Apple Ballmer and Nadella is the Microsoft Jobs?

The price of housing in Spain is already higher than in the bubble peak. But the data has a small trick

That Spain has A problem With housing it is no novelty. Even so, it is interesting to find data that specifies to what extent it has been complicated to access a house in this country. The Property Registrars they just disseminated one that results especially valuable And it shows that those who want to buy an apartment today meet the highest prices of the historical series, even above those of the 2007 real estate bubble. However, it is convenient to handle those figures with perspective. Prices above 2007. The real estate market is in full Price climbing. That also does not suppose any surprise and can be seen in the ascending curves that have been showing for a long time graphics like those of the idealistic portal. This week the College of Registrars published A report That leaves an extra, more interesting reading. The sustained price increase has led to the average housing price has reached in Spain “A new historical maximum”overcoming the values ​​that were handled in 2007, before the real estate bubble. According to the data collected by the registrars, the average cost during the last quarter of 2024 stood at € 2,164/m2. Table of the study of registrars on the evolution of the price of housing. Going down to detail. The dossier It allows to delve even more and check for example how the new and ‘second -hand’ housing housing market is responding, which has accumulated from afar the largest volume of sale operations. In fact, almost 637,000 transactions Scored by the professional body throughout 2024, just over 505,000 were starring used properties, 6.4% more than the previous year. The new house left ‘Solo’ 132,000 sale, although that data reflects a considerable increase with respect to 2023, of 21.6%. As for prices, at the end of 2024 the square meter of the release was charged on average to 2,338 euroswhile that of the already used properties was quoted at 2,133. In the first case the price exceeds that it was handled between 2006 and 2007, before the brick bubble exploded. The same does not happen with the second -hand housing, which although it has been more expensive, it would still continue slightly below of those prestisis values. Prices with small print. The study of registrars of Spain effectively shows an average price of housing in historical maximums, above of the 2007 level, but the big question is … does that mean that the houses are more uninquerable today than before the real estate bubble? To answer that question, you have to take into account the difference between the nominal valuesthat measure the current prices of each moment (in this case 2024) and The real onesin which the effect of inflation is taken into account. A few days ago, after the publication of the registrars dossier, Doctor of Economics Daniel Fuentes He warned in x which shows nominal prices, with which the comparison between 2007 and 2024 has 17 years apart. And its inflation. The nuance does not remain value to the report data, but it should be taken into account if comparisons are made with the prices prior to the 2007 brick crisis. Click on the image to go to Tweet. More perspectives. In fact, registrars are not the only ones who have published more or less updated information on Spanish real estate market prices. In April 2024 the Bank of Spain launched A report in which it clearly reflected the differences between nominal and real values. According to their calculations, in 2023 the average housing prices had been recovered since 2014 with a nominal revaluation of 56%; But in terms of real revaluation that rise was much lower, of 30%. What does that mean? That in the first case the prices of 2023 were about to exceed those of 2007 (they were 2% below); But in the second, taking into account the real values, the average of 2023 was still 28.5% lower at 2007. The indicator Tinsa Imie (monthly real estate market index) December It also shows that, although the values ​​have grown clearly, the general indicator It is still below of the scored in 2007. Graph of the Annual Report of 2023 on the housing market in Spain. Graph of the Bank of Spain on the evolution of the price of nominal and real housing. Evolution of the price of housing in Spain reflected in the October October report. Different situations. That current prices approach or exceed those of 2006 and 2007 also mean that the current scenario is the same as that of the years prior to the brick crisis. “The demand after the bubble was speculative, the one now is demographic,” says Fuentes In your tweet. During A talk with The country Santiago Carbó, Professor of Economic Analysis, also ruled out a crisis similar to that of the bubble. The reason: the great current challenge of Spain is the mismatch between supply and demand, rather than indebtedness, as was the case then. The problem, Carbó aboundsis that finding affordable housing becomes “increasingly difficult” and it will be difficult for the measures taken by the Government to let their effects feel this year or even the next one. “The incorporation of thousands of floors to the market would be needed to relieve this tension, and that requires years.” Is there more? Yeah. The study of the registrars allows to know three other factors equally interesting: the differences between regions, the number of signed mortgages and the weight of foreign buyers. With regard to the first issue, that of the lags between communities, the report shows that disparate realities coexist in the Spanish market. For example, while Madrid led the cost of the house, with € 3,780/m2, in Castilla-La Mancha and Extremadura that average did not even reach the € 1,000 barrier. The weight of the foreign buyer … and the banks. The report also shows that the vast majority of purchases are made with bank financing. Throughout the last quarter of 2024, 124,000 … Read more

Some researchers claim to have created an AI as good as those of Openai and Deepseek for $ 50. And the data is real

The cost of training of models of artificial intelligence (IA) More advanced is in the spotlight. And it is understandable that it is so. The irruption of the Chinese company model Deepseekwhich presumably has A moderate training costhas questioned the strategy and investments deployed so far by OpenAi, Google or Microsoft, among other companies. A brief review before moving forward: those responsible for Deepseek argue that the infrastructure they have used to train their agglutin model 2,048 chips H800 of Nvidia. And also that this process with 671,000 million parameters has cost 5.6 million dollars. However, some analysts defend that these figures do not reflect reality. The report prepared by SEMIANALYSIS He maintains that, in reality, the infrastructure used by Deepseek to train his AI model approximately 50,000 NVIDIA GPU with Hopper MicroAritecture. According to Dylan Patel, AJ Kourabi, Doug O’Laughlin and Reyk Knuhttsen, at least 10,000 of these chips are GPU H100 of Nvidia, and at least another 10,000 are GPU H800. The remaining chips, according to these analysts, are the cuts cut H20. The ‘S1’ model takes more firewood On January 31, a group of researchers from Stanford University and the University of Washington, both in the US, published in the repository of open access scientific articles Arxiv A text in which it claims to have managed to train an AI model with reasoning capacity and benefits comparable to those of OPENAI or DEPEEEK O1 models facing an investment of just under $ 50. A boat soon seems impossible. With that money a priori it is absolutely unfeasible to train an artificial intelligence model. And less an advanced and capable of competing from you to you with those of OpenAi or Deepseek. However, it is true. To understand how they have achieved it We need to investigate the strategy they have devised. On the one hand, those 50 dollars represent the cost of renting the cloud computing infrastructure to which they have resorted to carry out the training. It makes sense if the time invested is very moderate. ‘S1’ has been elaborated from the free qwen2.5-32b model developed by the Chinese laboratory Qwen But there is something else. Something very important. His reasoning model, which they have called S1, has been elaborated from the free artificial intelligence model QWEN2.5-32B developed by the Chinese Laboratory Qwen, alibaba. And its reasoning process is inspired by the GEMINI 2.0 Flash Thinking Google model. They have not left zero at all. An interesting note: the S1 model is available in GITHUB together with the data and code used by these scientists to train it. On the other hand, the training process lasted less than 30 minutes using only 16 NVIDIA H100 chips belonging to the cloud computing network used by these researchers. From here comes the cost of Somewhat less than 50 dollars. However, there is another data that is worth not overlooked: the S1 Reasoning Model has been generated by distillation of the Gemini 2.0 Flash Thinking experimental model. Distillation is, in broad strokes, an automatic learning technique that allows the knowledge base to be transferred from a large and advanced model to a much smaller and efficient. This strategy saves many resources, although it does not serve to create models from scratch. Beyond the caraded 50 dollars of cost, the really important thing is that, as we have just verified, it is possible to put to tuning models of very competitive facing a much more restrained investment than those made by the large technology companies so far. Image | Luis Gomes More information | Arxiv | GITHUB In Xataka | Samsung is preparing to give TSMC a bars where it hurts most: the manufacture of the chips for ia

The next the gold era are the data centers and Softbank has seen it perfectly

In 2025 the technology giants will dedicate a lot, a lot of money to data centers. Microsoft speaks of 80,000 million dollars of investment and google of 75,000 million. Not to mention Stargate projectwho plans to go further and Invest 500,000 million dollars With softbank and openai as protagonists. And precisely SoftBank seems to be preparing an important movement to take advantage of this true gold fever. Ampere in the spotlight. As indicated in Bloomberg, SoftBank is in advanced negotiations for the acquisition of ampere. The firm designs and manufactures processors for servers in data centers, and does so using ARM technology. 6,500 million valuation. Ampere is participated in 29% by Oracle, and the current estimate of its value is around 6,500 million dollars. In 2021 SoftBank already negotiated a possible investment that placed the ampere value at 8,000 million dollars, but since then the competition and evolution of the semiconductor market has made this assessment change. ARM would have a great ally. Wednesday ARM presented Your fiscal results. Although income grew 19%, he warned that he would not be able to reach the maximum income for the income of 2025 he had previously advanced. That caused a fall of 6% of its actions. The company, which is the pillar of the mobile segment, has long tried to make the leap to the market and AI market, and ampere would be an interesting ally in that area. Promising chips. Ampere presented its ampereone processors last year, which is expected this year they reach Have a version of 256 cores. This chip, manufactured with 3 Nm photolithography, will have an important competitive advantage: efficiency, something crucial for data centers. Attracting data centers. Precisely having more efficient chips and that consume less energy is increasingly important for data centers, than They threaten with “swallow” up to 1,000 power in 2026. Truce between ARM and Qualcomm? There is one more interesting factor. Qualcomm is current ampere ally since both are developing an inference chip of the set. The funny thing here is that ARM was involved in a legal process with Qualcomm. However Cristiano Amon, CEO of Qualcomm, He indicated Wednesday At the Conference for Investors, he stressed that ARM had withdrawn the demand for violation of the license agreement. That announcement seems to be able to be related to the negotiations between SoftBank and Ampere: if Arm “has made peace” with Qualcomm, a possible ampere-anm-qualcomm triplet raises important advantages in order to compete in a market as complex as this. Image | Miki Yoshihito | In Xataka | Qualcomm dodges a lethal bullet and is winning his judgment against ARM: that is crucial for the future of his PC chips

Spain was going to invest a fortune in data centers. And then Deepseek arrived

Data centers They looked like the new gold fever in Spain. Recent data revealed how investments of Various Big Tech They promised to significant this market. Artificial intelligence promoted all those efforts, but these days some companies are rethinking what to do. The reason is, of course, Deepseek. Deepseek. The arrival of the models Deepseek v3 in November and Deepseek-R1 Just a few days ago it has made all these investments now questioned for a single reason. It may not be necessary to spend so much money. Chinese models of this startup seem to show that the same can be achieved (or more) With much less. “Unrealistic”. As revealed expandingSpain could attract more than 43.7 billion investment until 2030. However, sources in the sector have indicated in that economic newspaper that some millmillonarium projects to build large data centers in Spain were “unrealistic” and there is talk of figures that were not even guaranteed by the funds. Market adjustment. The Search for efficiency You can make these analyzes a certain adjustment in the market. Both investment funds and risk capital companies can now show more prudence when investing. Spaindc provided for the arrival of 58,000 million euros to the data centers sector until 2030. But there will be (a lot of investment). Although it seems clear that there will be a review of the budgets in various projects, as long as the need for the creation of data centers will continue to exist. The demand, due to the rise of AI and cloud services, will be remarkable. Great plans. In our country ACS has for example plans to invest globally 60,000 million euros (12,000 million in Spain). Merlin, another of the leading companies in this sector, announced the promotion of two megacampus in Extremadura with 1 GW capacity in each. Repsol, also cited expanding, It has plans to invest 4,000 million in Spain in this area. Long -term optimism, but short caution. The movements that have occurred these days after the impact caused by Depseek seems to have made many companies be recalibrating their short, medium and long term plans. However, it is impossible to know what the impact of medium -term Deepseek will be. Some experts pointed out In the Independent How they were not entirely convinced of the real efficiency of Deepseek, but they admit that that will certainly force a potential rethinking of the data centers. Image | Amazon In Xataka | We have calculated how much money the Big Tech are being spent on data centers. The numbers are dizzy

LaLiga wants your biometric data to enter the field. Europe has something to say

The Spanish soccer employer returns to the load with its intention of implementing systems of facial recognition and fingerprints in the accesses to the stadiums, despite the fact that European regulations prevent it, so they continue to work on the way of achieving it. The current situation. Javier Tebas, president of LaLiga, reiterated on Tuesday in Congress his desire to apply biometric identification in the stadiums. “Within the stadiums there are people who are prohibited from entry, but the data protection agency has told us that it goes against the European regulations,” he said according to Brand. This battle is not new. In January 2024, the Spanish Agency for Data Protection (AEPD) already launched a warning to LaLiga for teaching a contract to develop a facial recognition system in access to fields. Between the lines. LaLiga’s motivation is clear: reinforcing her fight against violence and racism in stadiums. The organization has filed more than 40 criminal complaints and reported more than 560 intolerant chants since 2013. Ensure that those sanctioned with the prohibition of access to the stadiums can go to the matches with the payment of another person or with a free admission is what LaLiga would seek implementing biometry. The obstacle. The General European Data Protection Regulation expressly prohibits the processing of biometric data except for very specific exceptions, such as the explicit consent of the interested party or a need for essential public interest. There are some precedents in Spain, but with many nuances. This is the case of Osasuna, which already implemented facial recognition in 2022 in the Sadar, but as a voluntary and complementary system to the traditional fertilizer. The key is precisely in their optional character: partners can choose to register their face to access the stadium or maintain their usual method. However, that did not help him to avoid a recent fine of 200,000 euros. The technology, developed by the Pamplonesa company, converts facial images into numerical codes that cannot be converted into images, thus complying with privacy requirements. Consulted by XatakaLaLiga refers to an October statement already the therabas of Thebes in Congress. In that press release, the employer stressed that “biometry would be a crucial tool for the detection and eradication of violent and racist behaviors in the stadiums.” In fact, Miguel Ángel Aguilar, prosecutor for hate crimes, supported this idea by suggesting “the reform of the data protection law to allow biometric identification, obtain evidence and condemn the people who perform the insults in the stadiums “. Deepen. The AEPD has been blunt: when there are less intrusive alternatives to verify the identity, such as showing the access title next to the ID, “it is hardly justifiable” to implement biometric systems. In addition, the agency warns that there must first be a legal basis that justifies data processing, not vice versa. For now, LaLiga continues to work with the Interior Ministry to try to find a route that allows these systems to be implemented, although the current regulatory framework seems to leave it in a difficult aspiration to materialize. Outstanding image | Gregorio Cavana in Unspash In Xataka | I have run for weeks with glasses that project data in my field of vision. The future of running will have to wait

OpenAi has taken everything he wanted from the Internet to train his AI. Now accuses Depseek of stealing his data

The models of AI of Deepseek They are really good. They show it comparative evidence we publish yesterday and that put it at the level of chatgpt, Claude or Gemini. That has unleashed praises, but also suspicion. There are people who do not believe that training deepseek It has cost just 5.6 million dollarsbut also now in Openai they accuse Depseek of something else. Deepseek, you are using our data without permission. Openai spokesmen have indicated Financial Times They have discovered evidence that “distillate” techniques have been used from OpenAi models used by Depseek. What is that of “distilled” in AI? Yesterday we talk about how Depseek developers have used a large number of techniques to achieve such an efficient model. Among them stands out for reinforcement learningbut it is also known that they use models distillate. In this technique a smaller “student model” is taught to behave as a larger and more advanced “teacher model”. Data of the “teacher model” are used so that the small model is faster and more efficient, but equally intelligent in specific tasks. Use not allowed. The distillate or distillation of models is a common practice in the industry, but the terms of OpenAi service prohibit that their models be used for this purpose. Thus, it is specified that users cannot “copy” none of their services or “use the output (of Openai models) to develop models that compete with Openai.” OpenAI and Microsoft have already investigated this. According to Bloombergboth companies analyzed last accounts that were being used to take advantage of their chatbots and that apparently belonged to Deepseek developers. They used Openai’s API, but there were suspicions that they had violated the terms of service by taking advantage of that access to make distillate of their models. Many do. David Sacks, responsible for AI in Donald Trump’s team, alerted him to what was happening and said there was evidence that Depseek had used OpenAi data. Spokesmen of the company led by Sam Altman indicated that “we know that companies of the People’s Republic of China – and others – are constantly trying to distill the models of leading companies in AI in the US.” The thief is believed that everyone is of his condition. The ironic thing here is that Openai has not had scruples when collecting internet data to train their models, also violating the terms of service of those platforms. Last year it was discovered for example how transcribed a million youtube hours To train GPT-4. Timnit Gebru, famous for his controversial dismissal From Google, I commented on LinkedIn that Openai “must be the most insufferable company in the world.” And he continued: “They can steal the entire world and swallow all possible resources. But no one can give them their own medicine not even a bit.” If you are on the Internet, it can be used, right? Other companies They do exactly the sameand are shielded in the argumetno of “fair use.” They collect Any public content On the Internet without asking users or permission or platforms. Not only that: it is suspected that in many cases these models are trained with works Protected by copyrightsomething that has resulted in numerous demands. Image | Xataka with Grok In Xataka | The next phase of AI is not to see who invests more but who invests less

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.