A study analyzed the power of LED car headlights. The conclusion is what all drivers already know

I hate traveling at night, almost as much as drive in rain. It had been a while since I went to a national one, but a few days ago I had to do it and what had to happen happened: I was dazzled on more than one occasion. Car headlights have evolved tremendously in a short time and LEDs have prevailed in new vehicles. The problem is that every time there are more signs that we have gone too far with its evolution. And a new report puts a percentage on how dangerous they can be if they are not properly calibrated… or if the car that uses them is an SUV. In short. Whether because you have a new car or because you update the headlights of a car with a few years behind it, they are one of the elements that are most appreciated on the journeys. They see you better, you see better and it is one of the most important points in terms of safety behind the wheel. If the height is correct and they are well calibrated, they are a pleasure, but it can also happen that this is not the case and they dazzle or dazzle you. There, security goes to hell for a few seconds. The British Department for Transport has published the results of a study about glare caused by LED lights. Your conclusion? They represent a road safety problem, altering the habits of drivers in the United Kingdom. We could extrapolate it perfectly. Basically, between October 2024 and early 2025, they combined objective measurements in real conditions with surveys of 1,850 drivers. The results They are devastating: 97% of them affirm that they are frequently distracted, and 96% that glare from headlights is a road safety problem. Analysis. On the one hand, we have those statements from drivers, who were asked about the frequency with which they felt distracted due to glare from the headlights of vehicles traveling in the opposite direction. On the other hand, the objective analysis. To do this, the DfT used luminance cameras and mixed the data using a machine learning algorithm to identify the variables that come into play at high glare levels. They discovered that there was a strong correlation between higher luminance levels and reports of glare in some test vehicles (logical, on the other hand). Also that road factors influence, such as circular upwards or curves to the righttimes when drivers’ eyes are most exposed to the beam of light from the headlights. In the end, these are things that a study does not have to confirm if you have ever driven at night, but what is interesting about the study is the consequences and the “culprits.” Impact. For example, more than half of the respondents have affirmed that this discomfort due to glare has generated anxiety when driving at certain hours, which is why they have reduced night driving or have abandoned it altogether. And more than 20% point out that they would like to take the car less at night because of this, but they have no other option. According to statistics and beyond the indirect impact, they estimate that glare has bound about 290 accidents annually. and the effects They depend on age: a 50-year-old person takes nine seconds to recover from glare, while a 16-year-old takes just one second, which applies another risk factor on the road to older drivers. SUV. Beyond this, they have also found that larger vehicles, such as SUVs, are the most associated with glare in surveys. This is logical: they are taller, their headlights are more aligned with the eyes of drivers traveling in the opposite direction (especially in lower cars) and it seems that all new cars are SUVsso they are the ones with the most up-to-date lights. The problem of retrofit. This term in English refers to the modification of an existing component. In short: updating with new parts and superior technologies, such as changing the brakes for better ones, installing a new infotainment system or change the original halogen headlights for LED ones. You can buy new ‘bulbs’ even on Amazon and many are approved, but there are two problems: those that are not well regulated and those that are installed illegally. The British Administration has identified that illegal conversion is a problem, since changing halogen bulbs for LED means that those housings designed for halogen do not work the same with the new LED headlights, causing dangerous glare. British ITV has intensified its analysis of the sale of these kits, with heavy fines for violators. Not simple solutions. They estimate that around 800,000 vehicles fail their annual inspection due to headlight alignment problems, but although these are UK numbers, this is a global problem (in Spain22% of serious failures have to do with the lights) which implies that, perhaps, we have gone too far with the power of our cars’ headlights. The solution is not clear. The report recommends periodic glare checks and rethinking luminance measurements in modern headlights, but this will have to be studied. In the end, it is something that we all suffer at one time or another although, as they point Our colleagues at MotorPasión, for motorcyclists there is another added problem: reflections on the visor itself. Image | Alexander Jawfox In Xataka | The “made in China” business of the DGT’s V-16 beacons: homologating the same product 24 times and selling it under different brands

Someone has analyzed 136 million buildings threatened by rising sea levels. And there are reasons to worry

One of the biggest threats we have as a society is undoubtedly rising sea levels. A process that is slow, but that can end up changing the mental maps that we now have from world geography to finish coastal areas of some regions completely flooded. Something that a study wanted to shed light on analyzed building by building flood risk in the Global South. And the result is alarming. The study. Published in npj Urban Sustainabilityis the first to analyze the impact on this scale in Africa, Southeast Asia, and Central and South America. “The rise in sea level is a slow but unstoppable consequence of the global warming that is already impacting coastal populations and will continue for centuries,” explains Natalya Gomez, co-author of the study. The numbers. The study analyzes the exposure of buildings to different levels of local sea level rise (LSLR), regardless of a specific time scale. This allows the findings to remain relevant as climate projections are updated. In this case the data is quite compelling. First of all, with just 0.5 meters of sea level rise, 3 million buildings would be submerged under the sea. Something that is inevitable right now, even if the most ambitious emissions cuts on the table are applied. If we talk about a five-meter rise in sea level, a scenario that could occur in several hundred years if emissions do not stop, the exposure would skyrocket to 45 million buildings. And in the most extreme case, with a 20-meter rise in the LSLR, the figure would reach 136 million buildings. How it was done. To achieve this level of detail, the scientific team combined several cutting-edge technologies. They used the database Google Open Buildings V2which identifies the location and outline of billions of buildings by analyzing satellite images. This data was cross-referenced with FABDEM, a digital global elevation model that, thanks to machine learning, removes the height of trees and buildings themselves to obtain the true elevation of the “bare ground.” This is crucial to not underestimate the risk of flooding. Finally, they adjusted the calculations using a global tidal model to reflect the water level during high tide, thus providing a more realistic estimate of the danger. Uneven impact. The risk is not the same in all regions, since the study reveals that in the early stages of sea level rise, Africa is the continent with the highest number of buildings affected. However, as the LSLR intensifies, Southeast Asia quickly comes to dominate the flood figures. A key finding is the non-linear nature of the threat. Building loss is relatively high below two meters LSLR, but accelerates dramatically between 2 and 4 meters. Professor Jeff Cardile, co-author of the study, points out that “we were surprised by the large number of buildings at risk from relatively modest long-term sea level rise.” This means that we are not facing a problem that is gradually worsening, but rather one that could reach tipping points with devastating consequences. Many of these buildings are located in low-altitude, high-density areas, affecting entire neighborhoods and critical infrastructure such as ports, refineries, and cultural heritage enclaves. Planning. Beyond the global warning, the study seeks to be a useful tool. Researchers have created an interactive map available through Google Earth which allows policy makers and urban planners to visualize which regions face the greatest exposure. And on this map you will be able to see, building by building, the risk of ending up below sea level as a consequence of climate change. A global problem. Although this study has focused on the effects that will occur in Africa or Asia, the reality is that it is a problem that affects us all. As the study points out, all of us depend on food, goods and fuel that pass through ports and coastal infrastructure that are exposed to this rise in sea level. Thus, disruption of this infrastructure can cause disruption with serious economic consequences globally. That is why this tool can guide climate adaptation strategies, such as the construction of protective infrastructure, the adjustment of land use planning or, in some cases, the planned relocation of communities. As Maya Willard-Stepan, lead author of the study, concludes: “We cannot escape at least a moderate amount of sea level rise. The sooner coastal communities start planning, the more likely they are to continue to thrive.” Images | Chris Gallagher Marc Pell In Xataka | In the midst of climate change, cities only have one question to answer: become a sponge or a mousetrap

The genre Superventas, analyzed by its best Spanish authors

The talk about Romantas and Comic-with He did not put the crowds early at six in the morning (although, like all, he filled in his room), but he did bring several tens of readers and readers who came to listen to the interesting reflections of a trio of outstanding authors of the genre. A good symbol of what the genre of the Romantasy: It does not occupy holders, it is not the convention center of conventions like this, but Sell ​​thousands and thousands of copies outside the media radar, captivating along the way more and more followers. In the Comic-with Malaga We had the opportunity to attend a talk with three of the most important authors of the genre in Spain: the very young Lucia Cerezo, author of the saga ‘Phoenix and Dragon’, and the most experienced parente and Selene Pascual, authors of about twenty works such as ‘paper petals’ or the series ‘Time Keeper’, almost all of fantasy but not all strictly all strictly. Together they raised a somewhat demystifying approach to the genre. Iria makes it clear when he appears: “We have been writing fantasy, fiction and romance and sometimes, why not? Mixing all those subgenres.” And he adds: “When we started publishing, we already wrote fantasy with his romance touches. When we were more girls and we were fans of ‘Memories of Idhun’, nobody described him as Romantasy, but clearly If ‘Idhun memories’ was published today, what would it be called? Romantasy. “ With this, they make it clear that the mixture in “more or less balanced” parts of fantasy and romance, as it was agreed to define Romantasy in the start of the talk, is a label that defines a style, but that has always existed. “The themes have not changed, but the way the themes are being appointed,” says Selene. But he also recognizes that tastes are transforming: “Yes, it is true that there has been a boom of a type of fantasy that puts romance in the center of the plot, now today it looks more than the romance carries the plot and that the romance eats the fantasy a bit.” I would be parente It tracks the origin of the term and awarded a completely industrial origin: “It is a label that is born from the publishers, from Bloomsbury, which is the one who published Sarah J. Maas.” Commercial maneuver or not, all agree that this new genre has managed to vibrate at a frequency that many readers were looking for. Iria and Selene summarize it with their own experience: “Our book that fits the most in the Romantas and is ‘Papelos de Papel’, a story in which the romance prevails about the rest, of a girl and a boy who live in different worlds and that communicate through a book. It was a bit of the allegory of what we have ever dreamed of when we take a book and think that we would like to be part of him.” They are also aware of the importance that networks have had in the dissemination of the term. Selene says that “following the pandemic, people were at home without knowing what to do, and decided to talk about books with people who liked the same genres. The ‘Romantasy’ began to be used, and it grew until they acquired their own category.” However, the networks have a darker side, and it puts the Spanish authors in the background. According to parente, “the books in English Many times they come with marketing already done. Because when you see a new book in stores, you say, ‘Buah, I have seen it and have recommended it 7000 times in Tiktok and I have left 7000 videos, I’m going to buy it.’ “ The problem of labels, they recognize, is that “fashions change from time to time”, and the categories can also be a trap: Selene states that it has noticed a certain tendency to “Tropification, that is, I have to add three certain tropes to my book or if it will not be sold. “The genres have always used clichés, but there are many who” seem now to be mandatory. If the girl does not put the dagger on the boy’s neck, it’s wrong. If it is not a enemy to loverswe are not interested “, and that can be more a restriction than a welcome novelty. But all are very clear about what the attractiveness of this genre is supervent. Lucia states that “many readers seek evasion, and Romantas and gives you this, a fantasy world, a love story that, if you want, allows you to read and evade. But also, if you prefer, you can make a deeper reading, you will usually have a political background, something that speaks to you of the real world and everything that is happening and that somehow gives you hope.” And there they give in the nail: “I think that Romantas and also has this basis of hope, that you can have a better world, and somehow people want that.” And Romantas and gives it, beyond labels. In Xataka | Brandon Sanderson’s fantastic emporium works thanks to 70 people who help him put order in his books

Australia has analyzed teleworking since before pandemic. His conclusions disassemble the reasons for the return to the office

Although teleworking is no longer the preferred option by companies, or at least not their full -time variant, remote work continues to maintain much higher values that those who registered before pandemic. That shows that, in a way, teleworking does arrive to stay in Very specific contexts. Australia has been observing the real impact of teleworking for four years and consolidated data contradict old prejudices. “Working from home makes us happier,” The authors assure of a study by the University of Southern Australia, ciming a new more flexible and productive labor model. Time flexibility: the new office fruit. The Australian study is especially revealing because it began before the pandemic and the rise of teleworking and has been extended for four years, which leaves a much more defined photo of how remote work has impacted on the way of working and its consequences. According to the study, the possibility of choosing from where to work has allowed to improve both the mental and physical health of workers, although there is still a certain friction from corporate culture. According to A report of the International Labor Organization (OTI), the flexibility provided by teleworking is already equal to The emotional salary with which companies try to capture and retain the best employees, replacing other benefits. Most satisfied employees with their work. The data collected by the study reveal that before the pandemic, the Australian worker used about 4.5 hours per week only in displacements to the office. That time optimization It makes those who work from home enjoy “ten extra days of free time a year against those who go to the office”, dedicating 33% of that time to leisure, which implies “more opportunities to be physically active and less sedentary.” A factor that also highlighted the Academic Study which was carried out in Spain from the Lacaixa Foundation. According to the authors, these data “usually go hand in hand with worse mental health and with lower scores in the assessment of our own health.” Thanks to teleworking, employees have gained “hours of rest to sleep and, for example, breakfast more peacefully”, which helps reduce levels of template stress. In turn, this recovered time also has a reflection in healthier habits, such as home preparation or the increase in the consumption of fruits, vegetables and dairy. The result has been a more varied and healthy diet, with less dependence on ultraprocessed foods that require less preparation time. Positive whenever it is by choice. If something has shown us the experience of “forced” teleworking during the confinements of 2020 is that the teleworking It is not for everyone. Since this study allows to contrast the situation of employees before and after the massive arrival of teleworking, it also reveals how it affects that change of work model in workers The researchers found that the well -being and mental health They improved especially when teleworking was voluntarily chosen, while “when employees work from home for obligation, mental health and well -being tend to get worse.” Productivity in evidence. One of the main arguments of companies for the return to the office has been the alleged fall in productivity that was associated with teleworking. In this sense, researchers blame the problem more to an inability to assign tasks and New model management not in a direct casuistry of teleworking. “In many cases, managers who claim that teleworking reduces productivity responds more to a lack of management than a real performance problem,” says researchers in their conclusions. The conclusion after four years of monitoring is unequivocal: work performance and productivity seem to stay stable or, in most cases, improve when working from home. These results coincide With other research that They disconnect the decrease in productivity of the company with the teleworking. The distance does affect the cohesion of the equipment. Great corporations like Amazon wielded The argument of the cohesion of the equipment To impose The return to the office. In that sense, the study prepared by Australian researchers recognizes that “the connection with the classmates is difficult to reproduce at a distance,” admit those responsible for the study, and alert about the risk of loss of cohesion in the work teams. But, as has been demonstrated with some strategies back to the office, the problem can be mitigated by facilitating efficient communication channels. A Recent study Posted in the magazine Naturerevealed that this Team cohesion problem It currently persists with the hybrid day model, provided that consistent communication patterns are not established. In Xataka | A Barcelona company wanted to try the four -day week. He ended up firebaging an employee for having two jobs Image | Unspash (Rodeo Project Management Software)

Some astronomers analyzed the “Big Bang sound.” Now they believe that the Earth is in a vacuum of 2,000 million light years

Cosmology has a huge problem. It is known as hubble tension and suggests that the nearby universe is expanding faster than the distant and primitive universe He is telling us. Something does not fit. Now, a disturbing study offers a solution. The big problem of cosmology. Hubble tension is One of the biggest headaches of modern physics. On the one hand, we have the measurements of the cosmic microwave background (CMB), the oldest light in the universe. When applying the standard cosmological model (LBDACDM), these observations show a 67.4 km/s/mpc hubble constant. On the other hand, when The expansion of the universe is measured Using nearby objects such as standard candles (a type of supernova), a significantly higher value is obtained: about 73 km/s/mpc. This difference, that the most recent data places in a tension of more than 5Sigma (a level that in particle physics is considered a discovery), refuses to disappear. A disturbing explanation. A new study Prepublished in Arxiv proposes a solution as elegant as depressing. That the discrepancy is not in our measurements, but in our location. According to Indranil Cosmologists Banik and Vasileios Kalaitzidis, we could be living in the center of a gigantic cosmic vacuum, a “bubble” of 2,000 million light years in diameter with a density 20% lower than the universal average. The test, they affirm, It is in the “Big Bang sound”. A local vacuum. The idea of the local vacuum is not new: it is known as the empty KBC (Keenan-Barger-Cowie, in honor of the astronomers who proposed the idea based on the galaxies count). If our galaxy, the Milky Way, were in a region with less matter than normal, the severity of the surrounding, densest areas, would “throw away” out. This effect, added to the general expansion of the universe, would cause nearby galaxies to move away from us faster than normal. “This would give the appearance of a faster local expansion rate,” explains Indranil Banik, of the New Research. The Hubble tension problem would thus become a local phenomenon, without the need to revolutionize the entire cosmological model. The sound of the Big Bang as proof. What the new study of Banik and Kalaitzidis contributes is a much more fundamental test based on barionic acoustic oscillations. Although we call them “the sound of the Big Bang”, they are not sound waves that we can hear. They are the traces that left the pressure waves that spread through the superdense plasma of the primitive universe. These waves were “frozen” about 380,000 years after the Big Bang and created A characteristic pattern in the distribution of matter. This pattern works as a cosmic rule of about 500 million light years in length, which astronomers use to measure the expansion of the universe to different eras. The results. The team analyzed 20 years of measurements and compared them with two scenarios: on the one hand, the homogeneous standard model, without emptiness; and on the other, the model that includes the empty KBC. The results, presented at the National Astronomy Meeting 2025 of the Astronomical Society Royal, are blunt. According to the statistical analysis of the study, the model with a local vacuum conforms to the data in a spectacularly better way. While the standard model has a 3.3sigma voltage with observations, vacuum models reduce it to only 1.1sigma –1.4sigma. Calmly. The researchers consider “demonstrated” that A vacuum model is about 100 million times more likely than a model without emptiness. However, it is a preliminary study, which has not yet gone through the pairs review. Previous studies set very strict limits to the existence of such an influential vacuum, concluding that it is not enough to explain the entire Hubble tension. They also propose early dark energy as a solution. But Banik’s work offers one of the strongest evidence to date that the Earth could be in a very lonely region of the universe. Image | Greg Rakozy (UNSPLASH) In Xataka | The James Webb and Hubble telescope coincide in the expansion of the universe. And physics fails to explain why

The OCU has analyzed about thirty Tarrina ice cream, the results are not excellent

Summer is already here more than one awaits an ice cream binge. Ice cream are rich in sugars and fats, generally ultraprocessed foods and therefore Not especially healthy. Despite this (or perhaps precisely because of it), choosing the healthiest option can be a good idea. New report. Now the Organization of Consumers and Users (OCU) has published the results of its analysis of the Tarrina ice cream that we can find in supermarkets. Your verdict is not optimistic: too many additives, “some not recommended”; and scarce in dairy fats. 32 ice cream. In its analysis, the OCU studied 32 Tarrina ice cream like the ones we can find conventionally in supermarkets. They chose “Family Format” of three flavors (vanilla, chocolate and caramel). The analysis focused on white brand ice cream although it also included specimens of two “first brands”. According to Explain the organizationthe analysis consisted of an assessment made from labeling, nutritional quality (including energy content, fats and sugars used and composition), degree of processing, and a tasting test. The latter represented 50% of the final grade assigned to each product. Bad notes. According to the organization, only 10 of the 32 selected products exceeded the tests, with four of them “highlighting for their quality.” The 28 that did not reach this category, continues to explain the OCU, exceeded the levels in aromas, concentrated, dyes or syrups, which “mask the lack of product quality.” According to the OCU, the low quality in the ingredients was reflected in the taste of the products in the tastings to the palate. “It is not for less, because these are ultra prosecuted products, poor in dairy fat and rich in aromas, syrups and additives, some not recommended,” The organization explains in its statement. Too many additives, few dairy products. As for additives, the OCU identified a total of 20 (an average of 4 per product). Of this twenty additives, the organization highlighted four qualified by OCU itself as “not recommended.” It’s about E442, E471, E472C and E14XX. Only one of the analyzed products was described as “additive free.” The OCU also paid attention to the fats present in the products. He noted that only eight of the ice cream analyzed used dairy or cocoa exclusively. Better chocolate, caramel worse. In the results published by the OCU, a pattern can be perceived, and that is that among the ice cream analyzed, those of chocolate presented Better scoreswhile the caramel scored lower. “Caramel ice cream are the worst valued (in addition to the most caloric)”, details the organization. A product not so healthy. There is no healthy ice cream and that is why it is recommended that this food so desired in summer is only consumed in moderation. Ice cream are foods rich in sugars and fat regardless of the type of sweeteners and fats used. Choosing one or another option can depend on our tastes and what we want to prioritize in our diet. In Xataka | The ice cream are going to get very expensive this summer. There are two guilty strangers: coconut oil and the Philippines diesel Image | Titopasini

A study has analyzed which cars are the ones that lose the most after five years and the clear answer: electric

You were young but there was a day when buying a Tesla was a round business. The demand was such and the shortage of vehicles so high that There were those who were willing to pay more money For a used tesla than to pay 10,000 euros less, commission it to the company and wait a few months upon arrival. The funny thing is that you were not so young. It was something that happened in 2022. First in the United States and then in Spain. Who was going to tell us now that the company dealt with a painful salesin which it is difficult to discern how much there is temporary with the renewal of the Tesla Model and and how definitive. Especially in countries where the issue is very sensitive, Like Germany. But, obviously, this is not the usual situation. In fact, if your idea is to buy an electric car and change it shortly (three/four years) it is very likely that it is a bad decision. Because, in general and except for very specific circumstances such as the previous one, the electric car is the type of car that is most devalued. A car for many years If you are thinking of getting an electric car, there are two especially interesting formulas. The first is called Renting. Although it is a formula in which more money is paid than with a share of a loan for the purchase of a car to use, it is a good option if you are not very sure of whether the electric car is for you and you do not want to mortgage in the very long term. The second option is to buy an electric car and You keep it as much as possible. If a car does not give problems this is always the best formula to save money but, in the case of a concrete electric car of an electric car, the more kilometers do, the more they fill the battery at low power and more time keep the car the best result will give. This is because yes, How we tell you in this articleyou can load the car at home, you can be saving about five times more money than with a day -to -day gasoline car in fuel consumption. Especially if the use is intensive or almost exclusive in the city. To this is added that, with the passage of the kilometers and the years, reviews, oil changes, filters and, ultimately, replacement of all types of mobile parts that in an electric car are non -existent, are accumulated in a combustion car. A taxi driver can confirm that it is good savings. But the electric car is a problem if you want to change vehicles or technology after a few years. Because, according to the portal ISEECARSspecialists in second -hand sale in the United States, the electric car is the type of vehicle that is most devalued. According to its calculations, an electric car loses 58.8% of its value after five years. The figure contrasts with those collected for other types of vehicles. A hybrid loses 40.7% From the value of the last car a five years and, on average, a car loses 45.6% of its value. The data leave the sale portal and the cars sold there, monitoring 800,000 vehicles sold between March 2024 and February 2025. If the car is not electric, the type of car that is most depreciated is the luxury car. When there is a combination of both values, the result is fateful. The Jaguar I-Pace is the car that has suffered the most depreciation in the last five years, reaching 72.2%. It is followed by the BMW 7 Series (67.1%) and the Tesla Model S (65.2%). Among the 10 cars that depreciate the most we find the Nissan Leaf or the Tesla Model X. The rest are luxury vehicles. It makes a lot of sense Although it may seem bad news, the high depreciation of an electric car makes a lot of sense. And, in fact, the data is better than in previous years. In ISEECARS They point out that the same study in 2023 signed a depreciation of the electric car five years ago 49.1%. It was a lower figure because The price of second -hand cars shot During the Covid-19 crisis and the posterior Shortage in the supply chain. The figure, however, remained the highest of any other type of car. But in 2019 depreciation reached 67.1%. That this figure has dropped out that more buyers are willing to get a second -hand electric car. A sign that there are more electric cars in the market and that the plaintiffs trust more in technology. Although Buy a second -hand electric car It should not be very different than doing it with a combustion car (in fact, in terms of mechanics, it should be easier), it is logical that those who have never had an electric car are reluctant to enter technology with the acquisition of a second -hand vehicle. In addition, the rapid innovations that the sector is living while more competitors arrive that reduce prices cause cars to lose greater value against combustion cars. The promises of new most ambitious batteries and RECHARGES TO RATIMOS ALMOST IMPOSSIBLE TO IMAGINE They are especially relevant to those who are willing to jump into the electric car but prefer to wait a bit to the new models. Cars now bought are devalued to a greater extent because the qualitative leap of buying a new car will be higher than changing a gasoline used by a new one. It is normal that with technology in full development, the current car is obsolete more quickly. It is something that happens in all types of markets Until a technology reaches maturity. Photo | HAVEREDAS In Xataka | We do not trust the second -hand electric car: its value does not stop falling and it is a problem for the industry

Some researchers have analyzed what the summaries of scientific papers say. There is enough “clickbait”

Academic articles, Papers reviewed by pairs published in scientific journals, they are One of the pillars of science nowadays. These articles usually have a more or less defined structure, with introduction, results, conclusions and discussion, in addition to a section dedicated to the methodology used. An element that never (or practically ever) is missing in this type of articles is abstract. Abstract It is the term with which a kind of summary of the content of the article is known. It is a key piece that has the objective of serving as a bibliographic guide to those who are looking for a study, so this short text must answer properly to the question What is this article going? But beyond this basic function, the abstract Often fulfills the function of Summary of the articleincluding information on methods, results and conclusions of the experiment or study. Many of the scientific articles are limited access, protected by a Paywallthe price of a single article can be several tens of eurosbut summaries are available in open. Scientific articles, including this short introductory text, are subjected to several editorial and scientific reviews, so it would be expected that the abstracts be faithful representations of what the article and the study carried out. The problem is that sometimes, They are not so much. In the late 90s, a group of researchers analyzed the existence of discrepancies between the summaries of the articles and their content. The team analyzed more than 260 articles (44 pieces by six scientific relief journals) published in 1996 and 1997. They studied two ways in which these summaries could be incorrect, or by inconsistencies with the body of the article, or by the omission of relevant information. The results showed variation in the results according to the magazine (they found that between 18% and 68% of the articles presented problems). They concluded, in their own abstractthat the inconsistent or absent data in these summaries were “common, even in the medical magazines of great circulation.” The study was published in 1999 the magazine Jamaone of the publications analyzed in it. 25 years have passed since the publication of the magazine’s study Jama and almost 30 since the publication of some of the articles analyzed. Science has changed a lot in those 25 years. However some subsequent studies They indicate that this problem persists. In 2016, a group of researchers made a compilation and analysis of the studies carried out in this field. This literature review, published in the magazine BMC Medical Research Methodologyhe found that the median “level of inconsistency found these studies was In 39%although the variability was high: it ranged between 4% and 78%. Since not all errors are equally severe, this review was fixed on the studies that discriminated against the serious inconsistencies of the milder. They observed that the median in this case was somewhat lower, but still considerable, of 19%. Subsequent studies, like one posted this year In The magazine American Journal of SurgeryThey continue to show the existence of this trend in scientific literature. What happens then? Are scientists falsifying your data? Or are we simply witnessing an important accumulation of errors? We know that the summaries of the articles are determinants when receiving quotes of other academic articles and that This metric is key for the evaluation of scientific work for the authors. But the publication of an article may sometimes depend on its results being novel. That is why there is an incentive to emphasize some results and clarify them later. A non -significant result can cause the editors of the magazine or future readers to lose interest in the article, regardless of the real quality of the study. The call publication bias (which refers to the fact that studies with different results are overrepresented in scientific literature) is also the result of this interest in the novelty. Clickbait academic The titles of the articles have also been subject to scrutiny in recent years. Consciously or unconsciously, a striking holder can be decisive when we are more or less interested in a study. In 2016, A study Posted in the magazine Frontiers in Psychology It echoed this phenomenon. The analysis observed how the way in which the headlines affected were affected within reach of the study. Gwilym Lockwood, author of the study, analyzed More than 2,000 academic articles And he observed that the titles that enunciated something in a positive frame had better metrics than the average. On the other hand, he also found that the works that resorted to speech games showed a worse performance. The titles containing questions, meanwhile, did not deviate from the average significantly. The problem of abstractsIt is one of many to which the scientific publishers. Some publishers pressed By scandals of various typesfrom the “Mills of scientific articles”Even the problems with rates charged by publication or access to their contents. The artificial intelligence It is one of these problems, but perhaps also a potential solution. In recent months, and after the occasional scandal, scientific publishers They have been integrating The artificial intelligence tools in the scientific publication, beyond the work that these tools may have developed in the development of research itself. Artificial intelligence has the capacity, among other things, to generate more “objectives” summaries or to detect and correct possible errors and discrepancies between texts and summaries. In Xataka | This is how bad science infiltrates the international scientific debate: they are not just the great scandals, more than 50,000 questionable articles are incorporated every year Image | Sonia Radosz

Some researchers analyzed 280 samples of bottled water. Only one of the brands was microplastics free

Better flavor and smell and health reasons. Those are the two main reasons why people drink bottled water, according to A study of the Autonomous University of Barcelona. Spain is, in fact, The third European country that consumes more bottled water (up to 107 liters per inhabitant). That clashes with one thing: that bottled water is not only Much more expensive than tap waterbut now we know that it also has micro and nanoplastic in amounts much greater than those estimated. The original study. Some researchers from Columbia University They analyzed Three popular bottled water marks in the United States (whose names names have not transcended) in search of micro and nanoplastic. To do this, they used a new technique called Raman stimulated dispersion microscopy Based on probe samples with two tuned simultaneous lasers to resonate specific molecules. Analyzing seven common plastics, the researchers developed an algorithm to interpret the results. According to Wei Min, co-inventor of the technique and co-author of the study that concerns us, “one thing is to detect and another to know what you are detecting.” The findings. On average, this study discovered that a liter of bottled water contains 240,000 detectable plastic fragments, between ten and 100 times more than previous estimates. Specifically, the researchers claim that they found between 110,000 and 370,000 plastic fragments in each liter, of which 90% were nanoplastic. In that sense, it is important to remember the difference between micro and nanoplastic: Microplastics: those whose size varies between 100 nanometers and five millimeters. Nanoplastic: those whose size is equal to or less than 100 nanometers. The most frequent plastics. To anyone’s surprise, one of the most common plastics was the Terephthalate polypropylene, better known as PET. It is the material of which many bottles are made. “It is probably introduced into the water when pieces are detached when the bottle is squeezed or exposed to heat,” says the researchers, who cite another study that suggests that they can also detach themselves when the cap repeatedly opens and closes. Usual. And although the presence of PET is common, this plastic is overcome by the polyamide, a type of nylon that “probably comes from the plastic filters used to supposedly purify the water before bottling it,” says Beizhan Yan, a researcher of the study. Other common plastics found by the researchers were polystyrene, polyvinyl chloride and methyl polymetacrylate. And the rest? The technique used contemplates the seven most common plastics, but there are many other plastics. According to exposes Columbia University, “the seven types of plastic that researchers sought only represented about 10% of all the nanoparticles found in the samples; They have no idea what the rest are. If all are nanoplastic, they could be dozens of millions per liter. ” And what about those sold in Spain? That’s what he wanted to find out A study by CSIC and the Global Health Institute of Barcelona. They have developed a technique To quantify particles of between 0.7 and 20 micrometers, as well as the chemical additives released to the water and, for this study, analyzed 280 samples of 20 trademarks of commercial water. Only one of the brands did not contain microplastics, but all, the 280 samples, contained plastic additives. More specifically. The result is that, on average, a liter of water contains 359 nanograms of micro and nanoplastic, an amount comparable to that obtained in the tap water found in a previous study made by the same group. “The main difference we find is the type of polymer: in tap water we find more polyethylene and polypropylene while in bottled water we have mostly detected polypropylene terephthalate (PET), although also polyethylene,” said Cristina Villanueva, a researcher of the Isglobal and Author of the study. Quite microplastic. Considering that we drink two liters of water a day, the authors estimate “an intake of 262 micrograms of plastic particles per year.” With regard to additives, 28 plastic additives, mostly stabilizers and plasticizers have been detected. According to the researchers, “our toxicity study showed that three types of plasticizers had a higher risk to human health and, therefore, should be considered in risk analysis for consumers.” Images | Jonathan Chng in Unspash In Xataka | The US has decided to leave paper straws because everyone hates them. The problem is the alternative: plastic In Xataka | After the failure of the yellow container, the government has reached a conclusion: it is time for the returnable bottles *An earlier version of this article was published in February 2024

Some researchers have analyzed the impact of sugary drinks on world health. They have taken their hands to the head

The sugary soft drinks conquered the world a few years ago. Thanks to its flavor and Marketing strategiesthe soft drinks became the very image of globalization. Little by little we began to be more aware of the health hazards that the excessive consumption of these drinks carried, so much that, even in some European countries it was created THE REFRESCO TAX. With the, The consumption of free sugars was reduced In certain cases. But a new one study It reveals that its intake is still very high in many countries. So much that there is an alarming link between the usual consumption of these drinks and millions of new cases of Type 2 diabetes every year. Sugary pandemic. The trigger has been a study by the University of Tufts, in the United States. Reviewing the drinking data of the global dietary database, a database with more than 450 surveys with information on the consumption of sugary drinks and a sample of 2.9 million people belonging to 184 countries, they ran into an elongent figures . Approximately, and according to this study, sugary drinks would be related to 1.2 million new cases of cardiovascular diseases and 2.2 million new cases of type 2 diabetes. Every year and worldwide, something surprising if we take into account The normalized and integrated that these drinks are in all societies. The reasons. It is not a novelty that sugary drinks are related to type 2 diabetes, obesity and other disorders, but the reason is that they are foods that are digested quickly, causing very pronounced blood glucose spikes without providing essential nutrients. They are empty calories like those that a beer can have, but with a much larger amount of sugars. This absorption process, repeatedly, contributes to the weight gain already most important: insulin resistance that carries metabolic problems related to the aforementioned diabetes or cardiovascular diseases. Many cases, but … What does it mean? According to the study, 80,000 deaths per year for type 2 and 258,000 diabetes due to cardiovascular diseases related to soft drinks. Latin America and Africa. In countries that have fought in recent years to promote healthier diets and lifestyles, as well as the taxes we mentioned a few lines, that sugar consumption has decreased, but it is not something that happens throughout the world. In fact, in the study, researchers have focused on two territories: Latin America and Africa. According to these data, in Mexico the usual consumption of these drinks with almost a third of the new cases of diabetes is associated. In Colombia, the percentage rises to almost half. And in South Africa, about 28% of new cases of diabetes and 15% of new cases of cardiovascular episodes are related to these drinks. The explanation they have found is simple: in countries and communities with lower average income, little access to information and more limited preventive medical care, cases are triggered. Not all. Now, what drinks are we talking about? The study focused on the data of the sugary drinks with added sugars and, at least 50 kilocalories per 240 milliliters of product. It is something that includes soft drinks, energy drinks, fruit juices with added sugar, punch and even water with flavors to which sugar is added. Outside the focus is milk (which also has sugar), 100% natural juices without additives and without calorie products, being these sweetened drinks without added sugars. Of course, these drinks may be in the spotlight of subsequent studies, since researchers point out that, although they do not have added and not naturally present sugars, excessive consumption can also have negative health effects. Solutions. Laura Lara-Castor is the main author of the study and Comment that “urgent and evidence -based interventions are needed to curb the consumption of sugary drinks worldwide before more lives are shortened by their effects.” Dariush Mozaffarian is another of the authors, who believes that, above all, much more interest in Latin American and Africa countries should be put. Mozaffaian sees this as a real epidemic and considers that, “as a species, we need to address the problem of sugary drinks.” Now, as with almost everything, the study emphasizes the high and constant consumption of this type of drinks, since under normal conditions, a sporadic soda (without being healthy), is a whim that we can afford. In the end, the study does not want to focus on individual responsibility, but in a collective that involves governments and health systems. And, perhaps, at the most complex: The industry that creates that type of drinks. Image | Xataka In Xataka | There was a time when Coca-Cola had ‘cocaine’. That no longer has it is due to something surprising: racism

Log In

Forgot password?

Forgot password?

Enter your account data and we will send you a link to reset your password.

Your password reset link appears to be invalid or expired.

Log in

Privacy Policy

Add to Collection

No Collections

Here you'll find all collections you've created before.