the spectacular technology that finally turns fencing into a spectacle
The tip of a foil in a professional fencing competition moves faster than the human eye can follow, and that limitation has condemned the sport to a secondary role in broadcasts for decades. A Japanese studio has been working since 2012 on an answer that combines 4K cameras, deep learning and augmented reality. This April 25, it debuts in its first professional competition, in Los Angeles. It’s difficult to follow. Fencing has rules like right of way in foil and saber, used to determine who wins the point when both fencers touch the opponent’s body at the same time, forcing the spectator to follow movements of the weapon in fractions of a second. According to the Rhizomatiks official technical documentationthe tip of the weapon takes up just a few pixels even captured in 4K, and the blade deforms so much when flexing that classic image processing methods cannot follow it clearly. How it works. This visualization system is called Fencing Visualized and is born from an alliance between the Japanese studio Rhizomatiksdirected by Daito Manabe (known for collaborations with Björk, Perfume and the closing ceremony of Rio 2016), the agency Dentsu Lab Tokyo and the fencer Yuki Ota, the first Japanese Olympic medalist in the discipline. The idea germinated from previous work with dancers in which the team used motion capture and high-speed cameras to draw graphics on the bodies on stage. The official phase began in 2013 and the concept already appeared in Tokyo’s bid video for the 2020 Games. Early versions of the system depended on retroreflective markers attached to the weapon: In 2014 it was tested live during the Yuki Ota Cup and in 2017 the balls were replaced with reflective tapes so as not to hinder the shooter. From reflective markers to deep learning. The technical leap has come now, when the team has worked to introduce the system in official competitions without interfering with the athletes’ competition equipment, mainly weapons. Therefore, in 2016 they rewrote motion detection with the help of deep learning. According to the engineer Kye Shimizuthe solution is a multistage network based on YOLO v3, fed by 24 4K cameras on both sides of the track, and whose results are crossed to estimate the position of the tip. This new version without physical markers debuted as an exhibition at the 71st Japan National Championships in 2018 and was seen in official competition a year later. The next milestone was Tokyo 2020, where the technology was deployed on site during the Games. That time at the Olympic Games is, in fact, what has allowed it to be sold to other competitions. Money. The American premiere on April 25 responds to a commercial logic adopted by the World Fencing League (WFL) that organizes the event, a professional league founded at the end of 2025 by three-time Olympian Miles Chamley-Watson. The competition brings together twelve athletes in mixed teams with a total distribution of $100,000 in prizes. The WFL itself describes the installation as a system of blade tracking with AI intended to make new viewers understand the action instantly. In other words: the league is interested in ensuring that, as a television show for all audiences, each round is understood intuitively, without knowing the regulations. More visions of the future. Fencing Visualized It is not an isolated case: There are systems like Hawk-Eye in tennis and cricket, Second Spectrum as the official optical tracking provider for the NBA and Premier League, or semi-automated offside in football. But the tiny tip of a saber is a more demanding problem than tracking a ball. On the other hand, this vision of the future also fits into the trend that the IOC has been promoting for years with Alibaba Cloud and Intel, and which turned Paris 2024 into the first end-to-end 8K broadcast with multi-camera 3D replay. The Los Angeles 2028 Games are a good space for this system to be integrated into the audiovisual dissemination of this sport. In Xataka | We have been living with robots for years that beat us at chess. Now we have robots that beat us at tennis