The Olympics are more than just a sporting event—they are a showcase of human stories that inspire and captivate us. As media technology evolves, so does our ability to share these stories with audiences around the world. Artificial Intelligence is one of the technologies that is transforming how we experience sports media, especially at the Olympic level. AI can help us tell the story of each game or match, the historical significance of the event, and the ways we can keep fans engaged even after the event is over.
“AI automation holds the potential to revolutionize workflows, driving efficiencies across production and editing processes – for example, through automatic highlights generation and generative assisted editing. Moreover, AI has the potential to reduce the broadcast footprint through lower power consumption and physical space.”
Olympic AI Agenda, April 2024
From Past to Present
Ever since the modern Olympics began in 1896 (Athens), people have been fascinated by the games. And with each new era, the latest technological innovations were used to bring the games to viewers across the globe. The first coverage of the event was through newsreels in theaters. Radio started with live coverage in 1924 (Paris), 100 years ago. You might be surprised to learn that television coverage began in 1936 (Berlin), with a kind of closed-circuit system that showed the games in public places near the stadium. It wasn’t until the 1960 games (Rome) that the broadcast reached other countries.
The Olympic Broadcasting Services (OBS) was established in 2001 with the core mission to act as the host broadcaster for the games, delivering the sights and sounds to viewers all over the world. It started fulfilling that mission with the 2008 games (Beijing) and has been responsible for the main infrastructure and media related services for every Olympics since. OBS has been remarkable in its constant innovation with each event, maintaining quality and reliability during a time when media technology has changed a lot.
AI is part of this change. Depending on which specific AI technology we are talking about, we can say that AI was first used by OBS at the 2018 games (PyeongChang) where it was used for content tagging, recommendations, and language translation. It was also used in several other areas of the games, such as time recording systems and biometric analysis of athlete performance.
When in Paris…
According to OBS, the 2024 Games will be the most advanced yet in terms of technology with the two week event. The games will be fully produced natively in UHD HDR, along with immersive 5.1.4 sound using more than 1,000 camera systems and 3,600 microphones.
OBS will produce more than 11,000 hours of content and process more than 3,000 UHD and HD feeds within the International Broadcast Center. More than 80 different distribution feeds will be managed. The IBC facility covers about 40,000 square meters, a 13% reduction from the 2020 Tokyo Games held in 2020. A total of 36 different venue broadcast compounds will be supported.
Amazingly, OBS does this with only about 160 full-time employees. The core group expands to more than 8,000 people from more than 110 different countries during the games itself. This is an incredible organizational and technological achievement that only gets more sophisticated with each Games.
Of course, AI is also playing a bigger role at this scale of event production and distribution. AI will be used for auto-clipping of content and descriptive metadata tagging. The technology will be used to provide transcriptions and translations of interviews for journalists and to help them find content. It will also be used to provide captions/subtitles in real time for live coverage. Data gathered by various biometric and other sensors deployed through the event will be processed and provide unprecedented information to viewers, often in real time.
Perhaps most impressively, it will be used to create automated highlights for potential distribution to several different platforms, including those using vertical formats. These will be generated on demand at any level of interest from a county to a sport to an individual athlete and can be tailored to mood or many other factors that producers may want to consider.
2026 (Milan) and Beyond
AI technologies of all kinds will continue to play a role in allowing media companies to produce major live events with richer and more sophisticated viewing experiences. Even in the areas where AI is already playing a role, it is easy to see the potential for an even greater role. For example, clips and highlights packages could be generated for smaller audiences or even an individual viewer in a reasonably cost-effective way.
These highlight-generation technologies could include even more relevant stats and help producers and viewers find even more interesting “gems” hidden in the content. It can also help production teams fill time in between gaps in the action with some real-time context setting, or information about the sport, scores or other key story lines. It can bring real-time information with context directly to presenters and allow them to provide even more texture for audiences.
It can also help production crews keep an eye on what’s happening outside the venue and have a more 360-degree view of the “story of the games.” Audiences always want more and we’re reaching the limits of what’s feasible for a human production crew to do, so working AI into production and creative workflows will be crucial to bring better experiences. It will bring otherwise hidden stories to those who want to see them. AI is unlikely to ever be used to alter the “reality” a fan sees; instead it should serve as an enabling function to enhance our production of the content.
Fear of the potential impact of mal-intended generative AI creating fake or distorted content is an important concern that is being taken seriously by the IOC (International Olympic Committee), OBS, and broadcasters around the world. OBS has been explicit in its commitment to not tampering with the video.
One technology I would anticipate playing a role in coming games in this regard is C2PA (Coalition for Content Provenance and Authenticity), an open technical standard providing publishers, creators, and consumers the ability to trace the origin of different types of media. This will be critical in many areas beyond sport.
Between the Games
We can start thinking now about how AI will help with the story between the games. It might not be so obvious, but 17 days of competition isn’t enough to tell the lifetime story of how an athlete got there. When the flame goes out, preparation for the next games begins right away. AI tools that are looking beyond the content broadcasters generate and into social platforms and the broader “conversation” can help audiences experience the whole journey to the games and beyond.
The legacy for competitors, for cities, for fans who were present at these epic moments and watched on TV far away can be joined up using the careful application of AI to continue to activate fans throughout the cycle and maintain interest.
At its core, the Olympic games are a human event—an AI cannot compete in the games and can never tell the powerful stories in a way that will bond in a powerful, emotional way with viewers. AI’s roles are becoming clearer, however. In addition to providing capability for increased scale and sophistication, it is time to see AI as a technology to bring more of the “truth” to viewers. AI can assist us in bringing out powerful stories by helping us find and present more relevant and informational content than ever.
John Footen is a managing director who leads Deloitte Consulting LLP’s media technology and operations practice. He can be reached at usmediamatrix@deloitte.com.
Source: tvtechnology.com