The IOC and its major rights holders like NBC and Discovery — which have paid billions for the privilege — will be hoping to immerse viewers in wall-to-wall Olympic action these next two weeks. The broadcast coverage of the competition is critical to these organizations and more so this year without the color of fans cheering at the venues.
OBS has prepared well though, arming itself with a fleet of innovations to enhance coverage intended to attract Gen Z. On the face of it this could be the most immersive Games yet. Let’s look at a few of the ways it plans to do this.
UHD HDR
First and foremost, the host feed is being delivered in 4K Ultra HD and HDR for the first time. There’s no doubt this upgrade in picture quality will enhance the viewing experience. Better color, better contrast, better pixels. Quite how many viewers will see this though is open to question. NBCU for example is distributing some UHD feeds but these will be upconverted from HD. The BBC isn’t taking UHD either declaring it too expensive to backhaul from Tokyo. Discovery is showing UHD but on one channel accessed by viewers in Europe from Discovery+.
UHD Audio
UHD also means an audio upgrade to a standard 5.1.4 configuration “to enable viewers to have a more realistic audio experience” by adding overhead mics. But with no spectators in the venue, this immersive experience may not be one OBS want to shout about. It could just be the sound of proverbial tumbleweed or shouts in an echo chamber. On the other hand, we may actually here the coaching commands and outbursts from athletes and referee direction in ways we simply cannot when drowned out by crowds.
Multi-Camera Replay Systems
Between 60 and 80 robotic 4K cameras are at select venues, including those hosting gymnastics, athletics, BMX freestyle, street skateboard, sport climbing and volleyball. The feeds will be stitched together to create multi-cam replay clips — an effect OBS say is similar to the bullet-dodging sequences in The Matrix.
2D Image Tracking
Video tracking technology will help viewers keep track of the position of the athletes across sports, including marathon and race walks, road cycling and mountain biking, triathlon and canoe sprint. Instead of GPS positioning or wireless equipment, OBS’ 2D image tracking is based on image processing technology that allows motion tracking. A ‘patch’ (a square) is defined on selected video frames in order to identify each of the athletes/boats. The computer then creates a ‘label’ that is attached to each of the identified athletes/ boats and that will be maintained even as the image changes. This captured data is then made available to a graphics rendering platform for on-screen presentation. Additional data captured using more traditional GPS positioning can be combined with the ‘labels’ to identify athletes, their speed, distance to finish or relative position to the leader.
Biometric Data
Just for the archery contest — possible because of the athlete’s stillness, biometric data will be taken. Four cameras will be trained on their faces to analyze any slight changes of skin color generated by the contraction of blood vessels. Through an on-screen graphic, audiences will be able to witness the heartbeat variations and adrenaline rush experienced by the archer’s body as they shoot their arrow.
360-Degree Replays
Intel’s True View technology will come in to play during basketball matches. Thirty-five 4K cameras are mounted at the concourse level of the Saitama Super Arena to capture volumetric video that, once processed, renders 360° replays, bird’s eye views and freeze frames from any perspective on the court. OBS will produce between up to 10 True View clips for every basketball game.
Virtual Reality
OBS plans 110 hours of live immersive 180-degree stereoscopic and 360-degree panoramic coverage from the Opening and Closing Ceremonies, as well as from select sports like beach volleyball and gymnastics — sports chosen based on the possibility of getting cameras closest to the athletes.
Discovery subscribers will be able to view the VR coverage and so will users of the NBC Olympics VR by Xfinity app, which include watch parties for Oculus friends.
12K Video Wall
A 50-meter wide screen will broadcast 12K resolution footage of the sailing events that spectators have traditionally watched from nearby piers with binoculars. Floating on the water of the Enoshima Yacht Harbor, the screen will give spectators the sensation of the races being held right in front of their eyes. If there were any spectators…
5G and AR Glasses
The total absence of spectators — including VIP decision makers — is a blow for Intel, Alibaba and other tech sponsors which have created demos of immersive experiences for people to enjoy at the venue. These include wearable glasses at the swimming venue delivering AR graphics over 5G and video replays available to golf fans at the Kasumigaskei Country Club, also over 5G.
5G won’t be a complete washout though. Footage from the Opening and Closing Ceremonies will be contributed over 5G in a test for future Olympic events.
Mobile First
Digital publishers can draw on a repository of up to 9000 clips and short-form assets called Content+. This includes behind-the-scenes content from the venues purposefully filmed with smartphones. Artificial Intelligence is also being trialed as a means to automate the logging and clipping of all this video and hence distribute that medal winning celebration to social media in an instant.
Bringing AI to Timekeeping
Omega, the Official Timekeeper of the Tokyo Olympic Games, has provided the official timekeeping services to the Olympics since 1932. Since then, Omega has focused not only on increasing its accuracy (at the London Games in 2012, the company introduced its Quantum Timer with a resolution of one millionth of a second, 100 times greater than previous devices), but also on developing innovative techniques for monitoring new Olympic sports such as skateboarding, climbing, surfing and beach volleyball.
According to Wired, Omega Timing’s R&D department spent the past four years training its in-house AI system to learn beach volleyball. “In volleyball, we’re now using cameras with computer vision technologies to track not only athletes, but also the ball,” Alain Zobrist, head of Omega Timing, told Wired. “So it’s a combination where we use camera technology and artificial intelligence to do this.”
The AI was trained to recognize various shot types and the ball’s flight path. This data is combined with information from gyroscope sensors placed in the players clothing, which is processed — in less than one-tenth of a second — and fed live to broadcasters for use in commentary or on-screen graphics.