IGBS: Broadcasting the most spectacular Rugby World Cup in History
For the very first time, World Rugby, a global organization dedicated to promoting rugby worldwide, has decided to trust a specialised host broadcaster instead of relying on a local provider for the Rugby World CupTM 2019. The global audience has been able to enjoy what has been the most avant-garde and technological transmission of the championship. David Shield, co-project director at IGBS (International Games Broadcast Services), provides us with a complete overview of how the technical production of the Rugby World CupTM championship, held in Japan, was completed.
Text by Sergio Julián
The Rugby World CupTM is a really ambitious competition. 7 weeks, 12 stadiums… Which is the biggest technical challenge of the event?
Sourcing the OB vans. Japan had a limited number of large OB vans capable of accommodating the requirements of up to 34 cameras, and those that did exist were being used for unilateral coverage by the Japanese rights-holding broadcasters, NHK, NTV and JSports. As a result, IGBS employed a ‘flyaway’ model, with nine technical kits being flown in and assembled in temporary buildings at venues. Even with nine kits there were numerous internal moves required in order to cover the 12 venues and keep one kit available for a ‘disaster recovery’ venue.
What is the difference between the Rugby World CupTM and other similar sporting events?
The schedule of a rugby tournament is longer than a football tournament with the same number of teams, because of teams requiring at least five days to recover between matches. Also, the pitches need recovery time also. It means the on-site operations for our teams and specific planning for cost-efficient resource management.
Japan, as you know, drives technological change and implementation. NHK Labs are a great example of this. Did you have the opportunity to try any new technology for this event?
NHK provided 8K coverage of selected matches, with Japanese graphics derived from the same data source as the 4K and 2K match coverage. Canon provided clips from their Stadium Vision 360° coverage at Yokohama Stadium.
I would like to ask about the basic production resources for the matches. How many cameras do you use?
Either 23 or 28 cameras were used per match during the group phase, based on the specifications of the venues, as, for example, some stadiums were simply not structurally able to have the cable camera system deployed. This was raised to 32 cameras for knockout phase matches, thanks to the addition of four corner flag cameras. For the semi-finals and final of the tournament, the 32 camera plan rose again to 34 – two additional SSM cameras on the reverse side at the five metre line.
I’ve seen that a Spidercam is part of your basic coverage. Have you included other innovative resources in the production of the games?
Having a cable camera system available at 34 of the 48 games allowed us to implement some augmented reality graphics for team line-ups, stadium identification, half-time and full-time score graphics.
What is your standard of production? Are you working in HD or 4K?
All matches were produced in multiple formats, with UHD/SDR (4K), 1080p and 1080i all available.
What resources do you use for the prior match coverage? I’m thinking about the images of the fans coming to the stadiums, etc.
Sony FS7 ‘cine-style’ cameras were deployed around the concourse of the stadium and the surrounding areas, in order to capture engaging images of the fans arriving and the pre-match atmosphere in the host city. These cameras give high end visuals and a cinematic feel to the coverage. Furthermore, we used LiveU to provide live or near live shots of teams departing their hotels and fan zones from the quarter-finals onwards.
An important part of current sports broadcast productions are graphics. I’ve seen some amazing videos with tracking techniques. What graphics systems are you implementing? How are you applying those technologies?
Augmented Reality (AR) graphics were deployed to enhance the pre-match coverage on the World Feed, with images over-layed on the pitch and cable camera operators setting up the wide shot to allow the effect to achieve maximum impact.
Stats and big data are also taking an increasingly important role in these productions. Are you working in this field?
A dedicated social media offering used match and player statistics, both from the event and historically, to generate engaging assets that allowed broadcasters to extend their interaction with audiences beyond the match days and continue the debate across their social media channels. This content was delivered in a fully-customisable format, allowing them to tailor the material for their market.
Live streaming is important, but coverage through social networks and / or OTT is also extremely relevant. How are you working in that area?
As part of the social media package, 360° Virtual Reality (VR) clips were made available.
There are plenty of new technologies coming up that will redefine the way we consume sports. I’m thinking about 8K, HDR, 360 or 5G. Are you doing any test on these technologies in the World Cup?
NHK provided 8K coverage of selected matches. IGBS had 360° cameras embedded with its ENG crews. NTT Docomo used RWC footage to demonstrate the low latency of their 5G delivered pictures.
Could you name a particular technical issue that you had during the technical production on the event and how did you solve it?
Typhoon Hagibis provided an opportunity to show how we could maintain production of matches not inside the typhoon’s path, even whilst we had to close the IBC completely.
In your opinion… how is the future of sport broadcasting? What trend will be established for future World Cups?
This is the first Rugby World CupTM that World Rugby has contracted a specialised host broadcaster rather than relying upon coverage from the domestic licensee. This is increasingly the model for federations who want to take more control of their output.