100 players and 25 teams with all their movement, kills, ammo and equipment available as real-time data.
In the chaos of the final circle, play is fast and furious.
But straight after the game it’s interesting to see the route and path of where winning teams have been and how they got there: but we can’t pause for too long because another game is about to begin.
The client wanted the ability to see the whole map superimposed as a 3D AR object in their studio and be able to trace the movement of teams throughout the map whilst the presenter and players could talk the audience through the key moments.
Solution
We took our Ignition framework and built a control system to allow operators to select the key moments and play through them on cue.
Data
Fanview provided an API from PUBGs data set in the form of JSON feeds on multiple HTTP endpoints. For this graphic we didn’t need to know every element of data (for example which player had which type of grenades, or which weapon they were using - all of which was available) but we did need relevant data for the graphic. This is where the original specification of the graphics functionality is so important. We worked closely with the client to understand the specification well, and to see how the technology could best tell the story.
So, from the JSON endpoints we only took the data relevant to us:
- Flight path of the plane that drops the players
- Team routes & landing positions
- Position of the safe zone over time
- Location labels
- Kill positions
All of these would need to be presented with in-vision animations.
Graphics
Fanview provided a Cinema 4D file for the map which we imported to the chosen renderer, Ross Video’s XPression. From there we created all the necessary texture effects and 3D objects to create paths in real time.
Idonix added:
- Team markers
- Safe zone circles
- Team routes & animations
- Location labels
- Texture effects
- 3D splines to create the paths
- 3D model of the aeroplane (from a Cinema 4D file supplied by Fanview)
Scripting
A good deal of scripting was needed within XPression to create everything we needed. For example, placing the overlay objects such as kill locations, place names, flight paths and team routes and setting the data into each object. Setting the player images, team names and place names and triggering animations to make overlays animate in and out when needed: all done in VBScript.
Workflow
Availability of post-match had to be timely, so Ignition allowed them to cue up which teams they wanted to focus on. With the preview in Ignition they would step through and confirm their animations before playing them on air.
An operator could:
- Make a new sequence
- Choose to add place names
- Choose to add flight path & colour
- Choose to add safe zones & stages
- Select teams
- Add a pop-up video player at the final location with game play video from a live input
The interface gave data confidence monitoring, giving a red mark if anything was invalid or missing.
Once the sequence was prepared it was ready to play out to TX with confidence.
James also notes that the challenges resulted in some extra features not originally planned:
“Because we weren't in-studio and were working remotely in different time zones we added a lot of user-friendly settings in Ignition that would assist in setting up and configuring the map. Things like being able to move, scale and rotate the map to line it up with the physical set, and also to toggle on and off guides and markers, all without needing to touch XPression meant the client could do it themselves.”
Some months later came challenge number two: