Home > Blog > Uncategorised > Guitar Hero Live
T: +44 (0) 207 380 3540

Guitar Hero Live

In recent years the convergence between film and games has been much heralded. Certainly the tools and techniques pioneered in both worlds has been used to capture actor performances and achieve photorealistic rendering and real-life camera moves like never before. But with the new Guitar Hero Live from FreeStyleGames (FSG) and Activision, that film/games cross-over has been taken to a yet another level of co-operation. The game’s live mode now puts the player in front of filmed band members and audiences, who express both their delight and displeasure with the performance by reacting in real-time.

To achieve that new style of gameplay right as the player struts their stuff, a myriad of techniques were employed. Real band members were performance captured, which then drove camera moves acquired in multiple layers on a high-speed motion control Bolt Cinebot in front of partial concert crowds. The gamemakers then called on Framestore to fill out the scenes with CG backgrounds and additional CG crowds, plus deal with transitions between the warm and cold audiences. This visual effects approach was not an insignificant feat.The resulting lengthy shots sometimes ran to 12,000 frames and saw Framestore produce more than a petabyte of data for the project.

fxguide talked to key members of the teams behind the work from FSG and Framestore.

Why go live?

A screenshot from Guitar Hero Live shows the game highway and live action crowds.
A screenshot from Guitar Hero Live shows the game highway and live action crowds.

Guitar Hero is already one of the most popular – and immersive – games, so why change it? “It began in discussion between our creative director Jamie Jackson and Eric Hirshberg, the CEO of Activision,” recounts Freestyle art manager Mike Rutter. “It was about trying to re-imagine Guitar Hero and what makes it special. Erick had the idea of turning the camera around and looking out at the crowd. We were trying to re-create that space and that slight fear or trepidation you have when you’re about the get up on stage and play in front of a lot of people.”

Added to this concept was the idea of setting Guitar Hero Live within the landscape of music festivals. “They’re a big part of summer here in England,” says Rutter. “So we wanted to represent that and show the different styles of music and different people all in the same place altogether.” Freestyle team members visited several festivals for reference – and ‘expensed the tickets back!’ notes Rutter. Ultimately the two festival areas became SoundArm Music Festival in the UK countryside and Rock the Block in the U.S.

Finding a point of view

Shooting on set with the Bolt Cinebot.
Shooting on set with the Bolt Cinebot.

Just what a Guitar Hero Live player would see as they were looking out over the audience, and how that footage would be acquired, involved a degree of trial and error. There were some initial experiments involving helmet-mounted and hand-held cameras and Steadicams, before a motion control setup was utilized. “We would try to get a natural head rotation and movement,” explains Rutter, “but if you re-create human motion too closely it can sometimes cue motion sickness. You’re trying to find a balance between a realistic experience and something that’s fun to watch and engage in.”

At this stage, a decision was also made to feature both the positive and negative reactions from the band and audience, which meant the repeatability of camera moves became important to switch between those moods. “We then started testing a variety of motion control rigs but eventually settled on the Bolt through the VFX Co in London,” says Framestore visual effects supervisor Pedro Sabrosa. “The camera moves had to strike a balance between creating a POV that was believable but not too hectic for the viewer. The Bolt was nimble, robust and relatively compact for the kind of moves we were doing.”

Although it solved the problem of fluid camera moves and repeatability, the motion control setup still had its own set of challenges. “Because the Bolt rig is quite big and needs its track on stage,” notes Rutter, “you have to take into account things like where’s the track, where is its range of motion, how far can you see around a corner? What needs to be greenscreen and what needs to be a real set?”

The first attempt at capturing the action with the Bolt rig also showed the team that even more realistic POV motion was required. “That first time,” recalls Rutter, “the camera operator was behind the camera controlling it through effectively a remote control joystick. It didn’t really give you a sense of human movement. It felt more like an observer or camera and you needed to feel part of the band.”

A solution was found in motion capturing the performance first instead. “We would do full performance capture with previs geometry so that we actually had a camera operator as the guitarist, as a member of the band performing with the band, capturing that performance from their point of view,” outlines Rutter. “Then we would track the camera and do full motion capture on that rehearsal performance, so it really was a person on stage moving and reacting to what was happening. Once we had that captured, our animation team could then clean up or smooth out any kind of unwanted jitter or noise that made for uncomfortable viewing. We had as much control as we needed to get that kind of immersive feel to be the guitarist in the band.”

How the shoot went down

The Bolt camera enabled repeat and controllable moves.
The Bolt camera enabled repeat and controllable moves.

Once a methodology had been found for acquiring the live action footage, it followed a defined set of events for each song. First, the band and camera operator would rehearse the performance and camera move, which were then edited into a sequence of three to five continuous song performances.

Next, that same performance would be repeated on the motion capture stage. “The stage floor was marked out to match the dimensions of the real stage as well as markings for the chosen location of the motion control track,” explains Sabrosa. “The camera operator then had a specific area in which to work based on the length of the track and the reach of the Bolt rig arm. The captured camera move was then tweaked to fine tune timings, framing and could apply any constraints based on the motion control rig. From these sessions we would get a POV camera move for the motion control rig and a previs for the songs.”

With the rehearsal and motion capture in hand, things moved on to the studio, actually, a couple of studio locations in London and Essex. Filming then took between two and four days for a set of three to five songs. The shoot lasted six months to allow for prep, rehearsals and the motion capture and final capture. One of the important considerations was how much of the studio to build and how much would be covered by greenscreen, a task aided by an early previs effort from Freestyle. This also determined where the Bolt rig and the attached camera would be set up and how the song was performed. “The track for the Bolt rig would be laid out differently for every set to vary the position of the guitarist’s POV,” says Sabrosa. “The stage lighting, music and the Bolt all ran in sync with timecode. The band rehearsed extensively with the Bolt rig on the stage to make sure they were able to hit their marks in sync with the rig but without getting hit by it or tripping over the track.”

Live crowds, of course, were crucial. The band was filmed with the first three to six rows of crowd extras making up between 100 and 400 people, all reacting to the guitarist’s performance. “Each song had to be repeated for both positive and negative reactions from the band and audience,” notes Sabrosa. “An additional two to four passes of crowd were filmed without the band, moving the crowd further back from the stage, as well as changing their dress and individual positions.”

Live action crowds.
Live action crowds.

“The performance of the crowd in the additional passes had to be timed perfectly to the foreground layer of crowd so their performance was in sync throughout,” adds Sabrosa. “Additional VFX passes were filmed to help with comp and lighting. Each song was filmed one at a time but the motion control moves were designed to have all three to five songs run as one continuous set. Transitions such as wipes and wip pans were built into the camera moves to allow for multiple takes to be cut together in one song and help connect the different songs. At the start of each set there was an intro section which usually took place back stage while the band were getting ready for the gig. This was filmed hand held but our camera operator Leon would time the end of his move to connect to the beginning of the the Bolt move.”

Almost as if it was a character or crew member itself on set, the Bolt rig was a crucial part of the overall performance, and had a special team attending to its every need. “The main operator ran the moves through Flair motion control software,” states Sabrosa. “Our director, Giorgio Testi, and operator could see the live feed from the camera and the previs running in sync. A script supervisor would read out pre-planned stage directions to the band throughout the performance to help them hit their marks and performance beats. As the camera move was locked it became more about the band and audience delivering their performance in sync with the predefined camera move. Follow focus was done live for every take.”

As noted, the Bolt was employed to enable repeatable moves, something that would aid in transitions and in camera tracking and matchmoving. Specific points had been designed into the camera moves to allow seamless cuts between positive and negative audience reactions, such as where the camera pointed at the ceiling or the floor with little surrounding action. As planning continued, however, it became clear that these moments occasionally felt forced. The team also found, notes Sabrosa, “that when playing the game you weren’t always aware that you had switched. So it was decided before we went into the main production to have a quick flare/light transition which could happen at any point in a song and allowed the player’s experience to flow a lot better.”

Crowd control

Framestore extended the live action crowds with digital extras.
Framestore extended the live action crowds with digital extras.

The on-set crowds proved an interesting challenge for the production. They had to be directed for all sorts of music and all sorts of reactions. “When you go to festivals and have different types of music, you are expecting quite different experiences – say heavy metal compared to a rock band,” says Rutter. “There are different levels of excitement and disappointment. So it was very much a feature of filming to get the right reaction. Most people naturally found the positive performance easier because it’s more comfortable – it’s more how you’re used to behaving when you go to music festivals and gigs. The skill of acting came through in the negative performance because it’s more uncomfortable to do.”

On set, the plan was to film multiple passes of crowd action, comp these together and augment with CG crowds and environments. “We originally tried to have a real person within a 10 to 12 meter radius of the camera to avoid having to build very detailed CG doubles,” states Sabrosa. “But we had to adapt to the changing scope and creative direction of the project as we went along. Before every shoot we would plan the most appropriate layout of the different crowd passes in Maya based on the number of extras we had and the camera move involved. This allowed the DOP to plan the different layouts of the green screen and lighting and 1st AD to schedule these passes into the shoot day. As the crowd passes were filmed in the same light these passes in most cases came together in a rough form quite quickly. I did on set comps of all layers to make sure there were no glaring sync issues between the camera, lighting and audience performance on the individual layers.”

Digital crowd generation relied on guidance from early tests done by Freestyle, with Framestore then adopting a whole-hearted approach in Golaem to generate the crowds. The studio started with producing agents for audience members from in-house motion capture. “At this point,” says Framestore CG supervisor Alan Woods, “we still didn’t know the songs that would be used so we took an educated guess and worked to that. We knew we were creating a tight and compact crowd. So we took steps to restrain our performers’ movements so they wouldn’t dance in a way that would cause problems in the sims. For example, we would say, don’t put your arms out too far to the sides or the front. Keeping the crowd density and look right whilst having them dance and react was a tricky problem to try and solve.”

A few front rows of crowds were captured on greenscreen sets which had to play both excited and angry.
A few front rows of crowds were captured on greenscreen sets which had to play both excited and angry.

Prior to the shoot, Golaem was used to block out small crowds of people for test shots. In Maya, Framestore artists also blocked out the positions of the real actors that would be filmed – this helped visualize the shoot and make the most of the studio space and the crowds on hand. “Once we had footage in from the shoot (we shot lots of witness cameras of the stages and the real crowd),” says Woods, “we could start to try and match the feeling of the crowd better. There were moments in the songs where we had to match very specific movements like arms swaying or a big co-ordinated jump. We went back into the mocap suite to make sure we had coverage on all these actions and moments. We ended up with hundreds of clips of different reactions, ambient motions, claps – lots of claps! – air punching et cetera. All the things you’d expect to see in an amped up crowd – and a downbeat one as well!”

“Getting the feel of the crowd right was hard,” adds Woods. “Too much energy and it broke the illusion and started to look noisy. We had to feather the animation the further it got from the camera. Making each crowd feel unique was also tricky. The crowd in the metal set for example are more energized and reactive than the crowds in the folk set.”

Framestore’s toughest challenge for the crowds, and indeed the entire project, were the long frame ranges – 12,000 frames on the longest shot. “We had no cuts to speak of and we spent long periods of time looking at the crowd,” says Woods. “We decided to sim it all as one take to ensure consistency of agent positioning. Once we hit the larger stages the size of the crowds being sim’ed also started to become an issue. We were simulating around 70-100k agents for long frame ranges. We ended up with around nine layers of crowd, the last four were so small we could re-use them between the positive and negative passes.”

The Golaem sims were then sent to Framestore’s proprietary procedural crowd tool, fMob. “Here,” explains Woods, “we could tweak the sim, remove bad agents and ones we didn’t like. We could also reposition agents and fill gaps that didn’t work once in camera. We then rendered it through Arnold via our in house scene generator. Rendering the whole frame range of a shot was expensive and had to be actively managed. We ran a lot of quick QC renders with simple ambient occlusion rendering to check the motion of the crowds and get a sign off. We did check sections of the song in isolation but to really get a feel for the song you had to see the whole thing rendered. We listened to the same songs over and over a lot!”

Turning the lights on the audience

Lighting the venue and the crowds became an important consideration as this is what the player would see.
Lighting the venue and the crowds became an important consideration as this is what the player would see.

The crowds fleshed out several exciting venues, each designed by Freestyle, which provided Framestore with xml-based layout files. “All of the modelling and texturing was done at Freestyle and so a way for the two pipelines to talk to each other was needed,” explains Woods. “We ended up ingesting each asset separately through an automated process. This process would bring in the model, textures and any reference renders from Freestyle. This would then get automatically turntable rendered and published to the database. We ended up with around 500+ different assets ranging from a rubbish bin to a massive skyscraper. Lookdev palettes were then created for each asset and they could be rendered.”

Framestore then crafted its own layout files based on the original xml’s supplied by Freestyle. “This again was scripted to be as easy as possible,” says Woods. “Any assets in the layouts that had to give a performance, for example lighting pods or stage lights, were separated out so they could be rigged and animated. We could then build a Maya scene which pulled all the different components in along with their palettes.”

Then, the sets of the sets had to be lit. As Woods enunciates, “lighting the environments was an interesting proposal because at any real concert the visual interest is on the stage and you are viewing that from the crowd. We had to flip that around and make it interesting to look the other way!”

Many different lights made up the final image – there might be lightbulbs on food stalls, party lights in a tent or uplights on trees. “The small UK stage in the forest was particularly tricky as we really wanted to feel that there was a bigger festival going on behind the trees,” adds Woods. “We looked at lots of reference of real festivals and gigs to try and see how a lighting designer might approach the challenge of lighting a forest or a city. The city environment ended up with thousands of lights and hundreds of assets, it was a beast.”

The aim, as always, was to make the lighting as realistic and immersive as possible. One method that was used to achieve realistic stage lighting involved capturing the light used during the band shoots. “We discovered early on that we could hook up a laptop to the back of the lighting desk and intercept all the data packets which controlled the lights using Touch Designer,” explains Woods. “This was simple 8 bit channel data which we could turn into curves in Maya. Each light had a different specification with regards to how many channels it used and what they did. For example, a simple dimmer light would have one channel controlling intensity but one of the more flashy lights could take 32 channels controlling rotations, intensity, colour, gobo selection et cetera. By creating Maya rigs which had channel hookups for these attributes we could drive the lights and their performance without animating them for thousands of frames. That meant we could take a light from the stage and move it somewhere into our 3d environment and have it perform in the same way as the real one on the stage.”

The lengthy frame lengths were, once again, tough from a lighting point of view. “The normal workflows and procedures to run out a shot were all subject to change,” says Woods. “Normally if you wanted to control a portion of a render you might run an extra pass for that area/asset or an id pass. But each new pass we added to the render could easily add on 10,000 CPU hours! Our CPU hour render estimates for the shots were easily upwards of 150,000 per shot. So we made extensive use of techniques like RGB lighting passes for all interactive light sources and let comp pull the lighting around. It was simply too expensive to risk burning the timing of the lights into the renders. We had to maintain maximum flexibility in comp.”

By any measure, the show was a monster for Framestore to render and wrangle – using a petabyte’s worth of generated data. That was thanks to the frame lengths involved, the shortest being 6,000 frames and the longest 12,000. “The project was done at 30 fps and so we automatically had 20 per cent more frames to render vs a normal film project,” describes Woods. “Per frame render times were not the issue on this show, lighting did a great job of stripping out any unnecessary rendering. It was simply the number of frames which caused the issues.”

“An average shot,” adds Woods, “might have 10 to 15 passes so, 10 passes x 7500 frames on average = 75,000 frames to render! It was a monumental rendering task. We had to schedule the rendering very carefully to be able to get through the passes/shots we wanted that evening. Over the course of the project I think we ended up doing around 20 million CPU hours worth of rendering which would be over 2,200 years of rendering on a single processor.”

A new game highway

The new game highway.
The new game highway.

On top of all the complicated live action/CG backgrounds, Freestyle determined to reinvent the game highway which players follow in order to strut the right notes. “We tried many different routes,” says Rutter, “such as having different buttons and actually not even having a guitar and using something like Kinect to capture your performance. We settled in on this 3×2 layout on the guitar where you have two rows of three buttons. It adds more challenging game play and some realistic chord shapes.”

Asked whether a natural next step might be a virtual reality (VR) or augmented reality (AR) implementation of Guitar Hero, Rutter identified some obvious problems with such a game. “There’s a few challenges with VR and AR,” he says. “It feels like a natural step for games because it’s already in 3D. There’s not many hurdles to overcome to turn your scene from 3D into virtual reality. From that perspective it’s an easy step. But Guitar Hero has the split gameplay, the highway and the gems are coming at you in 3D effectively and you’ve got the scene behind you. In our experience, too, it’s difficult when you’re trying to lead someone around. If you’ve got the freedom to move yourself but then you have to get someone from backstage onto stage, you can throw people off quite quickly.”

Perhaps that’s not quite a ‘No’ for a VR or AR version of Guitar Hero. Clearly, Freestyle deeply care about the gameplay for users and about taking the game forward. Indeed, for Live, Rutter notes that “when we were looking to innovate the gameplay, we stripped it right back and started again and asked ourselves: What is it that makes playing Guitar Hero fun? What is it that’s cool about being a guitarist?”

It looks for certain that they found the answers.

Guitar Hero Live launched October 20 in the US and October 23 in the UK. There’ll also be an iOS version.