Building Gargantua – CERN Courier (2024)

Oliver James of DNEG, which produced the striking black hole in the film Interstellar, describes the science behind visual effects and the challenges in this fast-growing industry.

Gargantua A variant of the black-hole accretion disk seen in the film Interstellar. Credit: DNEG/Warner Bros. Entertainment Inc./CQG 32 065001

Oliver James is chief scientist of the world’s biggest visual effects studio, DNEG, which produced the spectacular visual effects for Interstellar. DNEG’s work, carried out in collaboration with theoretical cosmologist Kip Thorne, led to some of the most physically-accurate images of a spinning black hole ever created, earning the firm an Academy Award and a BAFTA. For James, it all began with an undergraduate degree in physics at the University of Oxford in the late 1980s – a period that he describes as one of the most fascinating and intellectually stimulating of his life. “It confronted me with the gap between what you observe and reality. I feel it was the same kind of gap I faced while working for Interstellar. I had to study a lot to understand the physics of black holes and curved space time.”

A great part of visual effects is understanding how light interacts with surfaces and volumes and eventually enters a camera’s lens and as a student, Oliver was interested in atomic physics, quantum mechanics and modern optics. This, in addition to his two other passions – computing and photography – led him to his first job in a small photographic studio in London where he became familiar with the technical and operational aspects of the industry. Missing the intellectual challenge offered by physics, in 1995 he contacted and secured a role in the R&D team of the Computer Film Company – a niche studio specialising in digital film which was part of the emerging London visual effects industry.

Suddenly these rag-dolls came to life and you’d find yourself wincing in sympathy as they were battered about

Oliver James

A defining moment came in 2001, when one of his ex-colleagues invited him to join Warner Bros’ ESC Entertainment at Alameda California to work on The Matrix Reloaded & Revolutions. His main task was to work on rigid-body simulations – not a trivial task given the many fight scenes. “There’s a big fight scene, called the Burly Brawl, where hundreds of digital actors get thrown around like skittles,” he says. “We wanted to add realism by simulating the physics of these colliding bodies. The initial tests looked physical, but lifeless, so we enhanced the simulation by introducing torque at every joint, calculated from examples of real locomotion. Suddenly these rag-dolls came to life and you’d find yourself wincing in sympathy as they were battered about”. The sequences took dozens of artists and technicians months of work to create just a few seconds of the movie.

DNEG chief scientist Oliver James started out in physics. Credit: DNEG

Following his work in ESC Entertainment, James moved back to London and, after a short period at the Moving Picture Company, he finally joined “Double Negative” in 2004 (renamed DNEG in 2018). He’d been attracted by Christopher Nolan’s film Batman Begins, for which the firm was creating visual effects, and it was the beginning of a long and creative journey that would culminate in the sci-fi epic Interstellar, which tells the story of an astronaut searching for habitable planets in outer space.

Physics brings the invisible to life
“We had to create a new imagery for black holes; a big challenge even for someone with a physics background,” recalls James. Given that he hadn’t studied general relativity as an undergraduate and had only touched upon special relativity, he decided to call Kip Thorne of Caltech for help. “At one point I asked [Kip] a very concrete question: ‘Could you give me an equation that describes the trajectory of light from a distant star, around the black hole and finally into an observer’s eye?’ This must have struck the right note as the next day I received an email—it was more like a scientific paper that included the equations answering my questions.” In total, James and Thorne exchanged some 1000 emails, often including detailed mathematical formalism that DNEG could then use in its code. “I often phrased my questions in a rather clumsy way and Kip insisted: “What precisely do you mean”? says James. “This forced me to rethink what was lying at the heart of my questions.”

The result for the wormhole was like a crystal ball reflecting each point the universe

Oliver James

DNEG was soon able to develop new rendering software to visualise black holes and wormholes. The director had wanted a wormhole with an adjustable shape and size and thus we designed one with three free parameters, namely the length and radius of the wormhole’s interior as well as a third variant describing the smoothness of the transition from its interior to its exteriors, explains James. “The result for the wormhole was like a crystal ball reflecting each point the universe; imagine a spherical hole in space–time.” Simulating a black hole represented a bigger challenge as, by definition, it is an object that doesn’t allow light to escape. With his colleagues, he developed a completely new renderer that simulates the path of light through gravitationally warped space–time – including gravitational lensing effects and other physical phenomena that take place around a black hole.

Quality standards
On the internet, one can find many images of black holes “eating” other stars of stars colliding to form a black hole. But producing an image for a motion picture requires totally different quality standards. The high quality demanded of an IMAX image meant that the team had to eliminate any artefacts that could show up in the final picture, and consequently rendering times were up to 100 hours compared to the typical 5–6 hours needed for other films. Contrary to the primary goal of most astrophysical visualisations to achieve a fast throughput, their major goal was to create images that looked like they might really have been filmed. “This goal led us to employ a different set of visualisation techniques from those of the astrophysics community—techniques based on propagation of ray bundles (light beams) instead of discrete light rays, and on carefully designed spatial filtering to smooth the overlaps of neighbouring beams,” says James.

A moderately realistic accretion disk gravitationally lensed by a black hole (a) with its colours Doppler- and gravitationally shifted, (b) with its specific intensity shifted in accord with Liouville’s theorem, and (c) showing what the disk would truly look like to an observer near the black hole. Credit: DNEG/CQG 32 065001

DNEG’s team generated a flat, multicoloured ring standing for the accretion disk and positioned it surrounding the spinning black hole. The result was a warped spac–time around the black hole including its accretion disk. Thorne later wrote in his 2014 book The Science of Interstellar: “You cannot imagine how ecstatic I was when Oliver sent me his initial film clips. For the first time ever –and before any other scientist– I saw in ultra-high definition what a fast-spinning black hole looks like. What it does, visually, to its environment.” The following year, James and his DNEG colleagues published two papers with Thorne on the science and visualisation of these objects (Am. J. Phys 83 486 and Class. Quantum Grav. 32 065001).

Another challenge was to capture the fact that the film camera should be traveling at a substantial fraction of the speed of light. Relativistic aberration, Doppler shifts and gravitational redshifts had to be integrated in the rendering code, influencing how the disk layers would look close to the camera as well as the colour grading and brightness changes in the final image. Things get even more complicated closer to the black hole where space–time is more distorted; gravitational lensing gets more extreme and the computation takes more steps. Thorne developed procedures describing how to map a light ray and a ray bundle from the light source to the camera’s local sky, and produced low-quality images in Mathematica to verify his code before giving it to DNEG to create the fast and high-resolution render. This was used to simulate all the images to be lensed: fields of stars, dust clouds and nebulae and the accretion disk around the Gargantua, Interstellar’s gigantic black hole. In total, the movie notched up almost 800 TB of data. To simulate the starry background, DNEG used the Tycho-2 catalogue star catalogue from the European Space Agency containing about 2.5 million stars, and more recently the team has adopted the Gaia catalogue containing 1.7 billion stars.

Creative industry
With the increased use of visual effects, more and more scientists are working in the field including mathematicians and physicists. And visual effects are not vital only for sci-fi movies but are also integrated in drama or historical films. Furthermore, there are a growing number of companies creating tailored simulation packages for specific processes. DNEG alone has increased from 80 people in 2004 to more than 5000 people today. At the same time, this increase in numbers means that software needs to be scalable and adaptable to meet a wide range of skilled artists, James explains. “Developing specialised simulation software that gets used locally by a small group of skilled artists is one thing but making it usable by a wide range of artists across the globe calls for a much bigger effort – to make it robust and much more accessible”.

In March, the Future Circular Collider study in collaboration with CERN’s EP department organised a colloquium at CERN with DNEG’s creative director Paul Franklin, chief technology officer Graham Jack and chief scientist Oliver James. Credit: N Caraban/CERN

Asked if computational resources are a limiting factor for the future of visual effects, James thinks any increase in computational power will quickly be swallowed up by artists adding extra detail or creating more complex simulations. The game-changer, he says, will be real-time simulation and rendering. Today, video games are rendered in real-time by the computer’s video card, whereas visual effects in movies are almost entirely created as batch-processes and afterwards the results are cached or pre-rendered so they can be played back in real-time. “Moving to real-time rendering means that the workflow will not rely on overnight renders and would allow artists many more iterations during production. We have only scratched the surface and there are plenty of opportunities for scientists”. Even machine learning promises to play a role in the industry, and James is currently involved in R&D to use it to enable more natural body movements or facial expressions. Open data and open access is also an area which is growing, and in which DNEG is actively involved.

“Visual effects is a fascinating industry where technology and hard-science are used to solve creative problems,” says James. “Occasionally the roles get reversed and our creativity can have a real impact on science.”

Building Gargantua – CERN Courier (2024)
Top Articles
Latest Posts
Article information

Author: Horacio Brakus JD

Last Updated:

Views: 5661

Rating: 4 / 5 (51 voted)

Reviews: 90% of readers found this page helpful

Author information

Name: Horacio Brakus JD

Birthday: 1999-08-21

Address: Apt. 524 43384 Minnie Prairie, South Edda, MA 62804

Phone: +5931039998219

Job: Sales Strategist

Hobby: Sculling, Kitesurfing, Orienteering, Painting, Computer programming, Creative writing, Scuba diving

Introduction: My name is Horacio Brakus JD, I am a lively, splendid, jolly, vivacious, vast, cheerful, agreeable person who loves writing and wants to share my knowledge and understanding with you.