How 'Star Trek' Changed Visual Effects History
From miniatures and glitter to cutting edge computer graphics, the franchise has led the way
The Star Trek TV and film series haven’t just been a journey through the history the voyages of the Starship Enterprise; they’ve also served as a showcase for cutting edge and innovative visual effects techniques. From models and miniatures used in the TV series, to some of the earliest computer graphics seen on film in Wrath of Khan, and to new ways of achieving complex effects on screen, including in this weekend’s Star Trek: Beyond, Inverse takes a look at the biggest VFX innovations in Trek history.
Models and Glitter
In the mid-1960s, special effects in films and TV were the domain of miniatures, matte paintings on glass and optical compositing – all on real film. The effects in Star Trek: The Original Series were no different, but the sheer number of shots made for the show, and the need for them to be of “film quality,” required some neat innovation.
The Howard Anderson Company was the original effects studio behind this work, with its most well-known contribution being the build of the miniature USS Enterprise. The ship was actually realized as two different sized scale models, and then would be combined with starfield backdrops and planet paintings.
Howard Anderson also devised an effect that would pervade the Trek series for years to come: the transporter, in which crew members were beamed from ship to planet or just about anywhere else. The futuristic effect was achieved in a surprisingly low-tech way for the Original Series, with aluminum powder and old-school optical compositing.
Here’s how it worked: First the person or persons being transported were filmed standing in position. Then they stepped out of frame while the camera captured an empty set. What was then needed was a “mask” of the figures being beamed — essentially an outline of them. A further element required was the glittering or beam effect (this is where the aluminum powder came in). It was photographed separately by dropping the powder from above and lighting it with an intensive light against a black background.
When the original piece of film was combined with the mask (which is like a hole in the piece of film) and the glitter element, the glitter effect showed up only over the outline of the figures. The image of the figures from the original piece of film was faded out, or in, to finally reveal the materialization/dematerialization effect, and this was all achieved with film cameras and optical printing.
The Pixar Connection
An early Trek film boasts a sequence that is one of the most significant in digital visual effects and computer graphics lore. The terraforming planet sequence, known as the Genesis Demo, in Star Trek II: Wrath of Khan holds the reputation as the first major use of computer graphics in a feature film.
It was created by the Lucasfilm Computer Division, a separate part of Lucasfilm that George Lucas had established to research various digital editing and computer graphics applications. It worked alongside the main Lucasfilm effects outfit Industrial Light & Magic, and ultimately — and famously ‚ would become Pixar.
But before Toy Story, there was this Genesis sequence, containing the kind of CG never before seen on film. This was a big deal. Very few films before Wrath of Khan had made use of computer graphics at all for effects work, and none to this level of complexity. Usually only miniatures ensure enough fidelity and creativity for an effects shot.
Alvy Ray Smith, who conceived and directed the Genesis sequence, told Inverse that although it was a “first,” his team was “very confident” they could pull it off. “Not that we were justified in that confidence, but that seems to be the nature of youth,” he adds. “Of course we could do it! But to be fair, I did design the piece to play to our particular strengths at the time.”
Those strengths were in utilizing fractals, particle systems, texture mapping, bump mapping, digital painting and a moving camera (aimed at mimicking a typical planetary flyby that had become popular then on television). They were all computer graphics techniques in their early days, but the project had the benefit of having some of the brightest minds at Lucasfilm on the job.
The Genesis Demo became a 67-second long shot showing a torpedo hitting the planet, exploding, causing a shockwave and then showing geological and Earth-like structures appearing. It would be rendered on VAX computers (remember, this is 1982), with some frames taking just five minutes to produce while others required five hours.
Smith says the sequence was “the first use of computer graphics in a successful motion picture” and that he is still incredibly proud of the work. He notes that if you watch closely, there’s a section where the spacecraft hit the fractal mountains on one frame, owing to the random heights. So one artist, Loren Carpenter, who was behind the fractal graphics, went into the database of imagery, by hand, and “found the offending frame and carved a notch in the mountain at the right place for that one frame. If you watch for that frame, it’s rather obvious and gets a guaranteed chuckle out of an audience who has been told about the fix beforehand.”
How VFX on television changed forever
Before the late 1980s and early 1990s, most episodic television shows were still shot on film, and any effects were mainly done with miniatures; optical compositing was also done on film. Star Trek: The Next Generation (TNG), which began airing in 1987, changed everything. TNG was still shot on film, but the footage was then transferred to video tape. As the series continued, digital tape formats also began to be used. This process enabled a new wave of video-based visual effects to be employed.
Why was this a big deal? Speed. Having footage on video meant things could happen faster with editing and adding effects. It was a hugely necessary step, since in TNG’s first three seasons, each episode would require about 60 to 70 VFX shots, a significant amount back then. And while TNG was still being made, the Star Trek: Deep Space Nine (DS9) series also entered production, requiring a further ramp up of visual effects.
Eric Alba was a visual effects associate on last two seasons of TNG, two seasons of DS9 and one season of Star Trek: Voyager, and contributed to the film Star Trek: Generations. Which means he was there during all of this transition in the TV and film effects world, and says that it “forever influenced the way VFX were made for television.”
But that doesn’t mean all of TNG’s visual effects were only done with video systems. Indeed, spacecraft models were still physically built and filmed as they had been done for decades. However, TNG would advance the art of motion control photography. One development was in the form of using UV black lights on fluorescent orange screens (instead of blue screens) for ship matte passes. “Before that,” explains Alba, “people would use 8 by 4 foam core boards, for contrast, around the ship in multiple passes to do silhouette — white on black — passes, which never worked because the cards would kick light and spill on the ship model.”
There would also be a move during the making of TNG and DS9 to incorporate fully CG versions of the spacecraft and other elements that would normally have been done by miniatures. Although other shows like Babylon 5 and DSV had also pioneered digital effects for television, it was these Star Trek series that perhaps gave CG on TV its greatest exposure.
Another important development that might not sound as glamorous but that Alba highlights as being a crucial advancement was the adoption of digital disk recorders (DDR) on TNG. “These were 30-second disk buffers of our transferred film footage that allowed us to ping-pong layers of elements on top of each other with no degradation. We could layer hundreds with no worry of compromising fidelity or quality of a final composited shot. Yikes, that was ambitious!,” Alba confesses.
I think TNG is a seminal series,” adds Alba, “if only because they had to create a capture, shoot and post approach that was fast, modular, scalable and executable in a two to three week production schedule. Considering that they had to produce 26 episodes a year is a large task. And then when DS9 became a second show in parallel production we now had to produce 52 hours of television a year in the same window of air dates.”
J.J. Abrams’s Star Trek Re-imagined Lens Flares
Much has been made of J.J. Abrams’s use of lens flares in his first Star Trek film. It’s fair to say they are dominant in many scenes, when the Enterprise passes in front of a sun or star, for example, or when the ship’s lights are visible as it whizzes past camera, or even on the bridge of the Enterprise as crew members move about. But they are not without motivation; Abrams’ intention was always to suggest that this is way light and camera lenses behave in the real world.
So real, in fact, that many flares seen in the film were simply the ones captured on set (albeit occasionally achieved with flashlights shining into the lens operated from near off-camera). They were enhanced by the use of anamorphic lenses which “squeeze” the film frame and provide a particular look to the final image.
Of course, many of Star Trek’s scenes were not capable of being shot for real — they were digital creations by Industrial Light & Magic, which had to replicate the live action look of lens flare aberrations to Abrams’s taste. In doing so, compositors at the visual effects studio analyzed the way light flared in anamorphic lenses and replicated that in digital form, even creating a tool called SunSpot to achieve it.
ILM sequence supervisor Todd Vaziri was one of the artists behind the lens flare work, and Millimeter Magazine at the time reported on his work on the SunSpot system, explaining that it “essentially combines off-the-shelf software, certain proprietary ILM tools, photographed elements, and several custom paint elements to painstakingly match the flares captured on the negative.”
In visual effects terms, the lens flares were perhaps not technically revolutionary but served a key part of the storytelling – always a goal by VFX artists. Vaziri himself suggests on his website, FXRant, that the “flares give the film a unique flavor of spontaneity and intensity, paradoxically giving the film a documentary-style grittiness, as well as a fanciful, otherworldly, abstract quality.”
A New Chance to Show Off Old Favorites
If there’s one kind of visual effect that has evolved with the Trek TV and film franchise, it’s warp speed. Advances in design, computer graphics and compositing techniques have allowed warp speed to become a more visceral, immersive, and impressive visual effect over the years.
In the latest incarnation, Justin Lin’s Star Trek: Beyond, the warp speed effect evolved even further when the film’s visual effects supervisor Peter Chiang suggested the idea of showcasing a more scientific approach.
“Right from the outset I kind of presented the idea of folding space and gravitational lensing,” he told HD Video Pro. Previous warp speeds had been a streak-type look, he notes, but this time it was developed as a folding space effect with a ‘warp bubble’ appearance. Luckily, the visual effects studio behind the warp speed effect was Double Negative, which had been involved in earlier significant gravitational lensing research and development for Christopher Nolan’s Interstellar and the black hole and worm hole VFX in that film.
So, like previous films before it, Star Trek: Beyond offers a new chance to show phenomena seen before. And soon filmmakers and effects artists will have another opportunity to continue boldly going where they haven’t gone before with a brand new TV series set to air in January of 2017.