CGI - Computer Generated Imagery
Computer-generated imagery (CGI) is the application of the field of computer graphics (or more specifically, 3D computer graphics) to special effects. CGI is used in movies, television programs and commercials, and in printed media. Video games most often use real-time computer graphics (rarely referred to as CGI), but may also include pre-rendered "cut scenes" and intro movies that would be typical CGI applications. These are referred to as FMV.
CGI is used for visual effects because the quality is higher and effects are more controllable than other more physically based processes, such as constructing miniatures for effects shots or hiring extras for crowd scenes, and because it allows the creation of images that would not be feasible using any other technology. It can also allow a single artist to produce content without the use of actors, expensive set pieces, or props.
Recent accessibility of CGI software and increased computer speeds has allowed individual artists and small companies to produce professional grade films, games, and fine art from their home computers. This has brought about an Internet subculture with its own set of global celebrities, clichés, and technical vocabulary.
2D CGI was first used in movies in 1973's Westworld, though the first use of 3D imagery was in its sequel, Futureworld (1976), which featured a computer-generated hand and face created by then University of Utah graduate students Edwin Catmull and Fred Parke. The 2nd movie to use this technology was Star Wars (1977) for the scenes with the Death Star plans. The first two films to make heavy investments in CGI, Tron (1982) and The Last Starfighter (1984) were commercial failures, causing most directors to relegate CGI to images that were supposed to look like they were created by a computer. The first real CGI character was created by Pixar for the film Young Sherlock Holmes in 1985 (not counting the simple polyhedron character Bit in Tron). It took the form of a knight composed of elements from a stained glass window. Photorealistic CGI did not win over the motion picture industry until 1989, when The Abyss won the Academy Award for Visual Effects. Industrial Light and Magic produced photorealistic CGI visual effects, most notably a seawater creature dubbed the pseudopod, featuring in one scene of the film. CGI then took a central role in Terminator 2: Judgment Day (1991), when the T-1000 Terminator villain wowed audiences with liquid metal and morphing effects fully integrated into action sequences throughout the film. Terminator 2 also won ILM an Oscar for its effects.
It was the 1993 film Jurassic Park, however, where the dinosaurs appeared so life-like and the movie integrated CGI and live-action so flawlessly, that revolutionized the movie industry. It marked Hollywood’s transition from stop-motion animation and conventional optical effects to digital techniques.
The following year, CGI was used to create the special effects for Forrest Gump. The most noteworthy effects shots were the digital removal of actor Gary Sinise's legs. Other effects included a napalm strike, fast-moving Ping-Pong balls and the feather in the title sequence. With Forrest Gump, CGI entered mainstream movies.
2D CGI increasingly appeared in traditionally animated films, where it supplemented the use of hand-illustrated cells. Its uses ranged from digital tweening motion between frames, to eye-catching quasi-3D effects such as the ballroom scene in Beauty and the Beast.
In 1995, the first fully computer-generated feature film, Pixar's (The Walt Disney Company) Toy Story, was a resounding commercial success. Additional digital animation studios such as Blue Sky Studios (Fox), DNA Productions (Paramount Pictures and Warner Bros.), Onation Studios (Paramount Pictures), Sony Pictures Animation (Columbia Pictures), Vanguard Animation (Walt Disney Pictures, Lions Gate Films and 20th Century Fox), Big Idea Productions (Classic Media) and Pacific Data Images (DreamWorks SKG) went into production, and existing animation companies such as The Walt Disney Company began to make a transition from traditional animation to CGI.
Between 1995 and 2005 the average effects budget for a wide-release feature film skyrocketed from $5 million to $40 million. According to one studio executive, as of 2005, more than half of feature films have significant effects.
In the early 2000s, computer-generated imagery became the dominant form of special effects. The technology progressed to the point that it became possible to include virtual stunt doubles that were nearly indistinguishable from the actors they replaced. Camera tracking software was refined to allow increasingly complex visual effects developments that were previously impossible. Computer-generated extras also became used extensively in crowd scenes with advanced flocking and crowd simulation software. The timeline of CGI in movies shows a detailed list of pioneering uses of computer-generated imagery in film and television.
CGI for films is usually rendered at about 1.4–6 mega pixels. Toy Story, for example, was rendered at 1536 × 922 (1.42MP). The time to render one frame is typically around 2–3 hours, with ten times that for the most complex scenes. This time hasn't changed much in the last decade, as image quality has progressed at the same rate as improvements in hardware, since with faster machines, more and more complexity becomes feasible. Exponential increases in GPUs processing power, as well as massive increases in parallel CPU power, storage and memory speed and size have greatly increased CGI's potential.
In 2001, Square Pictures created the CGI film Final Fantasy: The Spirits Within, which featured highly detailed and photographic-quality graphics. The film was not a box-office success, however, and after creating one more film using a similar visual style (Final Flight of the Osiris, a short film which served as a prologue to The Matrix Reloaded), Square Pictures closed down.
Developments in CGI technologies are reported each year at SIGGRAPH, an annual conference on computer graphics and interactive techniques, attended each year by tens of thousands of computer professionals.
Developers of computer games and 3D video cards strive to achieve the same visual quality on personal computers in real-time as is possible for CGI films and animation. With the rapid advancement of real-time rendering quality, artists began to use game engines to render non-interactive movies. This art form is called machinima.
Creating characters and objects on a computer
3D computer animation combines 3D modeling with programmed movement. Models are constructed out of geometrical vertices, faces, and edges in a true 3D coordinate system. Objects are sculpted much like real clay or plaster, working from general forms to specific details with various sculpting tools. A bone/joint system is set up to deform the 3d mesh i.e. to make a humanoid model walk. In a process called rigging, the virtual marionette is given various controllers and handles for an animator to manipulate. The character "Woody" in Pixar's movie Toy Story, for example, uses 700 specialized animation controllers. In the 2003 film The Day After Tomorrow, designers had to completely create forces of extreme weather with only the help of video references and accurate meteorological fact.
For the 2005 remake of King Kong, actor Andy Serkis was used to help designers pinpoint the gorilla's prime location in the shots and used his expressions to model "human" characteristics onto the creature.
One of the less obvious CGI effects in movies is digital grading. This is a computer process in which sections of the original image are color corrected using special processing software. A detail that was too dark in the original shot can be lit and enhanced in this post-production process. In The Lord of the Rings digital grading was used to drain the color from Sean Bean's face as his character died.