Computer Generated Images (CGI) have taken over Hollywood in the last 50 years. Gone are the days of ketchup being substituted for blood in schlocky movies. No longer are directors destroying massive, expensive set pieces to get a single shot, unless they’re Christopher Nolan. Why would a studio bother spending the money to hire those who have the specialized knowledge necessary to create convincing practical effects?
This wasn’t always how movies functioned. The first ever use of CGI in a live action feature film was in 1958’s “Vertigo.” The CGI in this movie, however, was limited to the opening credits alone. CGI wasn’t used in the storyline of a film until 15 years later with the release of “Westworld.” These original CGI films were not like the CGI films of today. They used CGI as a purely post-production tool. Scenes then were constructed without express focus on how CGI could replace or alter major aspects. Now, directors take into account what can be digitally replaced or altered when designing and determining the set, costumes, and non-human characters. They are allowed to start filming without having the script finished because they can just switch the setting, costumes, and story anytime; they’re filming on a greenscreen, after all.
Part of the reason for this was the quality of CGI at the time. As anyone who has ever seen a movie from the 1970s to early 2010s that relied a little too much on CGI knows, CGI was not developed enough to completely replace practical effects. This didn’t stop many movies, especially in the 2000s-’10s, from liberally using CGI as a complete replacement for sets, costumes, and non-human characters. A classic example of the flaws of early CGI can be found in the remake of “A Nightmare on Elm Street.” The original, released in 1984, featured practical effects exclusively, managing to create scenes that still hold up today. In comparison, the 2010 remake used CGI almost exclusively for the death scenes and gore. The remake is largely disliked, with a 14% on Rotten Tomatoes and critics calling out things like “an overuse of unforgivable CGI” in the reviews.
It is fair to say that CGI has come a long way in the last decade and a half, with movies embracing it more and more. Often, this is done without a visual assault against the viewer, unlike in the early 2000s. The often left out piece of the story, but one that those who advocate against CGI overuse, is that modern CGI, when implemented well, is hard to point out. Like plastic surgery, you only notice CGI when it’s bad. That is the expectation of CGI, and the implementation in a majority of movies. However, movies which care more about saving money than creating a good product for the viewer skip practical effects to save money, and then they rush the post-production time down to save money on computing costs. This creates effects like those seen in “The Flash” (2023) and “A Nightmare on Elm Street” (2010).
The real problem with the overuse of CGI in modern films is not that it makes films look worse, films that cut corners with CGI would cut corners in other ways if they had to, it’s the death of the art of practical effects. From costuming, which must achieve accuracy in historical pieces or create fantastical outfits for sci-fi epics, to gore, which gets very complicated due to the complexity of the functions of the human body, there is immense skill and talent that go into each role on the set. These artists have often learned the techniques from others in the industry or created them themselves. There are no schools which focus on teaching these skills. Colleges might include a few courses on behind-the-scenes work, but it tends to focus on managerial roles or stagecraft. As such, the only way to learn these skills is to get a role in the industry under someone who knows them. When the film industry is turning away from practical effects, these jobs become fewer and far between. And with many roles, such as lighting, set design, and costuming being trades, their skills are less likely to be recorded and saved for future artists.
Practical effects should be preserved for two central reasons: they force directors to complete scripts and planning in advance to film as the shots they get will be there to stay, unless they can pay for reshoots, and they preserve the skills of generations of artists which will be lost without consistent use.
This doesn’t mean that CGI should be completely thrown out and be re-replaced with practical effects. There are movies, such as “Oppenheimer,” which marketed itself on its completely exclusive use of practical effects, including for the dropping of the atomic bomb. While this is a marvel of film making, it is not a reasonable expectation for the average film. “Oppenheimer” had a budget of around $100 million, a $35 million increase from the average film budget. The gulf between budgets only increases when you consider that the average budget of a horror movie is only $25 million.
Instead of attempting to aim for complete practical effects, horror movies—and all movies—can employ a combination of CGI and practical effects. Designing scenes with a focus on practical and turning to CGI only in post-production to finalize visuals. This was done to great success in “Weapons” (2025). The movie combines practical gore with CGI enhancements to create enthralling visuals which leave the viewer haunted with scary dreams, all with a budget of only $38 million.
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew
- Chloe Ballew