The Basics of VFX

The other day I stood in front of a screen at the cinema while waiting for a friend, on which a bunch of trailers were playing. New York is getting blown up again, some ghost sailors hope to catch a quirky pirate dead or alive and humans are trying to save the world against robot juggernauts.

As people were coming out of different screenings, I couldn’t help but overhear some typical comments: “Man, those special effects were so sick!”. It seems like films are over-saturated with visual effects nowadays (there’s actually a difference between the terms “visual effects” and “special effects”, more on that later).

But hang on a second. There are old films with visual effects, so these aren’t anything new. Surely the technology back in the day wasn’t as advanced as it is today, so how did they do it? And how do they do it nowadays anyway? And where? By whom? Could I maybe do it too? What, for a living? For real?

Let’s start by explaining that difference between the terms “visual effects” (VFX) and “special effects”, commonly referred to as “practical effects”. Generally, practical effects are those which can be done while the scene is being captured. VFX are normally done in post-production and are those effects that would be impossible to achieve in the real world. Besides creating astonishing viewing experiences, this is one of the main three reasons why VFX are needed. The second sprouts when there could be a practical way of filming the scene required, but doing so might put someone at risk and the third has to do with cost effectiveness. Sometimes it is more practical to use VFX than to film a scene due to issues of scale, location or both, for example, when recreating period settings such as World War II or Victorian London.

While we are used to seeing all of those flashy, fancy VFX in action and fiction films, they are actually utilised in almost every film precisely for the reasons just mentioned above. Some argue that these are the type of VFX that are most impressive when they are integrated seamlessly and are unnoticeable by the audience. A glass breaking, a gunshot or even a car or a crowd are examples of this.

So, how are they done?

The software primarily used for VFX is called Nuke. As an outsider not knowing anything about VFX, I was one to believe that these were done in Adobe After Effects, however Nuke is the industry-standard software for films. While After Effects is also industry-standard, it is predominantly used to create motion graphics, another term not to be confused with VFX. The main difference is that After Effects is layer-based, while Nuke is node-based, which makes it quicker to work. Nuke is also a bit more complicated to use and learn, but counts with more functionality than AE.

Another big part of VFX is CGI (computer-generated imagery), rendering and 3D animation, for which software such as Maya, Renderman or Houdini is used. Fun fact: they don’t use Windows nor MacOS in the industry, computers mostly run on Linux.

Who creates the VFX then?

On a global scale, most top VFX houses have their headquarters or branches in the US, the UK (mainly London), Canada (mainly Vancouver and Montréal) and India. In London specifically, most companies are based in Soho. There are also different ramifications within VFX, with companies that specialise in film and others in TV, commercials or animation. Some of the best known companies are MPC, Double Negative, Framestore, Milk and The Mill, to name a few.

Going into further detail, there are different steps in terms of a VFX workflow: move matching, rotoscoping, compositing, matte painting, lighting and rendering or texturing are just a few of these, which are normally tied-in to a role of the same name. There are additional roles, such as VFX Producer, VFX Supervisor or Technical Director. And of course, runner. It is worth noting that in bigger companies, some of these roles normally specialise in a very particular task, for example hair and fur in beasts, animals or people or fire and smoke, to mention a couple, whereas in smaller companies artists are faced with doing more versatile work.

Now, what do I need to get my foot in the door and how?

Learning to use the software is obviously a major requirement, however it is not so obvious that the ideal core skills to possess include a good understanding of mathematics, design, computer science and physics. This is because as long as you have this core knowledge, it can be easily applied later to any piece of software, no matter how software and technology evolves or changes. For anyone passionate about VFX looking to strengthen their skills, I would recommend reading (and owning) the book “The Art and Science of Digital Compositing” by Ron Brinkmann, often referred to as “the Bible of VFX”.

Another ability to have which is as important as any technical skill is teamwork. A single VFX project can last for a year, 18 months or even two years. Therefore, it is very important to be a good teamplayer who is friendly and easy-going. Since projects can get very stressful, it is essential to be pleasant to deal with and able to produce quality work under pressure.

As for how to get your foot in the door, there are several alternatives. If you don’t have a lot of experience (you should still have a presentable showreel), you can try applying for a runner position at a company. This is one the lowest positions in film and you have to be prepared to make lots of tea and coffee for a long time. However, the bright side is that you will likely find yourself amongst talented artists and sooner or later, if you work hard and demonstrate interest and initiative, you’ll end up being given tasks actually related to your field of work.

Another alternative to consider are the training schemes that different companies run, such as MPC Academy or Envy Academy. Double Negative have their own Graduate Trainee Scheme or Framestore have several courses and training programmes, to name a few.

If you have a bit more experience and feel confident about your skills, you can try applying to junior positions which are opening up regularly, as companies are always on the lookout for new talent.

Last but not least, freelancing is another alternative, as companies often hire people for specific tasks on a project-to-project basis.

The VFX industry is a very exciting one, with lots of new potential coming its way thanks to the ever-developing advancements in VR, 360 cameras, virtual cameras and tools such as Lytro. It is also diversifying into areas outside of film, adapting previsualisations or simulations normally run in pre-production stages to apply them for design purposes, in industries as diverse as automotive, jewellery, construction or medical.

However, the industry might not be suitable for everyone. Working under stressful circumstances, long hours or mobility can be a constant, which can translate into a poor social or family life. But if you have the drive and passion to pursue a career in this industry, finding yourself working in a top-tier budget film or the next Hollywood blockbuster is a very real and stimulating possibility.

Six Ways Film & Television is Embracing Women and BAME

 

The most prominent debate in film and television in the present day is the representation of women and BAME. In this article, I hope to present schemes and organisations that are dedicated to creating diversity and equality in the industry.

 

ONE: NFTS Directing Workshop

The National Film and Television School provides teaching and training for those wishing to work in film and television. They run several diplomas; masters; certificates and short courses.

This new initiative for directors has been launched by NFTS aiming to increase the number of women, BAME and people with disabilities.

The six selected directors will take part in a 2-day introduction in March followed by an intensive 4-week workshop during summer culminating in the production of a short film.

The course is free and the deadline is 19th February.

Apply here: https://nfts.co.uk/directing-workshop

 

 

TWO: CREATIVE ACCESS

Founded in 2012, Creative Access aims to provide young BAME people paid training opportunities in creative companies and supporting them into full-time employment.

With over 200 media partners offering opportunities including ITV, BBC, Channel 4 and many more. This organisation is paving the way to creating an industry that truly reflects British society.

Want to sign up? Check out the website here: https://creativeaccess.org.uk/

 

THREE: Women In Film & TV UK Mentoring Scheme

Women In Film & TV is a membership organisation run by women supporting women working in the creative media in the UK.

Every year they run a mentoring scheme designed for women with more than 5 years’ experience looking to take a significant step in their career. Over six months participants receive six hours of mentoring contact with an industry figure. There are also seminars, training workshops and networking opportunities.

Free to apply and participate. Find out more here: https://wftv.org.uk/mentoring/

FOUR: BAFTA

In 2019, BAFTA will be adding the BFI Diversity Standards to the eligibility criteria for the Outstanding British Film Award and Outstanding Debut by a British Writer, Director or Producer.

This decision has been controversial in the industry with some parties believing this is a step too far and restricts filmmaking. In my opinion, it is a bold and much needed move towards creating an inclusive and equal industry. My only issue with it, is the fact this is even needed in the 21st Century to promote diverse filmmaking.

Considering that in the 2015 Oscars no non-white actors were nominated for an Academy Award, a change in criteria for these awards is definitely overdue.

FIVE: DIRECTORS UK

In 2016, Directors UK released a 10 year study on women directors in film revealing the shocking truth that only 13.6% of all directors working in the last decade were women.

They aim to use the findings of this study to improve the industry for women by campaigning for these 3 specific goals:

  1. 50% of films backed by UK-based public funding bodies to be directed by women by 2020.
  2. Development of the Film Tax Credit Relief system to require all UK films to take account of diversity.
  3. Industry wide campaign to inform and influence change

Find out more here: https://www.directors.uk.com/campaigns/gender-equality-in-uk-film-industry#support-our-campaign

 

SIX: GAME CHANGERS

In 2016, BFI Film Forever and Creative Skillset launched a workshop called Game Changers specifically for women and BAME filmmakers.

Run by Kymberlie Andrews who is a master trainer and communication coach. The aim of the two day workshop was to boost confidence; teach pitching and make contacts with like minded individuals.

For myself, this workshop changed my game by opening my eyes to my personality strengths which has affirmed by future career goal.

Hopefully this opportunity will be renewed for 2017 but only time can tell!

Find out more here: http://gamechangeruk.com/

 

If you know of any opportunities for women and BAME in film and TV then please comment below!

 

 

 

Game Engines and MoCap Suits – The Rise of Instant Digital Visual Effects

Animation director, VFX supervisor and director/producer Mondo Ghulam talks about creating digital visual effects using game engines such as Unreal and motion capture suits. Ghulam has worked on several high end games, as well as film and television productions. He has gained over 20 years experience currently focusing on his first passion: Filmmaking.

The Use of Game Engines and Motion Capture Suits – A Common Practice?

In this day and age, on-screen storytelling is able to take advantage of a large range of technologies used to create visual effects in and video games. Two of the most striking methods adopted by filmmakers are the use of game engines and motion capture. Ghulam highlights the economical and creative progress these technologies have brought to VFX in films on nearly every budget level. In early days such as those of A.I. Artificial Intelligence (Steven Spielberg, 2001) a game engine was used in order to combine live action elements with a virtual set.

 

MG: ‘In order to give everyone an idea of what they were looking at [on the virtual set], they were tracking the camera in real time and then displayed a low detail version of the background from a game engine that was synched to the camera. It meant that they could look through a monitor and see a composite of the live action against the fully 3D CG background in its early state. So they can set composition, they can work out exactly where to place the camera and how to move it.’

Ghulam explains that game engines have become so far advanced rendering the quality undeniably high and the production costs comparatively low.

MG: ‘A project that I’m interested in doing will use a game engine because it will be long format and animated, and I don’t want to have to look for the money to render everything out. So with a game engine, you don’t need to do that, it will just play in real time. You just record it straight out of the computer, and all you might need to do is to buy some graphic cards that will power that to a high degree.’

Another technological key element in visual effects is motion capture. As one of the kickstarters of a new inertial mocap suit, Ghulam talks about how quickly body motion can be recorded with this compact piece of technology.

MG: ‘Ultimately if you can take the heat, you can wear this under a set of clothes, and it works wirelessly. So I can be sitting in this room, doing this interview whilst being recorded for my body motion in regular clothing.’

The animator outlines that although the footprint of a motion capture studio with dedicated rooms, expensive equipment and a large team of VFX artists and technicians who work hard to calibrate the environment before shooting, will create a high quality end product, the single person in a suit shows you how the level of compactness and mobility this technology has already achieved.

MG: ‘Eventually there will be no suits, it will be any room that you like, and things like Xbox Kinect are kinda already doing that. You can sit on your sofa and you can track your movement, it can tell where your hands are, it can see your face, it can even recognize your voice and distinguish you from someone else sitting there with you.’

Accessibility and Limitations- Possibilities vs. Perception

Talking about the rapid transformation of large and powerful computers running expensive software to substantially more compact gaming engines, as well as the internet as a key element for sharing and accessing information, Ghulam addresses the technical curve that has happened over the past three decades. Game engines have been around for quite a while, yet the possibility for
individuals to use them easily and economically has not.

MG: ‘In 1994 there was one magazine, that we could get hold of in Glasgow once every quarter, which had articles dedicated to little 3D projects but ultimately that was it. Now any laptop will run, for example, Unreal who have thousands of excellent videos all very well aggregated within sections, so if you’ve never even touched a game engine before, there is a whole series of very easy to watch videos that will walk you through it and get you doing things, that two hours before, you had no concept of, or any idea that you might even be able to do.’

Despite the progress there are limitations. One barrier, the animator believes, is perceptual. He argues that although high impact visuals can still take a huge bite out of the filming budget due to the high cost or complexity, making minor corrections and replacements for example, is possible to do for anyone on a much more economical level.

MG: ‘I’m gonna make a bold claim here, but you could subscribe to Adobe cloud, download Adobe After Effects, spend a couple of days looking at tutorials, learn how to track, put it an element in and get your shot. Now these are pretty common tools now, that had once been the preserve of a very few people in the past.’

Another set of limitations can be found in facial capture. Whereas motion capture suits are far advanced in their ability to instantly transfer an actor’s movements to a highly accurate level, capturing facial movements and expressions is a different story. Ghulam addresses the fact that although your face can be recorded, there are still overarching problems with deadness and inability to capture not only the physicality, but also the psychology of human facial expressions. At the moment, he adds, it is still the prerogative of the high end productions to animate emotions to a believable extent.

MG: ‘When we are talking about Dawn of the Planet of the Apes (Matt Reeves, 2014) and Avatar (James Cameron, 2009), there is a percentile at the end of it the quality scale that just pushes pushed it beyond anything that anyone else was doing at that particular time. And that takes lots of talented people, there is no automated way to get to where any of those films ended up. It takes a lot of really good artistry […] You look into the eyes of those characters and they are real.’

Creating a Virtual Skin- Futurism or Future?

Looking at the processes that have shaped virtual realities and storytelling in the last decades, many of the things we associate with futurism are already possible in the here and now. Ghulam uses Xbox 360 and Xbox Kinect as prime examples for the level of accessibility motion capture has moved into, given the fact that those devices can capture 3D performances and data in the comfort of a living room at very low cost. The animator believes that although, at that low-budget level, the quality of the characters are somewhat inaccurate and the resolution is low, in the near future, actors will be able to scan their bodies and save their digital selves, then 20 years later, when a role requires a younger version of them, they rehearse the part, put on their motion capture suit and virtually perform in their younger skin.

MG: ‘There is no reason why this couldn’t be done now in fact. What we lack is the library of all these great actors, that are alive today. We don’t have their 360 degree images from 30 or 40 years ago’

Ghulam explains that technology which can record the facial performance of an actor and translate it into a virtual character that doesn’t need prior calibration already exists. Industrial Light & Magic (ILM) developed a particular program that studies the face of the actor who presents himself to the camera:

This form of capture has also manifested itself in the idea of face-exchange.

MG: ‘There was a research paper earlier this year, it’s a different kind of technology, but they can record your face, then they can record my face, and then I can puppeteer your face with my face.’

The animator highlights the many facets of the use of such highly developed technology and states that futuristic elements conveyed in early sci-fi films such as identity theft and stealing a persons face have potentially become very real. Whilst recognizing rapidly improving technology as vitally important to filmmakers, Ghulam is very clear about the fact that without dedicated people there would be no progress. He highlights the essence of having skilled and motivated artists, whether they are on the creative or on the technical side of things, on a VFX or VR project.

MG: ‘A very common question I would get asked has come from actors: ‘This stuff is gonna replace all of us, isn’t it?’ and I would say: ‘You’re in the middle of the floor, there are a hundred cameras around you and 40-odd people behind the scenes. All that technology and all that effort is about capturing something that only you can do. Which part of this tells you that you’re being replaced?”

Ghulam strongly outlines the convergence of storytellers, technicians and actors on visual effects productions and the importance of individual input. He stresses the fact that with the popularity of video tutorials digital artists have the opportunity to gain and improve their skills on a rapid speed, valuable knowledge is shared instantly and evolving communities of artists are trying new technologies and push their development to new levels. He uses the realtime cinematography example of Hellblade (Epic Games, 2016), a video game where a large number of digital artists capture a character in real time into the Unreal Engine by leveraging the unique qualities of the actress.

MG: ‘I’m not trying to paint a utopia here. Obviously there are competitive  aspects to this, but there has always been a lot of people (in the cg community) willing to share it and it’s these people who are making it possible. It’s also someone that will step on the set and say: ‘I can do that.’ Although their minds might be screaming that it is impossible […] but there is nothing quite like seeing it all work in the end. You’re creating things, or helping to create things, that could not be created anywhere else in that way’

Making the Magic Happen- What’s Next?

Mondo Ghulam is currently working on several short film projects as writer, director and VFX artist, and is planning on moving into feature film production very soon.

www.mondoghulam.com