Wednesday, January 14, 2015

Innovative Cinematography - Kadri Nikopensius 2015

Introduction

The moment's cinematography is very diverse in the sense that it has been developing more rapidly towards digital cinema in the last years. Digital cinematography and computer generated imagery are going hand in hand. One aspect of making cinema now is that film-makers are thinking out new ways to make creative worflow even more efficient in terms of preproduction. The director and cinematographer are working to imagine what would their vision look like and in the process, new technological ways of working are invented, for instance digital previsualization of the set, prelighting and previsualization of actor's movements.

In the process of preproduction and tests before the shootings, the cinematographer in collaboration with technical advisors, special effect supervisors and the director, are thinking of ways how to realize the visuals and the story the best way they possibly can. If something seems to be impossible to make, then there's a whole team behind the film who are helping to make that impossible to become reality. In the process, new technology and innovations are being invented, for instance pre-programmed camera rigs with smooth heads or moving sets with moving rigged lights and projections.

The other side of film-making is to use already-extisting technical innovations. The cinematographer and the team would use extisting camera solutions as a tool to tell a story or to create an artwork of a whole. Newer technology in cinematography are tools to manipulate the image: to capture film in low light situations or with open aparatures, to film with bluescreen or greenscreen or projection, to capture image in 3D, to capture film in huge formats for stabilizing or resizing it later, to capture images in narrow spaces or where you possibly never could put a 35mm camera. Directors of photography use these possibilities and get inspired by these technical innovations creatively.

In the latest years there have been arguments among film-makers whether it's better to use celluloid film stock for image capture or to use digital cameras. There are countless arguments what is considered to be „better“. There are cinematographers who prefer celluloid film over digital due to colour and dynamic range. But there are other opinions on digital cameras that they are more flexible in tight spaces and low light conditions. Of course these are only few considerations that cinematographers have in mind when they choose their medium, a tool of the trade – camera. But what kind of cameras are really used in different productions nowadays?



Tools of the Trade

The following chapter will give an insight into film productions, their nature and the choice of the cinematographer what camera was used.

According to D. Kaufman's article in American Cinematographer Magazine September 2014 issue, the cinematographer Neville Kidd won Creative Arts Emmy Awards for Sherlock: His Last Vow in cinematography last year. Neville says he used two Arri Alexas for the majority of the show, as well as a Canon Cinema EOS C300. According to Kaufman, Kidd also used a Vision Research Phantom camera for some high-speed work. Neville says, „To depict Sherlock’s mindset, we attached the C300 to Benedict [Cumberbatch] with a body rig.“ According to the article, Kidd shot most of the show with Cooke lenses, but also used extreme wide angles - Zeiss 10mm and 12mm lenses, to convey Sherlock’s mindset at the right moments. (D. Kaufman, September 2014)

Neville Kidd used two cameras for his film – an Alexa and a Canon C300. Shooting for TV and DVD distribution, he might have reduced his format size to HD, so he had plenty of cameras to choose from. The choice of Alexa seems to be understandable, because Alexa is known for its durability and colour sensibility. Having used both cameras myself, I can understand why Canon C300 was one of the choices. It can produce a sharp full sensor HD quality in good light conditions and being lightweight and easily monitored at the same time.

Ben Davis, the cinematographer of Guardians of the Galaxy, notes that he used two Arri Alexa cameras on the film, with a third and fourth added on occasion, D. Bankston writes about the cinematographer in his article. “If it’s three, I’ll work a wide and a tight down one axis and the third camera at 90 degrees. Over the years I’ve become much more adept at lighting for multiple cameras.[...] I don’t mind working with two cameras at 90 degrees from each other,” Davis says. The cinematographer chose lenses with anamorphic artifacts, aberrations, because he liked the look they gave to the image, stating, „We had a 50mm that said T2.8 on it, but it was more like a T4 and had a lot of edge distortion. I liked the look of it.“ (D. Bankston, September 2014)

Ben Davis also worked on Arri Alexas, shooting multiple cameras at once. Usually it is not that favourable to use multiple cameras, it mostly depends on the collaboration of the director and the cinematographer. The cinematographer has to be very clear and fast on his/her lighting desicions when using multiple cameras. The subject has to be lit on several angles and it makes the lighting hard for the gaffer and the DP – where to set the lights. Sometimes you have to give ground to other means, such as time and performance. In the end, the most important is how the actors perform, so multiple camera setup is mainly used to capture moments between the actors.

The film Gravity was shot by Emmanuel Lubezki who shot most of the live-action material also with Arri Alexa Classics and wide Arri Master Prime lenses. In an article about Gravity written by Benjamin B, the cinematographer stated, “The Alexa allowed me to shoot ASA 800 native, and it still looked great if I pushed it to 1,200, which made it possible to use the LED sources.“
According to Benjamin B's article, Lubezki shot the actors inside a big LED Box where the crew put the camera on a modified Mo-Sys remote head. The remote head was attached to a large, motion-controlled robot arm that could be moved around the actor in a preprogrammed trajectory.
Benjamin B wrote that addition to using the digital camera, Lubezki shot a scene on 65mm, using an Arri 765 and Kodak Vision3 500T 5219, to achieve an accent comparing to the rest of the picture. (Benjamin B, November 2013)

Gravity has been one of the most innovative film in terms of on-set technology and production workflow in the last years. Lubezki used Arri Alexas for his production mainly because he was planning to light the actors in a very specific and delicate way. He wanted to use LED light souces, so he needed a light sensitive camera. Putting the camera on a motion system remote head is also an impressive technological tool to imitate the feeling of non-gravity.

Nine digital camera systems were used on Rush, the film that was photographed by Anthony Dod Mantle, describes an article written by M. Hope-Jones. „Cameras and recording formats were as follows: Arri Alexas captured ArriRaw on Codex Recorders and (as an initial backup) in ProRes 4:4:4:4 to SxS cards; Canon C300s captured in MPEG2 8-bit CanonLog 1920x1080 24ps to CF cards; Indiecam GS2K and POV cameras captured in 10-bit raw in 4:2:2 uncompressed QuickTimes to Hyperdeck Shuttle SSD; the Phantom Flex (used by a splinter unit) captured in CineRaw 12-bit variable resolution and frame rate on CineMags; the V.I.O. POV.HD captured in QuickTime H.264; the Red Epic (used for visual-effects plates) captured in Redcode 5K 5:1; and the Canon 1D and GoPro cameras (both rarely used) captured in QuickTime H.264,“ is written about different cameras and formats used for Rush. The cinematographer states, „Three people staffed our mobile lab: a data wrangler, a DIT and a colorist. At the peak of production, when the race unit was filming, we had to double our manpower to have two shifts of three people working around the clock.“ (M. Hope-Jones, October 2013)

In a full length feature film like Rush, I imagine how difficult it must be to handle all these camera formats and sizes and put them to work on one timeline. But considering the nature of the story, I understand the choices that Anthony Dod Mantle made in terms of choosing the right cameras for the film. Arri Alexa and Canon C300 are also represented, like in Neville Kidd's film. Red Epics were used for visual-effects shots, Phantom Flex for slow motion effects shots, presumedly. Considering the small HD cameras, they were used for the simple reason why other cameras couldn't be used – those could be used in tight spaces or fixed to speeding F1 vehicles.


Fincher's Gone Girl was photographed by cinematographer Jeff Cronenweth, who used Red Epic Dragons for their production. In M. Goldman's article about Fincher's film Gone Girl, Cronenweth's team brought out several aspects why Red Dragon was their primary choice. Cronenweth's camera operator Rosenfeld stated, “There were a few occasions in Cape Girardeau when we shot so deep into dusk that most cameras would not have handled it.” He continued, “The low-light capability of the camera was outstanding, and the images we got in those conditions looked beautiful.“
What is more, the cinematographer describes in the article that the Dragon's 6K sensor size afforded them additional material information to better control, manipulate, reposition and stabilize the frames in postproduction themselves. That thought was also confirmed by Peter Mavromates, Fincher’s longtime postproduction supervisor, “We ended up with a 5K extraction out of a 6K field [that will] be distributed in 4K and 2K. But the 4K and 2K are better when you front-load the quality, which we were able to do with the 6K sensor.” (M. Goldman, November 2014)

Cronenweth filmed Fincher's Gone Girl with Red Dragons, presumedly due to its 6K sensor size and great performance in low light situations. The 6K sensor size allows the film-maker to better manipulate the footage in postproduction. It is possible to resize the images if the film will be delivered either in 2K or 4K resolution. 6K sensor size will allow the director to reframe and stalilize the image if he/she likes to emphasize something on the screen or wants to reduce the movement of the camera. These desicions made in postproduction are connected with the cinematographer as well. It is a common desicion of the director and the cinematographer, whether to change the image that the cinematographer had framed and moved in the beginning. I will be discussing this phenomenon later in the article.
For Oblivion, shot by Claudio Miranda, Sony F65, Red Epic-M and Red Epic-X were used, describes the article about the making of Oblivion by J. Holben. “We wanted to stay away from bluescreen and do as much in-camera as possible,” says the cinematographer Claudio Miranda. In the article he says that he nor the director likes the limitations of bluescreen composites on a set and they didn't want to end up in a situation where most of the set was made of CGI. Instead, Miranda had the idea of being old school on a modern way and use frontscreen projection to create sky all around the set with powerful video projectors. (J. Holben, May 2013)

Claudio Miranda used Sony F65 and Reds for his film. The chosen digital cameras offer the right light sensitivity, because the film-makers wanted to use projection screens. If they wouldn't have used projection, they would have gone with some other digital camera (if CGI would have been involved) or a 35mm camera (for built sets). Claudio Miranda states in the ASC article,“That’s one of the wonderful things about shooting digitally: you can work in very low-light situations and get some beautiful images. I love film, but I know I couldn’t have shot Oblivion on film.” (J. Holben, May 2013)
Using projection is fascinating in a way that it offers a lot more than bluescreen or greenscreen. There is also a side that concerns the lighting – in fact you can light with the projection itself. What is more, is the experience. The actors are able to experience the environment and can give a much more naturalistic performance.

The cinematographer of Resident Evil Afterlife, Glen MacPherson, talks about shooting his film in 3D in an interview article by J. Hemphill. The cinematographer states it's a trend to shoot 3D films digitally, because film-makers can monitor 3D live on set. Considering camera choices he stated, „We worked with the Pace Fusion 3-D rigs with Sony F35 cameras and Master Primes.“ The cinematographer explained why he was using Master Primes – they were sharp and had the right speed quality to them, especially when he was using Pace rigs and loses a full stop of light through the mirror on the 3D rig. He also used Phantom Gold for slow motion shots. MacPherson stated, „I liked the sensitivity of the F35 cameras and the full size sensor. A lot of people will tell you you need depth of field when shooting 3-D, but I like the look of shallow depth when I want to use it in the story.“ According to the cinematographer, his team was the first in 2010 to use long lenses like the 100mm and 150mm Master Prime on a 3D mirror rig. „A lot of rigs have mirrors that have trouble resolving anything over 50mm, but Pace has an "organic" mirror that allowed us to use the longer lenses,“ MacPherson says.
In the article he mentioned he liked the fact the technology is dealt with all the way through post and so he can concentrate on using his technology creatively. (J. Hemphill, December 2010)
Glen MacPherson indeed uses his technology creatively, due to the fact that he is shooting 3D on the set. He likes to make his own desicions in shooting 3D imagery – how shallow or deep something looks on the screen, what is focused, what is not. A lot of film productions about to be released in 3D are rendered in postproduction houses – these desicions aren't made by the cinematographer anymore. The films where 3D is considered already in preproduction and on the set, are definitely more refine than the 3D renders made in posthouses.

According to an article about the making of Interstellar, written by I. Stasukevich, Interstellar was shot both with 65mm IMAX or 35mm Panaflex Millennium XL. Approximately 60 to 70 minutes of the film’s 170-minute run time was filmed in 15-perf 65mm Imax MSM 9802. The article claims the cinematographer Hoyte van Hoytema to focus his large-format cinematography research on composition and operability. For instance, he said,“Your principles of framing are simpler. The Imax image is 1.43:1, so it’s more of a square. Because of the size, the experience is more visceral than observational, so you end up composing much more in the center of the frame.“ Hoytema wondered how beautiful medium IMAX is, with so much depth and clarity and size and how would it be like to do more intimate things with close focus and a short depth of field. He said, „It’s beautiful how the Imax lenses render faces. They’re like big-format still portraits.”
(I. Stasukevich, December 2014)

IMAX is the most largest format in cinema now – it passes a horizontal resolution size of 18K with its 15-perf 65mm film stock. The IMAX film can be delivered into a 70mm negative for distribution in IMAX theaters, 35mm negative distribution version for ordinary theaters, to DCPs and variable deliveries, such as DVDs etc. In the previous film productions, Christopher Nolan worked with Wally Pfister on the films Inception, The Dark Knight Rises, The Prestige, Memento where they formed a similar film language.
According to an interview with W. Pfister published on DP/30 Channel in Youtube in 2011, Pfister talked about the director Christopher Nolan and his constant desire to deliver his films into large scale formats, such as IMAX. At a point where they were in the middle of one of the film productions together, Christopher wanted to shoot the film with IMAX camera and therefore shooting it handheld, wanting to follow the actors. At that time Pfister was sceptical about handling the IMAX handheld, because it is a really heavy camera with a really heavy load and huge size. (DP/30, February/2011)
The Dark Knight Rises was the last production Nolan and Pfister worked together since 2012. In Interstellar, Christopher was already working with Hoyte van Hoytema, who offered solutions for handheld IMAX camera and IMAX attached on the actor's body.


Digital Capture – An Introduction


In the following chapter I will discuss about digital film capture from examples above that used digital cameras like Arri Alexas, Red Dragons, Red Epics or Sony F65 or F35. There are many reasons why digital medium was used for those specific films. For example, digital medium can either offer flexibility in the editing room or colour grading. Once there is sufficient material, the director can resize or stabilize or reframe the image if needed. For the colorist, there is a possibility to manipulate a 6K image to have more clarity or colors in it. Digital medium offers a more direct workflow when it comes to computer-generated imagery. There are specifically engineered high-end digital slow motion cameras, which are widely used for slow motion capture, for instance the Phantom Flex. When it comes to films and stories, which are in need of specifically small or lightweight cameras, it's possible to use head-cameras or Go-Pro's – they could be attachable to almost everywhere in tight conditions. Digital cameras are efficient in low light situations and are favoured mainly because of that. They offer beautiful image quality in low light with enough information in the blacks in the image. An interesting aspect of using projection is also introduced in the film productions below – digital cameras are used to capture the actors' performance in an environment created and illuminated by pre-shot footage on a projetion, without having a need to put a separate keylight for any actor. All these technological innovations inspire cinematographers to express their visuals through this technology.



Techinal Possibilities That Broaden Creative Realizations


There are a number of film productions that use film technology to serve their creative ideas. For instance, in Kaufman's article about the Emmys last year, it was written that Neville Kidd was able to do things in Sherlock that might have looked contrived in a normal drama, including being extreme with framing, camera tricks and rigs. “There aren’t many dramas where you can do that, because ordinarily, it takes you away from the story,” Kidd says. “But with Sherlock, it adds to the storytelling. (D. Kaufman, September 2014)

Previously I wrote about Cronenweth's and Fincher's collaboration Gone Girl where they used Red Dragon digital cameras to capture 6K footage in order to manipulate the image in post production. According to an interview with the cinematographer Cronenweth and other cinematographers at a roundtable, the reason for manipulating the image was for the sake of the Gone Girl's story. (THR December/2014) In Goldman's article about their film, Fincher said, „For us, collecting 6K was simply a way to get to the most pristine 4K, because then we could do all the stuff we wanted to do in post to emphasize the performances we liked.“ Cronenweth adds in the article, “This has been Fincher’s methodology all along: to use the system best equipped to help us get the most appealing images through color science and resolution.” (M. Goldman, November 2014)
The collaboration between the cinematographer Cronenweth and the director Fincher is very sustainable and seems that Cronenweth is happy with the outcome of that collaboration. On the other hand, it seems that every year there will be more technical innovations in the film industry and film-makers' thinking of how to tame that technology for their own sake. Decades ago, when only celluloid film cameras were used, there still was an industry where the cinematographer was one of the most important collaboraters with the director – still is, but the role has changed during the years. When shooting with a celluloid film camera, the cinematographer was the only person to have the right to look into the camera's viewfinder and decide on whether the take was good or not, because it was one of their responsibilities. The cinematographer was able to make the desicions that now are made by someone else in post production. The mentality was about mutual respect and trust.
According to assistant editor Tyler Nelson in Goldman's article about Gone Girl, they edited Gone Girl with Adobe Premiere programme which gave them the ability to transcode their media to a smaller format and edit those files in a more smaller, 1920x800 timeline. (M. Goldman, November 2014) This is a usual workflow to edit large resolution raw material the most efficient way. In the article, Nelson continued, „That allowed us to start editing by seeing only what Jeff was framing for on set, and to manipulate the framing, meaning we could move the image up, down, left, right and so on without having to re-transcode the edit media.“ (M. Goldman, November 2014)
There is always a reason, let it be intuitive or explained by some methodology, for the cinematographer to compose and frame a shot. It is extremely important how the camera is positioned in relation to the actor – how the eyeline is set, what is the frame size, how is the framing composed. These desicions are somehow important for the sake of the story and for an artistic whole. It is the cinematographer, who still has the right to decide which kind of framing or camera movement to use. If that desicion is made knowing that it will be your creative imput, but your creative desicions are done over by someone else, then it's kind of not your own artistic creation anymore.


What Works Well in Shooting 3-D


When it comes to making your own desicions, then cinematographers tend to like making their own desicions on visuals they are controlling.
The cinematographer Glen MacPherson talked about 3-D film-making in Hemphill's article in ASC magazine. He was talking about Resident Evil Afterlife, „There was never any talk of converting to 3-D in post. Neither Paul nor I are big fans of that process. I like to use 3-D on set as a creative tool. I don't like the idea of shooting it flat and handing it off to a 3-D post house, leaving those decisions to someone else.“ MacPherson stated, „You learn very quickly what works and what does not in 3-D. There are all sorts of things to look for while operating the camera. Objects on the edge of the frame can be very annoying in 3-D. In extreme case, an object can be in one eye but not the other. The camera operator is seeing a 2-D image from one eye and may not know he has something on the edge of the other eye. These are all things we look out for collectively. Anderson loves symmetry in his compositions, and that works very well in 3-D. Crazy shaky camera movements do not work well at all. Very fast cutting can be annoying unless you reduce your I/O to almost zero, close to 2-D. Our approach to the action sequences was to slow down the editing pace and play a lot of it in slow motion.“ (J. Hemphill, December 2010)

The long shots or sequences in slow motion work well in 3-D, the viewer simply is more focused on the central character or an object's movements. Gravity was also released in 3-D. The choice of having a very few cuts in the film and an extremely long shot in the beginning had to do something that it was being released in 3-D. The desicion was made both by the director Alfonso Cuarón and the cinematographer Emmanuel Lubezki. In the article written by Benjamin B. for ASC magazine the film-makers described what they had learned from making the previous films together. Lubezki states, “The main thing about the plano sequencia [long sequence in Gravity] is that it is immersive. To me, it feels more real, more intimate and more immediate. The fewer the cuts, the more you are with [the characters]; it’s as if you’re feeling what they’re going through in real time. This is something Alfonso and I discovered on Y Tu Mamá También and Children of Men.” (Benjamin B, November 2013)



Advantages in Shooting With Projection


Projection screens are in fact a very versatile way of working in the studio when film-makers want to create an environment that can't be done with set decoration. It saves a lot of post production hassle. There are a lot of advantages in using projection. For instance, it's possible to light with projection screens when powerful projectors are used. Futhermore, it creates an atmosphere for the actors and it makes a lot more easier for them to react to situations, because they know which context they are in.
Previously I brought an example how Oblivion's cinematographer Claudio Miranda used front projection in his film. Projection was also accepted by Christopher Nolan and Van Hoytema during Interstellar production.
According to I. Stasukevich's, Nolan raised the possibility of front projection as an alternative to computer generated imagery and in collaboration with the visual-effects supervisor Paul Franklin they decided to try it. The cinematographer Van Hoytema described, “We have sequences where the spacecraft dips toward a planet, and we could move the content dynamically outside the windows while rotating the light coming through the windows.” The article described the projection elements were fine-tuned over the course of production and most of these elements were still enhanced with visual effects in post production. On the other hand, Van Hoytema notes in the article that they always wanted to make it as good as they could in front of the camera, (I. Stasukevich, December 2014) like has been Christopher Nolan's princible on most of his films.

Claudio Miranda follows the same princible in his Oblivion and wanted to stay off of blue screens and green screens and focus on using projections on the set. The director and the DP wanted to avoid the situation that they had to make the environments in CGI. According to J. Holben's article for ASC magazine, the film-makers used a 270-degree projection around the entire set and more than 60 layers of video were combined to create a final blended image resolution of 18,288 x 1080 pixels. (J. Holben, May 2013) The amount of the projection is impressive enough to light the whole set, only having to use bounce once in a while in a close-up. In the article the DP says, „I actually used the light from the projections for much of the lighting in the sky tower. It gave us a huge source that was very beautiful natural light. In some cases, we’d use some additional bounce to bring that light closer [to an actor], but that was it.” (J. Holben, May 2013)
The projection screens were big enough to cover most of the exterior environments on the set, but these also had to be rearranged in terms of the story and time of the day. What concerns the footage on the projection screens, then Miranda talks about them as well, “We sent a crew out to Hawaii to shoot sky and cloud plates with three Red Epics, and those were stitched together to create 15K motion plates for the projectors.”
The good thing about using projections is that you can use reflecting substances: costumes, set design elements, shiny floors. Films that are shot using green screens and blue screens and which are digitally manipulated later, special-effects and CG graphics added, need test shootings for costumes, make-up and skin tones, (not) to match the keyed background and not to have non-fitting reflecting substances. In Holben's article Miranda talks about using projection, “This meant our production designer, Darren Gilford, didn’t have to compromise in his design for the set — we could have all the glass and shiny surfaces we wanted!” (J. Holben, May 2013)




Technical Innovations: Digital Prelighting and Preprogrammed Camera Moves


The last, but not the least among the digital cinematograpy achievements there is the film Gravity. In fact, it is thus far the most technically innovative film-making of the films listed so far in terms of creative progress and the technical side of cinematograpy. The fun part of the whole film is that it won 7 Academy Awards altogether in year 2014, among them is the Academy Award for Best Achievement in Cinematography. To tell something about the film itself, it must be that the only real elements that were shot are the actors and their faces and the space suit imitations on them. Other elements of the film is purely CGI.

According to an article about the making of Gravity by Benjamin B., the cinematographer Emmanuel Lubezki was thoroughly involved in every stage of creating the film's real and CGI images. In collaboration with his special-effects team, digital gaffers and the director he created a visual simulation for all the actors' movements, lighting and camera movements in the film. “Working with a lot of digital gaffers, I was able to design the lighting for the entire film,” says Lubezki, having a team working with him at all time. (Benjamin B, November 2013)

Because the film was shot considering it was released in 3-D, some main principles of shooting 3-D were considered. Like mentioned in the 3-D chapter, the movements and actions were shot in very long takes, especially in the beginning. In the film the shots were extremely fluid and seemed very real.


The senior visual-effects producer Charles Howell explains in Benjamin B.'s article for ASC magazine, “I think there were only about 200 cuts in the previs animation, [whereas] an average film has about 2,000 cuts. Because these shots had to be mapped out from day one, many of the lengthy shots didn’t really change in the three years of shot production. Because we did a virtual prelight of the entire film with Chivo[cinematographer], the whole film was essentially locked before we even started shooting.” The visual-effects supervisor Tim Webber explains how they got the idea to build a virtual simulation for the whole film. “We needed the freedom of a virtual camera,” says Webber, “so we created a virtual world and then worked out how to get human performances into that world.” (Benjamin B, November 2013)

In large scale film productions where computer generated imagery is involved, there are separate production advisor companies hired to help film-makers create a virtual animation of their planned film. Key persons for this kind of planning are the visual-effects supervisor and special-effects advisors. Usually it is better to start with storyboards which will lead into generating a 3-D animation in a programme, to make clear what the cinematographer and the director want exactly. After generating the animation, the visual-effects supervisor, director and cinematographer arrange a test shooting based on that same animation to see whether it's really working with the needed equipment and technology like it is supposed to. The Gravity workflow was similar to that, but much more refined in the sense of how detailed the planning was in the animation process – everything was fixed by the date the film-makers went to the shooting: the choreography, very precise lighting, camera movements.

The cinematographer explains how they managed to plan the camera movements. “The camera moves are really complex, but we started in the most simple way — first with storyboards, and then with a bunch of puppets and toy versions of the International Space Station and the space shuttle Columbia,” Lubezki states in Benjamin B's article.
The cinematographer notes that in addition to naturalism he wanted to have an elastic feel to the camera that stretches back and forth from one extreme to another. He explains in the article, “We wanted to keep a lot of our shots elastic — for example, to have a shot start very wide, then become very close, and then go back to a very wide shot.”

What concerns the lighting of the film, then it's very complex in a way it had to match to the previsualized CG lighting and its environment exactly, otherwise the look of the entire shot material would look non-naturalistic and therefore would look artificial, uncanny. It was extremely important for Lubezki to have a realistic look to the actor's faces. „I had the idea to build a set out of LED panels and to light the actors’ faces inside it with the previs animation,” Lubezki explained in the article. “When you put a gel on a 20K or an HMI, you’re working with one tone, one color. Because the LEDs were showing our animation, we were projecting light onto the actors’ faces that could have darkness on one side, light on another, a hot spot in the middle and different colors. It was always complex, and that was the reason to have the Box,” Lubezki explains the lighting strategy on the LED box set.

The actor's movements had to match to the previsualized animation exactly, because the motion system camera was programmed to certain moves. According to Benjamin B's article, Arri Alexa digital camera was put on a modified Mo-Sys remote head, which itself was attached to a large, motion-controlled robot arm that could be moved around the actor in a preprogrammed trajectory. Webber adds in the article that they worked on adjusting their technology to add increased flexibility to the system. For instance, to include the ability to adjust the speed of the preprogrammed moves to be adapted to the actors’ performances. More innovations were made to the motion system - a special remote head was added for a camera operator Peter Taylor to operate. The remote unit based on the Mo-Sys was adapted to a smaller and lighter system, so that it would block less light. (Benjamin B, November 2013)

These techological innovations, such as preprogrammed Mo-Sys remote heads and robot arms and previsualisations of the actor's movements in animation and a very precise previsualisation of lighting the whole film, mark a huge step in cinematography. It's a common thing to decide very early on in pre production how much shots there will be or long will the they be, but it is remarkable how precise these desicions had to be, to make the film work that realistically. With a very specific planning in choreography and CGI and choosing the right tools, the film-makers were able to pull off with the visual achievements that haven't been seen in cinema before. The ideas that they had in these previsualisations were still only ideas, until maybe they were put up with a fact that some things couldn't be done, but that's the least thing film-makers can accept, right? So these creative ideas were actually used to develop technological innovations to make their creative ideas happen, which is, in my opinion, an amazing thing to do.



Film Capture on IMAX Format and Anti-CG


One of the two film productions of my list used celluloid film in their cinema productions. One of them was Gravity, where Emmanuel Lubezki shot the film's very last scene on 35mm film stock. The second film production is Christopher Nolan's epic Interstellar, which was captured on 65mm 15-perf film stock, delivered into 70mm IMAX format.
First things first, according to an article from ASC written by I. Stasukevich, Van Hoytema was provided with large format Hasselblad lenses from Imax and Dan Sasaki provided the cinematographer two custom lenses of 50mm T2 close focus lens and 80mm T2 Mamiya (originally made for The dark Knight Rises). Hoytema was accommodated with handgrips and a shoulder pad for IMAX camera. A shortened MSM 9802 viewfinder was a neccessary change in order to handle the 75-pound camera with a 1,000' magazine more easily.
Visual-effects supervisor Paul Franklin claimed in the article that much of Interstellar’s imagery was photographed with miniatures and that means that very little computer generated imagery was made. “We saved digital for the stuff we could not do any other way, like a wormhole or a 4,000-foot mountain of water,” says Franklin in the article.
The incredible fact about the heavy IMAX cameras used for shooting Interstellar, is that the film-makers could hard-mount the Imax camera directly to the actors, like Nolan had wanted it on his previous films. Van Hoytema claims,“We built body mounts that were either suspended or placed on a pivoting rig, housing both the actor and the Imax camera. The whole rig could then descend on cables through the zero-G sets.[...] It let us [capture] visceral angles that are normally only possible with a GoPro camera, but in 15-perf 65mm!” (I. Stasukevich, December 2014)



Conclusion


We started to compare the different cameras cinematographers use for their films. Most of the DP's in my list used digital cameras, because most of their work was connected to either computer generated imagery or other cinematic innovations, such as slow motion or modern-day 3-D. The kinds of creative visual tools, such as using projections or operating in low light conditions, put digital cameras in advantage. Digital cameras offer a workflow that is convenient to work with and the images can be easily manipulated with – they can be resized, reframed or stabilized. It's still a matter of agreement between the cinematographer and the director, and a matter of taste, but it's still a creative tool as well.

Thanks to the film-makers' ideas and the process of imagining what their visions would look like, new technological ways of working are invented. For instance, digital previsualization of the set, prelighting and previsualization of actor's movements are used in the course of pre-production. New innovations are being added to already existing technologies, for instance remotes for pre-programmed smooth camera heads or additions made to handle film cameras in a more controlled way. Directors of photography and directors are either influencing innovations with their creativeness or are influenced by the innovations to get creative. There are thousands of cameras to choose from, they are the tools for creativity. Anthony Dod Mantle used 9 cameras for his film Rush, all of them were digital. In Christopher Nolan's and Van Hoytema's case their creative tool was IMAX 65mm on rigs that seemed impossible to handle. They wanted visuals that would be as realistic as possible, with as realistic colours as celluloid film could offer. And it is for the sake of the creative output. The arguments among film-makers whether it's better to use celluloid film stock for image capture or to use digital cameras are shrinking into the same thought in the end – it is best to use whatever is suitable for the film production you are working on – it totally depends on that.
Because in the end, does it really count whether it's analog or digital, because the choices that are made are in sake of the artwork and that's again very subjective – a creative choice.




References


[1] Debra Kaufman. Primetime Prestige. ASC September/2014
(accessed on January 10th 2015)
[2] Douglas Bankston. Space Cases. ASC September/2014
(accessed on January 10th 2015)
[3] Michael Goldman. Questionable Circumstance. ASC November/2014
(accessed on January 10th 2015)
[4] Iain Stasukevich. Cosmic Odyssey. ASC December/2014
(accessed on January 10th 2015)
[5] Jim Hemphill. Q&A With Glen MacPherson, ASC. ASC December/2010
(accessed on January 10th 2015)
[6] Benjamin B. Facing the Void. ASC November/2013
(accessed on January 10th 2015)
[7] Mark Hope-Jones. Full Throttle. ASC October/2013
(accessed on January 10th 2015)
[8] Jay Holben. Surviving the Future. ASC May/2013
(accessed on January 10th 2015)
[9] DP/30 Youtube Channel. Interview with Wally Pfister. February 3rd/2011
(accessed on January 10th 2015)
[10] THR Youtube Channel. The Cinematographers Roundtable. December 2nd/2014
(accessed on January 10th 2015)

No comments:

Post a Comment