Filmmaking Parts[edit] Film production consists of five major stages:[1] Development — The first stage in which the ideas for the film are created, rights to books/plays are bought etc., and the screenplay is written. Financing for the project has to be sought and greenlit.Pre-production—Preparations are made for the shoot, in which cast and film crew are hired, locations are selected, and sets are built.Production—The raw elements for the film are recorded during the film shoot.Post-production—The images, sound, and visual effects of the recorded film are edited.Distribution—The finished film is distributed and screened in cinemas and/or released to home video. Development[edit] Next, a screenwriter writes a screenplay over a period of several months. Once all parties have met and the deal has been set, the film may proceed into the pre-production period. Pre-production[edit] Main article: Pre-production In pre-production, every step of actually creating the film is carefully designed and planned.
Master shot Historically, the master shot was arguably the most important shot of any given scene. All shots in a given scene were somehow related to what was happening in the master shot. This is one reason some of the films from the 1930s and 1940s are considered "stagey" by today's standards. By the 1960s and 1970s, the style of film shooting and editing shifted to include radical angles that conveyed more subjectivity and intimacy within the scenes.[1] Today, the master shot is still an extremely important element of film production, but scenes are not built around the master shot in the same way that they were when professional filmmaking was in its infancy. Footnotes Bibliography Ascher, Steven, and Edward Pincus. Special effect A methane bubble bursting The illusions or tricks of the eye used in the film, television, theatre, video game, and simulator industries to simulate the imagined events in a story or virtual world are traditionally called special effects (often abbreviated as SFX, SPFX, or simply FX). Special effects are traditionally divided into the categories of optical effects and mechanical effects. Mechanical effects (also called practical or physical effects) are usually accomplished during the live-action shooting. Since the 1990s, computer generated imagery (CGI) has come to the forefront of special effects technologies. Developmental history[edit] Early development[edit] In 1856, Oscar Rejlander created the world's first "trick photograph" by combining different sections of 30 negatives into a single image. Not only the first use of trickery in the cinema, it was the first type of photographic trickery only possible in a motion picture, i.e. the "stop trick". Color Era[edit] Planning and use[edit]
Storyboard A storyboard is a graphic organizer in the form of illustrations or images displayed in sequence for the purpose of pre-visualizing a motion picture, animation, motion graphic or interactive media sequence. The storyboarding process, in the form it is known today, was developed at Walt Disney Productions during the early 1930s, after several years of similar processes being in use at Walt Disney and other animation studios. Origins[edit] The storyboarding process can be very time-consuming and intricate. Many large budget silent films were storyboarded but most of this material has been lost during the reduction of the studio archives during the 1970s. The form widely known today was developed at the Walt Disney studio during the early 1930s. Storyboarding became popular in live-action film production during the early 1940s, and grew into a standard medium for previsualization of films. Usage[edit] Film[edit] Theatre[edit] A common misconception is that storyboards are not used in theatre.
Motion graphic design Motion Graphic Design is a subset of graphic design in that it uses graphic design principles in a filmmaking or video production context (or other temporally evolving visual medium) through the use of animation or filmic techniques. Examples include the kinetic typography and graphics you see as the titles for a film, or opening sequences for television or the spinning, web-based animations, three-dimensional station identification logo for a television channel. Although this art form has been around for decades, it has taken quantum leaps forward in recent years in terms of technical sophistication. Technology[edit] Recently, motion graphics design needs more than a few tools and practices to be created smoothly. Tools like Maxon Cinema4D has integrated tools to create Motion Graphics, such as the native MoGraph plugin, or ICE of Softimage that can also be used for similar purposes. See also[edit]
Continuity editing Continuity editing is the predominant style of film editing and video editing in the post-production process of filmmaking of narrative films and television programs. The purpose of continuity editing is to smooth over the inherent discontinuity of the editing process and to establish a logical coherence between shots. Common techniques of continuity editing[edit] Continuity editing can be divided into two categories: temporal continuity and spatial continuity. Within each category, specific techniques will work against a sense of continuity. An ellipsis is an apparent break in natural time continuity as it is implied in the film's story. Diegetic sound is that which is to have actually occurred within the story during the action being viewed. Match on action technique can preserve temporal continuity where there is a uniform, unrepeated physical motion or change within a passage. The montage technique is one that implies no real temporal continuity whatsoever. Further reading[edit]
Compositing Four images of the same subject with original backgrounds removed and placed over a new background Compositing is the combining of visual elements from separate sources into single images, often to create the illusion that all those elements are parts of the same scene. Live-action shooting for compositing is variously called "chroma key", "blue screen", "green screen" and other names. Today, most, though not all, compositing is achieved through digital image manipulation. Basic procedure[edit] All compositing involves the replacement of selected parts of an image with other material, usually, but not always, from another image. Typical applications[edit] In television studios, blue or green screens may back news-readers to allow the compositing of stories behind them, before being switched to full-screen display. Physical compositing[edit] In physical compositing the separate parts of the image are placed together in the photographic frame and recorded in a single exposure. Matting[edit]
Widescreen For television, the original screen ratio for broadcasts was 4:3 (1.33:1). In the late 2000s, 16:9 (1.78:1) TV displays came into wide use. They are typically used in conjunction with high-definition television (HDTV) receivers, or Standard-Definition (SD) DVD players and other digital television sources. With computer displays, aspect ratios wider than 4:3 are also called widescreen. Film[edit] History[edit] Widescreen was first used in the film of The Corbett-Fitzsimmons Fight in 1897. In 1930, after experimenting with the system called Fanthom Screen for The Trail of '98 (1928), MGM came out with a system called Realife. By 1932, the Great Depression had forced studios to cut back on needless expense and it was not until 1953 that wider aspect ratios were again used in an attempt to stop the fall in attendance due, partially, to the emergence of television in the U.S. Types[edit] Masked (or flat) widescreen was introduced in April 1953. Television[edit] Computer displays[edit]
CinemaScope Un article de Wikipédia, l'encyclopédie libre. Le CinemaScope est un procédé de prise de vues et de projection qui consiste à anamorphoser (comprimer) l'image à la prise de vue, pour la désanamorphoser à la projection. Ce rapport de compression est de 2. Le CinemaScope ne désigne pas directement le format d'image, mais un procédé d'anamorphose de l'image, qui peut être utilisé en 35 mm comme en 16 mm, avec des ratios d'image différents. C'est la SMPTE qui définit les normes de la fenêtre de projection de ce qu'on appelle le scope : CinemaScope 35 mm avec son optique. C'est une image qui est comprimée lors du tournage qui est ensuite étirée lors de la projection pour ne plus être déformée et ensuite retrouver son format panoramique. §Technique[modifier | modifier le code] §Historique[modifier | modifier le code] Le dispositif optique est basé sur celui de l'Hypergonar, inventé en 1926 par le Français Henri Chrétien. En 1955, lors du tours du Mans, l’Angleterre a utilisé le CinémaScope.