To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

From Wikipedia, the free encyclopedia

In filmmaking, video production, animation, and related fields, a frame is one of the many still images which compose the complete moving picture. The term is derived from the historical development of film stock, in which the sequentially recorded single images look like a framed picture when examined individually.

The term may also be used more generally as a noun or verb to refer to the edges of the image as seen in a camera viewfinder or projected on a screen. Thus, the camera operator can be said to keep a car in frame by panning with it as it speeds past.

YouTube Encyclopedic

  • 1/3
    Views:
    260 938
    192 660
    6 936
  • The History of Frame Rate for Film
  • Film School: Framing Techniques
  • Frame By Frame: Alfred Hitchcock

Transcription

This FilmmakerIQ Lesson is proudly sponsored by RØDE Microphones Premium Microphones and Audio accessories for Studio, Live and Location Recording Welcome to Filmmaker IQ.com - I'm John Hess and today we'll dive into the history of frame rates. Let's establish a basic truth about film. Nothing in the movies is real The sets are fake, the actors are pretending and reciting lines written for them, even the very essence of moving pictures is a lie - There's nothing moving, it's all an optical illusion. Take for instance this spinning wheel of circles - it looks like it's moving clockwise but if we compare each frame of this animation, we see that actually nothing is moving at all. We're just turning off circles sequentially but when we flash these illustrations one after the other our brains create a sense of movement. This is called the Phi Phenomenon first described by Max Wertheimer in Gestalt psychology in 1912. The human brain can perceive about 10-12 individual frames per second. Faster than that and our brains blend the images together into motion. So we've found our first frame rate - anything higher than 12 frames a second. Simple, right? Well not so fast... With film we need to stop the projection as we load up each frame otherwise we'll have a blurry mess. But Playing back 12 frames per second with 12 intermittent periods of black as the film advances will create an intolerable amount of flicker. How fast do you need to flash the images on screen to make the flicker disappear? According to Thomas Edison - the magic number is 46 times per second. At 46 frames per second, our persistence of vision kicks in and we won't notice the screen going dark between every frame. But 46 frames per second meant you had to run through a lot of film and that stuff isn't cheap. Film projectionists came up with a unique solution - let's flash the same frame on the screen more than once. Using double or triple bladed shutters, you could up the frame rate projected without running more film. Playing back a 16 frames per second film using a triple bladed shutter, we would flash each frame 3 times for a total of 48 frames per second - just above Edison's recommendation And that's our first commonly used frame rate for silent film - 16 frames per second. Or there abouts. The inconsistency of silent film frame rates have driven film historians and preservationists nuts, Early 20th century cameras and projectors were hand cranked and cinematographers would undercrank or overcrank the camera for effect, D.W. Griffith was notorious for undercranking his shots, shooting as low as 12 frames per second. Even Edison ignored his recommediation. Exhibitors also played fast and loose with the frame rate sometimes playing films back faster so they could squeeze in one extra showing at the end of the day. In reality, silent film frame rates could range anywhere from 14 to 26 frames per second but that was okay as it didn't really ruin the effect of motion pictures. That is until Sound came into the picture. The introduction of sound was one of the most drastic technological and artistic changes in all of motion picture history. Because sound was recorded as an optical track that ran alongside the film strip, recording and playing back film had to be kept at a very strict and even frame rate - and that frame rate would be established internationally in 1929 as 24 frames per second. Why 24? Well they found that the audio track just didn't have enough fidelity on a 16 frames per second system. Using 48 projected frames as our goal, they stepped up the next factorial - a 24 frame per second projection using a double bladed shutter to keep to desired 48 projected frames per second Why 24 and not 23 or 25? Well that comes down to basic math. 24 is number that can easily be divided 2, 3, 4, 6, and 8. So an editor can know right off the bad that half a second is 12 frames. A third is 8 frames, a quarter is 6 frames and so on. Why not a higher number like 30 or 32 which also have the same factors. Like I said earlier, this stuff ain't cheap. 24 frames was just the lowest easily divisible number that would work for sound. Ironically, the need for a consistent 24 frames per second created headaches in the sound department. The first sound cameras with their whirling electric motors were very noisy - forcing camera operators to shoot from a soundproof booth through a window. Technology and design did eventually catch up, but the 24 frames per second frame rate is still very much with us today - almost culturally ingrained into what we come to expect from the cinematic experience. Television had to deal with the same flicker issues that plagued motion picture film - but flashing the same frame on screen was not an option that was technologically feasible. Engineers more concerned about bandwidth, something they were trying to conserve with over-the-air television broadcast. The solution was developed independently by German Telefunken engineer Fritz Schr'ter in 1930 and in the US by RCA engineer Randall C. Ballard in 1932. To conserve bandwitdth and avoid flickering - each frame would be Interlaced - that is broken down into two alternating fields - an upper and a lower field. Each field would be created on the screen one after the other in a comb like pattern. In order to eliminate intermodulation - or a beating distortion caused by hum generated in the electrical current, the refresh rate was set to that of the AC power- in the United States, 60 hertz - so that each field is created in a 60th of a second resulting in a full 30 frames per second. But the story gets more complicated with the introduction of color. In 1948, the FCC put a moratorium on new television broadcast licenses as it tried to figure out what to do with the newly available UHF spectrum. The idea was introduce a new color system utilizing this higher frequency bandwidth and let the older VHF channels which the older tv sets could access die off. While they were trying to figure out what to do, TV sales went through the roof exploding from 1 million sets to just over 10 million in matter of a few years. The idea of letting older VHF TV stations die off became impractical. So now the race was on to create a color standard that was compatible with older black and white set. The NTSC, the board that created the first US TV standards, reconvened with RCA leading the way using a system first outlined by Georges Valensi in 1938. Breaking the image down into luminance and chrominance, broadcasters could embed a color signal as a subcarrier in the television signal. New color TVs could pick up and interpret this color subcarrier which would just be ignored by the older black and white TV sets. So far so good - but there was a small problem. The bandwidth used by the color subcarrier could potentially interfere with the audio signal causing intermodular beating. The solution would be to reduce the frame rate by a factor of .1% phasing the color and audio signals so that they would never full match up. In December 1953, the FCC adopted the RCA system for color broadcast and we go from 60 fields per second, down to 59.94 fields per second - for an effective 29.97 full frames per second. In a mathematically ingenious way of creating a signal for both color and black and white television sets, we have these odd ball frame rates that are still a big part of modern broadcasting standards. But that's only if you live in a country that uses the NTSC standard. In 1963 German television manufacturer Telefunken released PAL to the European broadcasting union with regular broadcasts in PAL starting in 1967. PAL was an format designed to solve the color problems that plagued NTSC and would work with the 50 hertz AC power used in Europe and elsewhere in the world. PAL along with a similar format SECAM run at 50i for an effective 25 frames per second. So how do we get the cinematic 24 frames a second to fit 60i video stream for say watching movies on video. Let’s walk through this process - First the 24 frames per second film is slowed down by 0.1% giving us 23.976 frames per second. Now if we do the math we see that we need to make 4 frames of 23.976 fit into 5 frames of 29.97 We do this spitting up the frames into fields using a 3:2 pulldown. The first frame is captured onto three fields - the upper, lower and then upper field - that’s one and one half frames. Then the next frame is captured on the following two fields, lower field and then upper. The next frame fills up the lower, then following upper and lower with the last frame filling the upper and lower. So we have 3 fields, 2 fields 3 fields 2 fields . That’s your 3:2, 3:2 cadence. Unfortunately this process isn’t perfect with resulting video stream having Telecine Judders every 3 frames which is especially noticeable on long slow camera movements. Reverse Telecine or Reverse 3:2 pulldown are technologies that work backgrounds, constructing a true 23.976 or 24p video stream from the 3:2 pulldown 60i footage. Most modern digital cameras can avoid the telecine process altogether and record 23.976 or straight 24 frame rates natively on to the hard drive but there are some workflows that run video through HDMI cables which are rated for 60i, may still utilize the 3:2 pull down. For telecining film onto PAL or SECAM’s 25 frames per second, the process is much simpler Using a 2:2 Pulldown, the 24 frame per second footage is sped up by 4% and each frame is transfered onto two fields - an upper and lower field. The increased speed raises the pitch of the audio by a noticeable 0.679 semitones or a little more than a quarter step musically but can be adjusted down using a pitch shifter. 24 frames has been the standard for narrative film for nearly a century now. But enterprising filmmakers have tried to push the temporal resolution or frame rate higher - trying to reduce motion blur to create smoother and more realistic look. One of the notable experiments in high frame rate is Showscan - a 70mm format developed by Visual Effects Wizard Douglas Trumbull - who’s famous for developing many of the visual effects for Stanley Kubrick’s 2001: A Space Odyssey. Running at 60 frames per second, Showscan created a stronger biometric response in test audiences, but the process just never found use in narrative film - being used mainly in motion simulator rides. More recently Trumbull has worked on a digital Showscan - shooting at 120 frames per second and adjusting the play back anywhere from 24 to 120 frames depending on the needs of the shot. But audiences just haven’t been warm to high frame rate in narrative film - the most recent experiment was Peter Jackson’s “The Hobbit” presented in 48 frames per second. Variety reviewed the film and complained that the “human actors seemed overlit and amplified in a way that many compared to modern sports broadcasts or daytime television. One projectionist complained that "it looked like a made-for-TV movie" But filmmakers at the technological bleeding edge, people like Peter Jackson or James Cameron, still push for higher frame rates. Will the future of narrative filmmaking leave 24p behind? The technology is already here - the new 4K standards are capable of up to 120 frames per second. While these high frame rates may be great for recreating the immediacy of sports broadcasts or really good 3D or for video games - to this filmmaker there’s just something cinematic about the cadence of 24 frames per second. For all it’s drawbacks in clarity and motion blur It’s just how we grew up watching movies. Maybe the next generation will grow up high frame rates and see 60p the new cinematic look - or maybe not. Frame rate is engine behind the cinematic lie - the magic trick that allows us to enter a world not quite real but real enough. A simple defining number shaped by psychology, economics and clever engineering all in service to the act of telling stories. So use it. Use that engine and go make something great. I’m John Hess and I’ll see you at FilmmakerIQ.com

Overview

When the moving picture is displayed, each frame is flashed on a screen for a short time (nowadays, usually 1/24, 1/25 or 1/30 of a second) and then immediately replaced by the next one. Persistence of vision blends the frames together, producing the illusion of a moving image.

The frame is also sometimes used as a unit of time, so that a momentary event might be said to last six frames, the actual duration of which depends on the frame rate of the system, which varies according to the video or film standard in use. In North America and Japan, 30 frames per second (fps) is the broadcast standard, with 24 frames/s now common in production for high-definition video shot to look like film. In much of the rest of the world, 25 frames/s is standard.

In systems historically based on NTSC standards, for reasons originally related to the Chromilog NTSC TV systems, the exact frame rate is actually (3579545 / 227.5) / 525 = 29.97002616 fps.[a] This leads to many synchronization problems which are unknown outside the NTSC world, and also brings about hacks such as drop-frame timecode.

In film projection, 24 fps is the normal, except in some special venue systems, such as IMAX, Showscan and Iwerks 70, where 30, 48 or even 60 frame/s have been used. Silent films and 8 mm amateur movies used 16 or 18 frame/s.

Physical film frames

In a strip of movie film, individual frames are separated by frame lines. Normally, 24 frames are needed for one second of film. In ordinary filming, the frames are photographed automatically, one after the other, in a movie camera. In special effects or animation filming, the frames are often shot one at a time.

The size of a film frame varies, depending on the still film format or the motion picture film format. In the smallest 8 mm amateur format for motion pictures film, it is only about 4.8 by 3.5 mm, while an IMAX frame is as large as 69.6 by 48.5 mm. The larger the frame size is in relation to the size of the projection screen, the sharper the image will appear.

The size of the film frame of motion picture film also depends on the location of the holes, the size of the holes, the shape of the holes. and the location and type of sound stripe.

The most common film format, 35 mm, has a frame size of 36 by 24 mm when used in a still 35 mm camera where the film moves horizontally, but the frame size varies when used for motion picture where the film moves vertically (with the exception of VistaVision and Technirama where the film moves horizontally). Using a 4-perf pulldown, there are exactly 16 frames in one foot of 35 mm film, leading to film frames sometimes being counted in terms of "feet and frames". The maximum frame size is 18 by 24 mm, (silent/full aperture), but this is significantly reduced by the application of sound track(s). A system called KeyKode is often used to identify specific physical film frames in a production.

Video frames

Historically, video frames were represented as analog waveforms in which varying voltages represented the intensity of light in an analog raster scan across the screen. Analog blanking intervals separated video frames in the same way that frame lines did in film. For historical reasons, most systems used an interlaced scan system in which the frame typically consisted of two video fields sampled over two slightly different periods of time. This meant that a single video frame was usually not a good still picture of the scene, unless the scene being shot was completely still.

With the dominance of digital technology, modern video systems now represent the video frame as a rectangular raster of pixels, either in an RGB color space or a color space such as YCbCr, and the analog waveform is typically found nowhere other than in legacy I/O[clarification needed] devices.

Standards for the digital video frame raster include Rec. 601 for standard-definition television and Rec. 709 for high-definition television.

Video frames are typically identified using SMPTE time code.

Line and resolution

The frame is composed of picture elements just like a chess board. Each horizontal set of picture elements is known as a line. The picture elements in a line are transmitted as sine signals where a pair of dots, one dark and one light can be represented by a single sine. The product of the number of lines and the number of maximum sine signals per line is known as the total resolution of the frame. The higher the resolution the more faithful the displayed image is to the original image. But higher resolution introduces technical problems and extra cost. So a compromise should be reached in system designs both for satisfactory image quality and affordable price.

Viewing distance

The key parameter to determine the lowest resolution still satisfactory to viewers is the viewing distance, i.e. the distance between the eyes and the monitor. The total resolution is inversely proportional to the square of the distance. If d is the distance, r is the required minimum resolution and k is the proportionality constant which depends on the size of the monitor;

Since the number of lines is approximately proportional to the resolution per line, the above relation can also be written as

where n is the number of lines. That means that the required resolution is proportional to the height of the monitor and inversely proportional to the viewing distance.

Moving picture

In moving picture (TV) the number of frames scanned per second is known as the frame rate. The higher the frame rate, the better the sense of motion. But again, increasing the frame rate introduces technical difficulties. So the frame rate is fixed at 25 (System B/G) or 29.97 (System M). To increase the sense of motion it is customary to scan the very same frame in two consecutive phases. In each phase only half of the lines are scanned; only the lines with odd numbers in the first phase and only the lines with even numbers in the second phase. Each scan is known as a field. So the field rate is two times the frame rate.

Example (System B)

In system B the number of lines is 625 and the frame rate is 25. The maximum video bandwidth is 5 MHz.[1] The maximum number of sine signals the system is theorically capable of transmitting is given as follows:

The system is able to transmit 5 000 000 sine signals in a second. Since the frame rate is 25, the maximum number of sine signals per frame is 200 000. Dividing this number by the number of lines gives the maximum number of sine signals in a line which is 320. (Actually about 19% of each line is devoted to auxiliary services. So the number of maximum useful sine signals is about 260.)

Still frame

A badly chosen still can give a misleading impression.
This still may imply that the content concerns the letter W (thumbtime=1).
A better preview, which implies an interview, for the same video (thumbtime=58)

A still frame is a single static image taken from a film or video, which are kinetic (moving) images. Still frames are also called freeze frame, video prompt, preview or misleadingly thumbnail, keyframe, poster frame,[2][3] or screen shot/grab/capture/dump. Freeze frames are widely used on video platforms and in video galleries, to show viewers a preview or a teaser. Many video platforms have a standard to display a frame from mid-time of the video. Some platforms offer the option to choose a different frame individually.[4][5]

Video and film artists sometimes use still frames within the video/film to achieve special effects, like freeze-frame shots or still motion.[6]

Investigations

For criminal investigations it has become a frequent use to publish still frames from surveillance videos in order to identify suspect persons and to find more witnesses.[7] Videos of the J.F. Kennedy assassination have been often discussed frame-by-frame for various interpretations.[8] For medical diagnostics it is very useful to watch still frames of Magnetic resonance imaging videos.[9]

Fourth wall usage

Some humor in animation is based on the fourth wall aspect of the film frame itself, with some animation showing characters leaving what is assumed to be the edge of the film or the film malfunctioning. This latter one is used often in films as well. This hearkens back to some early cartoons, where characters were aware that they were in a cartoon, specifically that they could look at the credits and be aware of something that isn't part of the story as presented. These jokes include:

  • Split frames – Where the fourth wall is broken by two frames, the lower half of the previous frame and the upper part of the next frame, showing at once, usually showing frame lines, with jokes involving them including a character crossing the frame itself.
  • Film break – A famous form of the joke, where the film either snaps or is deliberately broken, with often the fourth wall coming into play during this period when, rightfully, there should be nothing on screen.
  • Gate hair – A famous form of joke where the animator intentionally places fake "gate hairs" within the frame, which one of the animated characters plucks and removes from the frame.
  • Editorial marks – Where those marks which an editor would normally employ on a "work print" to indicate the intended presence of a fade or a dissolve or a "wipe" to the SFX department are animated, and the film follows suit, or doesn't, depending upon the intended effect.
  • Cue marks – Where those marks, usually circular for non-Technicolor titles and "serrated" for Technicolor titles to indicate a reel changeover are animated for a humorous effect. This could also be employed for the famous "false ending" effect, employed even today in popular songs. For Inglourious Basterds, the cue marks for the reel changes of the Nation's Pride pseudo-documentary employed exceptionally large scribed circles with a large "X" scribed within it—marks which would never be utilized in actual editorial practice (motor and changeover cue marks are supposed to be clearly visible to the projectionist, but not obvious to the audience).
  • Exiting the frame – This joke, an extension of the split frames joke, has characters depart from the sides of the frame, sometimes finding themselves falling out of the cartoon entirely.

See also

Notes

  1. ^ In actual practice, the master oscillator is 14.31818 MHz, which is divided by 4 to give the 3.579545 MHz color "burst" frequency, which is further divided by 455 to give the 31468.5275 KHz "equalizing pulse" frequency, this is further divided by 2 toorizontal line rate), the "equalizing pulse" frequency is divided by 525 to give the 59.9401 Hz "vertical drive" frequency, and this is further divided by 2 to give the 29.9700 vertical frame rate. "Equalizing pulses" perform two essential functions: 1) their use during the vertical retrace interval allows for the vertical synch to be more effectively separated from the horizontal synch, as these, along with the video itself, are an example of "in band" signaling, and 2) by alternately including or excluding one "equalizing pulse", the required half-line offset necessary for interlaced video may be accommodated.

References

  1. ^ Reference Data for Radio Engineers, ITT Howard W.Sams Co., New York, 1977, section 30
  2. ^ Microsoft: Add a poster frame to your video, retrieved 29 June 2014
  3. ^ Indezine: Poster Frames for Videos in PowerPoint 2010 for Windows, retrieved 29 June 2014
  4. ^ Vimeo: How do I change the thumbnail of my video?, retrieved 29 June 2014
  5. ^ MyVideo: Editing my video, retrieved 29 June 2014
  6. ^ Willie Witte: SCREENGRAB, retrieved 29 June 2014
  7. ^ Wistv: Assaults, shooting in Five Points under investigation, retrieved 29 June 2014
  8. ^ "The Other Shooter: The Saddest and Most Expensive 26 Seconds of Amateur Film Ever Made | Motherboard". motherboard.vice.com. Archived from the original on 30 November 2012. Retrieved 11 January 2022.
  9. ^ Lister Hill National Center for Biomedical Communications: A classic diagnosis with a new ‘spin’, retrieved 29 June 2014

External links

This page was last edited on 4 December 2023, at 04:45
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.