To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Languages
Recent
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Virtual cinematography

From Wikipedia, the free encyclopedia

Virtual cinematography is the set of cinematographic techniques performed in a computer graphics environment. This includes a wide variety of subjects like photographing real objects, often with stereo or multi-camera setup, for the purpose of recreating them as three-dimensional objects and algorithms for automated creation of real and simulated camera angles.

Virtual cinematography allows among other things physically impossible camera movements, in example the so-called bullet time scenes in The Matrix films, the flow-motion camera movements in David Fincher films, the camera runs and the crowd simulations as can be seen in The Lord of the Rings and the airport terminal that doesn't exist looking very real and existing in the Pan Am that aired in 2011–2012.

YouTube Encyclopedic

  • 1/3
    Views:
    2 209
    770
    71 726
  • Virtual Cinematography Tools Plugin for Maya v2.0 Alpha
  • GCI Course Spotlight: "Virtual Cinematography"
  • NewTek LightWave 10 Virtual Cinematography

Transcription

Contents

History

Matrix trilogy

Virtual Cinematography came into prominence following the release of The Matrix trilogy especially the last two, The Matrix Reloaded and The Matrix Revolutions. The directors, Andy and Larry Wachowski, tasked visual effects supervisor John Gaeta (who coined the phrase) with developing techniques to allow for virtual "filming" of realistic computer-generated imagery. Gaeta, along with George Borshukov, Kim Libreri and his crew at ESC Entertainment succeeded in creating photo-realistic CGI versions of performers, sets, and action. Their work was based on the findings of Paul Debevec et al. of the acquisition and subsequent simulation of the reflectance field over the human face which was acquired using the simplest of light stages in 2000.[1] Famous scenes that would have been impossible or exceedingly time consuming to do within traditional cinematography include the burly brawl in The Matrix Reloaded where Neo fights up-to-100 Agent Smiths and at the start of the end showdown in The Matrix Revolutions where Agent Smith's cheekbone gets punched in by Neo [2]leaving the digital look-alike unhurt.

The Wachowsky Brothers films

In the Matrix trilogy, filmmakers used virtual cinematography to heavily attract the audience, Bill Pope the DP uses this tool in a much more subtle manner. However these scenes are also attractive, and most of them reach a quite high level of realism, thus the audience usually don't notice that they are actually watching a shot which was created entirely by visual effects artists using 3D computer graphics tools[3].

Other films

Another series of films of the same era that utilizes virtual cinematography heavily with trademark typical virtual camera runs that could not be achieved with conventional cinematography is The Lord of the Rings filmatization. Other studios and graphics houses with ability or near the ability to do digital look-alikes are in the early 2000s include: Sony Pictures Imageworks (Spider-Man 2 and 3 2004, 2007), Square Pictures (Animatrix - Final Flight of the Osiris prequel to Matrix Reloaded 2003), Image Metrics (Digital Emily 2009) and then later on in 2010s Disney (the antagonist CLU in movie Tron: Legacy 2010) and Activision (Digital Ira 2013)

Virtual Cinematography has evolved greatly since this time and can be found in use prolifically across a spectrum of digital media formats. Subsets technology components of Virtual Cinematography include "computational photography, machine vision, sensor based volumetric video and image based rendering.

Methods

Once the 3D geometry, textures, reflectance field and motion capture are done and an adequate capture and simulation of the BSDF over all needed surfaces and the virtual content has been assembled into a scene within a 3D engine, it can then be creatively composed, relighted and re-photographed from other angles by a virtual camera as if the action was happening for the first time.

Geometry can be acquired from a 3D scanner or from multiple photographs using machine vision technology called photogrammetry. The people at ESC entertainment used Arius3D scanner in the making of the Matrix sequels to acquire details of size of 100 µm such as fine wrinkles and skin pores.[1] Textures can be captured easily from photographs. Reflectance field is captured into BSDF's over the surface of the XYZRGB object using a light stage. Dense aka. markerless motion capture and multi-camera setup (similar to the bullet time rig) photogrammetric capture technique called optical flow was used in making digital look-alikes for the Matrix movies.[4]

Modification, re-direction and enhancements to the scene are possible as well. The rendered result can appear highly realistic, or rather, "photo-realistic". Virtual cinematography is the creation process. Virtual effects are stylistic modifications being applied within this format. Virtual cinema is the result. Its main applications are in movie, video game, leisure and disinformation industries.

The art of "photographing" any computer-generated imagery content with a virtual camera is still virtual cinematography by means of taking a 2D photo of a three-dimensional model, where as virtual cinematography is a capturing process of four-dimensional (XYZT) events into higher dimension functions such as a bidirectional texture function (7D) or a collection of BSDF over the target.

The advent of virtual worlds has given a new push to this concept since they allow the creation of real-time animation by using camera moves and avatar control techniques that are not possible by using traditional film-making methods.

Retroactively obtaining camera movement data from the captured footage is known as match moving or camera tracking. It is a form of motion estimation.

Software

See also

References

  1. ^ a b Debevec, Paul; Tim Hawkins; Chris Tchou; Haarm-Pieter Duiker; Westley Sarokin; Mark Sagar (2000). "Acquiring the reflectance field of a human face". ACM. doi:10.1145/344779.344855. Retrieved 2013-07-21.
  2. ^ George Borshukov, Presented at Imagina’04. "Making of The Superpunch" (PDF).
  3. ^ kaptainkristian. "The Wachowsky sisters films - Invisible Details".
  4. ^ Debevec, Paul; J. P. Lewis (2005). "Realistic human face rendering for "The Matrix Reloaded"". ACM. doi:10.1145/1198555.1198593. Retrieved 2013-08-10.

Further reading

This page was last edited on 29 October 2018, at 14:25
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.