To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
Show all languages
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.

Multiple sub-Nyquist sampling encoding

From Wikipedia, the free encyclopedia

MUSE (Multiple sub-Nyquist Sampling Encoding),[1] commercially known as Hi-Vision (a contraction of HIgh-definition teleVISION)[1] was a Japanese analog high-definition television system, with design efforts going back to 1979.[2]

It used dot-interlacing and digital video compression to deliver 1125 line, 60 field-per-second (1125i60) [2] signals to the home. The system was standardized as ITU-R recommendation BO.786[3] and specified by SMPTE 260M,[4] using a colorimetry matrix specified by SMPTE 240M.[5] As with other analog systems, not all lines carry visible information. On MUSE there are 1035 active interlaced lines, therefore this system is sometimes also mentioned as 1035i.[6] It employed 2-dimensional filtering, dot-interlacing, motion-vector compensation and line-sequential color encoding with time compression to "fold" an original 20 MHz bandwidth source signal into just 8.1 MHz.

Japan began broadcasting wideband analog HDTV signals in December 1988,[7] initially with an aspect ratio of 2:1. The Sony HDVS high-definition video system was used to create content for the MUSE system.[2] By the time of its commercial launch in 1991, digital HDTV was already under development in the United States. Hi-Vision was mainly broadcast by NHK through their BShi satellite TV channel.

On May 20, 1994, Panasonic released the first MUSE LaserDisc player.[8] There were also a number of players available from other brands like Pioneer and Sony

Hi-Vision continued broadcasting in analog until 2007.

YouTube Encyclopedic

  • 1/5
    7 567
    632 999
    1 273 678
    48 268
  • BBC demo's HDTV in 1986 & details the technology
  • HD Laserdisc - HD in ‘93 (Part 1)
  • Hi-Vision Clip 1
  • HD Laserdisc - HD in ‘93 (Part 2)
  • A Digital Media Primer For Geeks by Christopher "Monty" Montgomery and



MUSE was developed by NHK Science & Technology Research Laboratories in the 1980s as a compression system for Hi-Vision HDTV signals.

  • Japanese broadcast engineers immediately rejected conventional vestigial sideband broadcasting.
  • It was decided early on that MUSE would be a satellite broadcast format as Japan economically supports satellite broadcasting.
Modulation research
  • Japanese broadcast engineers had been studying the various HDTV broadcast types for some time.[9] It was initially thought that SHF, EHF or optic fiber would have to be used to transmit HDTV due to the high bandwidth of the signal, and HLO-PAL would be used for terrestrial broadcast.[10][11] HLO-PAL is a conventionally constructed composite signal (based on for luminance and for chroma like NTSC and PAL) and uses a phase alternating by line with half-line offset carrier encoding of the wideband/narrowband chroma components. Only the very lowest part of the wideband chroma component overlapped the high-frequency chroma. The narrowband chroma was completely separated from luminance. PAF, or phase alternating by field (like the first NTSC color system trial) was also experimented with, and it gave much better decoding results, but NHK abandoned all composite encoding systems. Because of the use of satellite transmission, Frequency modulation (FM) should be used with power-limitation problem. FM incurs triangular noise, so if a sub-carrierred composite signal is used with FM, demodulated chroma signal has more noise than luminance. Because of this, they looked [12] at other options, and decided[10] to use component emission for satellite. At one point, it seemed that FCFE (Frame Conversion Fineness Enhanced), I/P conversion compression system,[13] would be chosen, but MUSE was ultimately picked.[14]
  • Separate transmission of and components was explored. The MUSE format which is transmitted today, uses separated component signalling. The improvement in picture quality was so great, that the original test systems were recalled.
  • One more power saving tweak was made: lack of visual response to low frequency noise allows significant reduction in transponder power if the higher video frequencies are emphasised prior to modulation at the transmitter and de-emphasized at the receiver.

Technical specifications

MUSE's "1125 lines" are an analog measurement, which includes non-video scan lines taking place while a CRT's electron beam returns to the top of the screen to begin scanning the next field. Only 1035 lines have picture information. Digital signals count only the lines (rows of pixels) that have actual detail, so NTSC's 525 lines become 486i (rounded to 480 to be MPEG compatible), PAL's 625 lines become 576i, and MUSE would be 1035i. To convert the bandwidth of Hi-Vision MUSE into "conventional" lines-of-horizontal resolution (as is used in the NTSC world), multiply 29.9 lines per MHz of bandwidth. (NTSC and PAL/SECAM are 79.9 lines per MHz) - this calculation of 29.9 lines works for all current HD systems including Blu-ray and HD-DVD. So, for MUSE, during a still picture, the lines of resolution would be: 598-lines of luminance resolution per-picture-height. The chroma resolution is: 209-lines. The horizontal luminance measurement approximately matches the vertical resolution of a 1080 interlaced image when the Kell factor and interlace factor are taken into account.

Key features of the MUSE system:

  • Scanlines (total/active): 1,125/1,035[5]
  • Pixels per line (fully interpolated): 1122 (still image)/748 (moving)
  • Reference clock periods: 1920 per active line[5]
  • Interlaced ratio: 2:1[5]
  • Aspect ratio 16:9[5]
  • Refresh rate: 59.94 or 60 frames per second[5]
  • Sampling frequency for broadcast: 16.2 MHz
  • Vector motion compensation: horizontal ± 16 samples (32.4 MHz clock) / frame, a vertical line ± 3 / Field
  • Audio: "DANCE" discrete 2- or 4-channel digital audio system: 48 kHz/16 bit (2 channel stereo: 2 front channels)/32 kHz/12 bit (4 channel surround: 3 front channels + 1 back channel)
  • DPCM Audio compression format: DPCM quasi-instantaneous companding
  • Required bandwidth: 27 MHz[1]


The MUSE luminance signal encodes , specified as the following mix of the original RGB color channels:[3]

The chrominance signal encodes and difference signals. By using these three signals (, and ), a MUSE receiver can retrieve the original RGB color components using the following matrix:[3]

The system used a colorimetry matrix specified by SMPTE 240M[5][15][16] (with coefficients corresponding to the SMPTE RP 145 primaries, also known as SMPTE-C, in use at the time the standard was created).[17] The chromaticity of the primary colors and white point are:[16][5]

MUSE colorimetry (SMPTE 240M / SMPTE "C")
Primaries CIE 1931 coordinates
x y
Red 0.630 0.340
Green 0.310 0.595
Blue 0.155 0.070
White Point D65 0.3127 0.3290

The luma () function is specified as:[5]

The blue color difference () is amplitude-scaled (), according to:[5]

The red color difference () is amplitude-scaled (), according to:[5]

Signal and Transmission

MUSE is a 1125 line system (1035 visible), and is not pulse and sync compatible with the digital 1080 line system used by modern HDTV. Originally, it was a 1125 line, interlaced, 60 Hz, system with a 5/3 (1.66:1) aspect ratio and an optimal viewing distance of roughly 3.3H.

For terrestrial MUSE transmission a bandwidth limited FM system was devised. A satellite transmission system uses uncompressed FM.

The pre-compression bandwidth for is 20 MHz, and the pre-compression bandwidth for chrominance is a 7.425 MHz carrier.

The Japanese initially explored the idea of frequency modulation of a conventionally constructed composite signal. This would create a signal similar in structure to the composite video NTSC signal - with the (luminance) at the lower frequencies and the (chrominance) above. Approximately 3 kW of power would be required, in order to get 40 dB of signal to noise ratio for a composite FM signal in the 22 GHz band. This was incompatible with satellite broadcast techniques and bandwidth.

To overcome this limitation, it was decided to use a separate transmission of and . This reduces the effective frequency range and lowers the required power. Approximately 570 W (360 for and 210 for ) would be needed in order to get a 40 dB of signal to noise ratio for a separate FM signal in the 22 GHz satellite band. This was feasible.

There is one more power saving that appears from the character of the human eye. The lack of visual response to low frequency noise allows significant reduction in transponder power if the higher video frequencies are emphasized prior to modulation at the transmitter and then de-emphasized at the receiver. This method was adopted, with crossover frequencies for the emphasis/de-emphasis at 5.2 MHz for and 1.6 MHz for . With this in place, the power requirements drop to 260 W of power (190 for and 69 for ).

Sampling systems and ratios

The subsampling in a video system is usually expressed as a three part ratio. The three terms of the ratio are: the number of brightness (luma) samples, followed by the number of samples of the two color (chroma) components and , for each complete sample area. Traditionally the value for brightness is always 4, with the rest of the values scaled accordingly.

A sampling of 4:4:4 indicates that all three components are fully sampled. A sampling of 4:2:0, for example, indicated that the two chroma components are sampled at half the horizontal sample rate of luma - the horizontal chroma resolution is halved. This reduces the bandwidth of an uncompressed video signal by one-third.

MUSE implements a similar system as a means of reducing bandwidth, but instead of static sampling, the actual ratio varies according to the amount of motion on the screen. In practice, MUSE sampling will vary from approximately 4:2:1 to 4:0.5:0.25, depending on the amount of movement. Thus the red-green chroma component has between one-half and one-eighth the sampling resolution of the luma component , and the blue-yellow chroma has half the resolution of red-green.

Audio subsystem

MUSE had a discrete 2- or 4-channel digital audio system called "DANCE", which stood for Digital Audio Near-instantaneous Compression and Expansion.

It used differential audio transmission (differential pulse-code modulation) that was not psychoacoustics-based like MPEG-1 Layer II. It used a fixed transmission rate of 1350 kbp/s. Like the PAL NICAM stereo system, it used near-instantaneous companding (as opposed to Syllabic-companding like the dbx system uses) and non-linear 13-bit digital encoding at a 32 kHz sample rate.

It could also operate in a 48 kHz 16-bit mode. The DANCE system was well documented in numerous NHK technical papers and in a NHK-published book issued in the USA called Hi-Vision Technology.[18]

The DANCE audio codec was superseded by Dolby AC-3 (a.k.a. Dolby Digital), DTS Coherent Acoustics (a.k.a. DTS Zeta 6x20 or ARTEC), MPEG-1 Layer III (a.k.a. MP3), MPEG-2 Layer I, MPEG-4 AAC and many other audio coders. The methods of this codec are described in the IEEE paper:[19]

Real world performance issues

MUSE had a four-field dot-interlacing cycle, meaning it took four fields to complete a single MUSE frame. Thus, only stationary images were transmitted at full resolution. However, as MUSE lowers the horizontal and vertical resolution of material that varies greatly from frame to frame, moving images were blurred. Because MUSE used motion-compensation, whole camera pans maintained full resolution, but individual moving elements could be reduced to only a quarter of the full frame resolution. Because the mix between motion and non-motion was encoded on a pixel-by-pixel basis, it wasn't as visible as most would think. Later, NHK came up with backwards compatible methods of MUSE encoding/decoding that greatly increased resolution in moving areas of the image as well as increasing the chroma resolution during motion. This so-called MUSE-III system was used for broadcasts starting in 1995 and a very few of the last Hi-Vision MUSE LaserDiscs used it (A River Runs Through It is one Hi-Vision LD that used it). During early demonstrations of the MUSE system, complaints were common about the decoder's large size, which led to the creation of a miniaturized decoder.[1]

Shadows and multipath still plague this analog frequency modulated transmission mode.

Japan has since switched to a digital HDTV system based on ISDB, but the original MUSE-based BS Satellite channel 9 (NHK BS Hi-vision) was broadcast until September 30, 2007.

Cultural and geopolitical impacts

Internal reasons inside Japan that led to the creation of Hi-Vision
  • (1940s): The NTSC standard (as a 525 line monochrome system) was imposed by the US occupation forces.
  • (1950s-1960s): Unlike Canada (that could have switched to PAL), Japan was stuck with the US TV transmission standard regardless of circumstances.
  • (1960s-1970s): By the late 1960s many parts of the modern Japanese electronics industry had gotten their start by fixing the transmission and storage problems inherent with NTSC's design.
  • (1970s-1980s): By the 1980s there was spare engineering talent available in Japan that could design a better television system.

MUSE, as the US public came to know it, was initially covered in the magazine Popular Science in the mid-1980s. The US television networks did not provide much coverage of MUSE until the late 1980s, as there were few public demonstrations of the system outside Japan.

Because Japan had its own domestic frequency allocation tables (that were more open to the deployment of MUSE) it became possible for this television system to be transmitted by Ku Band satellite technology by the end of the 1980s.

The US FCC in the late 1980s began to issue directives that would allow MUSE to be tested in the US, providing it could be fit into a 6 MHz System-M channel.

The Europeans (in the form of the European Broadcasting Union (EBU)) were impressed with MUSE, but could never adopt it because it is a 60 Hz TV system, not a 50 Hz system that is standard in Europe and the rest of the world (outside the Americas and Japan).

The EBU development and deployment of B-MAC, D-MAC and much later on HD-MAC were made possible by Hi-Vision's technical success. In many ways MAC transmission systems are better than MUSE because of the total separation of colour from brightness in the time domain within the MAC signal structure.

Like Hi-Vision, HD-MAC could not be transmitted in 8 MHz channels without substantial modification – and a severe loss of quality and frame rate. A 6 MHz version Hi-Vision was experimented with in the US,[7] but it too had severe quality problems so the FCC never fully sanctioned its use as a domestic terrestrial television transmission standard.

The US ATSC working group that had led to the creation of NTSC in the 1950s was reactivated in the early 1990s because of Hi-Vision's success. Many aspects of the DVB standard are based on work done by the ATSC working group, however most of the impact is in support for 60 Hz (as well as 24 Hz for film transmission) and uniform sampling rates and interoperable screen sizes.

Device support for Hi-Vision

Hi-Vision LaserDiscs

On May 20, 1994, Panasonic released the first MUSE LaserDisc player.[8] There were a number of MUSE LaserDisc players available in Japan: Pioneer HLD-XØ, HLD-X9, HLD-1000, HLD-V500, HLD-V700; Sony HIL-1000, HIL-C1 and HIL-C2EX; the last two of which have OEM versions made by Panasonic, LX-HD10 and LX-HD20. Players also supported standard NTSC LaserDiscs. Hi-Vision LaserDiscs are extremely rare and expensive.[7]

The HDL-5800 Video Disc Recorder recorded both high definition still images and continuous video onto an optical disc and was part of the early analog wideband Sony HDVS high-definition video system which supported the MUSE system. Capable of recording HD still images and video onto either the WHD-3AL0 or the WHD-33A0 optical disc; WHD-3Al0 for CLV mode (up to 10 minute video or 18,000 still frames per side); WHD-33A0 for CAV mode (up to 3 minute video or 5400 still frames per side).

The HDL-2000 was a full band high definition video disc player.[7]

Video cassettes

W-VHS allowed home recording of Hi-Vision programmes.

See also

The analog TV systems these systems were meant to replace:

Related standards:


  1. ^ a b c d "DBNSTJ : Realization of High-Definition Television by MUSE System".
  2. ^ a b c Cianci, Philip J. (January 10, 2014). High Definition Television: The Creation, Development and Implementation of HDTV Technology. McFarland. ISBN 9780786487974 – via Google Books.
  3. ^ a b c "MUSE system for HDTV broadcasting-satellite services" (PDF). International Telecommunication Union. 1992. ITU-R BO.786.
  4. ^ "ST 240:1999 - SMPTE Standard - For Television — 1125-Line High-Definition Production Systems — Signal Parameters". St 240:1999: 1–7. November 30, 1999. doi:10.5594/SMPTE.ST240.1999. ISBN 978-1-61482-389-6 – via IEEE Xplore.
  5. ^ a b c d e f g h i j k ANSI/SMPTE 240M-1995 - Signal Parameters 1125-Line High-Definition Production Systems (PDF). SMPTE. 1995.
  6. ^ Poynton, Charles (January 3, 2003). Digital Video and HD: Algorithms and Interfaces. Elsevier. ISBN 9780080504308 – via Google Books.
  7. ^ a b c d "MUSE LaserDisc". Retrieved 2022-10-19.
  8. ^ a b "MUSE HI-DEF LaserDisc Players". LaserDisc UK Web Site. Archived from the original on 30 April 2016. Retrieved 10 October 2021.
  9. ^ Jun-ichi, Ishida; Ninomiya, Yuichi (December 19, 1982). "3. Signal and Transmission Equipment for High-Definition TV". The Journal of the Institute of Television Engineers of Japan. 36 (10): 882–888. doi:10.3169/itej1978.36.10_882 – via CiNii.
  10. ^ a b Fujio, Takashi (December 19, 1980). "High-Definition Television System for Future : Desirable Standard, Signal Form and Broadcasting System". ITE Technical Report. 4 (28): 19–24. doi:10.11485/tvtr.4.28_19 – via CiNii.
  11. ^ Fujio, Takashi (December 19, 1981). "High Definitional Television". The Journal of the Institute of Television Engineers of Japan. 35 (12): 1016–1023. doi:10.3169/itej1978.35.1016 – via CiNii.
  12. ^ Komoto, Taro; Ishida, Junichi; Hata, Masaji; Yasunaga, Keiichi (December 19, 1979). "YC Separate Transmission of high Definition Television Signal by BSE". ITE Technical Report. 3 (26): 61–66. doi:10.11485/tvtr.3.26_61 – via CiNii.
  13. ^ FUJIO, Takashi (December 19, 1984). "High-Definition Television System". ITE Technical Report. 8 (1): 33–39. doi:10.11485/tvtr.8.1_33 – via CiNii.
  14. ^ FUJIO, Takashi (August 19, 2006). "Rowing a Boat to the HDTV New World". The Journal of the Institute of Electronics, Information and Communication Engineers. 89 (8): 728–734 – via CiNii.
  15. ^ "SMPTE-240M Y'PbPr".
  16. ^ a b "Detailed Colorspace Descriptions".
  17. ^ Charles A. Poynton, Digital Video and HDTV: Algorithms and Interfaces, Morgan–Kaufmann, 2003. online
  18. ^ NHK (1993). High Definition Television - Hi Vision Technology. ISBN 0-442-00798-1.
  19. ^ Naganawa, K.; Hori, Y.; Yanase, S.; Itoh, N.; Asano, Y. (August 19, 1991). "A single-chip audio signal processor for HDTV receiver". IEEE Transactions on Consumer Electronics. 37 (3): 677–683. doi:10.1109/30.85585. S2CID 62603128.

External links

This page was last edited on 15 February 2024, at 17:23
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.