View Single Post
Old 15th January 2011
  #63
Lives for gear
 
minister's Avatar
Wow, here is the resurrection of a thread with a lot of misinformation in it.

The origin of 44.1 and 48 had nothing to do with marketing, or with an Executive wanting to be able to hear the entire Beethoven's 9th on a disc. These are the old wives tales of the audio industry.

I don't know if anyone noticed cinealta's posts, they pretty much give the answers. These things were actually decided by engineers. As he points out, the origin of 29.97 was to introduce Color into a Black and White signal infrastructure. But putting it at 30 fps caused a visible beating in the signal, so they pulled it down to 29.97 sub carrier so they could have Color without causing the B&W to beat.

The origin of 44.1kHz for the compact disc is due to early days in PCM digital where hard drives and such did not have the 1 Mbps per audio channel bandwidth or the capacity to store this much information. So they turned to a "pseudo-video" black and white binary waveform solution on video tape. This "binary" system is constrained by field rate and field structure. (As you know, in video, a frame is comprised of fields). To video was how the first Compact Discs were pre-mastered before they went to the CD manufacturing plant. And thus we have two standards to adhere to: 525 lines at 60 Hz and 625 lines at 50 Hz. Remember, this is B&W, so we are talking 30fps, NOT NTSC color.

In 60 Hz, there are 35 blank lines, leaving you 490 lines per frame or 245 lines per field, so the calculation for the sampling rate is simply : 60 X 245 X 3 = 44.1

With 50 Hz video, there are 37 blank lines, giving you 588 active lines per frame, or 294 per field. The math is thus : 50 X 294 X3 = 44.1 Khz.

Early Mastering Engineers in the U.S. had to make sure their machines were locked to a 60 Hz signal and not an NTSC Blackburst signal.


As to 48kHz, actually 60 kHz was considered to avoid any "leap frames", but remember we are talking very early PCM days (for the TV SR, this is the late '70's), so 60 was difficult to obtain as a sampling frequency and seen as wastefully high to the early digital engineers. Soundstream and 3M were using 50 kHz, and proposed taking 5005 samples over 6 fields of NTSC. But it did require "leap frames" for 525/60 monochrome B&W TV. The number of digital audio samples per video frame is an important number. If it is not an integer, then then you have put a different amount in some of the frames relative to the rest. If you create a "leap frame" (think like "leap years") you have to give it a digital flag. BBC/EBU was using 32kHz at the time, but this was not acceptable, so it favored 48 kHz because of the simple 3:2 relationship with 32 kHz and it caused only leap frames in NTSC, which was not the video standard used in Europe. Also, by this time, Decca was creating producing software that operated at 48k. Let's not forget that the film rate is 24 Hz, easy to work with this rate and 48k.

This happend all around 1981 for the 48k standard to be adopted.