Jag kan ej verifiera om allt han skriver stämmer helt men tänkte att det kunde vara intressant att få dissekerat och delgivet här. Dock så är grunden för hans argument ang CD formatets problem de man ofta hört, och inte verkar det ha varit ett nytt seriösare format som lanserades ur ett hifi perspektiv, mera ett annat format i annat format och med enklare handhavande!?
Här är texten av Britbrian:
As I recall, Phillips & Sony never intended the CD's quality to be better than vinyl, just good enough to replace cassette which was a pretty low bar.
Since audio & video lossy or lossless compression algorithms weren't available, the CD format was constrained to use uncompressed data and squeeze some 74mins on to a 12cm disc.
There were only enough bits for the 44.1KHz sample rate / 16bit 96db audio plus bits for robust error correction and servo data. I don't recall why 74mins limit came from.
The sample rate was considered good enough and it probably still is for most ears like mine.
Since I worked in design of digital sound processing equipment way back, I thought some back ground about the distortion affect might be of interest.
Two main problem with inadequate sample rates arise
1) aliasing and 2) phasing distortion in the output reconstruction low pass filters
1) aliasing
Its like watching an old western movie of a wagon accelerating to say 20mph. However the wheels appear to speed up to only 5mph then slow down to 0 when its really 10mph then appear to speed up to 5mph when its really 15mph then back to 0 when its really 20mph. This visual aliasing was cyclical because the sample rate of the camera was woefully to low so we could see several aliasing cycles.
With CD audio, this aliasing happens worse & worse for any high frequency harmonics that get thru the master recordings low pass filter which are still above 22KHz. If the low pass filter rolls off at 11KHz then it can gradually atenuate say 24db/octave. If it rolls off at 15KHz then it has to be much sharper. Even so, there may still be tiny harmonic levels above 22KHz that will get under sampled just like with the movie analogy and reflect back in an alias form
So any 23KHz would reconstruct as 21KHz probably still inaudible to everyone
any 25KHz would reconstruct as 19KHz
any 28KHz would reconstruct as 16KHz which is audible to good ears
Even if they are all inaudible, critical ears could hear a fatigueing or shrilling distortion because the reflected harmonics bear no relation to actual content with same frequencies.
The requirement for such sharp filtering in the master recording of itself could introduce phasing distortion and certainly loss of information which is audible to more ordinary ears.
2) Phase distortion in output reconstruction filter
The earliest CD players used individual Digital to Analog Converter chips with their own simple filters. These first filters introduced a phase distortion probably most noticeable as poorer stereo imagery.
Then for a period, they cost reduced the design to make one DAC do the job of two but sharing the low pass filter introduced a horrible channel phase error thats much worse than the already present phase distortion.
Once they introduced two times over sampling, the 22KHz reflection point moved to 44KHz and similarly
four times over sampling moves the 22KHz reflection point to 88KHz
Now the anti alias component for a 20KHz intended harmonic comes back at 68KHz when 2x oversampled.
So the simple resistor capacitor filter has a much easier time reconstructing the intended frequencies and the alias frequencies can be more easily attenuated with a simple filter with less phase distortion.
So over sampling reduces the phasing distortions associated with simple reconstruction filters.
So a higher sample rate would still be desirable but the majority couldn't hear it.
18 or 20 bits would also have expanded the dynamic range to 108db or 120db
If only 2x lossless compression had been available at the outset of CDs, we could have had 18bit, 60KHz sampling and 96 mins playback for about the same amount of bits. Cool
