Technology changes all the time, sometimes for the better. One of the background tasks that tends to get forgotten is to revisit and re-assess old assumptions, and to update them when necessary. The mains electricity wiring in a house is one example - the insulation in cables does not last forever, and rubber, PVC and plastics all degrade over time. Recently, I have been blowing the dust off some of my archives of audio samples, and it reminded me of one of the first articles that I wrote for Sound On Sound magazine... It is still available on the Interweb:
It set me thinking: how has time changed this 'rule of thumb' that says that if you look at a waveform, you can only see the 'top' 30dB of whatever is in it? I was curious to see if advances in technology meant that this needed a revision...
Into the past...
May 1986 is over 35 years ago, and a lot of the things that you now probably use everyday didn't exist in anything like their current form: the World Wide Web, the Internet, HTML, cheap domestic microwave ovens, cheap laptop computers, LCD video monitors, mobile phones, MP3, DAT, DVDs, DAWs... and high streets full of little more than charity shops and coffee shops.
Digital audio was possible, in a limited way, on a hobbyist computer. If you were in the know, then researchers in places like George Lucas's Sprocket Systems were working on prototype DAW-like technology, and if you had lots of money, then New England Digital's Synclavier was shipping direct-to-disc recording of digital audio, or you could sample at 8-bit resolution on a Fairlight CMI Series II, whilst you saved up for the recently-released Series III with 16-bits! Most ordinary hi-tech musicians were limited to just using computers for simple audio file editing, or for another relatively new innovation: MIDI.
So, my samples from this time were mostly 8-bit, sampled at 8, 16, or maybe even the insanely high rate of 32kHz! They were mostly kept on 3.5 inch floppy disks (From Sony, which were encased in plastic and so weren't actually 'floppy' at all...) or on a hard drive which would be a few hundred Megabytes in a case about twice the size of a modern hard drive with a few Terabytes. To look at the files, CRT (Cathode Ray Tube) monitors would be used - VGA resolution (640x480) monochrome LCDs didn't appear until 1988, and colour LCDs didn't become affordable until the 1990s. You might like to create a graphic image sized 640 x 480 pixels on your current computer to see just how small it really is - on my 27 inch 5K (5120 x 2880 pixels) monitor it covers about the same area as a credit card.
So here's an 8bit sine wave, displayed more or less 1:1, so there are 256 pixels from the highest to lowest peak (except it isn't - no matter what I do, my browsers won't show the graphic actual size. Strange...). Anyway, just imagine that the following graphic image is 256 pixels in height:
To 2020...
If we now do this with 16 bits and a bigger screen, then we can fast forward to 2020. We can take the normalised output as our maximum output level (let's call it 0dB), then we can compare it with a higher frequency (4x freq) sine wave attenuated by 30dB (i.e at -30dB). Because these are going to show relative levels, I'm not going to align the bits to pixels here, especially because I can't show 16 bits on a sensibly-sized screen (the screen would have to be 65,536 pixels high, which is more than 20x the vertical resolution of my current screen (2880 pixels).
So, from left to right, we have the sine wave at 0dB, a 4x frequency sine wave at -30 dB, and the result of mixing them together. The waveform on the right looks distorted, and is obviously not a sine wave, and if you listen to it, then it is very easy to hear the 4x frequency sine wave because it is only 30dB down, and your ears are good over a much bigger dynamic range than that.
But the middle screenshot is particularly interesting. A signal 30dB down is just about visible on modern screens, but you can imagine that if this was an oscilloscope with a slightly fuzzy line of light as the display, then it might be possible to see the sine wave. But many 2020 synthesizers feature waveforms shown on small OLED displays that are not even 256 pixels high, and so the vertical resolution is worse than a VGA monitor, and worse than the 8-bit example shown above.
So let's try the same process at -40dB.
As before, from left to right we have the sine wave at 0dB, then the 4x frequency sine wave at -40dB, and then the result of mixing them together. The middle screenshot is now much smaller, and the sine wave on the right looks like...a sine wave. If you listen to it, then you can hear the 4x sine wave. but it is not obvious from looking at the screenshot that the sine wave is impure at all.The 40dB Rule...
If you find my writing helpful, informative or entertaining, then please consider visiting this link:
Synthesizerwriter's Store (New 'Modular thinking' designs now available!)
No comments:
Post a Comment