Analysis of an SDR Cassette Tape

Apr 2009

Many years ago as a gift, probably for Christmas 1983, I was given a cassette tape of David Bowie's Let's Dance (EMI Canada 4XO-17093). The front of the J-card proudly proclaims "Super Dynamic Range" while the reverse has some text trying to explain what SDR means to the consumer. What caught my attention was the mention that one can hear "computerized tones" at the start of the cassette containing "15 frequencies from 32 hZ [sic] to 18 khZ [sic]".

Inside of David Bowie Let's Dance cassette Front of David Bowie Let's Dance cassette

Sure enough, when I played it I heard faint tones at the start of the tape, and at the end too. This was really cool technical stuff, even though I did (and still do) frown at it slightly because I recognized that it was a technological intrusion into the soundscape, despite the low recording level; in other words, all sounds on the recording should be the artist's responsibility. Whatever. Unfortunately I had no way to measure the tones to see what the frequencies were, or anything else! So for years I've been wondering just what these tones were all about.

Digging around the net in 2009 for technical info about SDR I found that it was apparently developed by Capitol in Canada and then renamed XDR (eXtended Dynamic Range) when Capitol in the US started to use it. It's actually an industrial quality control process that tries to improve the final product by monitoring the quality of replication at different stages. What makes this possible are the tones at the beginning and end of the tape; they're a type of in-band signalling using standard tones that the quality control process can test to see how they've been affected by each stage of replication. These tones were named "The Soundburst Test" by Capitol.

Presumably the Marketing Dept used technobabble to spin this potentially irritating sound as a positive thing for the consumer. Interestingly, something that could have been a source of irritation instead found fans; some modern CD-only releases or fan-produced tapes or CDs have included such tones without them having anything to do with quality control! The tones became a part of the culture; however, it's probable that these tones only have meaning to people growing up in the 80's, making me wonder what modern youth (that may never have experienced analogue audio) thinks of projects that use such tones.

I couldn't find the sort of detailed technical information I wanted, like exact frequencies and what relative levels they were recorded at. Also, this tape was recorded with Dolby B, so does quality control measure the levels with or without applying Dolby? However, with modern computer technology I've made a digital transfer and can now make whatever measurements I can of this 25-year old tape!

SDR Tone Bursts

Here's the audio waveform (Audacity was used for all tests) for the first half-minute of the cassette with Dolby B enabled, starting with no signal until I pressed play at about the one second mark. Immediately there is a group of test tones (detailed below) followed by eight seconds of silence until the first song begins.

Tone burst and start of song 1

The tape wasn't running for the first second or so, so this part of the transfer serves as an extremely quiet reference level of the computer's own noise. The tape has no leader, and on play it immediately begins with a low-frequency signal that has a 7 Hz fundamental and a bunch of odd harmonics; it's not clear if this is an attempt at recording a square or triangle wave, or what. This signal lasts for almost three seconds until the tones start. Below is a zoomed-in image. The large spike visible in the image is an impulse noise due to a tape or recording defect — it's on both transfers I did.

Strangely, the first two sets of tones, 50 and 100 Hz, have been mixed with the end of the 7 Hz signal; in this image they look like noise and the tones aren't visible unless one zooms in and sees the higher frequency signals riding on the lower. This suggests that SDR quality control wasn't quite as good as SDR marketing implied, or perhaps this didn't impact quality control.

The tones were found to be approximately 50, 100, 250, 400, 640, 1010, 1610, 4000, 6350, 8100, 10100, 12600, 15200, and 18300 Hz, followed by a burst of 15200 Hz that lasts for 800 ms. Each tone lasts for 127 ms followed by 23 ms of silence, a total of 150 ms per slot. These numbers aren't exact, as measurement is affected by player speed variations. In fact, the two transfers I did showed different results but the numbers should be within a few percent of whatever the spec says. Interestingly, the tones begin with 50 Hz rather than the advertised 32 Hz. I wonder why that is? The player used has no problem recording or playing below 30 Hz, so it's not that the player can't play it, it's just not there on the tape. Hmm, maybe 127 ms is too short an interval to record a useful number of cycles at that frequency?

Tone burst

Below is an image of the set of tones from the end of the tape; Side 2, there are no tones at the end of Side 1 or start of Side 2. This time there's no 7 Hz signal to interfere with the bursts.

Tone burst at end of Side 2

I made two transfers of the tape; one with Dolby B enabled, the other without. Studying the envelope of the SDR tones suggests that the correct levels are to be had with Dolby B enabled.

For further research one can download the Side 2 Toneburst (Dolby B) audio file containing the clean toneburst from the end of Side 2.

Mysterious 10 kHz Tone

I noticed something intriguing while checking what the noise spectrum looked like at various points of silence. I found a high frequency tone at about 10100 Hz that suddenly began a few seconds after the SDR tone bursts but before the first song started. It's at a low level that's 40 dB below the average music level and is nearly always masked by music content but it's always there until the music ends.

Here's an image from Audacity displaying the audio spectrum corresponding to the above image of the audio at the start of the tape. This is from the non-Dolby transfer, which makes the tone more visible since the highs aren't being reduced. The SDR tone bursts are obvious, followed by a few seconds of silence until the mystery tone begins at about 11.5 seconds. This image also shows two other tones at higher frequencies starting with the music at 15 seconds but they're at a much lower level and don't seem to be running constantly throughout the tape.

Tone burst and start of song 1

As far as I know there's nothing special about 10100 Hz; I don't know of any standard electronics that operate at or near this frequency. So what's the source? It's only present when music is, or will be, playing; it's absent during the SDR tone burst and after the music ends, so it's not likely to be something wrong with the player. There doesn't seem to be any information modulated onto the signal. Is it a feature of the SDR technology? Is it some control signal marking music content? Is it some primitive anti-copying measure? Could it even be a defect with a production master tape? It'd be interesting to compare this with a CD release to see if the tone is present (no, I don't have the CD).

Clearly there are many unanswered questions.