From a technical point of view, how come these digital tapes don't suffer interference? Surely, if recorded on tape, there will always be this risk?
There may well be signal loss in the playback system, and dropouts as the tape wears, but that won't affect what you see:
When playing an analogue tape, you are playing the signal back straight off the tape; that signal is therefore very complex and so and thus if you experience frequency drop off that will have a direct effect on the quality of the picture and sound. When playing a digital tape, the signal on the tape is simply a sequence of coded binary digits. Since there are only two states to identify between (as opposed to a continuous range with an analogue system), the raw signal on the tape is very much simpler (just a simple high or low signal, instead of an infinately variable one), and thus more robust. Even if the tape is wearing, as long as the decoder can still tell the difference between the two states, the data will still be retrieved and decoded correctly and therefore won't affect what you see on the screen at all.
And even if a bit should be transposed (i.e. a 1 is misread as a 0 and vice versa), there will almost certainly be an error detection and correction system built in.
You're going to need a very, very (very, very) worn tape before it affects the quality of the output.
To make this easier to grasp, I've knocked up a few diagrams in paint (and this is the basis of why digital signals are more robust than analogue ones). It's a bit more in depth than this, but this is basically what's happening:
Diagram A is a representation of what one of your existing analogue tapes would look like if you looked at in on a CRO (a very crude one I know). That is the signal - the video and audio is directly contained with in it. To get good quality playback, that signal must always look exactly like that each time you play it. If you played that tape a few hundred times and it wore, losses in the playback system would result in the signal coming off the tape to look different - to have been distorted. That directly translates into a drop in quality.
Diagram B is a representation of a typical digital signal. With digital, everything is encoded as 0's and 1's, and thus the signal has only two states. The signal is merely used to store dated which then has to be decoded. If all of the high states are taken as 1's and all of the low states as a 0, that is how you can encode the digital signal.
Now, suppose you play that tape a few thousand times and the signal gets distorted. Diagram C shows what it might look like. As you can see, it is very badly distorted. That however won't affect the output. Even though it may be recorded with two states, when playing back it's just looking for everything above a certain level to be read as a 1, and everything below a certain level to be read as a 0. If you take the line as the high/low split point, if you look at the signal you can see that the high and low values are still correct. In other words, the signal might be badly distorted but the decoder can still recognise a 0 and a 1, and thus it makes now difference at all to the quality of the output. The signal would have to be so badly distorted that it would have to cross the line before the data is being read back incorrectly. Thus, digital is generally a lot more robust.