Bit depth * Sample rate = bit rate?

billy b

New member
Seriously, I'm not looking for n00b answers here or copy-and-pastes.

I understand the concept of sample rate and bit depth (sampling rate is how often the audio is defined, bit depth is the extent to which each sample is defined).

Knowing that, couldn't I assume that bit depth (in bits) and sample rate (in hertz) when multiplied, will give me the bit rate, in bps? (divided by 1000, in kbps?)

However, when I try to apply this, it doesn't seem to make sense, unless my audio files really are of such low quality.

For example, if I take a 128kbps mp3 file, I can try to calculate a sample rate * bit depth combination.

That would be:

x bits * 44100hz = 128000bit/s (here I assumed 44.1khz as sample rate, since it's a lower-quality standard)

The units cancel, I'm left with 44100x=128000.

However, this comes out to be a bit depth of ~2.902. WAYYY low. Can't possibly be correct.

Someone explain, please?

I know it's a tough one, but if you understand it, all I'm really looking for is an explanation. Don't hold back from using nerd-ese either, I can understand.

Also, I doubt I'll get more than one answer, so you do it right, easy 10 pts =)
Answer #1: I don't get where you're going with this. I haven't cancelled any unnecessary units:

If we break down the units:

bits * hz = bit/s

that comes out to be:

bits * 1/s = bit/s

that multiplies to be bit/s = bit/s.

Because Hz is cycles per second, using only the unit, we'd put a one for the coefficient (so 50hz is really 50 * 1hz, which is 50 * 1/sec) therefore hz is equal to 1/s.
 


Write your reply...
Back
Top