# What is Digital Audio Workstation: Analog Audio to Digital Conversion and Pulse Code Modulation

Continuous signals such as voltages vary a lot continuously but digital signals are not. Digital signals are square waves which have only two possible values: 0 and 1. Continuous signals such as voltages at the input of the audio interfaces have infinite possible values. They could take just any value of voltages as long as it’s within the resolution of the converter for example 0.324 volts, 0.345 volts, 0.232 volts, or even negative voltages -0.656 volts, etc.

These voltages hold up the “INFORMATION” of the audio content in analog domain. Remember that the microphone is a transducer that converts sound pressure vibrations into voltage levels. High sound pressure results to higher voltage induced at the microphone output while low sound pressure will have smaller induced voltage

So you could imagine a singer with dynamics (low to high pitch or volume levels) can induce infinite possibilities of voltage levels at the microphone output. These voltage levels would make up the singer audio waveform which is to be converted to digital (if it’s to be recorded in your DAW).

Representing these analog levels in digital domain:

To represent analog levels, you need sufficient sampling rate and bit depth to convert the waveform accurately. The sampling rate required is twice the highest frequency to be converted, so for music, it would be around 22050Hz. It is why the most common sampling rate for music would be 44.1 KHz because:

Sampling rate required for accurate reproduction = 22050Hz x 2 =44.1KHz, This implies that there are 44100 analog voltages samples taken per second.

The number of bits you need can also affect the resulting representation. Supposing you want to sample a sine wave voltage signal using 3 bits at low sampling rate, these are the output:

quantized 3 bits

Take note that the sine wave is jagged and not a good looking sine wave. This is not a replica of the analog sine wave because it has only been sampled using 3 bits. The maximum possibilities of 3-bits are: 2^8 = eight possible representations of the analog signal voltages. These are not enough, considering that the analog signal is continuous.

However, if you are using 24-bits and using a reasonable sampling rate such as 44100Hz it would become very accurate as there are 2^24 = 16,777,216 possibilities that a voltage levels can be represented and then there are 44100 samples taken per second. The resulting digitized waveform would now be very smooth looking much like the original analog sine waveform:

Perfect sine wave

You cannot see anymore those jagged or ragged corners on the sine wave. Those dots are the samples and there are sufficient samples taken per second at reasonable bit depths; thus making the digitized representation accurate.

Always record at 24-bits for better resolution

Bit depths tell you how much resolution you have in your analog to digital conversion. So if you record only at 16-bit, you only have 2^16=65,535 levels. Supposing the analog voltage levels to be coded is from -20,000 mV to +20,000 mV then the resolution would be:

Resolution = [+20,000mV – (-20,000mV)]/65535 =0.61mV per sample

If you are recording at 24-bits, this resolution would be:

Resolution = [+20,000mV – (-20,000mV)]/16,777,216 =0.0024 mV per sample

Now you can accurately represent an analog audio signal if recorded at 24-bits because of this very high resolution. With 16-bits example above, you cannot resolve voltages significantly smaller than 0.61mV resolution so it would simply be round off to 0.61mV thus causing what is known as “quantization error” because of the significant difference between the analog input and the digitized output. This will have a significant effect on the resulting recording quality.

To make this clear to you, the resolution affects the granularity or steps in the digitized signal, a high resolution results in a more smoothly looking converted/digitized analog signals, see screenshot:

quantization tips

If the resolution is to be made to be smaller, those “steps” would become smaller and looks smoother. The resulting sound would also be less “digitized” making it sound exactly like analog. But with 24-bits example above, with 0.0024mV resolution, even smaller changes in the analog input voltage can be represented, thus resulting into a more accurate reproduction of the analog input. It is why 24-bits are a standard in music production for recording. You should be recording at 24-bits.

### Digital audio workstation comes into play

Since music now becomes a digital audio, your computer can now start processing it. Your computer, operates with a software for processing these digital audio information and hence it is called as a “digital audio workstation” or simply as DAW. For me, a complete DAW system is not the software itself but it’s the combination of the audio hardware (computer, sound monitors, etc.) components and the software (Reaper, Adobe audition, Cubase, Logic, Sonar, etc). In the computer screen, the captured analog audio is then visualized into a waveform based on the digital audio information rendered by your software. As an engineer you will then start interacting and editing these digital information into three steps that comprises an important processes in music production:

a.) Basic Tracking and Editing (recording instruments into your computer, removing noise and editing parts digitally)

b.) Digital Audio Mixing (combining the recorded tracks to form the best sound results by adjusting EQ, compression, panning, effects of different digital audio tracks) then rendering a “mixdown” which is a single waveform summing up all adjusted tracks in multi-track. It is called “digital mixing” because you are using your DAW to mix the tracks in digital domain as opposed to using an analog mixing console in the analog domain.

c.) Digital Mastering – optimizing the mixdown to sound best in different audio monitors/players. You are still using your DAW to do this step so its still in the digital domain.

You will learn the details on each of the above processes as you progress with your reading later on.

### The ending

After mastering, a digital audio information is then stored into an audio CD (which is still a digital device) or any digital storage device such as DVD, etc. If this digital audio information is played using a CD player; it includes a digital to audio converter that converts it back again to analog form!

Thus, it then flows to the electrical wires as an analog electrical signal and ends up to your studio monitors/speakers. Your monitor is also a transducer that converts electrical energy/signals into acoustic/sound pressure wave vibrations. Thus creating disturbance in the air that creates the music you hear. This is how it ends.

Content last updated on October 10, 2012