Recording

Vocals:

My book suggests for overdubbing vocals to reduce the recorded vocals by 50% and pan them off to one side, so they give guidance but don’t cause you to lose the pitch of what you’re singing. Perhaps a similar thing with drums?

Vocal recordingUsing a Reflection Filter for singing vocals in a small studio? More detailed information from Sound on Sound, but here is a summary:

The single most important area to treat with sound-absorption material is directly behind and to the sides of the performer. This can simply mean hanging a duvet (or similar) behind the vocalist: it really does make a massive difference to any recording session. Secondly, a reflection filter provides some helpful absorption of sounds that would otherwise reach the rear-facing sides of the mic, and also to catch and absorb some of the direct sound from the vocalist.

General Recording Tips:

Guerilla Recording book summarises as follows:

For each track:
•    get the source right – guitars in tune, vocals clear, drums in time
•    get the dynamics right with limiting/compression/expansion/gating
•    get the frequency response correct for the instrument using EQ
•    get the panning right
•    use effects where they enhance the sound rather than to compensate for poor tracks
•    make sounds fuller using layering
•    make the volume as loud as possible without distortion at each point through the chain
•    mix it down with careful attention to the separation of each instrument using the above
•    master the mix using EQ and compression in the opposite sequence
•    avoid overdoing anything, like boosting frequencies, loudness of parts, full quantizing
•    don’t get hung up on getting it perfect, but check against pro recording of similar sounding music and hear how they do it

How do you get the recording synced with the bars in your DAW?

I know you can record to a click track, but sometimes the music might slow down or speed up. And I know you can change the tempo throughout a track, but how to match that to the speed of the player? It seems a lot of the techniques for playing around with the sound rely on the music being synced to bars, so you can use partial quantize, or time stretching etc.. All my tracks that have audio recordings, as opposed to midi recording, are completely independent of the bars. A bit of a rambling question, but difficult to phrase.

I’m not sure I even understand why I need to sync to bars, but when I see figures showing screenshots of DAW windows, they always have the beats synced to the tempo of the track, which may vary. I *think* this makes it easier for other tools to do their stuff, but I may be micturating up the wall. See http://www.cakewalk.com/Documentation/default.aspx?Doc=SONAR%20X2&Lang=EN&Req=AudioSnap.06.html for a slightly more eloquent description of the problem from the Cakewalk point of view. Or http://support.apple.com/kb/PH13214 for Logic Pro.