The No.1 Website for Pro Audio
Aligning Phase
Old 7th June 2006
  #1
Lives for gear
 
Dog_Chao_Chao's Avatar
 

Aligning Phase

Hi.
Whenever i track with more than 1 mic I test phase before. Sometimes i understand that even with one phase inverted I am not getting a perfect phase aligment between the 2 mics. But I don´t mind that much beacause i know that later I can shift one track so i match phases.
I also know i can use the 3/1 rule to minimize this but most of the times this rule is not very usable.
The problem now is that, sometimes, after recording, I can´t relate the 2 waveforms. Sometimes they are so different from each other that i can´t shift and align them, or undesrstand where they start and end each cicle.
Or, if i match the 2 waveforms at the beggining they will cancel each other in the middle, end or somewhere along the track.
Is this normal? And if it is what would you do? Should i leave phase alone if it sounds Ok? Is there anyway to get around this problem? Am i doing something wrong?
Another issue that I´m not sure about is when using room mics. For example, consider one close mic and another one 10 feet away for room sound. If the waveform of each mic is going to be perfectly aligned (shifting them in the editor) aren´t we destroying the time diference between mics?
Old 7th June 2006
  #2
SK1
Lives for gear
 
SK1's Avatar
 

Someone here told me when I was asking about reamping, to just put a sharp transient ..... like a stick click, or clap your hands ....... whatever ..... at the beginning or end of the track ......
makes it very easy to line up the tracks.
It would work for aligning phase too ......

But if it sounds good why bother ??? As long as it's not canceling too much, I say a little phase can add alot of character.

Sounds good, is good. ( sorry had to say it ... heh )

Old 7th June 2006
  #3
Lives for gear
 
Dog_Chao_Chao's Avatar
 

You´re right. But I am curious about this. Why are waveforms so different? Why can´t I align them?

Sometimes although it sounds ok, after a phase correction it sounds even better, and shows something that was hidden, like low frequencies or better image.
Old 7th June 2006
  #4
Lives for gear
 
paterno's Avatar
 

Quote:
Originally Posted by Dog_Chao_Chao
Hi.
Whenever i track with more than 1 mic I test phase before. Sometimes i understand that even with one phase inverted I am not getting a perfect phase aligment between the 2 mics. But I don´t mind that much beacause i know that later I can shift one track so i match phases.
You never will get 'perfect phase alignment'. There will always be some area of the frequency spectrum that does not line up 'perfectly'. Shifting the track will align the phase of some areas and 'unalign' others, because the time delay varies with frequency. Check this out on the Little Labs website: www.littlelabs.com/ibp.html Scroll down, download the manual, and read through it. It will give a nice explaination of what 'phase' is.

Quote:
Originally Posted by Dog_Chao_Chao
The problem now is that, sometimes, after recording, I can´t relate the 2 waveforms. Sometimes they are so different from each other that i can´t shift and align them, or undesrstand where they start and end each cicle.
Or, if i match the 2 waveforms at the beggining they will cancel each other in the middle, end or somewhere along the track.

Is this normal? And if it is what would you do? Should i leave phase alone if it sounds Ok? Is there anyway to get around this problem? Am i doing something wrong?
Another issue that I´m not sure about is when using room mics. For example, consider one close mic and another one 10 feet away for room sound. If the waveform of each mic is going to be perfectly aligned (shifting them in the editor) aren´t we destroying the time diference between mics?
yes, leave the track delay alone if it sounds OK. If it doesn't sound OK, you should think about moving the mics way before fixing it later in the computer. I think you will be much more satisfied with the results. It's not a 'problem' per se, it's the physics of the situation. For room mics, isn't the whole point to have a bit of delay? Don't be afraid to move those around either for the best sound in relation to the close mic.

Adeus,
john
Old 7th June 2006
  #5
Lives for gear
 

have you tried TriTone Digital's free Phasetone plug-in? It's utility is both creative and corrective. Me likey!
Old 7th June 2006
  #6
Lives for gear
 
Dog_Chao_Chao's Avatar
 

Quote:
Originally Posted by Schmacko
have you tried TriTone Digital's free Phasetone plug-in? It's utility is both creative and corrective. Me likey!
Does it work with protoolsLe?
Old 7th June 2006
  #7
Lives for gear
 
Dog_Chao_Chao's Avatar
 

Quote:
Originally Posted by paterno
You never will get 'perfect phase alignment'. There will always be some area of the frequency spectrum that does not line up 'perfectly'. Shifting the track will align the phase of some areas and 'unalign' others, because the time delay varies with frequency. Check this out on the Little Labs website: www.littlelabs.com/ibp.html Scroll down, download the manual, and read through it. It will give a nice explaination of what 'phase' is.



yes, leave the track delay alone if it sounds OK. If it doesn't sound OK, you should think about moving the mics way before fixing it later in the computer. I think you will be much more satisfied with the results. It's not a 'problem' per se, it's the physics of the situation. For room mics, isn't the whole point to have a bit of delay? Don't be afraid to move those around either for the best sound in relation to the close mic.

Adeus,
john

Thanks for the reply. I has curious about if I was doing something wrong.
About moving the room mics around, it´s a bit of a complex work to do when I have the drummer hiting the drums in the same room, when I am alone in the studio, etc.
Anyway, thanks.

About that 3/1 rule, tell me:

If you are tracking a electric guitar cabinet and you have a close mic almost touching the grill of the amp at what distance would you place the room mic?
Old 7th June 2006
  #8
Gear Addict
 
tonymite's Avatar
 

Quote:
Originally Posted by Schmacko
have you tried TriTone Digital's free Phasetone plug-in? It's utility is both creative and corrective. Me likey!

Yes
Old 7th June 2006
  #9
Lives for gear
 
paterno's Avatar
 

Quote:
Originally Posted by Dog_Chao_Chao
Thanks for the reply. I has curious about if I was doing something wrong.
About moving the room mics around, it´s a bit of a complex work to do when I have the drummer hiting the drums in the same room, when I am alone in the studio, etc.
Anyway, thanks.
Maybe you could have him play a bit -- 16 or 32 bars, listen, make adjustments, and try it again until you are happy.

Quote:
Originally Posted by Dog_Chao_Chao
About that 3/1 rule, tell me:

If you are tracking a electric guitar cabinet and you have a close mic almost touching the grill of the amp at what distance would you place the room mic?
Personally, I'd put the second mic right next to the first mic, and make sure the capsules are the same distance from the grill. If you need a room mic on the guitar, it really depends on how loud they are playing, the type of song, the type of room you are in. i guess my point is that you should experiment and see what gives you the best results. It may be time consuming, but that is part of the learning, and part of the fun...

Cheers,
john
Old 7th June 2006
  #10
Lives for gear
 
Dog_Chao_Chao's Avatar
 

Quote:
Originally Posted by paterno
i guess my point is that you should experiment and see what gives you the best results. It may be time consuming, but that is part of the learning, and part of the fun...

Cheers,
john
This is what i do most of the time. that´s why I like to come here and ask people about things that i have doubts . By the way, i already placed that mic room all over the space. I was just asking that because i can´t see were the 3-1 rule apllies in this case. Like, if the distance from the source is almost 0 how can you figure out were to put the next mic. Also i don´t like to place 2 close mics next to each other.
Thanks
Old 7th June 2006
  #11
Lives for gear
 

Quote:
Originally Posted by Dog_Chao_Chao
Does it work with protoolsLe?
yup, you just have to install the Pluggo runtime environment which will wrap the plugs as RTAS, VST, & AU.
Old 7th June 2006
  #12
Lives for gear
 
paterno's Avatar
 

Quote:
Originally Posted by Dog_Chao_Chao
I was just asking that because i can´t see were the 3-1 rule apllies in this case. Like, if the distance from the source is almost 0 how can you figure out were to put the next mic. Also i don´t like to place 2 close mics next to each other.
Thanks
Well, then, what happens when you put the room mic less than three feet from the speaker/close mic? Does it sound phasey? What happens if you put the mic two feet from the source and point it out into the room, instead of at the source? You should be able to draw some conclusions by 'bending' the 3:1 rule a little.

Why don't you like to place to close mics next to each other? If they are different mics especially, it can be a very good trick to use...

John
Old 7th June 2006
  #13
Lives for gear
 
Jason Poulin's Avatar
 

adjusting the time on one of the tracks is not phase ajustment.


You need an IBP for this... there's a lot more to it than just shifting one track. You'll keep beating yourself over the head this way!


Jason
Old 7th June 2006
  #14
SK1
Lives for gear
 
SK1's Avatar
 

Quote:
Originally Posted by paterno
You never will get 'perfect phase alignment'. There will always be some area of the frequency spectrum that does not line up 'perfectly'. Shifting the track will align the phase of some areas and 'unalign' others, because the time delay varies with frequency. Check this out on the Little Labs website: www.littlelabs.com/ibp.html Scroll down, download the manual, and read through it. It will give a nice explaination of what 'phase' is.

Adeus,
john
That is great .... I've been wondering about this as well.

Old 7th June 2006
  #15
SK1
Lives for gear
 
SK1's Avatar
 

Quote:
Originally Posted by Dog_Chao_Chao
Thanks for the reply. I has curious about if I was doing something wrong.
About moving the room mics around, it´s a bit of a complex work to do when I have the drummer hiting the drums in the same room, when I am alone in the studio, etc.
Anyway, thanks.
I've had luck finding the sweetspot for the second/room mic on elect guitar with a good set of closed headphones with better isolation like the beyer dynamic 150's or such .......
much easier for me than going back and forth to the control room.

Good thread for me too ... Thanks !

Good Luck
Old 8th June 2006
  #16
Lives for gear
 

Quote:
Originally Posted by Dog_Chao_Chao
... I was just asking that because i can´t see were the 3-1 rule apllies in this case. Like, if the distance from the source is almost 0 how can you figure out were to put the next mic. Also i don´t like to place 2 close mics next to each other.
Thanks
That's exactly right. I love that angle of this. What the hell good is this rule that says if one is at 1", everything from 3" on out is good to go?!
3/1 is useless for close mic/far mic situations. A ratio can't say which frequencies wil be in or out. (That is a factor of time difference) 3/1 is about hiding phase effects through attenuation. As soon as you mix them back to similar levels (ie make up for the attenuation... that would be akin to your depth control on a 'phase shifter box.
The good news is that as the far mic gets farther into the room, the reflections tend to average out the direct phase paths, fill in the deeper peaks and holes.

thumbsup
Old 8th June 2006
  #17
Lives for gear
 

Very complex subject. With drums, time alignment is very complex due to the fact that distances between heads, even different parts of the head if it's out of tune will have different phase relationships between any two mics. Time alignment can create a mess. Sometimes delaying room mics will actully null out some phase cancellation (longer delays = less cancelation) and can make the room sound bigger. Also DI + AMP signal won't align perfectly because of the non-linear phase response of speakers. You will always have to use your ear to judge what sounds best. You can't alway trust waveforms. The little labs box rules. It's like an eq that does stuff that other eq's can't fix.
Old 8th June 2006
  #18
Lives for gear
 
Dog_Chao_Chao's Avatar
 

Quote:
Originally Posted by Schmacko
yup, you just have to install the Pluggo runtime environment which will wrap the plugs as RTAS, VST, & AU.
Didn´t work. Don´t know why. Have to try it again. Everything went ok but the plugin didn´t show up in the list...
Old 8th June 2006
  #19
Lives for gear
 
Dog_Chao_Chao's Avatar
 

Quote:
Originally Posted by paterno
Well, then, what happens when you put the room mic less than three feet from the speaker/close mic? Does it sound phasey? What happens if you put the mic two feet from the source and point it out into the room, instead of at the source? You should be able to draw some conclusions by 'bending' the 3:1 rule a little.

Why don't you like to place to close mics next to each other? If they are different mics especially, it can be a very good trick to use...

John
Thanks for your reply. To tell the truth i´m using less room mics these days. It´s simpler to mix that way and my room mics don´t sound that good. Although with some sources room mics are a must for me.

About the 2 close mics next to each other i found 2 problems with that:

1- I don´t like the way they blend. I used to do this with a 57 ( or a 421) and a 4047 and i ended up choosing one. It´s a bit of wasted time and disk space!
2- If they are very close to each other they will interfere in the other one´s sound and if they´re not in the same position ( say one at 2 o´clock and the other at 8 o´clock) i get some phase issues. But maybe i should try it again, i can´t remember the last time i did.
Old 8th June 2006
  #20
Lives for gear
 
Dog_Chao_Chao's Avatar
 

Quote:
Originally Posted by SK1
I've had luck finding the sweetspot for the second/room mic on elect guitar with a good set of closed headphones with better isolation like the beyer dynamic 150's or such .......
much easier for me than going back and forth to the control room.

Good thread for me too ... Thanks !

Good Luck
Thanks. Yes i try to place the room mic in the room with headphones, but i always hear to much sound from the source, specially with drums and loud guitar amps.
Yesterday i was tracking drums and the drummer told me he was hearing to much drums from the room and less from the headphones. So i tried it myself and I was impressed- it sounded horrible! I don´t know how the hell the guy could play like that. Maybe I should get better closed headphones, altough i though mine were Ok (At and Sony).
Old 8th June 2006
  #21
Lives for gear
 
softwareguy's Avatar
 

Hmmmmmmm,

Every time one of these threads come along things seem to get pretty scrambled. The fact is that both time aligning in a DAW and phase aligning via IBP are legitimate ways to accomplish essentially the same task. Because they accomplish it differently, you will get better or worse results with one or the other depending upon the circumstance, so they are both worth trying, and can both get you there.

I just want to make a few points:

1. There is only one sound wave traveling from your source, not high and low frequencies traveling at different speeds. Look in your DAW. One very complex wave. One diaphragm in your mic creating one electrical signal.

2. When aligning two microphones there are two general issues. The first is a time element that must be dealt with caused by the distance between the two mics and the time it takes to travel between them. You hit a snare and the wave that is created reaches the closer mic first. This distance causes misalignment of the initial transients, and can also cause misalignment of the phase of the sound wave, resulting in phase cancellations and a bad sound. Secondly, there are also other elements that for convenience I will call "outside influences," such as the differences between the two microphones, room influence, the action of air friction reducing the high end of the signal, etc., that are not time based.

3. If there were no outside influences, time alignment would always result in PERFECT phase alignment. It would just be the same signal, captured a moment later in time. In this case, also, there would be no point in our bothering to put up the second microphone at all, because it would not be capturing new information.

4. The job of the engineer is to use mic placement, time alignment via the DAW and/or phase alignment via an IBP to get the best clarity for the source signal possible, but bring along the outside influences (e.g. room sound), that made putting up a second mic desirable.

5. Alignment of 2 mics is therefore a set of compromises. You will not get perfect phase alignment with ANY tool, because the outside influences are causing the two signals to be different, and therefore not perfectly alignable. In the end, you must use your ears. If it sounds good, you have accomplished your task.

6. The IBP manual, if you read it carefully, does not say that time alignment is frequency dependent. In fact it says that phase alignment is frequency dependent when accomplished via all pass filters such as those used by the IBP, and that higher frequencies require more phase shift to sound natural. Hence the use of 2 all pass filters in the IBP.

7. The IBP Manual also says, and this is where it gets really interesting, that two sources can be phase aligned but NOT time aligned. The job of the IBP is to accomplish phase alignment by bringing the peak of one wave together with the closest peak of the second wave. As mics get further apart, these two peaks will not be the SAME peak. You will not be time aligning, but only phase aligning. The transients will not be aligned, only the peaks and troughs of the two waves. As a result, phase aligning will sound different than time aligning, and the further apart the microphones are, the more this will be true. So if you have a distant room mic up on your drum kit, it is an artistic decision whether you want to time align (and have cleaner transients) or phase align (and have a more conventional distant room sound without phase cancellation of the low end of the signal).

In the end, if it sounds right, it is right.

Old 8th June 2006
  #22
Gear Head
 

Quote:
Originally Posted by softwareguy
Hmmmmmmm,

Every time one of these threads come along things seem to get pretty scrambled. I just want to make a few points:
Yes, and although I agree with your philosophy and most of your post is correct you seem to be doing a bit of scrambling here yourself..

Quote:
Originally Posted by softwareguy
1. There is only one sound wave traveling from your source, not high and low frequencies traveling at different speeds. Look in your DAW. One very complex wave. One diaphragm in your mic creating one electrical signal.
Yes, all frequencies are travelling at the same speed.


Quote:
Originally Posted by softwareguy
3. If there were no outside influences, time alignment would always result in PERFECT phase alignment.
This is where you go wrong. Even without the 'outside' influences you mention, you won't get perfect phase alignment with time alignment except for a sound wave with a single frequency. The phase relationships are frequency dependent, so with a complex waveform, every sample you nudge in DAW changes the phase relationships by a different amount for every frequency
Old 8th June 2006
  #23
Lives for gear
 
feyshay's Avatar
 

When you nudge the time a bit, you're nudging by msec. Is that really fine enough to avoid phase? I haven't gone through the calculations on that, but I would think not. I tried to fix phase when using external digital effects in parallel (e.g. parallel compression). The only fix was in Cubase SX3 when someone told me to ping the external effect. When you do this, you'll see the compensation print out as 56.73msec (for example). When I adjusted by msec I could never fix the phase issues. The control has to be finer.
Am I correct or incorrect with this statement?
BTW. I still haven't figured out how to use Tritone's phase program effectively. I understand the msec delay part, but what are the purpose of the vertical sliders (frequency and mix amount)? I guess I should read the manual.
Old 8th June 2006
  #24
Lives for gear
 

Quote:
Originally Posted by feyshay
When you nudge the time a bit, you're nudging by msec. Is that really fine enough to avoid phase? I haven't gone through the calculations on that, but I would think not. I tried to fix phase when using external digital effects in parallel (e.g. parallel compression). The only fix was in Cubase SX3 when someone told me to ping the external effect. When you do this, you'll see the compensation print out as 56.73msec (for example). When I adjusted by msec I could never fix the phase issues. The control has to be finer.
Am I correct or incorrect with this statement? ...
I can zoom in to the sample level (Sonar) -not sure it will drag by that increment, but it does go very fine, sub-ms.
Just as a side, consider the intentional phase effects of ORTIF. The strongest, tightest version may not be the one of choice in this anyway. A little slop might be just the ticket. heh
Old 8th June 2006
  #25
Lives for gear
 
softwareguy's Avatar
 

Quote:
Even without the 'outside' influences you mention, you won't get perfect phase alignment with time alignment except for a sound wave with a single frequency. The phase relationships are frequency dependent, so with a complex waveform, every sample you nudge in DAW changes the phase relationships by a different amount for every frequency
I'm not sure we're that far apart on this.

If two identical, complex sound waves, separated in time, are nudged into time alignment, there will be different parts of the waves that will reinforce and cancel each other as the two waves come closer and closer to absolute time alignment. This is audible as the canceling and strengthening of different frequencies. Ultimately, however, if the two waves are identical, they are identical. If you line them up, they line up and are absolutely in phase. There is nothing in principle that forbids this. It is the DIFFERENCES between the two waves (which is why we bothered to record both of them in the first place) that make absolute time alignment impossible in practical recording situations. It also makes absolute phase alignment impossible with the IBP. Any two waves that are not identical will have parts that reinforce one another and parts that cancel each other, however you try to align them. Any track alignment requires a set of compromises, and requires that you use your ears to decide what method of alignment is best for that situation (if any), and exactly where that alignment sounds the best.
Old 8th June 2006
  #26
Lives for gear
 
Doublehelix's Avatar
 

Here goes (again):

First off, I am not an expert here, so there are some things that I over-simplify due to my lack of understanding. I am an engineer just like most of us around here, so the following info is compiled from many years of reading and working with sound and some basic working knowledge of the physics of sound.

OK...on with the explaination...

Each frequency vibrates at a different speed (hence the term "frequency"). If you imagine a perfect sine wave at a specific frequency, it alternates its wave pattern above and below the middle line (pos and neg) in a regular, repeating fashion. In order for two mics to be receiving this perfect sine wave "in perfect phase", they both need to be at a distance from the source so that both mics "hear" the sine wave when it is in the exact same place in its vibrational transientory stage in relation to its position to the center line.

This does not mean that they have to be at the exact same *distance* away from the source, they just need to be receiving the signal at the same spot in its vibrational motion.

This is obviously almost impossible to do.

You now know why coincident miking techniques were developed to reduce phasing issues. Not perfect, but closer than anything else we can do by eye.

If you were recording only a single perfect sine wave, then yes, time-aligning and phase-aligning are the *exact* same thing.

Unfortunately, real-life sounds are not perfect sine waves of a single frequency, but are very complex series of sounds made up of many, many frequencies (fundamentals and harmonics of every nature and sort).

With something like a drum kit, this is only compounded many fold because not only are drums producing complex waveforms of many frequencies, but they are coming from multiple sources all at the same time.

Since all of these frequencies vibrate at different speeds, each one arrives at the various microphone at different transiential postions and times. Trying to find a position were each microphone receives each frequency at the same position is *IMPOSSIBLE*. Period.

Our goal as engineers is find the best compromise possible when using multiple mics on a single source, or in the case of drums, multiple sources. It is always a bit of a trade off, but in the end, we need to use our ears.

And don't even get me started on reflections coming in from walls, floors and ceilings! This is such a complex subject, that whole books could be written about it!

Time aligining in a DAW does not phase-align!!! It just time-aligns a loud noise, not the individual frequencies.

You can time-align, and still get a smeary, phasy sound.

I am not saying that it is bad to time-align, in fact, I do it quite often, but you need to use your ears, not your eyes to get it sounding the best.

This is where the IBP comes in.

Does it *perfectly* align phase? Heck no! That would be a nearly impossible task, at least with today's technology, and if it was possible to get close, the cost of the device would be beyond imagination! (and I can imagine a *lot*!!! )

If you go to the Little Labs web page on the IBP HERE , you will see a pretty good explaination of what is going on. Also, be sure to scroll down to the bottom of the page and read the PDF manual for the IBP for some more detailed explaination.

As I said, this is such a highly complex subject and there is no perfect answer.

For me, the solution is to use a combination of my ears and the IBP, with a bit of time-aligining thrown in for good measure.

When I time-align drums, I usually start with the OHs since they are the furthest away other than the ambience mics, which I leave as is... they are ambience after all!
Old 8th June 2006
  #27
Gear Head
 

Sure, but the waves won't be identical. Consider a simple case where there is just a time delay between two mics on one source and ignore all other influences besides the time delay. There will still be phase differences due to the time delay only. They are not just the same identical wave captured a moment later in time as you suggest. For an ideal case like this, the phase relationships can be easily calculated based on the distance between the source and each mic, but there is more going on that just a time delay. A simple delay or DAW nudge can align them in time, but not in phase. For this you need an all pass filter, to effectively delay each frequency by a different amount.
Old 8th June 2006
  #28
Lives for gear
 
softwareguy's Avatar
 

OK, help me out here.

If you

1. ignore every influence except time delay (room, friction, eq, mic, etc.)

2. acknowledge that there is a single wave form that is being generated by the source

3. acknowledge that the whole sound travels at a consistent velocity that is not frequency dependent

I'm not sure where your time-generated, frequency-dependent phase shift is going to come from.

Am I missing something?
Old 8th June 2006
  #29
Gear Head
 

Yes, I believe you are missing something and this is the crux of a lot of the general confusion. Take a simple waveform that is a combination of two sine waves - 100Hz and 50Hz. Now, imagine positioning the mics so that the mic further from the source picks up a signal that is delayed by 0.005 seconds (that's 1/2 the wavelength of the 100Hz). The phase difference for the 100Hz portion will be out of phase by 180 degrees, but the 50Hz portion will be out of phase by only 90 degrees. If you time align these so that that either frequency is in phase, the other is out of phase by 90 degrees.
Old 8th June 2006
  #30
Gear Head
 

Quote:
Originally Posted by softwareguy
OK, help me out here.

If you

1. ignore every influence except time delay (room, friction, eq, mic, etc.)

2. acknowledge that there is a single wave form that is being generated by the source

3. acknowledge that the whole sound travels at a consistent velocity that is not frequency dependent

I'm not sure where your time-generated, frequency-dependent phase shift is going to come from.

Am I missing something?
I'm also wondering too and here is an idea (might be stupid but...) , a complex wave file (think of piano chord) made from many waves each with its own frequencies (its on circle),now they have they different wave length, so 5 feet away the high component of the sound might be in its 10 circle (what so ever)but the low not yet to finish its first circle ,so here you go: a different phase relation only 5 feet away...
Post Reply

Welcome to the Gearslutz Pro Audio Community!

Registration benefits include:
  • The ability to reply to and create new discussions
  • Access to members-only giveaways & competitions
  • Interact with VIP industry experts in our guest Q&As
  • Access to members-only sub forum discussions
  • Access to members-only Chat Room
  • Get INSTANT ACCESS to the world's best private pro audio Classifieds for only USD $20/year
  • Promote your eBay auctions and Reverb.com listings for free
  • Remove this message!
You need an account to post a reply. Create a username and password below and an account will be created and your post entered.


 
 
Slide to join now Processing…
Thread Tools
Search this Thread
Search this Thread:

Advanced Search
Forum Jump
Forum Jump