Too much capacitance shouldn't hurt the circuit, so just aim for a value that's bigger than you'll ever need.
This will also help with longevity of the circuit; most capacitors tend to degrade over time.
Zahc - What's the voltage/amperage/maximum wattage of the circuit you're supplying? Also, are you using a half wave rectifier, full wave, or is this in an industrial setting where you're using a three phase one?
This requires some thought, but it gives you the necessary math. The trick is 'how much voltage drop/jitter is acceptable'.
C=Capacitor value, in Farads
f=frequency of DC peaks after the rectifier. 60 for a half-wave on US power, 120 for a full-wave
Xc=Maximum allowed resistance for your capacitor.
C=.159/f*Xc (.159= 1/(2pi))
Let's say your rough load is 100 watts at 12 volts*. That's 8.33A. R=V/I means 1.44 ohms.
Let's also say that we want it to have a minimum of 10V, IE the 1.44ohm is 10/12th the circuit part, leaving 2/12 for the capacitor. Xc=.288
Full wave rectifier = .159/(120*.228)=.00581 farads, or about 6 millifarads, 6k uf. Or one of
these for $4.52
More load = bigger cap, smoother load = bigger cap, etc...
*We're powering a big laptop computer or something.