From a thermo point of view, the extra condensor can be used in 2 different ways.
1. Maintain same compressor work input, but cool the working fluid below saturated liquid
2. Reduce compressor workload by decreasing peak pressure & temperature of the superheated fluid
Nick is talking about #2. #2 reduces the peak enthalpy of the fluid and provides less heat available for rejection. This should reduce the overall efficiency of system.
Let me try and explain in plain english...
Hot, high pressure fluid = high energy fluid.
The reason the fluid is hot is because the compressor has done work on the fluid by pressurizing it. The hot fluid is now hotter than ambient, and can dump the heat. The design of the condenser is slave to the design of the compressor. The condenser should already be designed to be the right size in any normal ambient temps (not normal = 130F outside at 10,000ft altitude). If the compressor heats the fluid less, then there is less heat to be rejected to atmosphere. In such a case, the condenser doesn't need to be bigger, but smaller.
The engineering for efficiency goal is the reverse, to heat the fluid to the maximum that materials and cost allows. Then, create a condenser big enough to dump that high temp, high pressure energy to atmosphere (or some colder reservoir). This achieves maximum efficiency.
Another way to really improve efficiency, the condenser should dump its heat to cold underground pipes. Or dump the heat into a swimming pool (2 birds with 1 stone, heat the pool, cool the house). That way the temperature differences is larger not smaller.