You don't actually 'need' a large source of water for a nuclear plant. It's just substantially cheaper on the cooling systems. Build the cooling systems heavier and you don't need to heat/evaporate water in order to dispose of the heat. This can work even in the desert, and as a bonus they'd be a couple percent more efficient.
I was debating this issue on a forum called 'spacebattles'. Somebody posted a link to a French nuclear reactor under construction - $11B with cost over-runs(it's something of a prototype).
Some financial notes:
The nuclear reactor is rated for 4 times the power at 5 times the cost
The solar plant would have an estimated capacity factor of 31 percent. Which means that you take the faceplate capacity times 365*24*31% in order to figure out the average amount of energy it'll produce in a year. Nuclear plants average roughly 90% - so on a kwh per year comparison, you need 3 watts of solar to match 1 watt of nuclear.
The solar plant should produce 1GWh per year(wiki site). The nuclear plant? 13GWh. 13 times the energy at 5 times the cost? Pretty good deal.
Ivanpah doesn't even include any power storage so it can continue producing power for a significant period of time after the sun goes down, occluded, etc...
I'll note that this doesn't necessarily mean that Ivanpah won't be useful - power usage in the south does trend higher when the sun is out due to AC systems and such. So if you look at it as a sort of peaker, it's not necessarily bad. In any case now that it's built you might as well use it.
On the fancier, more advanced fronts: Conventional nuclear plants are about 30% efficient at turning heat into electricity. This means that a 1GWe nuclear plant is actually 3GWt. Unless you turn it into a co-generation plant where you use the heat for something(which would involve building industry/facilities too close to plants for current concerns), you need to dispose of 2GWt, that's the expensive part, water wise.
The efficiency of turning thermal energy into electrical energy is limited by the Carnot Cycle. Conventional water cooled designs simply can't get that hot due to the pressure necessary to keep the water liquid. PWR/BWR runs at around 315C, which translates to 49% efficiency, but that's the theoretical max. 70% of it is more real world, 34% efficient. Go to
molten salt,
Very high temperature reactor gives you temperatures from 700 to 1000C.
700C gives you a max of 69%, 48% real world
1000C would be 76% and 53%. At this point lowering the temperature of your heat sink(assuming 27C here) would have more effect.
What's the big deal? Going from 30 to 50% efficiency means that you no longer need a 3GWt plant to produce 1GWe. You now only need a 2GWt plant to produce 1GWe. 1GWt to dispose of is half what you needed to get rid of before, the plant can be smaller, etc...
Another positive of getting away from water as a coolant is that you can switch to materials that don't need to be pressurized to stay in the desired state. That means that you don't need an incredibly strong pressure vessel that's at risk of exploding when something goes wrong. Hell, on the high end the reactor core is essentially pre-melted. The higher design temperatures also means that the reactor vessel is a very good radiative body - passive cooling to prevent breach is easy.
Another crazy design I've heard of is building the reactor into a barge and operating it semi-submerged as normal. Only works around coasts though.