Please explain. Is it a problem of electricity costs to run a system long enough to mine these things has you paying more in electric bills than the value of the coins you mine? I have little understanding of the whole cryptocurrency thing.
Under any currency, it's not financially/energy efficient to use the CPU to mine. You suck up 250 watts or more, to produce minimal hashes.
Several years ago, it made sense to mine BTC using GPU, and special apps were written to do so with video cards of that era. Nowadays, the algorithms to solve to be rewarded with BTC are so complicated and specialized, that both CPU and GPU mining for BTC will still consume 500+ watts of juice and not produce enough BTC to even cover the electric costs.
But, the ASIC miners from about a year or so back are USB based. USB can only consume 500 milliwatts per object. They provide ~300 megahash/sec. They can be clustered to provide gigahashes, if you have enough USB connectivity.
As a stand-alone rig, they don't make much sense. BTC nowadays requires dozens of gigahashes to be lucrative. Lee N. Field has a Butterfly Labs solution that produces (if I recall correctly) 6 Ghash/sec, and it's barely worth the power to leave on. That's because you have to power the computer that then powers the Butterfly Labs device that connects to it via USB.
However, I'm primarily focused on mining LTC... not BTC.
LTC can be mined using today's modern GPU's, still. My current GPU's (Nvidia 8600 GT, and a pair of Radeon 3870's) are too old to be financially viable for the power they suck. The Nvidia will suck 400 watts on a system running a GPU miner app and a CPU miner app combined. And provide about 20-25khash total. In comparison, a good stand-alone LTC mining rig will provide at least 250khash. There are miners in my pool that appear to generate over 2000khash using a single box, by using Radeon Crossfire technology and multiple high-end GPU's. There are some people building farms in their homes and generating 20,000khash in my pool. They win the lion's share of the proceeds from successful mining in the LTC blockchain.
I'm upgrading my GPU form the Nvidia, which is producing about 8khash (CPU is producing another 12-15khash), to an AMD Radeon R9 270X. Benchmarks from other users indicate it should produce nearly 500khash.
The Nvidia 8600 GT I have supports Nvidia's CUDA instruction set, which is an instruction set that LTC code developers have used to write the mining apps and optimize them for Nvidia cards. The preferred instruction set, though, is OpenCL (not OpenGL). CUDA is a proprietary variant of OpenCL that only Nvidia supports, and the 8600 is probably the earliest card Nvidia made that can run it. My Radeon 3XXX cards, despite being newer and faster for games, do not support OpenCL. The earliest generation of Radeon cards to do that is the 4XXX series. So I cannot effectively run the LTC mining app on my more powerful Radeon cards I currently have, and I'm running on the Nvidia cards using the CUDA instruction set.
The Radeon R9 270X that I ordered does support OpenCL, and is about 50x more powerful than my 8600GT when executing similar code.
I was considering upgrading my CPU from the 3-core to a more advanced 6-core chipset, but the khash increase would only be from ~12 to ~40. Not enough reward for the $150+ investment. Instead, I will sit on the money and consider an eventual replacement of the mobo/cpu for a board like the one that recently failed... with multiple PCIe slots, and fill it up with additional Radeon R9 cards and Crossfire them to exceed the megahash threshold.
So, while that is cranking away (in what appears to be an economically sound model at $25+ LTC, and a loss I can tolerate if that turns out to not be true), I have the CPU at idle. People are selling their ASIC USB BitCoin calculators to raise money for the next wave of 480GHash interfaces, at $5000 each. The little USB ASIC sticks are selling for about $50 each and provide ~300Mhash.
The computer is already on. There will be about 3 watts of additional power consumption to run 6 of these miners, and they would provide almost 2Ghash of processing on BTC. The results will be "dust" in the land of BTC... hundredths of a coin per interval... but how many of you think about the power consumption of an LED nightlight? And those hundredths of a coin each interval will stack up slowly over time.
Given the point of this machine is to mine LTC... it doesn't hurt to multitask.