A FPGA opensource miner has just been released running at 80Mhps but at a cost of $585. The efficiency is stated below quoted from a post in the thread.
At 80 MHps, I will need at least 3 of these to achieve a single 5830 hashrate.
That is $595.-x 3 = $1785.- at full price, vs. $190.- for the 5830.
Giving the 5830 is consuming $11.- a month in electricity, and assuming this board will consume zero electricity, it will take more than 145 months, or 12 years to recover the investment, always comparing to a 5830.
BUT:
In this thread, someone mentioned he is doing 210Mhash/sec after some optimization but he will cease public posting of his development.
Apologies but no more development information will be posted. I've been offered a 25% share from someone that owns 2 FPGA clusters. If you haven't seen that type of hardware before think a 156 FPGAs per machine.
From those posts what we can understand is that the factors that affect FPGA now are high procurement cost, low running cost and ease of scalability . What this means is that with the increasing total hash rate of the network (30Ghash/day last difficultly adjustment) the question becomes when would the difficulty render GPU inefficient in contrast to running cost?
Remember to take into account FPGAs are usually run in clusters and even though it would not be beneficial to buy one outright, those who have access to FPGA are the first movers and eventual dominant forces of the mining market.
Of course, in the end, ASIC is where it's at. Anyone? =D
Edit: read more stuff, added info.
Want to add to the discussion?
Post a comment!