Crypto-radiator update – the quest for 100% efficiency and beyond

So a few little updates on the crypto-radiator (see previous post HERE) …I’ve had quite a few bits and bobs going on with cryptocurrencies in general recently and partially inspired by Monero’s recent run to over $100 I had some ideas for how to improve the efficiency of the system.

So far it’s been mining on the CPU only, which outputs about ~60Hashes/sec (H/s), clearly not gonna be profitable but the more I’ve thought about this, the more it’s highlighted the importants of the “primary reward” as a factor.

By that I mean – if you’re purely mining crypto to make profit then you have quite a tricky task, and you end up in quite a precarious position since if your system isn’t profitable (which can happen at any moment) you’re shit-outta-luck and potentially thousands of $$ in the hole.

However if your primary reward is *heat*, and you’re only using mining crypto as a way of recouping some of that cost, the situation looks a lot more favourable…since heating is something you’d have to pay for anyway, and if you can get it significantly cheaper then you’re winning, even if it fluctuates a bit.

We started off with an electric oil radiator which was costing us an estimated £20/mo to run – so that’s been our baseline all along. The crypto-rad already looked better in terms of efficiency out of the box in that whilst it ran 24/7 rather than only part of the time (due to the oil radiator’s thermostat), it still used less power. Add into that the mining side of it and you start to recover some of that cost.

But what if we could improve that – my initial estimations (based on somewhat shaky maths) put the recovered cost at around 40% (based on the Monero/XMR prices at the time). There were a few reasons that maths was shaky – the crypto-rad wasn’t the only machine in my mining pool, but the other machines didn’t run 24/7 so it was still the dominant miner…but it does mean the mining outputs weren’t 100% accurate.

So now I’ve set the crypto-rad up in its own account in the pool – so it’s the only machine mining on that account, and I’m monitoring the settings much more closely. I want to know *for sure* whether it’s viable or not.

 

The quest for more efficiency:

So on my desktop pc I have the mining set up and on the cpu alone it mines at around 60H/s but on the GPU it mines at around 130H/s.

That suggests to me that the GPU mining could be significantly more efficient than CPU-mining alone..no surprise there really. but do the numbers really work?

So I bought an Nvidia GTX 750 ti 2gb to test. It cost £90, so there’s some initial outlay to recover but if it improves the efficiency enough it may be worth it.

Obviously the first thing to do is get some more accurate measurements with it running as it is.

 

Baseline readings:

So I ran a test for approx 74 hrs where I measured the electricity usage as well as the mining output, in an effort to calculate the real-world efficiency of the system:

0.007107490393xmr  (£0.58) earned in 74 hrs.

6.77Kwh used in 74 hrs.

Cheap rate electricity is 1am-7am – i.e. 25% of the time, so given the machine is running 24/7:

75% of 6.77kwh = 5.08 kwh @ normal rate electricity (£0.14/kwh) – £0.71

25% of 6.77kwh = 1.69 kwh @ cheap rate electricity (£0.07/kwh) – £0.12

Total cost of running for the 74 hrs: £0.83

so extrapolating for a month:

COST
£0.83 / 74hrs * 24 = £0.26/day, * 30 = £8.08/mo

EARNED:
£0.58 / 74hrs * 24 = £0.18/day, * 30 = £5.54/mo

Total cost of running per month (30 days) with cpu mining only: £2.54/mo (not bad at all vs £20/mo for the original oil rad!)

So – roughly calculating the efficiency:

£0.58/£0.83 * 100 = 69.9% efficient

69% efficient!!! that’s pretty damn good actually.

NOTE: It’s worth bearing in mind that my initial estimate of 40% efficiency in my previous post was based on the XMR prices at the time – which was around £40/xmr. That figure was not *wrong*, but whilst I think my initial estimate of 40% was a little optimistic, the monero prices have more than doubled since then and the mining difficulty will have also increased…which pretty brings us to the current figure of 69.9%.

 

Adding a GPU:

I’m not claiming Nvidia’s 750GTX ti is the b-all and end-all of cards, but it’s better than the one in my desktop and it’s relatively cheap. If the experiment failed I could still put it in my desktop and get a few more FPS on my video games (back to this “primary rewards” thing again!).

Arguably higher-end cards could be more efficient but they’re also a lot more expensive, not to mention scarce. Personally, at this point I caouldn’t really afford to drop £300+ on a card just to get a few more FPS in elite dangerous if the experiment flops, whereas £90 I could live with.

So I built a fresh lubuntu install onto a 32gb sandisk usb stick, so that it’s easy to back up & replicate if it works, and no risk to the existing radiator SSD if it doesn’t. Getting the nvidia driver, CUDA etc all playing nicely together is not a trivial task so this turned out to be a good move, since I had to blat the system a couple of times till I found a build which worked (if anyone wants details on how to build I can post links).

I’m using xmrMiner which has quite a good set of build instructions and seems like it would offer a decent hashrate. I can’t yet verify whether it’s better than claymore or any of the others but there may be some efficiency gains yet to be had by using different miners.

So once the nvidia drivers, CUDA etc were all set up and xmrMiner installed, plus a bit of time playing with launch parameters for the miner, I found a pretty decent setup which was producing a steady 252H/s (launch=48×10 in case anyone’s interested – will vary from card to card though so you’ll need to experiment a bit).

I then set the “coolbits” setting and was able to overclock the card, getting it up to ~270h/s, though I had some trouble making the settings persist. That’s something I may come back to – apparently overclocked but under-volted seems to be a good way to go. For now though I just wanted to get an overall idea of whether the GPU was a good option.

 

Turn everything up to MAX ftw!

So this is where we crank everything up and see what it does, right?

Well actually – NO.

I already had a web interface set up which clocked the CPU up and down to vary the heat output, and I figured I’d want to tie the GPU into that, over- and under-clocking that in line with the CPU.

So I was taking some measurements related to power usage vs performance and I sorta ended up with a “key metric” – namely Hashes per Second per Watt – H/s/w, I’ll just call them hsw from here on in.

The crypto-radiator was running minergate-cli on the cpu and xmrminer on the gpu, and as you’d expect both affected the power usage, as did how much they were clocked up/down

It seems like there was a basic cost of having the machine running (~60w) but then varying the CPU speed and the gpu speed would affect the hsw rating. The CPU seemed to affect up to about 40w and the GPU could cost up to an additional 60w.

The crypto-rad had been before (running on cpu alone) was running around 0.6hsw at best, but with the GPU in the mix too I was seeing readings of up to 2.5hsw. That’s quite a significant, and non-linear, improvement – exactly what I was hoping for, though quite a bit more of an improvement than I expected!

I played with the settings a while longer and managed to break everything and had to reinstall when I tried to make the GPU o/c settings persist. Fortunately I’d made a backup of the USB-drive so the rebuild was minimal, but still. So no overclocked card for the moment. Fair enough.

Even so – I found that actually the best combo of cpu speed + gpu was having the cpu clocked most of the way down to around 2100mhz (vs 3200mhz max) which gave me an output rating of 2.51hsw.

This seemed to be as good as it was going to get for the moment so I set another test going to see where we’re at efficiency-wise.

 

The results?

This test ran for 32 hours – I’d have liked to go longer but this was just to get an idea and I was impatient to see the results 😉

0.004599848663 xmr earned in 32 hrs (£0.38)

3.87kwh used in 32 hrs

assuming the same 75:25 split for normal rate:cheap rate electricity and £0.14 / £0.07 per kwh respectively:

2.9kwh @ normal rate ( £0.41)

0.97kwh @ cheap rate ( £0.067)

cost: £0.41 + £0.067 = £0.48

so extrapolating for a month:

COST
£0.48/32 * 24 = £0.36/day * 30 = £10.80 / month cost

EARNED
£0.38/32 * 24 = £0.285/day * 30 = £8.55 / month earned

£8.55 / £10.80 = 79.2% efficient!

not bad at all 🙂

Conclusion:

ok so not quite 100% efficient – but a noticeable improvement

Well this was only a short test so the numbers may not actually be 100% accurate, and this (32hr) test was with the machine set at 2.51hsw, but there was one setting which was apparently even more efficient – namely having the cpu clocked all the way down to 800mhz + the gpu at normal speed, which claimed to be about 2.8hsw. I didn’t run it on that setting initially because I was kinda more interested in the sustained h/s output than ultimate efficiency, though clearly that may not have been the best move.

I’ve not set another test running at 2.8hsw and will report back – but that format uses less power (95w vs 120w) and produces only marginally less H/s, so we could still see quite an improvement.

79.2% current efficiency / 2.51 hsw * 2.8 hsw = 88.35% efficiency (theoretical).

If we can get to 90% that would already be a HUGE improvement and would mean virtually free heating for us.

WHICH WOULD BE AWESOME!

I’ll post an update once the 2.8hsw test has run for a few days and we see how the numbers shape up.

EDIT!

Just realised that if I kill the CPU miner completely the power usage drops to ~83w but the hashing output stays at 252H/s, which puts the efficiency at 3hsw (highest reading yet). 

79.2% efficiency / 2.51 * 3 = 94.6% efficiency. Have amended the test to see if we can confirm the theory 🙂

Further optimisations?

I think there’s a number or directions you could take this further. The base (electricity) cost of running the PC (65w at idle) bothers me, and that’s something which may be minimised by having more GPUs since even adding a second GPU would already make that parasitic cost proportionally smaller. It may also be possible to find a motherboard/chip combo which just uses less power, since the cpu mining part of it is the smallest contributor. that way you’d basically just have a minimal board as life-support for the GPU. I’ve seen mentions of boards which only cost 25w at idle which would be quite a significant saving and that alone could push the efficiency over 100%.

Another idea is to potentially run the system off a solar power rig. 95w seems like it might be quite manageable on a small panel + inverter (e.g. 250w), though the initial outlay would be quite high, since you’d need the panel, plus a couple of batteries, charge controller, inverter etc. On the other hand then your electricity could actually be *free* which would be pretty awesome once you get past the 6-mo/1yr threshold and start to break even.

Currency fluctuations:

Also worth bearing in mind is that these calculations are based on *today’s* XMR exchange rate. Most cryptos have experienced a slight pullback after the recent epic bull run. This illustrates that the exchange rate plays as much of a role in the efficiency as anything else…a few days ago XMR was over £100 but today it’s £82…so even if I don’t change anything else, as soon as the markets recover we could be over the 100% efficiency mark.

This also illustrates why mining for profit is so risky (imho).

 

Leave a Reply