So just a little update re the crypto-radiator – I’ve since got the TEMPer USB thermal probe working on the machine and have set it to log temperatures.
There is however a slight issue in that the usb ports are on the back of the machine and that’s also where the warmed air comes out , so the heat from that does affect the reading from the TEMPer device somewhat, which means the temperature reading can’t really be relied upon to be entirely accurate.
It would be possible to run a USB extension with the TEMPer probe on that and get a much more accurate reading, but the crypto-radiator is in quite a “hostile environment” (for computers anyway) in that where the radiator lives there also lives a bunny who likes to chew wires. Plus the whole idea was to keep the thing as self-contained and simple as possible like the oil radiator it replaced.
Arguably the thermostat on the oil radiator would also be affected by the heat the radiator produces so I guess this is not so different really.
For the moment I have the TEMPer probe logging temperatures to a CSV file every 5 mins so that I can get an idea of the range of temperatures it encounters. I think what will probably work best is to set an upper and lower threshold (or a target temp with a +/- margin) and then when the air temperature produced by the crypto-radiator falls outside of that range it will step the cpu speed up or down. Having it try too hard to track a given temp seems like something of a fool’s errand given the feedback loop from the warm air, so having it react to temperature “extremes” seems like a more useful approach.
I’ve been watching the amount of Monero getting mined by my little “pool” (i.e. the crypto-rad and my main desktop) and when turned down and in “idling” mode, on my dekstop on its’ 6x amd cpu it tends to mine at about 40-60H/s but the GPU is more like in the 80-100H/s range.
Looking at the XMR outputs from both machines my thinking is that there’s potential to increase both the efficiency and the heat output of the crypto-radiator by adding GPU mining (potentially at the expense of more noise), but so far I haven’t found a linux-based-minergate-compatible-GPU-mining tool which can be controlled from the terminal (as far as I’m aware the minergate-CLI client only works with the CPU.
I suppose I could get it to launch/kill the minergate GUI miner as needed since the machine does have a desktop installed, but if there’s a terminal-only version I’d prefer that for neatness’ sake. Launching and killing the gui miner seems like a bit of a hack…but we’ll see.
It’s warm at the moment so it’s spending most of its time on “min” but this is an ongoing project so we’ll see how things shape up when it gets colder (e.g. in the summer knowing the UK haha)
Code-wise – getting the TEMPer probe to work I had the best luck with the python code which can be found here:
There is a C program but I couldn’t get it to work – probably my fault but I’ll include the link just for anyone who might find it helpful: