On Tuesday I was in Glasgow to speak at the 1st International Workshop on Low Carbon Computing. This was a fascinating workshop bringing together a range of different subject areas within computing to try and address a range of problems relating to low carbon computing -- we focused on breadth rather than depth. I was encouraged to see the range of helpful ideas being pursued, and wanted to highlight a few here.
Smart Meters and Dynamic Power Usage
Smart meters are one of those things I hear of all the time, but about which I never hear any real information. I'm pretty sure I have one, but I don't know if that is a good thing or a bad thing. However, the use case discussed here related to data centres rather than domestic use.
A recurring concern in several talks was about green electricity. Renewables provide lots of energy, but not always when we need it. Energy storage technology is insufficient to smooth this out, so we typically need to fall back on fossil fuels, and may have periods of excess where we do not use all the renewable power we produce. The solution (in principle) is to dynamically scale power usage in non-urgent tasks. Massive machine learning tasks, for example, do no need to run right now, so could be delayed until there is an excess of green energy, rather than driving up demand for fossil fuels. In theory, this should align environmental concerns with naive economics -- energy suppliers could increase prices when electricity is in short supply, incentivising clients to consume power at other times. This would save the client money on their energy bill, and the provider who could better utilise "free" renewables.
In practice, there are a number of challenges preventing rapid deployment. Aging national power infrastructure does not always provide the necessary monitoring data to make these decisions -- and certainly does not do so in a standard way, so each nation must solve this for themselves. New infrastructure is needed for providers to signal to clients (via smart meters) that they should use less power -- a trivial engineering problem, but one that costs money. Finally, end systems must be (re)designed to dynamically scale power usage. Depending on the task, this will probably involve at least redesigning the OS's scheduling metrics, and adding a new kernel API to specify the power preferences of an application. However, some promising results from the workshop suggest that dynamic scheduling is possible, so there is good motivation to solve these problems.
It is worth noting that there would be additional welfare concerns if this approach were used in domestic cases. Dynamic pricing can be volatile, and when demand is driven by widespread environmental factors (such as cold weather), the end result may simply be power becoming more expensive when it is needed, rather than creating opportunities for savings. Most homes do not have a queue of batch processing jobs that can be delayed indefinitely...
Green Accounting
Another interesting angle, again focused on data centres, is to measure their environmental impact. If this can be done, it could be build into other methods to incentivise greener approaches (e.g. tax relief, good PR). While it is relatively easy for a data centre on the whole to do this -- with their energy provider verifying claims around energy use -- it is trickier for the client of a cloud provider to make the same measurements. They rely on data from the provider, and need some way to prove that this data is reliable. However, if the provider releases all energy usage data transparently, there could be privacy concerns.
In their talk "Emission Impossible", Jessica Man et al. outlined a framework for verifying a data centre's claims using a web of trust combined with a zero-knowledge proof. I found this solution particularly interesting, as it is the first application I have encountered where a zero-knowledge proof would actually be helpful, rather than just intellectually interesting.
Frugal Computing
A final theme was our obsession with doing more computing. While LLMs are everyone's favourite example this year, we keep increasing the amount of computing work humanity collectively performs in a way that outstrips the benefits we gain. This is an example of Jevon's Paradox: as the efficiency of processing (processing power per unit of energy) has improved, it has become economically favourable to do more, so the improved efficiency has resulted in and increase in energy usage, rather than a reduction.
Frugal computing -- the wild idea that we could do less computing work -- is again a solution in principle. In practice, it is very hard to incentivise this. A few talks identified specific areas where this problem surfaced, but other than individuals following a personal preference, a solution is lacking. Perhaps there is more of an opportunity in the domain of computer systems, where APIs could be designed to discourage "wasteful" work, but this may be only a drop in the ocean against the ever-growing demands of applications.