Computers are starting to use staggering amounts of electricity. There is a trade-off here between the utility of the tasks they perform and the climate damage caused by generating all the electricity they need. Bitcoin mining is thought to be currently using 2% of America’s electricity and seems an especially egregious waste of energy.
Radically diminishing computer’s electricity requirements as they become more powerful should be seen as an urgent task.
Aren’t modern computers taking way less energy than before per work? We just keep using more of it faster than the energy use decreases?
Yes and yes
So the whole chip is a complicated lens, that somehow can perform multiplication using ‘analogue computation’.
Imo, analog computation is the way forward with this whole AI thing. It seems like a waste to perform calculations bit-by-bit when neural nets are generally okay with “fuzzy math” anyway.
I don’t want fuzzy math anywhere near autonomous armed machines. You want ED-209? Because that’s how you get ED-209.
human brains are the epitome of fuzzy math machines
I mean, I personally agree, but the military has already made it clear they don’t mind. ED-209 is basically an inevitability at this point.
Idk, maybe. But i think you may have issues with tolerances and reproducibility. With analog and neutral nets your going to have edgecases where some devices will give vastly differing outcomes. For something that’s fine but not for others.
Digital is also analog.
Automobile analogy: there is no replacement for displacement… until there is?