Let me clarify what I mean:
If you put 80W of electrical energy into an ideal electrical heater, it will put out 80W of heat energy.
But what if you put 80W of electrical energy into a CPU and let it calculate things, creating new information?
Will it also output exactly 80W of heat?

Or is some energy transformed into “information”, so the CPU will radiate less heat?

My instinct is that if information isn’t energy, then you could theoretically create it (thereby reducing entropy) without expending energy, and that’s a no-no.

But if it is energy, then a CPU running a random number generator (creating no information) at max load would get hotter than one doing actual calculations. Which also sounds wrong.

(I’m neither a physicist nor a computer scientist, in case that wasn’t obvious)

  • Successful_Try543@feddit.org
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    25 days ago

    Information is entropy.

    Precisely, information has lower entropy than chaos.

    If we could run a theoretical computer which is 100% efficient, it could compute forever with the same electrons.

    Yet, it would still need extra energy to move the electons around.

    Do not summon Maxwell’s Demon (in German, the English article doesn’t elaborate on this that explicitly):

    Die Mindestenergie E, um n Bit Information zu verarbeiten, beträgt E=nkT ln(2).

    The minimum energy E necessary to process n Bit of information is E=nkT ln(2).