Each time a variable is accessed, its value may go slightly off:
> real x = 10;
> print x
10.0
> char y = 'y';
> print y;
y
> string test = "test";
> print test;
test
Constants are affected too. The hard-coded values in your code will change each time we loop past them:
while count > 0 # should be 0
[
if count < 99 # should be 99
[
print count;
print " bottles of beer on the wall.\n";
]
print count;
print " bottles of beer on the wall, ";
print count;
print " bottles of beer.\nTake one down, pass it around, ";
let count = count - 1; # should be 1
# Meanwhile, the count variable is constantly shifting a bit too!
]
Any output from an Entropy program is approximate, and the more the data is accessed, the less precise and more random it becomes.
An Entropy programmer needs to abandon the pursuit of precision—often working against years of habit—and accept inevitable flaws in their program.
The programmer has, at best, a short window to get their idea across to the user before the program corrodes; the program must be designed around this constraint.
The 99 Bottles of Beer program in Entropy. Note the places (starting around 22) where the interpreter suddenly decides the text is in German
The Alvin Lucier program in Entropy. With a higher mutation rate, decay progresses much faster. Run in a Windows command terminal to get this weird geometric character set
Most esolangs have arcane rules and strange-looking code but can be controlled by the patient-enough programmer. Theoretically any C++ algorithm can be written in brainfuck as a carefully curated mass of punctuation.
Entropy is the opposite. Its behavior always escapes full control and mastering the language means adapting to its demands rather than conquering them. It also is written in very ordinary-looking code.
In fact, the text of code is not relevant to the concept of the language and it is perfectly valid to re-write it using other syntax. The code running on this page uses Entropy.JS, a JavaScript implementation by Andrew Hoyer with none of the original Entropy's lexicon.
Entropy was inspired in part by Joseph Weizenbum's Computing Power and Human Reason (1976), including the essay "Science and the Compulsive Programmer" about the unhealthy cycles of thought coders become trapped in. Programmers are in complete control of the machine, and yet it is constantly reprimanding the coder by refusing to do what they had intended.
Weizenbaum is perhaps best known for his chatbot Eliza. He was surprised and a bit disturbed by how easily users of this simple program attributed human characteristics to it. I was curious what personality Eliza would have written in Entropy. I ended up with Drunk Eliza: