This seems true.
In my experience, these things that happened to kill programs could be considered entropy:
- New (e.g. hardware / software / code / people / focus)
- Money (e.g. actual or perceived infusion of it / actual or perceived lack of it / focus changed)
- Loss (e.g. someone or something left / was injured / died / was destroyed / was deleted / was corrupted)
And I think that if you have a system that contains risk due to entropy, then even a planned event resulting in success is entropic, e.g.:
- I plan a sunset for X software.
- There is risk of an asteroid or sudden epidemic that would thwart that plan.
- The “dice are rolled”, and the sunset happens because the asteroid and epidemic didn’t happen.
- Therefore, the planned sunset occurred due to less than 100% chance. This is still entropic.
This seems true.
In my experience, these things that happened to kill programs could be considered entropy:
- New (e.g. hardware / software / code / people / focus)
- Money (e.g. actual or perceived infusion of it / actual or perceived lack of it / focus changed)
- Loss (e.g. someone or something left / was injured / died / was destroyed / was deleted / was corrupted)
And I think that if you have a system that contains risk due to entropy, then even a planned event resulting in success is entropic, e.g.:
- I plan a sunset for X software.
- There is risk of an asteroid or sudden epidemic that would thwart that plan.
- The “dice are rolled”, and the sunset happens because the asteroid and epidemic didn’t happen.
- Therefore, the planned sunset occurred due to less than 100% chance. This is still entropic.