Removing entropy from transistors is expensive - computers use just two states separated by large voltage differences. In AI, entropy isn't a problem as we don't care about repeatable results. Therefore, why not use more of the linear or even non linear range of the transistor for this purpose?