First, Shannon came up with a formula for the minimum number of bits per second to represent the information, a number he called its entropy rate, H. This number quantifies the uncertainty involved in determining which message the source will generate. The lower the entropy rate, the less the uncertainty, and thus the easier it is to compress the message into something shorter.
This also applies to human communication in the sense that the lower the entropy rate, which in this case means the uncertainty on the recipient side on the message that will be sent, the easier it is to compress the message. This could be interpreted in human communication to mean that the certainty from the message could come from the abundance or lack thereof of context, signals that could provide clues as to what the message might be. Considering this context, the message might be compressed into very small messages. A good example of this is air traffic controllers and the military.