Aims to convey 3 principal developments in the evolution of information theory, including Shannon's interpretation of Boltzmann entropy as a measure of information yielded by an elementary statistical experiment and basic coding theorems on storing messages and transmitting them through noisy communication channels in an optimal manner.