Introduction to information theory: entropy rate of a sequence of random variables: correlated variables: conditional entropy and mutual information (sections 1.3, 1.4)
Reference book: "Information, physics, and computation" by Mezard and Montanari
Introduction to information theory: entropy rate of a sequence of random variables: correlated variables: conditional entropy and mutual information (sections 1.3, 1.4)