Today we discussed Shannon’s model of communication in two stages: source coding followed by channel coding. We introduced the model of a discrete memoryless channel, and saw some examples such as the binary symmetric channel, the binary erasure channel, the noisy typewriter channel, and a continuous example: the binary input additive white Gaussian noise channel. We stated Shannon’s noisy coding theorem for a discrete memoryless channel (we will later prove it for the BSC). Together with the source coding theorem, this led to the joint source-channel coding theorem. We mentioned the “separation theorem” on optimality of doing source and channel coding separately. The discussion of the capacity of the noisy typewriter channel led us into a brief discussion of zero-error capacity and Lovasz’s theorem on Shannon capacity of the 5-cycle.
January 24, 2010
Leave a Comment »
No comments yet.
Leave a comment