Reply
Thu 30 Nov, 2017 01:39 am

Alice wants to send Bob the result X of a fair coin flip over a binary symmetric channel that flips each bit with probability 0 < p < 1/2. To avoid errors in transmission, she encodes heads as a sequence of 2k+1 zeroes and tails as a sequence of 2k +1 ones.

(a) For any fixed k, give an upper bound on the probability of error of decoding.

(b) Show that for any fixed k, the above decoding scheme (of choosing the majority) minimizes the probability of error of decoding

The system's upper bound is 1 chance per 2 outcomes of generating an error. As we add bits to the encoding stream, the chance of the new length NOT containing an error goes down. That relationship between length and line noise can be expressed as

*noError = chancePerTry^numberOfTries*

@AngleWyrm-paused,

Quote:As we add bits to the encoding stream, the chance of the new length NOT containing an error goes down.

Yes, but the chance of decoding the message properly goes up, and the question was asked about decoding errors, not transmission errors.