1

# Information theory

Thu 30 Nov, 2017 01:39 am
Alice wants to send Bob the result X of a fair coin flip over a binary symmetric channel that flips each bit with probability 0 < p < 1/2. To avoid errors in transmission, she encodes heads as a sequence of 2k+1 zeroes and tails as a sequence of 2k +1 ones.
(a) For any fixed k, give an upper bound on the probability of error of decoding.
(b) Show that for any fixed k, the above decoding scheme (of choosing the majority) minimizes the probability of error of decoding
• Topic Stats
• Top Replies
Type: Question • Score: 1 • Views: 794 • Replies: 2
No top replies

AngleWyrm-paused

1
Sun 3 Dec, 2017 01:48 pm
The system's upper bound is 1 chance per 2 outcomes of generating an error. As we add bits to the encoding stream, the chance of the new length NOT containing an error goes down. That relationship between length and line noise can be expressed as

noError = chancePerTry^numberOfTries
markr

1
Sat 9 Dec, 2017 03:42 am
@AngleWyrm-paused,
Quote:
As we add bits to the encoding stream, the chance of the new length NOT containing an error goes down.

Yes, but the chance of decoding the message properly goes up, and the question was asked about decoding errors, not transmission errors.
0 Replies

### Related Topics

Probability of 11 cards - Question by ptkaisen
Probability - Question by sean2
Exercise in Probability!!! - Question by evinda
Winning chance - Question by ghodhereek
Family Wine Tasting - Question by mrmac
what is the probability of... #4 - Question by w0lfshad3
Calculating a probability, urn problem - Question by dodododavid

1. Forums
2. » Information theory