1
   

Information theory

 
 
Reply Thu 30 Nov, 2017 01:39 am
Alice wants to send Bob the result X of a fair coin flip over a binary symmetric channel that flips each bit with probability 0 < p < 1/2. To avoid errors in transmission, she encodes heads as a sequence of 2k+1 zeroes and tails as a sequence of 2k +1 ones.
(a) For any fixed k, give an upper bound on the probability of error of decoding.
(b) Show that for any fixed k, the above decoding scheme (of choosing the majority) minimizes the probability of error of decoding
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Question • Score: 1 • Views: 1,234 • Replies: 2
No top replies

 
AngleWyrm-paused
 
  1  
Reply Sun 3 Dec, 2017 01:48 pm
The system's upper bound is 1 chance per 2 outcomes of generating an error. As we add bits to the encoding stream, the chance of the new length NOT containing an error goes down. That relationship between length and line noise can be expressed as

noError = chancePerTry^numberOfTries
markr
 
  1  
Reply Sat 9 Dec, 2017 03:42 am
@AngleWyrm-paused,
Quote:
As we add bits to the encoding stream, the chance of the new length NOT containing an error goes down.

Yes, but the chance of decoding the message properly goes up, and the question was asked about decoding errors, not transmission errors.
0 Replies
 
 

Related Topics

Probability of 11 cards - Question by ptkaisen
Probability - Question by sean2
Exercise in Probability!!! - Question by evinda
Winning chance - Question by ghodhereek
Family Wine Tasting - Question by mrmac
what is the probability of... #4 - Question by w0lfshad3
Calculating a probability, urn problem - Question by dodododavid
 
  1. Forums
  2. » Information theory
Copyright © 2026 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.03 seconds on 03/19/2026 at 06:05:15