1
   

Information theory

 
 
Reply Thu 30 Nov, 2017 01:39 am
Alice wants to send Bob the result X of a fair coin flip over a binary symmetric channel that flips each bit with probability 0 < p < 1/2. To avoid errors in transmission, she encodes heads as a sequence of 2k+1 zeroes and tails as a sequence of 2k +1 ones.
(a) For any fixed k, give an upper bound on the probability of error of decoding.
(b) Show that for any fixed k, the above decoding scheme (of choosing the majority) minimizes the probability of error of decoding
  • Topic Stats
  • Top Replies
  • Link to this Topic
Type: Question • Score: 1 • Views: 794 • Replies: 2
No top replies

 
AngleWyrm-paused
 
  1  
Reply Sun 3 Dec, 2017 01:48 pm
The system's upper bound is 1 chance per 2 outcomes of generating an error. As we add bits to the encoding stream, the chance of the new length NOT containing an error goes down. That relationship between length and line noise can be expressed as

noError = chancePerTry^numberOfTries
markr
 
  1  
Reply Sat 9 Dec, 2017 03:42 am
@AngleWyrm-paused,
Quote:
As we add bits to the encoding stream, the chance of the new length NOT containing an error goes down.

Yes, but the chance of decoding the message properly goes up, and the question was asked about decoding errors, not transmission errors.
0 Replies
 
 

Related Topics

 
  1. Forums
  2. » Information theory
Copyright © 2024 MadLab, LLC :: Terms of Service :: Privacy Policy :: Page generated in 0.05 seconds on 05/29/2024 at 06:42:32