Information Theory & Coding
#1

[attachment=12328]
Information Theory & Coding
Introduction:

 Information theory is a branch of probability theory which can be applied to the study of the communication systems.
 Information theory deals with mathematical modelling and analysis of a communication system rather than with physical sources and physical channels.
 The performance of the system depends upon available signal power, channel noise and bandwidth.
 Based on these parameters it is possible to establish the condition for errorless transmission.
 These conditions are referred as Shannon’s theorem.
 Coding means assigning binary values for the information. The main purpose of coding is to improve the efficiency of the communication system.
Advantages:
1. Error probability is reduced, since errors are detected and corrected
2. Excellent channel coding techniques are available that minimize the chances of errors.
3. It is possible to utilize full capacity of the channel with appropriate coding technique.
4. The coding techniques for forward error correction as well as automatic repeat request are available.
Limitations:
1. All Channel coding techniques add redundancy bits for error detection and correction. This increases output data rate and requires wide bandwidth.
2. The coding and decoding involves computations, which involves delays in transmission & reception.
Discrete messages and Information Content:
• The information sources generates the output which is random in nature.
• The source can be characterized interms of its statistical properties, but its exact output cannat be predicted.
• If the source output is known exactly in advance, then there is no need to transmit it.
• The information source can be analog or discrete.
Discrete Information Source:
• The information is said to be discrete if it emits any one symbol from the finite number of symbols.
• To prepare the mathematical model of discrete information source, consider that the source generate L number of set of alphabets: {x1,x2,x3....xL}.
• Thus the information source generate any one alphabet form this source.
• Let us further consider that each letter in the alphabet {x1,x2,x3....xL} has the probability of occurrence as pK.
• Probability of occurrence of random output xK is represented as,
pK = P(X=xk) 1 ≤ K ≤ L
Here pK is the probability of occurrence of random output X= xk . And the summation of probabilities of all letter is equal to unity.
Discrete Memory less Source:
• A source whose individual output letters are statistically independent is called discrete memory less source.
• In such source, the current letter is statistically independent form all past and future outputs.
Stationary Source:
• If the output of the source is statistically dependent on past and future outputs, then it is called stationary source.
Analog Source:
• The analog source has an output waveform x(t) that is the sample function of a random process.
• It is possible to convert the output of analog source into discrete time source with the help of sampling theorem.
Uncertainity:
• Consider the source which emits the discrete symbols randomly from the set of fixed alphabet i.e.
X= {x0,x1,x2.....xK-1}
• The various symbols in X have probabilities of p0,p1,p2,... which can be written as
P(X=xk) = pK K=0,1,2.....K-1
• This set of probabilities satisfy the following condition,
• The idea of information is related to Uncertainity or Surprise.
• Consider the emission of symbol X=xk from the source.
• If the probability of xk is pk = 0, then such a symbol is impossible.
• Similarly when probability pk = 1, then such symbol is sure.
• In both the cases there is no ‘surprise’ and hence no information is produced when symbol xk is emitted.
• As the probability pk is low, there is more surprise or uncertainity.
• Before the event X=xk is emitted, there is an amount of uncertainity.
• When the symbol X=xk occurs, there is an amount of surprise.
• After the occurrence of the symbol X=xk, there is the gain in amount of information.
Definition of Information:
• Let us consider the communication system which transmits messages m1,m2,m3.... with probabilities of occurrence p1,p2,p3.... The amount of information transmitted through the message mK with probability is given as,
Amount of Information : IK = log2 (1 / pK )
Unit of Information is bit or binit.
• Properties of Information:
• If there is more uncertainity about the message, information carried is also more.
• If receiver knows the message being transmitted, the amount of information carried is zero.
• If I1 is the information carried by message m1, and I2 is the information carried by m2, then amount of information carried completely due to m1 and m2 is I1+I2
• If there are M=2N equally likely messages, then amount of information carried by each message will be N bits.
• Entropy ( Average Information)
• Consider that we have M-different messages. Let these messages be m1,m2,m3...mM and they have probabilities of occurrence as p1,p2,p3...pm .
• Suppose that a sequence of L messages is transmitted, then if L is very very large, then
pML messages of mM are transmitted
Hence the information due to message m1 will be
I1 =log2 (1/p1)
Since there are p1L number of messages of m1, the total information due to all messages of m1 will be,
I1(total) = p1L log2 (1/p1)
Similarly the total informationdue to all messages m2 will be
I2(total) = p2L log2 (1/p2)
Thus the total information carried due to the sequence of L messages will be,
I(total) = I1(total) + I2(total) +....+ IM(total)
I(total) = p1L log2 (1/p1)+ p2L log2 (1/p2)+....+ pML log2 (1/pM)
• The average information per message will be,
Average Information = Total Information / Number of messages
= I(total) / L
Average Information is reperesented by Entropy, denoted by H. Thus,
H = I(total) / L
H = p1log2 (1/p1)+ p2log2 (1/p2)+....+ pMlog2 (1/pM)
Properties of Entropy:
• Entropy is zero if the event is sure or it is impossible
H=0 if pK = 0 or 1
• When pK = 1/M for all the M symbols, then the symbols are equally likely, for such source entropy is given as H= log2 M.
• Upper bound on entropy is given as,
Hmax = log2 M
Information Rate
• The information rate is represented by R and it is given as,
Information rate R = r H
Where R is information rate
H is Entropy or average information
r is rate at which messages are generated
Information rate R is represented in average number of bits of information per second. It is calculated as follows.
Extension of Discrete Memoryless Source:
• Consider the discrete memory less source having alphabet X= {x1,x2,x3,....xM}.
• This source emits M number of individual symbols. The symbols of this alphabet occurs in two or more groups.
• These group of symbols can be said to have produced from extended source.
• This extended source is said to have {Xn } alphabet with Mn distinct symbols.
• Here n is the number of successive symbols in one group or block.
• The symbols generated by the discrete memoryless source are statistically independent.
• Hence the probability of the group of symbol is obtained by product of individual symbol probabilities.
• Hence the entropy of Xn and H(X) are related as,
H(Xn ) = n H(X)
Reply

Important Note..!

If you are not satisfied with above reply ,..Please

ASK HERE

So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page
Popular Searches: seminar on information theory and coding ppt, japanese symbols alphabet, information theory, download pdf book of communication engineering coding theory, application of coding theory in communication pdf, simple projects on information theory and coding, communication engineering coding theory,

[-]
Quick Reply
Message
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Possibly Related Threads...
Thread Author Replies Views Last Post
  Information Technology Projects? shakir_ali 0 8,906 30-10-2014, 01:31 AM
Last Post: shakir_ali
  IEEE based Seminar topics for Information Technology project topics 2 13,319 14-01-2013, 11:12 PM
Last Post: Guest
  DOMAIN DIMENTIONAL INFORMATION RETRIEVAL SYSTEM computer girl 0 8,609 06-06-2012, 05:03 PM
Last Post: computer girl
  information technology seminars topics computer science technology 4 72,218 11-02-2012, 12:07 PM
Last Post: seminar addict
  Smart Cards: Technology for Secure Management of Information seminar class 0 2,031 23-04-2011, 04:10 PM
Last Post: seminar class
  Cryptography: Securing the Information Age seminar class 0 1,958 23-03-2011, 11:24 AM
Last Post: seminar class
  Health Information Technology project report helper 0 1,143 16-10-2010, 07:51 PM
Last Post: project report helper
  Information Technology in the Social Life of Japan project report helper 0 1,410 16-10-2010, 05:55 PM
Last Post: project report helper
  Importance of Information Technology projectsofme 0 1,761 12-10-2010, 09:57 PM
Last Post: projectsofme
  information technology in textile industry full report project topics 0 5,755 01-04-2010, 10:41 PM
Last Post: project topics

Forum Jump: