What is the entropy of a communication system that consists of six messages with probabilities 1/8, 1/8, 1/8, 1/8, 1/4, and 1/4 respectively ?

This question was previously asked in
ESE Electronics Prelims 2021 Official Paper
View all UPSC IES Papers >
  1. 1 bits/message 
  2. 2.5 bits/message
  3. 3 bits/message
  4. 4.5 bits/message

Answer (Detailed Solution Below)

Option 2 : 2.5 bits/message
Free
ST 1: UPSC ESE (IES) Civil - Building Materials
6.2 K Users
20 Questions 40 Marks 24 Mins

Detailed Solution

Download Solution PDF

Concept:

The entropy of a probability distribution is the average or the amount of information when drawing from a probability distribution.

It is calculated as:

\(H=\underset{i=1}{\overset{n}{\mathop \sum }}\,{{p}_{i}}{{\log }_{2}}\left( \frac{1}{{{p}_{i}}} \right)bits/symbol\)

pi is the probability of the occurrence of a symbol.

Calculation:

The entropy will be:

\(H=4\times\frac{1}{8}log_2(8)+2\times\frac{1}{4}log_2(4)\)

\(H=\frac{4\times3}{8}+\frac{2\times2}{4}\)

H = 2.5 bits/message

Latest UPSC IES Updates

Last updated on May 28, 2025

->  UPSC ESE admit card 2025 for the prelims exam has been released. 

-> The UPSC IES Prelims 2025 will be held on 8th June 2025.

-> The selection process includes a Prelims and a Mains Examination, followed by a Personality Test/Interview.

-> Candidates should attempt the UPSC IES mock tests to increase their efficiency. The UPSC IES previous year papers can be downloaded here.

More Entropy Coding Questions

More Information Theory Questions

Get Free Access Now
Hot Links: teen patti list rummy teen patti teen patti yes