FPGA Central - World's 1st FPGA / CPLD Portal

FPGA Central

World's 1st FPGA Portal

 

Go Back   FPGA Groups > NewsGroup > DSP

DSP comp.dsp newsgroup, mailing list

Reply
 
LinkBack Thread Tools Display Modes
  #1 (permalink)  
Old 11-18-2009, 10:33 PM
Anand P. Paralkar
Guest
 
Posts: n/a
Default Information Theory - Entropy: h(A U B)

Hi,

I am doing an introductory course to Information Theory where I have come
across:

h(A) = -log(P(A)), where h(A) represents the quantity of information
associated with the event A, with a probability P(A).

Further, we use conditional probability to define the quantity of
conditional information, mutual information and so on:

h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)

where h(B|A) = quantity of conditional information B given A

and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)

where i(A, B) is the mutal information between A and B.

All of this can be interpreted for source-channel-receiver model where:

h(A) : Quantity of information in the source,
h(B) : Received information from the channel
i(A, B): Usefull information
h(B|A): Information added/subtracted by channel. (Channel noise.)

I would like to know if the quantity h(A U B) is defined (and how). And if
it is defined, what does it signify in the source-channel-receiver model?

Thanks,
Anand


Reply With Quote
  #2 (permalink)  
Old 11-19-2009, 10:21 PM
Dilip Warrier
Guest
 
Posts: n/a
Default Re: Information Theory - Entropy: h(A U B)

On Nov 18, 5:33*pm, "Anand P. Paralkar" <[email protected]>
wrote:
> Hi,
>
> I am doing an introductory course to Information Theory where I have come
> across:
>
> * h(A) = -log(P(A)), where h(A) represents the quantity of information
> associated with the event A, with a probability P(A).
>
> Further, we use conditional probability to define the quantity of
> conditional information, mutual information and so on:
>
> * h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)
>
> *where h(B|A) = quantity of conditional information B given A
>
> *and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)
>
> * where i(A, B) is the mutal information between A and B.
>
> All of this can be interpreted for source-channel-receiver model where:
>
> * h(A) : Quantity of information in the source,
> * h(B) : Received information from the channel
> * i(A, B): Usefull information
> * h(B|A): Information added/subtracted by channel. (Channel noise.)
>
> I would like to know if the quantity h(A U B) is defined (and how). *And if
> it is defined, what does it signify in the source-channel-receiver model?
>
> Thanks,
> Anand


Anand, yes.

The entropy for any event X is defined as:
h(X) = -log p(X).
So, in particular for the event A U B, you have h(A U B) = -log p(A U
B).

That said, I haven't seen much use of this quantity in information
theory.

Dilip.
Reply With Quote
  #3 (permalink)  
Old 11-20-2009, 03:20 PM
Clay
Guest
 
Posts: n/a
Default Re: Information Theory - Entropy: h(A U B)

On Nov 19, 5:21*pm, Dilip Warrier <[email protected]> wrote:
> On Nov 18, 5:33*pm, "Anand P. Paralkar" <[email protected]>
> wrote:
>
>
>
>
>
> > Hi,

>
> > I am doing an introductory course to Information Theory where I have come
> > across:

>
> > * h(A) = -log(P(A)), where h(A) represents the quantity of information
> > associated with the event A, with a probability P(A).

>
> > Further, we use conditional probability to define the quantity of
> > conditional information, mutual information and so on:

>
> > * h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)

>
> > *where h(B|A) = quantity of conditional information B given A

>
> > *and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)

>
> > * where i(A, B) is the mutal information between A and B.

>
> > All of this can be interpreted for source-channel-receiver model where:

>
> > * h(A) : Quantity of information in the source,
> > * h(B) : Received information from the channel
> > * i(A, B): Usefull information
> > * h(B|A): Information added/subtracted by channel. (Channel noise.)

>
> > I would like to know if the quantity h(A U B) is defined (and how). *And if
> > it is defined, what does it signify in the source-channel-receiver model?

>
> > Thanks,
> > Anand

>
> Anand, yes.
>
> The entropy for any event X is defined as:
> h(X) = -log p(X).
> So, in particular for the event A U B, you have h(A U B) = -log p(A U
> B).
>


I think you would want to start with a correct defn for entropy.

H(p1,p2,p3,...,p_n) = - sum(over i) p_i log(p_i)

You can use any reasonable radix for the log function. Commonly used
is 2.

Clay


Reply With Quote
  #4 (permalink)  
Old 11-20-2009, 04:04 PM
maury
Guest
 
Posts: n/a
Default Re: Information Theory - Entropy: h(A U B)

On Nov 18, 4:33*pm, "Anand P. Paralkar" <[email protected]>
wrote:
> Hi,
>
> I am doing an introductory course to Information Theory where I have come
> across:
>
> * h(A) = -log(P(A)), where h(A) represents the quantity of information
> associated with the event A, with a probability P(A).
>
> Further, we use conditional probability to define the quantity of
> conditional information, mutual information and so on:
>
> * h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)
>
> *where h(B|A) = quantity of conditional information B given A
>
> *and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)
>
> * where i(A, B) is the mutal information between A and B.
>
> All of this can be interpreted for source-channel-receiver model where:
>
> * h(A) : Quantity of information in the source,
> * h(B) : Received information from the channel
> * i(A, B): Usefull information
> * h(B|A): Information added/subtracted by channel. (Channel noise.)
>
> I would like to know if the quantity h(A U B) is defined (and how). *And if
> it is defined, what does it signify in the source-channel-receiver model?
>
> Thanks,
> Anand


Anand,
H(X,Y) was derived by using the equivalent P(X,Y) = P(X) P(Y|X) in the
basic entropy equation.

Now, use the equivalent P(X U Y) = P(X) + P(Y) - P(X,Y) to find H(X U
Y). You now have the log of a sum instead of the log of a product.

Happy deriving

Maurice Givens
Reply With Quote
Reply

Bookmarks

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are On


Similar Threads
Thread Thread Starter Forum Replies Last Post
which should have higher entropy? Todd DSP 0 01-17-2008 04:51 PM
how to explain it in information theory? Helen DSP 5 08-24-2005 10:08 PM
How do I calculate entropy? Tom DSP 7 09-04-2004 09:16 AM
a question about entropy and mutual information roy DSP 2 08-28-2004 12:20 AM
JPEG compression only retains magnitude information? How about phase information? walala DSP 4 11-28-2003 07:04 PM


All times are GMT +1. The time now is 03:22 PM.


Powered by vBulletin® Version 3.8.0
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
Search Engine Friendly URLs by vBSEO 3.2.0
Copyright 2008 @ FPGA Central. All rights reserved