 # FPGA Central

## World's 1st FPGA Portal

 DSP comp.dsp newsgroup, mailing list  Anand P. Paralkar Guest Posts: n/a Information Theory - Entropy: h(A U B)

Hi,

I am doing an introductory course to Information Theory where I have come
across:

h(A) = -log(P(A)), where h(A) represents the quantity of information
associated with the event A, with a probability P(A).

Further, we use conditional probability to define the quantity of
conditional information, mutual information and so on:

h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)

where h(B|A) = quantity of conditional information B given A

and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)

where i(A, B) is the mutal information between A and B.

All of this can be interpreted for source-channel-receiver model where:

h(A) : Quantity of information in the source,
h(B) : Received information from the channel
i(A, B): Usefull information
h(B|A): Information added/subtracted by channel. (Channel noise.)

I would like to know if the quantity h(A U B) is defined (and how). And if
it is defined, what does it signify in the source-channel-receiver model?

Thanks,
Anand

 Dilip Warrier Guest Posts: n/a Re: Information Theory - Entropy: h(A U B)

On Nov 18, 5:33*pm, "Anand P. Paralkar" <[email protected]>
wrote:
> Hi,
>
> I am doing an introductory course to Information Theory where I have come
> across:
>
> * h(A) = -log(P(A)), where h(A) represents the quantity of information
> associated with the event A, with a probability P(A).
>
> Further, we use conditional probability to define the quantity of
> conditional information, mutual information and so on:
>
> * h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)
>
> *where h(B|A) = quantity of conditional information B given A
>
> *and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)
>
> * where i(A, B) is the mutal information between A and B.
>
> All of this can be interpreted for source-channel-receiver model where:
>
> * h(A) : Quantity of information in the source,
> * h(B) : Received information from the channel
> * i(A, B): Usefull information
> * h(B|A): Information added/subtracted by channel. (Channel noise.)
>
> I would like to know if the quantity h(A U B) is defined (and how). *And if
> it is defined, what does it signify in the source-channel-receiver model?
>
> Thanks,
> Anand

Anand, yes.

The entropy for any event X is defined as:
h(X) = -log p(X).
So, in particular for the event A U B, you have h(A U B) = -log p(A U
B).

That said, I haven't seen much use of this quantity in information
theory.

Dilip.
 Clay Guest Posts: n/a Re: Information Theory - Entropy: h(A U B)

On Nov 19, 5:21*pm, Dilip Warrier <[email protected]> wrote:
> On Nov 18, 5:33*pm, "Anand P. Paralkar" <[email protected]>
> wrote:
>
>
>
>
>
> > Hi,

>
> > I am doing an introductory course to Information Theory where I have come
> > across:

>
> > * h(A) = -log(P(A)), where h(A) represents the quantity of information
> > associated with the event A, with a probability P(A).

>
> > Further, we use conditional probability to define the quantity of
> > conditional information, mutual information and so on:

>
> > * h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)

>
> > *where h(B|A) = quantity of conditional information B given A

>
> > *and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)

>
> > * where i(A, B) is the mutal information between A and B.

>
> > All of this can be interpreted for source-channel-receiver model where:

>
> > * h(A) : Quantity of information in the source,
> > * h(B) : Received information from the channel
> > * i(A, B): Usefull information
> > * h(B|A): Information added/subtracted by channel. (Channel noise.)

>
> > I would like to know if the quantity h(A U B) is defined (and how). *And if
> > it is defined, what does it signify in the source-channel-receiver model?

>
> > Thanks,
> > Anand

>
> Anand, yes.
>
> The entropy for any event X is defined as:
> h(X) = -log p(X).
> So, in particular for the event A U B, you have h(A U B) = -log p(A U
> B).
>

I think you would want to start with a correct defn for entropy.

H(p1,p2,p3,...,p_n) = - sum(over i) p_i log(p_i)

You can use any reasonable radix for the log function. Commonly used
is 2.

Clay

 maury Guest Posts: n/a Re: Information Theory - Entropy: h(A U B)

On Nov 18, 4:33*pm, "Anand P. Paralkar" <[email protected]>
wrote:
> Hi,
>
> I am doing an introductory course to Information Theory where I have come
> across:
>
> * h(A) = -log(P(A)), where h(A) represents the quantity of information
> associated with the event A, with a probability P(A).
>
> Further, we use conditional probability to define the quantity of
> conditional information, mutual information and so on:
>
> * h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)
>
> *where h(B|A) = quantity of conditional information B given A
>
> *and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)
>
> * where i(A, B) is the mutal information between A and B.
>
> All of this can be interpreted for source-channel-receiver model where:
>
> * h(A) : Quantity of information in the source,
> * h(B) : Received information from the channel
> * i(A, B): Usefull information
> * h(B|A): Information added/subtracted by channel. (Channel noise.)
>
> I would like to know if the quantity h(A U B) is defined (and how). *And if
> it is defined, what does it signify in the source-channel-receiver model?
>
> Thanks,
> Anand

Anand,
H(X,Y) was derived by using the equivalent P(X,Y) = P(X) P(Y|X) in the
basic entropy equation.

Now, use the equivalent P(X U Y) = P(X) + P(Y) - P(X,Y) to find H(X U
Y). You now have the log of a sum instead of the log of a product.

Happy deriving Maurice Givens

 Thread Tools Show Printable Version Email this Page Display Modes Linear Mode Switch to Hybrid Mode Switch to Threaded Mode Posting Rules You may not post new threads You may not post replies You may not post attachments You may not edit your posts BB code is On Smilies are On [IMG] code is On HTML code is OffTrackbacks are On Pingbacks are On Refbacks are On Forum Rules Similar Threads Thread Thread Starter Forum Replies Last Post Todd DSP 0 01-17-2008 04:51 PM Helen DSP 5 08-24-2005 10:08 PM Tom DSP 7 09-04-2004 09:16 AM roy DSP 2 08-28-2004 12:20 AM walala DSP 4 11-28-2003 07:04 PM

All times are GMT +1. The time now is 03:22 PM. Copyright 2008 @ FPGA Central. All rights reserved