I am doing an introductory course to Information Theory where I have come
across:

h(A) = -log(P(A)), where h(A) represents the quantity of information
associated with the event A, with a probability P(A).

Further, we use conditional probability to define the quantity of
conditional information, mutual information and so on:

h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)

where h(B|A) = quantity of conditional information B given A

and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)

where i(A, B) is the mutal information between A and B.

All of this can be interpreted for source-channel-receiver model where:

h(A) : Quantity of information in the source,
h(B) : Received information from the channel
i(A, B): Usefull information
h(B|A): Information added/subtracted by channel. (Channel noise.)

I would like to know if the quantity h(A U B) is defined (and how). And if
it is defined, what does it signify in the source-channel-receiver model?

On Nov 18, 5:33*pm, "Anand P. Paralkar" <[email protected]>
wrote:
> Hi,
>
> I am doing an introductory course to Information Theory where I have come
> across:
>
> * h(A) = -log(P(A)), where h(A) represents the quantity of information
> associated with the event A, with a probability P(A).
>
> Further, we use conditional probability to define the quantity of
> conditional information, mutual information and so on:
>
> * h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)
>
> *where h(B|A) = quantity of conditional information B given A
>
> *and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)
>
> * where i(A, B) is the mutal information between A and B.
>
> All of this can be interpreted for source-channel-receiver model where:
>
> * h(A) : Quantity of information in the source,
> * h(B) : Received information from the channel
> * i(A, B): Usefull information
> * h(B|A): Information added/subtracted by channel. (Channel noise.)
>
> I would like to know if the quantity h(A U B) is defined (and how). *And if
> it is defined, what does it signify in the source-channel-receiver model?
>
> Thanks,
> Anand

Anand, yes.

The entropy for any event X is defined as:
h(X) = -log p(X).
So, in particular for the event A U B, you have h(A U B) = -log p(A U
B).

That said, I haven't seen much use of this quantity in information
theory.

On Nov 19, 5:21*pm, Dilip Warrier <[email protected]> wrote:
> On Nov 18, 5:33*pm, "Anand P. Paralkar" <[email protected]>
> wrote:
>
>
>
>
>
> > Hi,
>
> > I am doing an introductory course to Information Theory where I have come
> > across:
>
> > * h(A) = -log(P(A)), where h(A) represents the quantity of information
> > associated with the event A, with a probability P(A).
>
> > Further, we use conditional probability to define the quantity of
> > conditional information, mutual information and so on:
>
> > * h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)
>
> > *where h(B|A) = quantity of conditional information B given A
>
> > *and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)
>
> > * where i(A, B) is the mutal information between A and B.
>
> > All of this can be interpreted for source-channel-receiver model where:
>
> > * h(A) : Quantity of information in the source,
> > * h(B) : Received information from the channel
> > * i(A, B): Usefull information
> > * h(B|A): Information added/subtracted by channel. (Channel noise.)
>
> > I would like to know if the quantity h(A U B) is defined (and how). *And if
> > it is defined, what does it signify in the source-channel-receiver model?
>
> > Thanks,
> > Anand
>
> Anand, yes.
>
> The entropy for any event X is defined as:
> h(X) = -log p(X).
> So, in particular for the event A U B, you have h(A U B) = -log p(A U
> B).
>

I think you would want to start with a correct defn for entropy.

H(p1,p2,p3,...,p_n) = - sum(over i) p_i log(p_i)

You can use any reasonable radix for the log function. Commonly used
is 2.

On Nov 18, 4:33*pm, "Anand P. Paralkar" <[email protected]>
wrote:
> Hi,
>
> I am doing an introductory course to Information Theory where I have come
> across:
>
> * h(A) = -log(P(A)), where h(A) represents the quantity of information
> associated with the event A, with a probability P(A).
>
> Further, we use conditional probability to define the quantity of
> conditional information, mutual information and so on:
>
> * h(A, B) (sometimes denoted as h(A intersection B)) = h(A) + h(B|A)
>
> *where h(B|A) = quantity of conditional information B given A
>
> *and i(A, B) = h(A) - h(A|B) = h(B) - h(B|A)
>
> * where i(A, B) is the mutal information between A and B.
>
> All of this can be interpreted for source-channel-receiver model where:
>
> * h(A) : Quantity of information in the source,
> * h(B) : Received information from the channel
> * i(A, B): Usefull information
> * h(B|A): Information added/subtracted by channel. (Channel noise.)
>
> I would like to know if the quantity h(A U B) is defined (and how). *And if
> it is defined, what does it signify in the source-channel-receiver model?
>
> Thanks,
> Anand

Anand,
H(X,Y) was derived by using the equivalent P(X,Y) = P(X) P(Y|X) in the
basic entropy equation.

Now, use the equivalent P(X U Y) = P(X) + P(Y) - P(X,Y) to find H(X U
Y). You now have the log of a sum instead of the log of a product.