PDA

View Full Version : Function of Random Variables


S Didde
02-09-2009, 06:33 PM
I am trying to determine the pdf of a function which is a convolution o
two random variables. How do I determine the pdf of the function?

Let me illustrate with a simple case cited in many text books:

Z=X+Y (sum of two random variables)
The final pdf,
fz(z)=fx(x)*fy(y) which is convolution of the individual pdf's, if the
are independent.

I am looking for the pdf fz(z) when,
Z=X*Y (convolution of two random variables)

From the data I have it appears that it is also a convolution of th
individual pdf's when they are independent.
But I am unable to prove it from first principles.
Any help would be highly appreciated.
Thanks,
Stephen

Rune Allnor
02-09-2009, 06:44 PM
On 9 Feb, 19:33, "S Didde" <[email protected]> wrote:
> I am trying to determine the pdf of a function which is a convolution of
> two random variables. How do I determine the pdf of the function?
>
> Let me illustrate with a simple case cited in many text books:
>
> Z=X+Y (sum of two random variables)
> The final pdf,
> fz(z)=fx(x)*fy(y) which is convolution of the individual pdf's, if they
> are independent.
>
> I am looking for the pdf fz(z) when,
> Z=X*Y (convolution of two random variables)
>
> From the data I have it appears that it is also a convolution of the
> individual pdf's when they are independent.
> But I am unable to prove it from first principles.

A quick look in Papoulis' "Probability, Random Variables
and Stochastic Processes" (1992) only uncovers discussions
of convolving PDFs, not data.

An off-the-top-of-my-head guess would be that the convolution
of data would invoke the Central Limit Theorem, such that
the end PDF of z would asymptotically become Gaussian.

Rune

02-09-2009, 06:52 PM
On Feb 9, 12:33*pm, "S Didde" <[email protected]> wrote:
> I am trying to determine the pdf of a function which is a convolution of
> two random variables. How do I determine the pdf of the function?
>
> Let me illustrate with a simple case cited in many text books:
>
> Z=X+Y (sum of two random variables)
> The final pdf,
> fz(z)=fx(x)*fy(y) which is convolution of the individual pdf's, if they
> are independent.
>
> I am looking for the pdf fz(z) when,
> Z=X*Y (convolution of two random variables)
>
> From the data I have it appears that it is also a convolution of the
> individual pdf's when they are independent.
> But I am unable to prove it from first principles.
> Any help would be highly appreciated.
> Thanks,
> Stephen

Perhaps if you defined what you mean by the convolution
of two random variables, someone might be able to help
you.

Gordon Sande
02-09-2009, 07:09 PM
On 2009-02-09 14:33:42 -0400, "S Didde" <[email protected]> said:

> I am trying to determine the pdf of a function which is a convolution of
> two random variables. How do I determine the pdf of the function?
>
> Let me illustrate with a simple case cited in many text books:
>
> Z=X+Y (sum of two random variables)
> The final pdf,
> fz(z)=fx(x)*fy(y) which is convolution of the individual pdf's, if they
> are independent.
>
> I am looking for the pdf fz(z) when,
> Z=X*Y (convolution of two random variables)

What is your definition of convolution of random variables?

In X+Y you took a single number distributed as X and likewise for Y
and combined the two numbers. For a convolution you need two functions
to combine to yield a new function. Where did the functions come from and
how are they related to the distibution of the the points X?

Methinks this is just run away confusion!

> From the data I have it appears that it is also a convolution of the
> individual pdf's when they are independent.
> But I am unable to prove it from first principles.
> Any help would be highly appreciated.
> Thanks,
> Stephen

S Didde
02-09-2009, 07:37 PM
>I am trying to determine the pdf of a function which is a convolution of
>two random variables. How do I determine the pdf of the function?
>
>Let me illustrate with a simple case cited in many text books:
>
>Z=X+Y (sum of two random variables)
>The final pdf,
>fz(z)=fx(x)*fy(y) which is convolution of the individual pdf's, if they
>are independent.
>
>I am looking for the pdf fz(z) when,
>Z=X*Y (convolution of two random variables)
>
>From the data I have it appears that it is also a convolution of the
>individual pdf's when they are independent.
>But I am unable to prove it from first principles.
>Any help would be highly appreciated.
>Thanks,
>Stephen
>
>
>
Thanks to everyone who responded!
I realized soon after posting this question, that I needed to clarify thi
further.
The real issue is to find statistical information of a signal goin
through an LTI, linear time invariant channel.
It is essential for my work to find out the pdf of the signal at th
output.
To get down to the details of the problem: If I know the pdf or (pmf) o
the amplitude distribution of the input signal, specifically binary bits
and if I can model the impulse response of the channel as another rando
variable (probabilistic distribution of amplitudes at the samplin
instants) then I can treat this as a convolution of two random variables.
Hope this makes a bit more sense?

robert bristow-johnson
02-09-2009, 09:43 PM
On Feb 9, 2:37*pm, "S Didde" <[email protected]> wrote:
>
> The real issue is to find statistical information of a signal going
> through *an LTI, linear time invariant channel.
> It is essential for my work to find out the pdf of the signal at the
> output.
> To get down to the details of the problem: If I know the pdf or (pmf) of
> the amplitude distribution of the input signal, specifically binary bits,
> and if I can model the impulse response of the channel as another random
> variable (probabilistic distribution of amplitudes at the sampling
> instants) then I can treat this as a convolution of two random variables.
> Hope this makes a bit more sense?

one nitpick regarding terminology: i think you are considering the
convolution of two random *processes*. (if you like to impress
academics and other eggheads, you might call them "stochastic
processes".) a random variable comes out as a number. you can do to
it what you do to numbers (like add or multiply them). if you do some
well defined operation to a random variable (or two RVs, like add or
multiply), you can come up with an expression for the pdf of the
resulting RV.

you might need to put a little bit more into defining the random
process ("RP") representing the impulse response of the channel. that
RP cannot be white noise, nor may it even be a finite power signal.
it has to be a finite energy signal.

now what Rune said is likely true. if you have a decent model of the
impulse response, restricting it to finite energy, if the RV that is

h(tau)*x(t-tau)

("*" means multiplication here, t is completely random, tau is
specified, but variable.)

gives you a RV that has finite variance (for any tau), adding it up
for all possible taus will add to a normal or gaussian RV. if you
know the autocorrelation (or power spectrum) of x(t) and the
autocorrelation (or energy spectrum) of the h(t) processes, then, i
think that the power spectrum of the convolution result will be the
product of the two spectrums. but, i'll bet the pdf tends to be
gaussian.

r b-j

S Didde
02-10-2009, 07:00 PM
>On Feb 9, 2:37=A0pm, "S Didde" <[email protected]> wrote:
>>
>> The real issue is to find statistical information of a signal going
>> through =A0an LTI, linear time invariant channel.
>> It is essential for my work to find out the pdf of the signal at the
>> output.
>> To get down to the details of the problem: If I know the pdf or (pmf
of
>> the amplitude distribution of the input signal, specifically binar
bits,
>> and if I can model the impulse response of the channel as anothe
random
>> variable (probabilistic distribution of amplitudes at the sampling
>> instants) then I can treat this as a convolution of two rando
variables.
>> Hope this makes a bit more sense?
>
>one nitpick regarding terminology: i think you are considering the
>convolution of two random *processes*. (if you like to impress
>academics and other eggheads, you might call them "stochastic
>processes".) a random variable comes out as a number. you can do to
>it what you do to numbers (like add or multiply them). if you do some
>well defined operation to a random variable (or two RVs, like add or
>multiply), you can come up with an expression for the pdf of the
>resulting RV.
>
>you might need to put a little bit more into defining the random
>process ("RP") representing the impulse response of the channel. that
>RP cannot be white noise, nor may it even be a finite power signal.
>it has to be a finite energy signal.
>
>now what Rune said is likely true. if you have a decent model of the
>impulse response, restricting it to finite energy, if the RV that is
>
> h(tau)*x(t-tau)
>
>("*" means multiplication here, t is completely random, tau is
>specified, but variable.)
>
>gives you a RV that has finite variance (for any tau), adding it up
>for all possible taus will add to a normal or gaussian RV. if you
>know the autocorrelation (or power spectrum) of x(t) and the
>autocorrelation (or energy spectrum) of the h(t) processes, then, i
>think that the power spectrum of the convolution result will be the
>product of the two spectrums. but, i'll bet the pdf tends to be
>gaussian.
>
>r b-j
>
Thanks r b-j and Rune for the helpful suggestions.
May be I should define the problem a bit more: What I am interested i
finding out how the amplitude probabilities span out as the bits go throug
a finite impulse response channel. Ultimately I'd like to compute th
probability of inter-symbol-interference (ISI). Of course, there's th
issue of random noise as a stochastic process interfering with the bi
stream, but I am ignoring that as a second order effect. Signals start ou
without any ISI at the input to the channel and due to non-linear phase an
amplitude response in the channel they begin to overlap. I would like t
compute the probability of the amplitude distribution at the output of th
channel. Not sure if this is enough information, but would surel
appreciate if anyone has any ideas.

S Didde
02-10-2009, 09:33 PM
>On Feb 9, 2:37=A0pm, "S Didde" <[email protected]> wrote:
>>
>> The real issue is to find statistical information of a signal going
>> through =A0an LTI, linear time invariant channel.
>> It is essential for my work to find out the pdf of the signal at the
>> output.
>> To get down to the details of the problem: If I know the pdf or (pmf)
of
>> the amplitude distribution of the input signal, specifically binary
bits,
>> and if I can model the impulse response of the channel as another
random
>> variable (probabilistic distribution of amplitudes at the sampling
>> instants) then I can treat this as a convolution of two random
variables.
>> Hope this makes a bit more sense?
>
>one nitpick regarding terminology: i think you are considering the
>convolution of two random *processes*. (if you like to impress
>academics and other eggheads, you might call them "stochastic
>processes".) a random variable comes out as a number. you can do to
>it what you do to numbers (like add or multiply them). if you do some
>well defined operation to a random variable (or two RVs, like add or
>multiply), you can come up with an expression for the pdf of the
>resulting RV.
>
>you might need to put a little bit more into defining the random
>process ("RP") representing the impulse response of the channel. that
>RP cannot be white noise, nor may it even be a finite power signal.
>it has to be a finite energy signal.
>
>now what Rune said is likely true. if you have a decent model of the
>impulse response, restricting it to finite energy, if the RV that is
>
> h(tau)*x(t-tau)
>
>("*" means multiplication here, t is completely random, tau is
>specified, but variable.)
>
>gives you a RV that has finite variance (for any tau), adding it up
>for all possible taus will add to a normal or gaussian RV. if you
>know the autocorrelation (or power spectrum) of x(t) and the
>autocorrelation (or energy spectrum) of the h(t) processes, then, i
>think that the power spectrum of the convolution result will be the
>product of the two spectrums. but, i'll bet the pdf tends to be
>gaussian.
>
>r b-j
>
Thanks r b-j and Rune for the helpful suggestions.
May be I should define the problem a bit more: What I am interested in
finding out how the amplitude probabilities span out as the bits go through
a finite impulse response channel. Ultimately I'd like to compute the
probability of inter-symbol-interference (ISI). Of course, there's the
issue of random noise as a stochastic process interfering with the bit
stream, but I am ignoring that as a second order effect. Signals start out
without any ISI at the input to the channel and due to non-linear phase and
amplitude response in the channel they begin to overlap. I would like to
compute the probability of the amplitude distribution at the output of the
channel. Not sure if this is enough information, but would surely
appreciate if anyone has any ideas.