>On Feb 9, 2:37=A0pm, "S Didde" <
[email protected]> wrote:
>>
>> The real issue is to find statistical information of a signal going
>> through =A0an LTI, linear time invariant channel.
>> It is essential for my work to find out the pdf of the signal at the
>> output.
>> To get down to the details of the problem: If I know the pdf or (pmf)
of
>> the amplitude distribution of the input signal, specifically binary
bits,
>> and if I can model the impulse response of the channel as another
random
>> variable (probabilistic distribution of amplitudes at the sampling
>> instants) then I can treat this as a convolution of two random
variables.
>> Hope this makes a bit more sense?
>
>one nitpick regarding terminology: i think you are considering the
>convolution of two random *processes*. (if you like to impress
>academics and other eggheads, you might call them "stochastic
>processes".) a random variable comes out as a number. you can do to
>it what you do to numbers (like add or multiply them). if you do some
>well defined operation to a random variable (or two RVs, like add or
>multiply), you can come up with an expression for the pdf of the
>resulting RV.
>
>you might need to put a little bit more into defining the random
>process ("RP") representing the impulse response of the channel. that
>RP cannot be white noise, nor may it even be a finite power signal.
>it has to be a finite energy signal.
>
>now what Rune said is likely true. if you have a decent model of the
>impulse response, restricting it to finite energy, if the RV that is
>
> h(tau)*x(t-tau)
>
>("*" means multiplication here, t is completely random, tau is
>specified, but variable.)
>
>gives you a RV that has finite variance (for any tau), adding it up
>for all possible taus will add to a normal or gaussian RV. if you
>know the autocorrelation (or power spectrum) of x(t) and the
>autocorrelation (or energy spectrum) of the h(t) processes, then, i
>think that the power spectrum of the convolution result will be the
>product of the two spectrums. but, i'll bet the pdf tends to be
>gaussian.
>
>r b-j
>
Thanks r b-j and Rune for the helpful suggestions.
May be I should define the problem a bit more: What I am interested in
finding out how the amplitude probabilities span out as the bits go through
a finite impulse response channel. Ultimately I'd like to compute the
probability of inter-symbol-interference (ISI). Of course, there's the
issue of random noise as a stochastic process interfering with the bit
stream, but I am ignoring that as a second order effect. Signals start out
without any ISI at the input to the channel and due to non-linear phase and
amplitude response in the channel they begin to overlap. I would like to
compute the probability of the amplitude distribution at the output of the
channel. Not sure if this is enough information, but would surely
appreciate if anyone has any ideas.