PDA

View Full Version : Rigorous definition of the Spectral Density of a random signal?


Carlos Moreno
10-25-2003, 07:47 AM
Hi,

I know this question is not 100% relevant to this newsgroup,
as I'm talking about continuous-time signals, and not DSP...
But I couldn't find a more appropriate place to discuss this.

Anyway, there was a big discussion in one of my classes about
some characteristics of white noise, and the disagreement
seemed to be driven by a discrepancy between our concepts
of what the Power-spectral density of a random signal is.

So, I'm looking for a *rigurous* definition of what the
values of the PSD really represent. (notice that I don't
want the PSD defined as the Fourier transform of the
auto-correlation of the signal -- what I need is a
definition of what the value of the PSD at a specific
frequency means/represents)

When I say a "rigurous definition", I mean is that the
discussion will not be solved with an informal "it
represents the spectral contents at the given frequency",
or the typical notion that "white noise has a flat spectral
density, which means that it was equal contents at all
frequencies".

No, I'm looking for the *actual* meaning (in a rigurous
mathematical sense) of the value of the PSD at a certain
frequency.

For instance, if we are talking about the definition of
what the values of a probability density function mean,
this would be examples of what I want, and what I don't
want:

What I woud NOT want: "The value of the PDF tells you
if it is very likely for the variable to take a value
of x, relatively to other values"

(this definition -- besides incorrect -- is purely
intuitive, and thus may lead to ambiguity and multiple
(mis)interpretations)


What I would want: "The value of the PDF at x is the
limit as dx approaches to zero of the probability that
the variable takes values in (x, x+dx), divided by dx"
Or: "The pdf function is such that the probability
that the variable takes values within a region R is
given by the integral of the pdf over the region R"

These are both valid examples of what I would call an
actual/rigurous definition of what the meaning of the
value of the PDF is. (I don't know if that is *the*
right way of defining what a PDF is, but the statements
are true and unambiguous, and they describe what a value
of the PDF really tells us)

So, can someone help me with finding a rigurous definition
of what the value at a given frequency of the PSD of a
random signal means?

Thanks!

Carlos
--

Carlos Moreno
10-27-2003, 07:51 PM
Mike Rosing wrote:
> Carlos Moreno wrote:
>
>> So, can someone help me with finding a rigurous definition
>> of what the value at a given frequency of the PSD of a
>> random signal means?
>
> howdy Carlos,
>
> What's a rigorous definition of PSD to begin with? What you
> apply the definition to should not matter. If interpretation
> of the application seems to change with the function being
> measured, then it would seem that the original interpretation
> is wrong. Not the definition.

I'm not claiming or suggesting that the definition is wrong.
I just need/want to know what is the definition.

I think I understand it, but since some of the fellow students
were drawing conclusions different from mine, I am sure that
either I or them have a wrong idea of what the PSD of a random
signal is.

I'm trying to find a strict definition of what the value of
the PSD at a given frequency represents. For instance, if you
give me the PSD of a random signal, and I see that the value
at f = 60Hz is 2.5, what I want to know is: what exactly
does that 2.5 represent?

Is it the power of the 60Hz component? (obviously not -- if
it is a random signal, we can not get a measure of power from
the PSD).

Is it the probability that the signal has energy at that
frequency? (obviously not -- the PSD values are not
restricted to the interval 0 to 1)

Is it the average of the total engergy of the 60Hz component
over different realizations of the random signal?

Is it the average of energy density at 60Hz over different
realizations of the random signal?

Is it the average of the power of the 60Hz component over
different realizations of the random signal?

(I'm not sure if one of the above is the correct definition;
but that's what I'm trying to know -- that's what I meant
when I said that I'm looking for a "rigurous" definition of
what the PSD values represent)


Cheers,

Carlos
--

Rune Allnor
10-27-2003, 09:48 PM
Carlos Moreno <[email protected]> wrote in message news:<[email protected]>...
> Mike Rosing wrote:
> > Carlos Moreno wrote:
> >
> >> So, can someone help me with finding a rigurous definition
> >> of what the value at a given frequency of the PSD of a
> >> random signal means?
> >
> > howdy Carlos,
> >
> > What's a rigorous definition of PSD to begin with? What you
> > apply the definition to should not matter. If interpretation
> > of the application seems to change with the function being
> > measured, then it would seem that the original interpretation
> > is wrong. Not the definition.
>
> I'm not claiming or suggesting that the definition is wrong.
> I just need/want to know what is the definition.
>
> I think I understand it, but since some of the fellow students
> were drawing conclusions different from mine, I am sure that
> either I or them have a wrong idea of what the PSD of a random
> signal is.

In my humble opinion, the terms "rigorous" and "statistics" are almost
contradictions in terms. While the formalities of maths also applies
when used in statistsics, I think the interpretations may not be as
strict as with other mathemathical diciplines.

Actually, from the statement of the question I think the interpretation,
not the definition, may be the problem.

> I'm trying to find a strict definition of what the value of
> the PSD at a given frequency represents. For instance, if you
> give me the PSD of a random signal, and I see that the value
> at f = 60Hz is 2.5, what I want to know is: what exactly
> does that 2.5 represent?
>
> Is it the power of the 60Hz component? (obviously not -- if
> it is a random signal, we can not get a measure of power from
> the PSD).

If you mean "the power at 60 Hz in one particular realization",
I agree.

> Is it the probability that the signal has energy at that
> frequency? (obviously not -- the PSD values are not
> restricted to the interval 0 to 1)

Agreed.

> Is it the average of the total engergy of the 60Hz component
> over different realizations of the random signal?

Nope. It's the "Power Spectral Density" we are talking about. If there
was a nonzero *energy* component in random noise, electrical energy
would be available for free: Just mount a reciever and drain energy out
of the blue.

> Is it the average of energy density at 60Hz over different
> realizations of the random signal?

Nope, due to the energy/density thing commented in the previous point.

> Is it the average of the power of the 60Hz component over
> different realizations of the random signal?

Almost: It's the average power density of all realizations.

> (I'm not sure if one of the above is the correct definition;
> but that's what I'm trying to know -- that's what I meant
> when I said that I'm looking for a "rigurous" definition of
> what the PSD values represent)

That's the kind of questions you learn the most from. Test as many
interpretations you can, discuss them with others, and make up your
own mind about what works for you.

Rune

Bernhard Holzmayer
10-28-2003, 10:38 AM
Rune Allnor wrote:

> Carlos Moreno <[email protected]> wrote in message
> news:<[email protected]>...
>> ...
>> (I'm not sure if one of the above is the correct definition;
>> but that's what I'm trying to know -- that's what I meant
>> when I said that I'm looking for a "rigurous" definition of
>> what the PSD values represent)
>
> That's the kind of questions you learn the most from. Test as many
> interpretations you can, discuss them with others, and make up
> your own mind about what works for you.
>
> Rune

Hi Rune,
that's a sort of brilliant philosophical answer.
It remembers me to a piece of German literature: Lessing's "Nathan
the Wise" or the famous words of Zarathoustra...

SCNR,
Bernhard

Rune Allnor
10-28-2003, 02:49 PM
Bernhard Holzmayer <[email protected]> wrote in message news:<[email protected]>...
> Rune Allnor wrote:
>
> > Carlos Moreno <[email protected]> wrote in message
> > news:<[email protected]>...
> >> ...
> >> (I'm not sure if one of the above is the correct definition;
> >> but that's what I'm trying to know -- that's what I meant
> >> when I said that I'm looking for a "rigurous" definition of
> >> what the PSD values represent)
> >
> > That's the kind of questions you learn the most from. Test as many
> > interpretations you can, discuss them with others, and make up
> > your own mind about what works for you.
> >
> > Rune
>
> Hi Rune,
> that's a sort of brilliant philosophical answer.
> It remembers me to a piece of German literature: Lessing's "Nathan
> the Wise" or the famous words of Zarathoustra...

I don't know any of those (except for that intro music by one of the
Strauss'es in "2001 - a Space Oddyssey")... but thanks for the warning.
We don't want any "Oracle of Delphi"-type answers, do we... ;)

> SCNR,

???

Rune

Bernhard Holzmayer
10-28-2003, 03:58 PM
Rune Allnor wrote:

> Bernhard Holzmayer <[email protected]> wrote in
> message news:<[email protected]>...
>> Rune Allnor wrote:
>>
>> > Carlos Moreno <[email protected]> wrote in
>> > message news:<[email protected]>...
>> >> ...
>> >> (I'm not sure if one of the above is the correct definition;
>> >> but that's what I'm trying to know -- that's what I meant
>> >> when I said that I'm looking for a "rigurous" definition of
>> >> what the PSD values represent)
>> >
>> > That's the kind of questions you learn the most from. Test as
>> > many interpretations you can, discuss them with others, and
>> > make up your own mind about what works for you.
>> >
>> > Rune
>>
>> Hi Rune,
>> that's a sort of brilliant philosophical answer.
>> It remembers me to a piece of German literature: Lessing's
>> "Nathan
>> the Wise" or the famous words of Zarathoustra...
>
> I don't know any of those (except for that intro music by one of
> the Strauss'es in "2001 - a Space Oddyssey")... but thanks for the
> warning. We don't want any "Oracle of Delphi"-type answers, do
> we... ;)

You're right. However, sometimes, it's close together...
>
>> SCNR,
>
> ???
= _S_orry, _C_ould _N_ot _R_esist
>
> Rune

Bernhard

Piergiorgio Sartor
10-28-2003, 08:32 PM
Carlos Moreno wrote:

> Is it the power of the 60Hz component? (obviously not -- if
> it is a random signal, we can not get a measure of power from
> the PSD).

Why not?

I mean, that's _statistical_ power, not power as
in physics... Sometimes they're not the same...

bye,

--

piergiorgio

Tom Loredo
10-28-2003, 10:19 PM
How about going to the library and looking at any one of many good books
on the statistics of time series? Vol. 1 of Priestly is a good one for
this question.

-Tom

--

To respond by email, replace "somewhere" with "astro" in the
return address.

Dilip V. Sarwate
10-29-2003, 02:15 AM
Carlos Moreno <[email protected]> wrote in message news:<[email protected]>...

> So, I'm looking for a *rigurous* definition of what the
> values of the PSD really represent. (notice that I don't
> want the PSD defined as the Fourier transform of the
> auto-correlation of the signal -- what I need is a
> definition of what the value of the PSD at a specific
> frequency means/represents)
>
>..... material snipped ....
>
> What I would want: "The value of the PDF at x is the
> limit as dx approaches to zero of the probability that
> the variable takes values in (x, x+dx), divided by dx"
> Or: "The pdf function is such that the probability
> that the variable takes values within a region R is
> given by the integral of the pdf over the region R"
>
> These are both valid examples of what I would call an
> actual/rigurous definition of what the meaning of the
> value of the PDF is.



Would you accept the following?

Consider an ideal narrowbandband filter with center frequency w,
passband gain 1 and bandwidth dw whose input happens to be the
random process in question. Then, the value of the PSD of the
input process at frequency w is the limit as dw approaches to
zero of the ratio of the average power of the filter output signal
to the filter bandwidth dw.

or

The PSD is a function S(w) such that the power of the output
signal from a filter (with transfer function H(w)) whose input is
the process is given by the integral of S(w)|H)w)|^2 over the real
line. If H(w) in an ideal bandpass filter with unit passband gain,
then the power is the integral of S(w) over the passband.

Carlos Moreno
10-29-2003, 02:40 AM
Tom Loredo wrote:
> How about going to the library and looking at any one of many good books
> on the statistics of time series? Vol. 1 of Priestly is a good one for
> this question.

Well, one thing that promted me to bring the subject for a
"live" discussion with people that really know the subject,
is that a couple of books (ironically, with a common co-
author) give interpretations that, the way I see, are
contradictory. They do not give a definition of what the
PSD is, but just spit a bunch of blah-blah about what I
would call "the consequences from an intuitive point of
view of what PSD is" :-\ You see how one can get pretty
frustrated :-(

I guess I'll look up Priestly's book, to see if I can get
a better perspective to the subject.

Thanks!

Carlos
--

Carlos Moreno
10-29-2003, 04:59 AM
Rune Allnor wrote:

> In my humble opinion, the terms "rigorous" and "statistics" are almost
> contradictions in terms. While the formalities of maths also applies
> when used in statistsics, I think the interpretations may not be as
> strict as with other mathemathical diciplines.

That does not mean that probabilities/statistics is not a rigurous
subject -- it simply means that a lot of people do not understand
them and therefore misinterpret it.

I can imagine Kolmogorov right now desperately rolling in his grave
if he heard your comment!! :-)

>>Is it the average of the total engergy of the 60Hz component
>>over different realizations of the random signal?
>
> Nope. It's the "Power Spectral Density" we are talking about. If there
> was a nonzero *energy* component in random noise, electrical energy
> would be available for free: Just mount a reciever and drain energy out
> of the blue.

???

Electrical energy *is* available for free -- not in whatever
amount that we may want. But if you mount a receiver, you *can*
indeed drain energy from it. You're not "creating" energy out
of the blue -- you're transforming energy from whatever physical
source of the noise, into electrical energy that you now extract;
there's definitely no magic in that.

>>Is it the average of energy density at 60Hz over different
>>realizations of the random signal?
>
> Nope, due to the energy/density thing commented in the previous point.

Actually, I (doubly) disagree on this point -- there is non-zero
energy component *in a frequency band of non-zero length*. Notice
that the term I used, "energy density", refers to (informally
speaking) the "amount of energy per unit of frequency (i.e.,
density in the above definition would be defined as the limit as
delta-omega approaches 0 of the energy contained in the band
(omega, omega + delta-omega) divided by delta-omega).

Notice that with this definition, the energy of a particular
frequency component is zero.

>>Is it the average of the power of the 60Hz component over
>>different realizations of the random signal?
>
> Almost: It's the average power density of all realizations.

Ok, here we go... This can not be right. I know the name
says "power density", but this simply can not refer to power.

First of all, power is a function of time (the power is an
instantaneous measure of the signal, that takes different
values over time).

But even assuming that you're talking about power as the
average power over the entire realization (i.e., the average
over time), I still disagree!

The Fourier transform of a signal represents *energy* density
(unless we're talking about periodic signals, but that's a
different thing, and it's obviously not the case with random
signals in general).

My interpretation of PSD was always the average of the
Fourier transforms of all possible realizations of the random
signal -- if that's the case, then the values of PSD represent
the average energy density (over all possible realizations).

The notion of "power density" has counter-examples, the way I
see it: if the PSD of white noise, which is a constant,
represents power per unit of frequency, then the power would
be infinite; that's certainly not the case for Gaussian white
noise; or white noise where, say, the value at each time is
an independent random variable uniformly distributed between
-1 and 1 (or whatever values). The power of such signals is
definitely not infinite; however, they're both white noise,
and they do have a PSD that is a constant for all frequencies.

If we are talking about energy density, then the above leads
to no contradiction: we integrate with respect to frequency,
and of course we obtain infinity: the signal has infinite
energy (the signal taken as a whole, from t = -infinity to
+infinity)

Does it make sense what I'm saying?

Carlos
--

Rune Allnor
10-29-2003, 02:16 PM
Carlos Moreno <[email protected]> wrote in message news:<[email protected]>...
> Rune Allnor wrote:
>
> > In my humble opinion, the terms "rigorous" and "statistics" are almost
> > contradictions in terms. While the formalities of maths also applies
> > when used in statistsics, I think the interpretations may not be as
> > strict as with other mathemathical diciplines.
>
> That does not mean that probabilities/statistics is not a rigurous
> subject -- it simply means that a lot of people do not understand
> them and therefore misinterpret it.
>
> I can imagine Kolmogorov right now desperately rolling in his grave
> if he heard your comment!! :-)

My point is only that if you use statistics and rigorous mathemathics
to describe the probablility distribution function of a fair dice, it
will show a PDF as

P(x=n) = 1/6, n=1,2,3,4,5,6 [1]

i.e. the dice will *on average* show one eye every 6'th time it is rolled.
A flawed ("strict") interpretation is to say that "it will show one eye
exactly once in the next six throws". I've heard that kind of statements
made.

> >>Is it the average of the total engergy of the 60Hz component
> >>over different realizations of the random signal?
> >
> > Nope. It's the "Power Spectral Density" we are talking about. If there
> > was a nonzero *energy* component in random noise, electrical energy
> > would be available for free: Just mount a reciever and drain energy out
> > of the blue.
>
> ???
>
> Electrical energy *is* available for free -- not in whatever
> amount that we may want. But if you mount a receiver, you *can*
> indeed drain energy from it. You're not "creating" energy out
> of the blue -- you're transforming energy from whatever physical
> source of the noise, into electrical energy that you now extract;
> there's definitely no magic in that.

Could you provide examples? Admittedly, a solar panel or a hydroelectric
generator are examples on types of recievers that actually generate
energy. I was thinking of radio recievers, where the measured signal
is used to modulate a flow of energy internal to the reciever, energy
that was provided by the reciever's power supply.

Ok, you got me wondering if I confuse "power" and "energy"... my point
is that you can't charge a battery by just connecting it to a recieving
antenna, and drayn energy/power from the static noise the antenna
recieves from the surroundings.

> >>Is it the average of energy density at 60Hz over different
> >>realizations of the random signal?
> >
> > Nope, due to the energy/density thing commented in the previous point.
>
> Actually, I (doubly) disagree on this point -- there is non-zero
> energy component *in a frequency band of non-zero length*. Notice
> that the term I used, "energy density", refers to (informally
> speaking) the "amount of energy per unit of frequency (i.e.,
> density in the above definition would be defined as the limit as
> delta-omega approaches 0 of the energy contained in the band
> (omega, omega + delta-omega) divided by delta-omega).
>
> Notice that with this definition, the energy of a particular
> frequency component is zero.
>
> >>Is it the average of the power of the 60Hz component over
> >>different realizations of the random signal?
> >
> > Almost: It's the average power density of all realizations.
>
> Ok, here we go... This can not be right. I know the name
> says "power density", but this simply can not refer to power.
>
> First of all, power is a function of time (the power is an
> instantaneous measure of the signal, that takes different
> values over time).
>
> But even assuming that you're talking about power as the
> average power over the entire realization (i.e., the average
> over time), I still disagree!
>
> The Fourier transform of a signal represents *energy* density
> (unless we're talking about periodic signals, but that's a
> different thing, and it's obviously not the case with random
> signals in general).

The difference is in the domain of the function. Most of the
discussions about "random signals" in DSP is based on processes
that are at least "stationary", and most are also assumed to
be "ergodic".

Stationary signals are, by definition, assumed to have infinite
support, i.e.

E{x(t)} = m_x, -infinite < t < infinite [2]

and are power signals. A pulsed random signal where the pulse
exists between T_start and T_stop would in general be described as

0 t < T_start
E{x(t)} = m_x(t) T_start <= t < T_stop [3]
0 T_stop <= t

Since the contributions to the Fourier integral comes from a limited
time domain [T_start, T_stop> this is an energy signal.

> My interpretation of PSD was always the average of the
> Fourier transforms of all possible realizations of the random
> signal -- if that's the case, then the values of PSD represent
> the average energy density (over all possible realizations).
>
> The notion of "power density" has counter-examples, the way I
> see it: if the PSD of white noise, which is a constant,
> represents power per unit of frequency, then the power would
> be infinite; that's certainly not the case for Gaussian white
> noise; or white noise where, say, the value at each time is
> an independent random variable uniformly distributed between
> -1 and 1 (or whatever values). The power of such signals is
> definitely not infinite; however, they're both white noise,
> and they do have a PSD that is a constant for all frequencies.
>
> If we are talking about energy density, then the above leads
> to no contradiction: we integrate with respect to frequency,
> and of course we obtain infinity: the signal has infinite
> energy (the signal taken as a whole, from t = -infinity to
> +infinity)

The Fourier transform talks about integrable(?) functions, that is,
functions where the integral over the definition domain satisfies

integral |x(t)|^2 dt < infinite [4]

where the integration limits must be specified in each case. If the
integral goes between plus and minus infinity and you want to integrate
a sine or cosine, you have to modify the integral to

inf 1
lim integral - |x(t)|^2 dt < infinite [5]
T-> infinite -inf T

or the maths doesn't work. That's what is known as a "Power Signal".
As I commented above, if the contributions come from a limited
time domain, you can integrate between finite limits and the signal
is an energy signal.

As for the example of white noise/constant spectrum, my very crude
interpretation is that the Fourier integral yelds infinite energy
but finite power, since white noise is stationary and therefore must
be regarded a power signal.

> Does it make sense what I'm saying?

Yes, sort of. I would be a bit happier if we could agree on exactly
what a "random signal" is. The fundamental assumption that forms the
basis of my discussion, is that the signal is an infinite-domain,
stationary power signal that has to be Fourier transformed under the
constraint [5] above.

Rune

Gordon Sande
10-29-2003, 03:11 PM
>Could you provide examples? Admittedly, a solar panel or a hydroelectric
>generator are examples on types of recievers that actually generate
>energy. I was thinking of radio recievers, where the measured signal
>is used to modulate a flow of energy internal to the reciever, energy
>that was provided by the reciever's power supply.

>Ok, you got me wondering if I confuse "power" and "energy"... my point
>is that you can't charge a battery by just connecting it to a recieving
>antenna, and drayn energy/power from the static noise the antenna
>recieves from the surroundings.

Actually one of the abuses in the early days of radio were the folks who
set of "tranformers" to sap power from the transmitters. So yes you can
charge a battery from an attenna, and whether you want to think of loud
music as noise is left to you as an exercise.

Almost any good book on stochastic processes will give you good rigourous
definitions. You will have to take the trouble to understand them and
stop being confused by those around you who "simplify" things.

Finding bad drivers in an old VWs does not mean that there are no good
drivers in Porsches. But you have to be able to tell the difference.

Carlos Moreno
10-29-2003, 03:15 PM
Dilip V. Sarwate wrote:

> Would you accept the following?
>
> Consider an ideal narrowbandband filter with center frequency w,
> passband gain 1 and bandwidth dw whose input happens to be the
> random process in question. Then, the value of the PSD of the
> input process at frequency w is the limit as dw approaches to
> zero of the ratio of the average power of the filter output signal
> to the filter bandwidth dw.

Well, there is a tiny detail that is not clear. When you say
"average power", do you mean the average over time of the output
power in one realization? Or do you mean the average over all
possible realizations of the "mean power"? (where mean power
is the average over time of the power)

(I think this distintcion has to do with ergodicity, but I'd
prefer to avoid that issue, unless it is necessary)

Either way, there is something that I can not understand: how
can the PSD represent density of *power*? (I know the name
includes the terms "power" and "density" :-))

The spectrum of a deterministic signal represent the density
of *energy* (modulo a square root somewhere in there) of the
particular frequency component (i.e., the total energy, from
t = -infinity to +infinity). I figure the PSD should be the
probabilistic equivalent to the spectrum, right? (something
like the average of the spectra of all possible realizations
of the random signal).

If the PSD represents density of power, one contradiction that
I find is: suppose that you define a signal as the following:
the value of x(t) at each time t is an independent random
variable uniformly distributed in the interval (-1,1) (when I
say independent, I mean independent of x at *any other time*).

The PSD of this signal is necessarily a constant value for all
frequencies.

If the PSD represents *power* density, then the above signal
would be found to be infinity (which is not -- the average
power of it is 1/3).

Can you help sort this out?

> The PSD is a function S(w) such that the power of the output
> signal from a filter (with transfer function H(w)) whose input is
> the process is given by the integral of S(w)|H)w)|^2 over the real
> line. If H(w) in an ideal bandpass filter with unit passband gain,
> then the power is the integral of S(w) over the passband.

Yes, but this does not define what PSD represents -- this simply
states a characteristic of it. Of course the output power is
given by S(w) |H(w)|^2, because the output *energy* follows that
relationship (what I'm saying is that this property could hold
for more than one definition of what PSD is)

Cheers,

Carlos
--

Carlos Moreno
10-29-2003, 04:59 PM
Carlos Moreno wrote:

> If the PSD represents density of power, one contradiction that
> I find is: suppose that you define a signal as the following:
> the value of x(t) at each time t is an independent random
> variable uniformly distributed in the interval (-1,1) (when I
> say independent, I mean independent of x at *any other time*).
>
> The PSD of this signal is necessarily a constant value for all
> frequencies.
>
> If the PSD represents *power* density, then the above signal
> would be found to be infinity (which is not -- the average
> power of it is 1/3).

Oops. Seems like my brain was too slow compared to my fingers :-)

The last paragraph above should say: "then the power of the
above signal would be found to be infinity"

Carlos
--

Rune Allnor
10-29-2003, 06:27 PM
[email protected] (Gordon Sande) wrote in message news:<[email protected]>...
> >Could you provide examples? Admittedly, a solar panel or a hydroelectric
> >generator are examples on types of recievers that actually generate
> >energy. I was thinking of radio recievers, where the measured signal
> >is used to modulate a flow of energy internal to the reciever, energy
> >that was provided by the reciever's power supply.
>
> >Ok, you got me wondering if I confuse "power" and "energy"... my point
> >is that you can't charge a battery by just connecting it to a recieving
> >antenna, and drayn energy/power from the static noise the antenna
> >recieves from the surroundings.
>
> Actually one of the abuses in the early days of radio were the folks who
> set of "tranformers" to sap power from the transmitters. So yes you can
> charge a battery from an attenna, and whether you want to think of loud
> music as noise is left to you as an exercise.

So what you're saying is that the transmitted EM wave would transmit
energy as in a long-distance transformer? The efficiency would be
ludicrously small but non-zero... It makes sense... I'll have to check
my sources(!), the first signal analysis book I ever used in the first
class I ever took on the subject. I thought I understood the difference
but apparently I didn't.

> Almost any good book on stochastic processes will give you good rigourous
> definitions. You will have to take the trouble to understand them and
> stop being confused by those around you who "simplify" things.
>
> Finding bad drivers in an old VWs does not mean that there are no good
> drivers in Porsches. But you have to be able to tell the difference.

Getting challenged at comp.dsp is good training in that respect... ;)

Rune

Rune Allnor
10-29-2003, 06:30 PM
[email protected] (Rune Allnor) wrote in message news:<[email protected]>...
> If the
> integral goes between plus and minus infinity and you want to integrate
> a sine or cosine, you have to modify the integral to
>
> inf 1
> lim integral - |x(t)|^2 dt < infinite [5]
> T-> infinite -inf T
>
> or the maths doesn't work. That's what is known as a "Power Signal".

I think the correct Fourier integral for power signals should be

T/2 1
lim integral - |x(t)|^2 dt < infinite [5]
T-> infinite -T/2 T

(integration limits are changed).

Rune

Rune Allnor
10-29-2003, 08:57 PM
Carlos Moreno <[email protected]> wrote in message news:<[email protected]>...
> If the PSD represents density of power, one contradiction that
> I find is: suppose that you define a signal as the following:
> the value of x(t) at each time t is an independent random
> variable uniformly distributed in the interval (-1,1) (when I
> say independent, I mean independent of x at *any other time*).
>
> The PSD of this signal is necessarily a constant value for all
> frequencies.

No, it's not. The infinite sequence comprising only unit elements,

....., 1, 1, 1, 1, 1,......

is a valid (thoght almost improbable) realization of this process.
How does its spectrum look like?

Rune

Tom Loredo
10-29-2003, 09:14 PM
Carlos Moreno wrote:
> They do not give a definition of what the
> PSD is, but just spit a bunch of blah-blah about what I
> would call "the consequences from an intuitive point of
> view of what PSD is" :-\ You see how one can get pretty
> frustrated :-(

Yes. Look at Priestly; it focuses on the statistical properties
and is very rigorous and technical. It's one of the top few
references on statistical properties of periodograms and
related statistics.

Part of the issue is context. For example, if you believe your
data contains an unknown single sinusoid and Gaussian noise, then
there is a simple interpretation of the PSD as the log of the
(marginal) posterior probability density for the frequency of
the sinusoid (this is from the Bayesian viewpoint; there is an
analogous frequentist result in terms of least squares estimation
of the frequency). If instead you believe the signal has a smooth
spectrum, things get more complicated. For example, Whittle showed
that, under certain conditions, you can use the PSD of the data to
easily construct an approximate likelihood function for the PSD of the
underlying signal (look for "Whittle likelihood" to learn more about
this). So just from these two examples perhaps you
can see that there is likely to be more than one "correct" answer
to your question unless you get more specific about the actual
problem you are trying to solve. Priestly discusses much of this
in great detail (though not any of the Bayesian stuff; for that
look at bayes.wustl.edu for Jaynes's article on Bayesian spectrum
and chirp analysis, and at Bretthorst's book).

-Tom

--

To respond by email, replace "somewhere" with "astro" in the
return address.

Carlos Moreno
10-29-2003, 10:59 PM
Rune Allnor wrote:

>>If the PSD represents density of power, one contradiction that
>>I find is: suppose that you define a signal as the following:
>>the value of x(t) at each time t is an independent random
>>variable uniformly distributed in the interval (-1,1) (when I
>>say independent, I mean independent of x at *any other time*).
>>
>>The PSD of this signal is necessarily a constant value for all
>>frequencies.
>
>
> No, it's not.

Sorry, yes it is!

> The infinite sequence comprising only unit elements,
>
> ....., 1, 1, 1, 1, 1,......
>
> is a valid (thoght almost improbable) realization of this process.

Except that I was talking about the PSD of the random signal that
I defined. You are talking about the spectrum of one particular
realization of that random signal -- those are two entirely different
things.

Carlos
--

Tom Loredo
10-30-2003, 12:13 AM
Carlos, I think you are abusing terminology and thus missing Rune's point.

> > If the PSD represents density of power, one contradiction that
> > I find is: suppose that you define a signal as the following:
> > the value of x(t) at each time t is an independent random
> > variable uniformly distributed in the interval (-1,1) (when I
> > say independent, I mean independent of x at *any other time*).

This does not define a signal; it defines a *distribution* of signals.

Carlos Moreno wrote:
>
> Except that I was talking about the PSD of the random signal that
> I defined. You are talking about the spectrum of one particular
> realization of that random signal -- those are two entirely different
> things.

Indeed they are---and only a particular realization of the process
you defined has a PSD (and Rune did correctly identify one possible
realization with a nonconstant PSD). The process as a whole produces
a distribution of PSD functions.

-Tom

--

To respond by email, replace "somewhere" with "astro" in the
return address.

Carlos Moreno
10-30-2003, 02:43 AM
Tom Loredo wrote:
> Carlos, I think you are abusing terminology and thus missing Rune's point.
>
>>>If the PSD represents density of power, one contradiction that
>>>I find is: suppose that you define a signal as the following:
>>>the value of x(t) at each time t is an independent random
>>>variable uniformly distributed in the interval (-1,1) (when I
>>>say independent, I mean independent of x at *any other time*).
>
> This does not define a signal; it defines a *distribution* of signals.

Actually, I disagree. Although I see that I wasn't rigorous
enough with my phrasing.

The above (my previous post) defines a *random signal*, or a
*random process* (maybe the latter is a more appropriate
term). But I got used to talk about "random signals",
understanding that when we say "the (random) signal" we
are talking about an experiment with outcomes being the
realizations of that random signal.

The PSD that I'm talking about is the PSD of random signals,
and not the Spectrum of a particular realization of a
random signal. That's why I contested Rune's point; the
spectrum of any specific realization of the (random) signal
I defined is not relevant.

White noise has a PSD that is constant for all values of
frequency (from -infinity to infinity).

The random signal (or random process) that I described is
white noise; it has a PSD that is constant for all values
of frequency. Yet its average power is not infinite (as
we would conclude if the PSD represented average density
of power).

So, I'm hoping that I'm clarifying the notation and what
I meant.

What troubles me is that there are a couple of posts in
this thread that claim that PSD is the average (over all
possible realizations of the random signal) of *power*
density, and no-one has contested that notion -- except
me, claiming that it represents density of *energy*.
The funny detail is that no-one has explicitly contested
my claim either!! Are you guys under the impression
that I'm just trolling? Or that maybe I'm cheating in
some homework. I can truthfully tell you that neither
of these options is true. As I said in my first post,
something that our teacher said in class triggered a
disagreement that I think goes back to different
understandings of what PSD is: they claim that white
noise has infinite variance, and I disagree (their
argument is that the power is obtained by integrating
the PSD with respect to frequency, and since the PSD
of white noise is constant, then the integral gives
infinite power).

Cheers,

Carlos
--

Bernhard Holzmayer
10-30-2003, 06:59 AM
Rune Allnor wrote:

> ...
> Could you provide examples? Admittedly, a solar panel or a
> hydroelectric generator are examples on types of recievers that
> actually generate energy. I was thinking of radio recievers, where
> the measured signal is used to modulate a flow of energy internal
> to the reciever, energy that was provided by the reciever's power
> supply.

In Berlin (was it in the early 30th?), people took lamp bulbs,
connected two wires to them, one of them was grounded.
By raising the other wire into the air, they could switch on the
lamp.

Surely, this was only possible because of the short distance of the
transmitting station.

This became common usage in an increasing number of garden houses,
garages, ... until in the end a law was released which forbade the
"abuse" of radio energy.

Even today, the "Deutsche Welle" short-wave broadcasting station,
causes lamps (fluorescent tubes) to shine without being switched on,
as soon as the antenna is directed to Africa.
This happens in a village some 3km south of the station.
(I've been told by somebody who lived there.)

Bernhard

Rune Allnor
10-30-2003, 09:41 AM
Carlos Moreno <[email protected]> wrote in message news:<[email protected]>...
> Rune Allnor wrote:
>
> >>If the PSD represents density of power, one contradiction that
> >>I find is: suppose that you define a signal as the following:
> >>the value of x(t) at each time t is an independent random
> >>variable uniformly distributed in the interval (-1,1) (when I
> >>say independent, I mean independent of x at *any other time*).
> >>
> >>The PSD of this signal is necessarily a constant value for all
> >>frequencies.
> >
> >
> > No, it's not.
>
> Sorry, yes it is!
>
> > The infinite sequence comprising only unit elements,
> >
> > ....., 1, 1, 1, 1, 1,......
> >
> > is a valid (thoght almost improbable) realization of this process.
>
> Except that I was talking about the PSD of the random signal that
> I defined. You are talking about the spectrum of one particular
> realization of that random signal -- those are two entirely different
> things.

Could you please define what a "realization of a random signal" is?
I can't make any sense of that. If you said "realization of a random
process" it would make sense. There is a vast difference between
"signal" and "process". The PSD is a property of the process, the
squared magnitude of the frequency spectrum of the signal is an
estimate of that spectrum. See below.

I did ask in a different post that you define more clearly what you mean
by "random signal". I can't find a post of yours where you do that.
I will make an attempt to clarify the basis of my argumentation:

If you study one sequence of numbers or one signal (one particular
realization), then use the standard techniques like DFTs, and do so.
However, once we start talking about "random signals" and "random
processes" it is implied that we regard the observed signal (the
"realization") as one glimpse into a world where some governing
rule, of which we know little (the "process"), influences the
behaviour of the observed data. The objective of the study is to
use the observed data to infer something about these governing rules,
not to describe the data themselves in any particular detail.

The statistician admits (by using the very term "random") that he can
not (and will not) attemt a complete, deterministic description of the
system. He will, however, make the best use he can of what data he has,
in such a way that the inherent uncertainty of the analysis is vclear,
and also such that he can use any new data that may become available
to him.

So we *assume* some property of the data, i.e. that they are a
"realization" of some "random process". We can the impose some more
assumption of this model, ("stationary", "ergodic", "zero mean",
"Gaussian",...) and try to characterize it or estimate some parameters.
You may want to look into the difference between "true" parameters and
"estimators" for those parameters. As in the example cited above, the
estimated PSD of the (constant) data series of one realization is as far
from the "true" PSD of the random process as possible, but the statistical
arguments still hold.

Of course, attempting to estimate the true PSD on the basis of that
realization is doomed to fail, but that's just the inherent peril of
statistical signal processing or statistics in general.

I'll just remind you about a point I made in an earlier post about
interpreting the maths of the statistics in a somewhat different manner
than "only maths".

Rune

Rune Allnor
10-30-2003, 12:50 PM
Carlos Moreno <[email protected]> wrote in message news:<[email protected]>...
> Tom Loredo wrote:
> > Carlos, I think you are abusing terminology and thus missing Rune's point.
> >
> >>>If the PSD represents density of power, one contradiction that
> >>>I find is: suppose that you define a signal as the following:
> >>>the value of x(t) at each time t is an independent random
> >>>variable uniformly distributed in the interval (-1,1) (when I
> >>>say independent, I mean independent of x at *any other time*).
> >
> > This does not define a signal; it defines a *distribution* of signals.
>
> Actually, I disagree. Although I see that I wasn't rigorous
> enough with my phrasing.
>
> The above (my previous post) defines a *random signal*, or a
> *random process* (maybe the latter is a more appropriate
> term). But I got used to talk about "random signals",
> understanding that when we say "the (random) signal" we
> are talking about an experiment with outcomes being the
> realizations of that random signal.

Then it may be useful to check out the terminology. The "random
process" usually denotes this distribution of "realizations",
and the term "signal" usually refers to one particular outcome
or observation in one particular realization or experiment.

> The PSD that I'm talking about is the PSD of random signals,
> and not the Spectrum of a particular realization of a
> random signal. That's why I contested Rune's point; the
> spectrum of any specific realization of the (random) signal
> I defined is not relevant.

The spectral densities of realizations are most certainly relevant,
they are what you observe, they are what you have to work with.
Ref the connection between the probability density function of
your dice, and the outcome of actually rolling it.

> White noise has a PSD that is constant for all values of
> frequency (from -infinity to infinity).

The generating process does. One particular realization need not have.

> The random signal (or random process) that I described is
> white noise; it has a PSD that is constant for all values
> of frequency. Yet its average power is not infinite (as
> we would conclude if the PSD represented average density
> of power).
>
> So, I'm hoping that I'm clarifying the notation and what
> I meant.

You have clarified you notation, but not necessarily what you meant.
Humpty Dumpty may have been a charming guy, but his linguistics
is not an example to follow.

> What troubles me is that there are a couple of posts in
> this thread that claim that PSD is the average (over all
> possible realizations of the random signal) of *power*
> density, and no-one has contested that notion -- except
> me, claiming that it represents density of *energy*.
> The funny detail is that no-one has explicitly contested
> my claim either!!

I *think* I wrote a post a couple of days ago, where I commented
on the difference between Energy Signals and Power Signals.
There need not be any contradictions between energy and power,
it's a matter of using the correct tool in each case.

Coming to think of it, I actually asked in that very same post if
you could make an effort to check wether the signals you are concerned
about were best classified as Energy Signal or Power Signals...

> Are you guys under the impression
> that I'm just trolling?

The thought has occured to me...

> Or that maybe I'm cheating in
> some homework. I can truthfully tell you that neither
> of these options is true. As I said in my first post,
> something that our teacher said in class triggered a
> disagreement that I think goes back to different
> understandings of what PSD is: they claim that white
> noise has infinite variance, and I disagree (their
> argument is that the power is obtained by integrating
> the PSD with respect to frequency, and since the PSD
> of white noise is constant, then the integral gives
> infinite power).

I don't know what your teacher said, so I can't comment on any
remarks you heard in class. But it seems more and more clear to me
that this is an issue of getting the different definitions right,
understanding those differences, and to know that different analysis
tools exist for different types of signals. You might find it useful
to go to your teacher and ask him or her about the details.

Rune

Jerry Avins
10-30-2003, 03:52 PM
Carlos Moreno wrote:

...

> White noise has a PSD that is constant for all values of
> frequency (from -infinity to infinity).
>
> The random signal (or random process) that I described is
> white noise; it has a PSD that is constant for all values
> of frequency. Yet its average power is not infinite (as
> we would conclude if the PSD represented average density
> of power).

...

The average power that that model implies is indeed infinite. The
argument is a demonstration that ideal white noise -- constant energy
per unit bandwidth for all frequencies -- can't exist. Like the rigid
bodies of mechanics, white noise is no more than a useful abstraction.

Just a the notion of a rigid body provides an easy attack on statically
determinate systems, the notion of pure white noise is usefully applied
to bandlimited systems. Properties of the noise outside the system's
bandwidth are irrelevant. We deal regularly with abstractions that don't
represent something we can build or touch. Impulses and other
singularities are examples.

Jerry
--
Engineering is the art of making what you want from things you can get.
ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ

Carlos Moreno
10-31-2003, 03:17 AM
Jerry Avins wrote:

>> White noise has a PSD that is constant for all values of
>> frequency (from -infinity to infinity).
>>
>> The random signal (or random process) that I described is
>> white noise; it has a PSD that is constant for all values
>> of frequency. Yet its average power is not infinite (as
>> we would conclude if the PSD represented average density
>> of power).
>
> The average power that that model implies is indeed infinite.

But it can not be! The power of something that has values
bounded in absolute value can not be infinite!

> The argument is a demonstration that ideal white noise -- constant energy
> per unit bandwidth for all frequencies -- can't exist.

If it can exist in nature or not, is irrelevant (I mean, any
argument could be taken to philosophical grounds, claiming
that numbers do not exist in nature and that they're only
a model, an abstraction that only exist in our minds....
With that sort of argument one could dismiss basically
*any* mathematical result, valid or not).

My point is that, the white noise as I defined it, as a
mathematical abstraction, has properties that can be
rigorously defined according to the laws of mathematics.

The way I see it, the power of the white noise that I
described *can not* be infinite. In fact, calculating
the variance is a trivial exercise: E{x^2} = integral
from -1 to 1 of (1/2) x^2 dx. That gives you the average
power (assuming a normalized 1-ohm resistor, etc. etc.),
which is definitely not infinite.

Yes, white noise is a mathematical abstraction; the
disagreement is on the mathematical properties of this
abstraction.

You mention also the delta. As an example, if someone
tells you that the value of the delta at t = 0 is 1??
You will undoubtedly tell them that they're wrong!! It
doesn't matter that a delta can not exist in practice;
the mathematical properties that define a delta are
specific and rigurous.

Same thing (IMO) about white noise. We talk about
*Gaussian* white noise with zero mean and variance
sigma^2. Well, if the variance is sigma^2, then the
average power is not infinite!

Carlos
--

Randy Yates
10-31-2003, 06:34 AM
Carlos Moreno wrote:
> [...]
> So, can someone help me with finding a rigurous definition
> of what the value at a given frequency of the PSD of a
> random signal means?

Carlos,

I think you mean to ask is how we derive the units
of the PSD. If so, that's not too hard.

For this discussion, let's assume that the units of our
signal is volts, which I will denote by "[V]" ([anything]
denotes the units of "anything"). Let's also say the
voltage signal is identically distributed, and that the
pdf of the voltage (the ensemble pdf) at any time t
is simply f(x). Let's also denote the voltage signal
X(t).

First of all we need to convince ourselves that the pdf
of a random variable representing voltage has units of
[1/V]. Reason like this: The integral of the pdf gives
a probability, which is unitless. Since the integral
of a pdf is of the form \int f(x) dx, and the "dx" has
units of [V], then f(x) must be [1/V] to make the result
unitless.

OK, now what are the units of the autocorrelation function
Rxx(tau) of X(t)? All time lags have the same units, then for
simplicity let's examine the units of Rxx(0) = E[X^2]. This
is

\int x^2 f(x) dx,

which has units of [V^2] * [1/V] * [V], or [V^2]. Of course you
know the units [V^2] are directly proportional to power.

Finally, what are the units of the PSD?

\int Rxx(tau) e^{j*omega*tau) dtau

has units of [V^2] * [s] ("s" = seconds). And that is
directly proportional to watts/Hz (since [Hz] = [1/s]).
--
% Randy Yates % "...the answer lies within your soul
%% Fuquay-Varina, NC % 'cause no one knows which side
%%% 919-577-9882 % the coin will fall."
%%%% <[email protected]> % 'Big Wheels', *Out of the Blue*, ELO
http://home.earthlink.net/~yatescr

Jerry Avins
10-31-2003, 06:57 AM
Carlos Moreno wrote:

...

>
> My point is that, the white noise as I defined it, as a
> mathematical abstraction, has properties that can be
> rigorously defined according to the laws of mathematics.
>
> The way I see it, the power of the white noise that I
> described *can not* be infinite. In fact, calculating
> the variance is a trivial exercise: E{x^2} = integral
> from -1 to 1 of (1/2) x^2 dx. That gives you the average
> power (assuming a normalized 1-ohm resistor, etc. etc.),
> which is definitely not infinite.
>
> Yes, white noise is a mathematical abstraction; the
> disagreement is on the mathematical properties of this
> abstraction.

Why does it surprise you that the calculated properties of impossible
abstractions are themselves not possible? White noise as you defined is
is characterized as volts per root Hz. (Or watts per Hz.) It is a useful
abstraction where the actual bandwidth is limited, even though it
implies infinite power for infinite bandwidth. That doesn't bother me
because neither infinity can be realized.

> You mention also the delta. As an example, if someone
> tells you that the value of the delta at t = 0 is 1??
> You will undoubtedly tell them that they're wrong!!

You would do well to doubt. I might, however, ask 1 what?

> It doesn't matter that a delta can not exist in practice;
> the mathematical properties that define a delta are
> specific and rigurous.

Well, it behooves us to be precise. What to you mean by "value" above?
Certainly nor width or height. Some would say "area", others "strength".

> Same thing (IMO) about white noise. We talk about
> *Gaussian* white noise with zero mean and variance
> sigma^2. Well, if the variance is sigma^2, then the
> average power is not infinite!

Are you sure that sigma^2 represents power here? There is no limit to
the magnitude of true Gaussian noise, even though the frequency of a
particular amplitude's being exceeded decreases rapidly with amplitude.
If there is just one infinite-amplitude event in a year, the average
power is rather large, to say the least.

It doesn't do to say that impossible conditions are OK because that are
merely mathematical constructs, and then to reject the conclusions
mathematically constructed from them.

Jerry
--
Engineering is the art of making what you want from things you can get.
ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ

Dilip V. Sarwate
10-31-2003, 03:02 PM
Carlos Moreno <[email protected]> wrote in message news:<[email protected]>...

> Same thing (IMO) about white noise. We talk about
> *Gaussian* white noise with zero mean and variance
> sigma^2. Well, if the variance is sigma^2, then the
> average power is not infinite!


Carlos:

**You** may talk about Gaussian white noise with zero
mean and variance sigma^2, but in the context of
continuous-time signals, this does not make sense.
In most systems, the thermal noise at the output of
a filter can be modeled as a Gaussian random process
whose PSD (that word you hate!) is proportional to
|H(w)|^2. We can derive this result using the standard
general theory of second-order (i.e. finite power)
random processes by **pretending** that the filter input
is a Gaussian random process with PSD = constant for all w.
This hypothetical process is called Gaussian white noise,
and generalized Fourier theory (which allows the notion
of Fourier transforms of **power** signals) allows us to
pretend that the input process has autocorrelation that
is a delta function. But, it does not make sense to talk
about the variance of white noise: it is a meaningless
concept. For continuous-time signals, the only meaning to
be ascribed to white noise -- Gaussian or not -- is
that it is a hypothetical process defined by the property
that when it is passed through a filter with transfer function
H(w), it produces a random process with PSD that is proportional
to |H(w)|^2. We cannot talk of its properties except in terms
of what we can observe, and any observation necessarily
implies some filtering. For example, we cannot talk of the
properties of the random *variable* X(5), say, and claim that
it has zero mean or is Gaussian with some specified variance
(possibly infinite), because we cannot sample the process
at t = 5 instantaneously, even though we often pretend in DSP
circles that we can and do get an instantaneous sample. This
pretence works because the fact that an actual sampler switch
will stay closed for a small nonzero period of time does not matter
verymuch in typical DSP applications: the signal being sampled has
typically been filtered anyway. But, to apply this notion of
instantaneous sampling to a white noise process just leads to
the same sort of dilemmas that you are stuck with.

For **discrete-time** random processes, the concept
of Gaussian white noise is a sequence of independent
zero-mean Gaussian random variables with fixed finite
variance sigma^2. The power here is indeed finite (and
equal to sigma^2), but bear in mind if we think of this
discrete-time process as having been obtained from a
continuous-time process via sampling, then the continuous-time
white noise has, implicitly or explicitly, been filtered
before sampling and thus has finite variance.

Now, you can say that you are very mathematical and rigorous
and don't care a whit about practical notions, and have given
an explicit construction of a continuous-time random process
with finite variance such that X(t) and X(t') are independent
if t and t' are two different real numbers. Now, why don't
you figure out what a typical sample function (or realization)
of this process is, whether such a realization can actually be
exhibited (remember that instantaneous changes in voltage will
require that infinite currents instantaneously change the
charges in various capacitors in the circuit!), and whether
second-order random process theory can be applied to this
process that you have described? Since you claim that you are
not trolling or seeking help on homework, maybe you should be asking
your instructor these questions too?

--Dilip Sarwate

Carlos Moreno
10-31-2003, 03:49 PM
Jerry Avins wrote:

> Why does it surprise you that the calculated properties of impossible
> abstractions are themselves not possible?

Because mathematical abstractions follow mathematical rules
and axioms, not the possibility or impossibility of such
abstraction to exist in nature.

The great majority of mathematical constructs are abstractions
that are not possible in nature (I would almost dare to say *all*
mathematical abstractions).

The simplest and most ubiqutous abstraction: the notion of
"continuous" values. It doesn't seem to exist in nature; yet
we do use differential equations and differential calculus to
obtain *real* results (that do approximate the way physical
systems behave).

(and let's not get started with complex numbers!! Just
because complex numbers do not exist in nature, would you
dismiss as invalid all of the Fourier analysis theory or
anything that someone derives based on complex numbers
properties?)

What I'm trying to say (which applies to the white noise
discussion) is: if we set up a mathematical abstraction
with certain properties (properties that do not contradict
other more fundamental mathematical axioms), then anything
that we derive from it will have to be consistent with that
and other mathematical axioms!

White noise should not be an exception! So white noise can
not exist in nature? So what? Complex numbers don't either,
and the delta function doesn't either, and yet they don't
lead to contradiction *in mathematical terms* when deriving
properties of them.

See, that's my point: what I'm seeing when thinking of PSD
as density of power is a contradiction *in mathematical
terms*, not a contradiction with physical properties or
terms.

>> Same thing (IMO) about white noise. We talk about
>> *Gaussian* white noise with zero mean and variance
>> sigma^2. Well, if the variance is sigma^2, then the
>> average power is not infinite!
>
> Are you sure that sigma^2 represents power here?

It represents "mean power" or "average" power, of course
I'm sure! At least it has to!

> There is no limit to
> the magnitude of true Gaussian noise

I know. And it doesn't matter: the average is still
finite! The expected value of x^2, where x is a random
variable with gaussian distribution, is the variance of
it.

> If there is just one infinite-amplitude event in a year, the average
> power is rather large, to say the least.

Actually, there is never an event with infinite-amplitude.
At any time, the value of a Gaussian variable is an *actual
number*. There is no such thing as a function, or a random
variable, taking "infinite-value".

> It doesn't do to say that impossible conditions are OK because that are
> merely mathematical constructs

Hmmm, I know that we may be shifting to the "philosophical
grounds" here, but I have to disagree with that. As I said,
virtually all mathematical constructs are indeed impossible
to achieve in nature. Many of them are things that we use
on a daily basis in DSP, and in general in signal analysis
and the like (e.g., the delta, complex numbers, differential
equations and differential calculus in general, integrals
from 0 to infinity, or from -infinity to infinity).

BTW, notice that I'm not saying that impossible conditions
are ok: I'm saying that mathematical constructs representing
impossible conditions may be ok -- as long as they're ok from
the point of view of the mathematical rules that we use to
deal with them.

Applying those mathematical constructs to represent physical
things, that's a different thing, and I do agree with what
you mention about dealing with "band limited" versions of
white noise, etc.

Carlos
--

Piergiorgio Sartor
10-31-2003, 04:05 PM
Carlos Moreno wrote:

[...]

Maybe I'm wrong, but I think the sigma^2 refers
to the Gaussian variable, while the PDS refers
to the Gaussian process, i.e. a sequence of
Gaussian variables.

So the process has infinite "power", while the
variable itself has statistical power = sigma^2.

Does it fit to you?

bye,

--
Piergiorgio Sartor

Jerry Avins
10-31-2003, 06:27 PM
Carlos Moreno wrote:

> Jerry Avins wrote:
>
>> Why does it surprise you that the calculated properties of impossible
>> abstractions are themselves not possible?
>
>
> Because mathematical abstractions follow mathematical rules
> and axioms, not the possibility or impossibility of such
> abstraction to exist in nature.

This is silly. I can prove that if the moon is made of green cheese,
then you are your own grandmother. In a proof, any false premise can
lead to any conclusion, true or false. Just so, a non-realizable
abstraction can lead to a non-realizable property. That's OK. What's not
OK is to claim that it really is realizable, or that a non-realizable
conclusion invalidated the argument. In fact, all it shows is that the
abstraction was extended outside it's useful domain. Just as functions
have regions of convergence, abstractions have regions of applicability.

Be happy!

>
...

Jerry
--
Engineering is the art of making what you want from things you can get.
ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ

Carlos Moreno
10-31-2003, 08:57 PM
Piergiorgio Sartor wrote:
> Carlos Moreno wrote:
>
> [...]
>
> Maybe I'm wrong, but I think the sigma^2 refers
> to the Gaussian variable, while the PDS refers
> to the Gaussian process, i.e. a sequence of
> Gaussian variables.
>
> So the process has infinite "power", while the
> variable itself has statistical power = sigma^2.
>
> Does it fit to you?

I'm not sure... The first impression is to say "no
it doesn't", but I'm not quite sure that I'm not
missing some subtlety.

I know I may be repeating myself, but applying your
concepts to the random process that I described
before (x(t) is independent of the value of x at
any time other than t, and has uniform distribution
in [-1,1].

Any possible realization of such process must have
an average power less than one. In fact, for any
possible realization of such process, the power at
any given time (as an "instantaneous" measure) must
be less than 1 (since the magnitude of the signal
can not be greater than one).

Also, the variance of the signal at any given time
(i.e., the variance with respect to all possible
realizations of the process) is also less than one.

So, how could a concept that leads to "infinite
power" fit in here? Notice that this example
emphasizes a bit more the impossibility of having
infinite power, but it is indeed equivalent to the
example of Gaussian white noise.

Carlos
--

Carlos Moreno
10-31-2003, 09:17 PM
Dilip V. Sarwate wrote:

> [...]
> --Dilip Sarwate

I guess I can only thank you for a detailed and careful
explanation of the terms causing my confusion... (in
fact, I'll have to re-read it carefully :-))

I was in the process of writing a long and detailed
message in response to yours, but to be honest, *I*
myself read the message and it *really* started to
sound like I'm trolling... :-(

My conclusion is that I will definitely have to figure
out a way to reconcile ideas that in my mind are
contradictory.

At some point during this discussion I though about
the possibility that this is indeed one of those
"mathematical paradoxes" (something like the notion
that there is exactly the same number of points in
the interval (0,1) as in the interval (0,infinity),
given that I can define a one-to-one, bijective
function that maps (0,1) to (0,infinity).

This paradox is "explained" by the fact that the real
numbers is not a countable set. (I think -- maybe
mathematicians out there will scream at me telling
me that I have no clue of what I'm saying :-))

See, the thing is that I'm not able to see if this
PSD thing with white noise is one of those paradoxes,
and if so, where would the origin of that paradox be.

Oh well, I guess I bothered you guys enough, so I'll
stop. Thanks to all that participated!!

Cheers,

Carlos
--

Jerry Avins
11-01-2003, 12:54 AM
Carlos Moreno wrote:

...

> I know I may be repeating myself, but applying your
> concepts to the random process that I described
> before (x(t) is independent of the value of x at
> any time other than t, and has uniform distribution
> in [-1,1].

Then it isn't bandlimited, so you can't draw legitimate conclusions from
the samples.

...

Jerry
--
Engineering is the art of making what you want from things you can get.
ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ

Piergiorgio Sartor
11-01-2003, 01:56 PM
Carlos Moreno wrote:

> I know I may be repeating myself, but applying your
> concepts to the random process that I described
> before (x(t) is independent of the value of x at
> any time other than t, and has uniform distribution
> in [-1,1].

OK...

> Any possible realization of such process must have
> an average power less than one. In fact, for any
> possible realization of such process, the power at
> any given time (as an "instantaneous" measure) must
> be less than 1 (since the magnitude of the signal
> can not be greater than one).

I do not get this.
A possibile realization is:

...., 0, 0, 1, 0, 0...

which has quite a lot of spectral energy, I would say.

> Also, the variance of the signal at any given time
> (i.e., the variance with respect to all possible
> realizations of the process) is also less than one.

That signal does not have any variance, the
random variable has variance.

> So, how could a concept that leads to "infinite
> power" fit in here? Notice that this example
> emphasizes a bit more the impossibility of having
> infinite power, but it is indeed equivalent to the
> example of Gaussian white noise.

Statistical power, but I do not see the point.

In my view the process you described has infinite
power, why not?

bye,

--

piergiorgio

Carlos Moreno
11-01-2003, 06:28 PM
I know I said I would leave you guys alone and let the
thread drop... But now you're answering my other post,
and it feels kind of rude to let you talking alone...

Piergiorgio Sartor wrote:
> Carlos Moreno wrote:
>
>> I know I may be repeating myself, but applying your
>> concepts to the random process that I described
>> before (x(t) is independent of the value of x at
>> any time other than t, and has uniform distribution
>> in [-1,1].
>
>> Any possible realization of such process must have
>> an average power less than one. In fact, for any
>> possible realization of such process, the power at
>> any given time (as an "instantaneous" measure) must
>> be less than 1 (since the magnitude of the signal
>> can not be greater than one).
>
> I do not get this.
> A possibile realization is:
>
> ..., 0, 0, 1, 0, 0...

???

How can that be a possible realization? The process
I described is continuous-time. But regardless, I
don't understand what you mean with "a lot of spectral
energy".

I'm talking about signals representing voltage, and
thus, the power I'm referring to is electric power,
which, as an instantaneous measure, at time t, is
equal to x(t)^2 (divided by the value of the
resistor to which it is applied, but let's assume
a normalized 1 ohm resistor)

If we talk about the average power of one particular
realization (let's call it the "time average power",
to avoid confusion with any other parameter), then
that would be given by:

T
/
1 |
TAP = lim --- | x(t)^2 dt
T->oo 2T |
/
-T

That integral is less-than-or-equal to an integral
with the same limits and a function that is >= x(t)^2
for all t.

So, such function could be f(t) = 1. If you plug
f(t) = 1 (replacing x(t)^2) in the above formula,
you obtain that the average power (time average
power) is 1. Thus, the average power of the signal
(the particular realization of the process I described)
is less-than-or-equal than one.

If all possible realizations have a time average
power less than one, then the ensemble average power
(i.e., the average over all possible realizations
of the time average power) must also be less than
or equal than one.

I'm using basic definition and basic properties of
integrals, and I reach a conclusion that contradicts
the fact that the white noise is found to have
infinite power, if calculated from the fact that
the constant-valued PSD represents density of power.

I feel so frustrated that I haven't been able to
communicate what's in my mind!! And that is a
fact, because I am seeing a contradiction that
no-one else sees, and no-one has been able to make
me understand why there isn't a contradiction, or
why such contradiction is to be expected (no, I'm
still not buying the "since white noise is impossible
to achieve in nature..." -- again, the way I see it,
the contradiction arises in purely abstract mathematical
terms, using the rules and axioms of mathematics, which
apply to mathematical constructs).

> In my view the process you described has infinite
> power, why not?

Well, the way I see it, because of the above...
Maybe I'm still using the wrong terminology? Or
maybe I'm confusion some terms or properties??

I'm really hoping that someone will be able and
willing to make me understand what's happening!!
(so that I can leave you guys alone AND sleep
peacefully at the same time :-)) -- I'm serious!!
I'm having nightmares about this!!! :-(((

Cheers,

Carlos
--

Randy Yates
11-01-2003, 06:37 PM
Carlos Moreno wrote:
> [...]
> I feel so frustrated that I haven't been able to
> communicate what's in my mind!!

Carlos,

Although I haven't read all of the dozens of
articles in this thread, would the following
describe the "question in your head"? ...:

Let X(t) be a zero-mean, IID, continuous-time process
with variance of sigma^2. Now we know that since this
function is IID, it has a white PSD and therefore its
autocorrelation function at lag 0, R_XX(0), should be
b*delta(tau), where delta(tau) is the usual Dirac delta
function and b is some constant.

However, we also know that the autocorrelation function R_XX(tau)
is defined to be E[X(t)*X(t+tau)], and therefore that R_XX(0)
is E[X^2] (where we have dropped the dependence on t since the
process is IID). Therefore in this sense R_XX(0) = E[X^2] = sigma^2,
which is not infinite.

Is this the contradiction you speak of?

PS: I'm a little put off that you haven't responded to my post on
the units of PSD. Did you see it?
--
% Randy Yates % "...the answer lies within your soul
%% Fuquay-Varina, NC % 'cause no one knows which side
%%% 919-577-9882 % the coin will fall."
%%%% <[email protected]> % 'Big Wheels', *Out of the Blue*, ELO
http://home.earthlink.net/~yatescr

Carlos Moreno
11-01-2003, 08:18 PM
Hi Randy,

Randy Yates wrote:
>
> Carlos,
>
> Although I haven't read all of the dozens of
> articles in this thread, would the following
> describe the "question in your head"? ...:
>
> Let X(t) be a zero-mean, IID, continuous-time process
> with variance of sigma^2. Now we know that since this
> function is IID, it has a white PSD and therefore its
> autocorrelation function at lag 0, R_XX(0), should be
> b*delta(tau), where delta(tau) is the usual Dirac delta
> function and b is some constant.
>
> However, we also know that the autocorrelation function R_XX(tau)
> is defined to be E[X(t)*X(t+tau)], and therefore that R_XX(0)
> is E[X^2] (where we have dropped the dependence on t since the
> process is IID). Therefore in this sense R_XX(0) = E[X^2] = sigma^2,
> which is not infinite.
>
> Is this the contradiction you speak of?

Maybe. As a matter of fact, this may be at the heart of
what I perceive as a contradiction. What surprises me is
that you show the two "contradictory" derivations, but
don't mention why the contradiction, or if it is indeed
a contradiction (what I mean is that it might be an
*apparent* contradiction).

In fact, now that you "equate" the results of those two
different approaches, I guess one problem (maybe part of
the same problem?) is that I'm not sure I understand the
justification of defining the PSD and Rxx as a Fourier
transform pair. I mean, I remember from my Signals and
Systems course, understanding the intuitive interpretation
of it: the more high-frequency contents, the less correlated
close samples would be -- in particular, for white noise,
with strong frequency contents going to infinity, samples
arbitrarily close are still uncorrelated. That made (and
still makes) perfect sense to me.

Fine. So, why the contradiction? (or the "apparent"
contradiction of results?).

In fact, how would Rxx(0) be related to the average
electrical power of the signal? (the average "watts"
that a voltage signal would dissipate on a 1-ohm resistor).
I seem to see clearly how the E{x^2} is related to the
average power of a voltage signal, but not so sure that
I understand the link with Rxx.

> PS: I'm a little put off that you haven't responded to my post on
> the units of PSD. Did you see it?

Yes!! And I apologize for letting it go unnoticed!!
The thing is that I read it together with the other
messages that finally made me decide that I was starting
to sound like a troll, and thus decided to witdraw...
My sincere apologies!

Ironically, that was *the* message that best seemed to
address my initial point (I mean, it was the one message
that did not give me that feeling of "this guy didn't
understand what I was asking" -- not that I'm saying
that the others didn't understand; but almost all the
other messages (the initial ones, at least) gave me, at
some extent, the impression that they were going off a
tangent).

When you showed me that the units of the PSD are indeed
watts per herz, that seemed like a very precise attempt
at convincing me of what the PSD is... (still, after
reading that message, I could not seem ro reconcile
ideas that seemed contradictory, and then again, that
contributed to my decision that "maybe I should go
re-study these concepts before I continue to bother
these guys" :-))

So, I'm hoping that you won't be too mad at me and will
be willing to elaborate a bit on this contradiction of
Rxx(0) being infinite while E{x^2} being finite.

Thanks!!

Carlos
--

Randy Yates
11-02-2003, 03:11 AM
Hi Carlos,

Carlos Moreno wrote:
>
> Hi Randy,
>
> Randy Yates wrote:
>
>>
>> Carlos,
>>
>> Although I haven't read all of the dozens of
>> articles in this thread, would the following
>> describe the "question in your head"? ...:
>>
>> Let X(t) be a zero-mean, IID, continuous-time process
>> with variance of sigma^2. Now we know that since this
>> function is IID, it has a white PSD and therefore its
>> autocorrelation function at lag 0, R_XX(0), should be
>> b*delta(tau), where delta(tau) is the usual Dirac delta
>> function and b is some constant.
>>
>> However, we also know that the autocorrelation function R_XX(tau)
>> is defined to be E[X(t)*X(t+tau)], and therefore that R_XX(0)
>> is E[X^2] (where we have dropped the dependence on t since the
>> process is IID). Therefore in this sense R_XX(0) = E[X^2] = sigma^2,
>> which is not infinite.
>>
>> Is this the contradiction you speak of?
>
>
> Maybe. As a matter of fact, this may be at the heart of
> what I perceive as a contradiction. What surprises me is
> that you show the two "contradictory" derivations, but
> don't mention why the contradiction, or if it is indeed
> a contradiction (what I mean is that it might be an
> *apparent* contradiction).

It is indeed a contradiction to me, and I cannot resolve
the contradiction. It is one I have had for years.

> In fact, now that you "equate" the results of those two
> different approaches, I guess one problem (maybe part of
> the same problem?) is that I'm not sure I understand the
> justification of defining the PSD and Rxx as a Fourier
> transform pair.

The justification of the definition? Are you asking how
Wiener and Khinchine proved the theorem that the PSD
is the Fourier transform of the autocorrelation function?
That I cannot say - it is an interesting question, though.

> I mean, I remember from my Signals and
> Systems course, understanding the intuitive interpretation
> of it: the more high-frequency contents, the less correlated
> close samples would be -- in particular, for white noise,
> with strong frequency contents going to infinity, samples
> arbitrarily close are still uncorrelated. That made (and
> still makes) perfect sense to me.
>
> Fine. So, why the contradiction? (or the "apparent"
> contradiction of results?).

Because Rxx(tau) != F^(-1)[Sxx(w)], and they should be.

> In fact, how would Rxx(0) be related to the average
> electrical power of the signal? (the average "watts"
> that a voltage signal would dissipate on a 1-ohm resistor).
> I seem to see clearly how the E{x^2} is related to the
> average power of a voltage signal, but not so sure that
> I understand the link with Rxx.

Rxx(0) = E[X^2(t)], by definition. Remember, Rxx(tau) is
defined to be E[X(t)*X(t+tau)], so when tau = 0, this
yields E[X^2(t)].

>> PS: I'm a little put off that you haven't responded to my post on
>> the units of PSD. Did you see it?
>
>
> Yes!! And I apologize for letting it go unnoticed!!
> The thing is that I read it together with the other
> messages that finally made me decide that I was starting
> to sound like a troll, and thus decided to witdraw...
> My sincere apologies!

Acknowledged. Thanks, Carlos.

> Ironically, that was *the* message that best seemed to
> address my initial point (I mean, it was the one message
> that did not give me that feeling of "this guy didn't
> understand what I was asking" -- not that I'm saying
> that the others didn't understand; but almost all the
> other messages (the initial ones, at least) gave me, at
> some extent, the impression that they were going off a
> tangent).
>
> When you showed me that the units of the PSD are indeed
> watts per herz, that seemed like a very precise attempt
> at convincing me of what the PSD is... (still, after
> reading that message, I could not seem ro reconcile
> ideas that seemed contradictory, and then again, that
> contributed to my decision that "maybe I should go
> re-study these concepts before I continue to bother
> these guys" :-))
>
> So, I'm hoping that you won't be too mad at me and will
> be willing to elaborate a bit on this contradiction of
> Rxx(0) being infinite while E{x^2} being finite.

Carlos, I am indeed not mad with you. Thanks for asking
these questions - it helps remind me and others of defintions
and concepts that tend to fade out unless you revisit them
often.

Regarding the contradiction, I hope my responses to you previously
in this post have clarified.

PS: Go to sci.math, where I posed this very question (in fact,
I had cut and paste it here in my last message to you). As of
a few minutes ago, I hadn't yet gotten any responses.
--
% Randy Yates % "...the answer lies within your soul
%% Fuquay-Varina, NC % 'cause no one knows which side
%%% 919-577-9882 % the coin will fall."
%%%% <[email protected]> % 'Big Wheels', *Out of the Blue*, ELO
http://home.earthlink.net/~yatescr

Jerry Avins
11-02-2003, 05:41 PM
Carlos Moreno wrote:

...
>
> I'm really hoping that someone will be able and
> willing to make me understand what's happening!!
> (so that I can leave you guys alone AND sleep
> peacefully at the same time :-)) -- I'm serious!!
> I'm having nightmares about this!!! :-(((
>
> Cheers,
>
> Carlos
> --

By applying correct geometric reasoning to a flawed figure, I can prove
that two lines which intersect are parallel. So what?

Consider a line of LEDs a foot apart. Each draws 10 milliwatts. It's not
absurd to say that if the line extends from -infinity to infinity, then
the power represented by them in the aggregate is infinite. Absurdity
lies in the belief that a line of such extent must follow rules that
apply to things that can be.

Jerry
--
Engineering is the art of making what you want from things you can get.
ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ

Carlos Moreno
11-02-2003, 06:54 PM
Randy Yates wrote:

> It is indeed a contradiction to me, and I cannot resolve
> the contradiction. It is one I have had for years.

Hmmm... This thread is becoming interesting!! :-)

>> In fact, now that you "equate" the results of those two
>> different approaches, I guess one problem (maybe part of
>> the same problem?) is that I'm not sure I understand the
>> justification of defining the PSD and Rxx as a Fourier
>> transform pair.
>
> The justification of the definition? Are you asking how
> Wiener and Khinchine proved the theorem that the PSD
> is the Fourier transform of the autocorrelation function?

Well, I wasn't looking for something as rigorous as a proof
of the theorem. Just an interpretation of the definition
that helps me understand it better (as opposed to an
interpretation *of the results* or an interpretation of
the consequences of such definition -- that, I think I
understand relatively well).

>> Fine. So, why the contradiction? (or the "apparent"
>> contradiction of results?).
>
> Because Rxx(tau) != F^(-1)[Sxx(w)], and they should be.

Ok.

>> In fact, how would Rxx(0) be related to the average
>> electrical power of the signal? (the average "watts"
>> that a voltage signal would dissipate on a 1-ohm resistor).
>> I seem to see clearly how the E{x^2} is related to the
>> average power of a voltage signal, but not so sure that
>> I understand the link with Rxx.
>
>
> Rxx(0) = E[X^2(t)], by definition.

Huh??

I thought Rxx(0) was defined as the time-domain correlation:

oo
/
|
Rxx(T) = | X^2(t) dt
|
/
-oo

Notice that in that case, this represents the total energy
of the signal... Hmmm, though something confuses me again:
this would be the energy of *one particular* realization.
(so, I think this brings me back to my original doubt:
what the hell does PSD and Rxx represent for a random
process?)

> Carlos, I am indeed not mad with you.

I never actually thought you'd be mad, of course! :-)

> PS: Go to sci.math, where I posed this very question

Still unanswered. We'll wait. Maybe in that reference I
was pointed to (Priestly, vol. 1) such proof/justification
is given?

Thanks,

Carlos
--

Carlos Moreno
11-02-2003, 08:15 PM
Jerry Avins wrote:

> By applying correct geometric reasoning to a flawed figure, I can prove
> that two lines which intersect are parallel. So what?
>
> Consider a line of LEDs a foot apart. Each draws 10 milliwatts. It's not
> absurd to say that if the line extends from -infinity to infinity, then
> the power represented by them in the aggregate is infinite. Absurdity
> lies in the belief that a line of such extent must follow rules that
> apply to things that can be.

Jerry,

You seem to keep missing my point. What you're saying is more
or less equivalent to deriving an incorrect property of sinusoidal
signals and claiming that that's because complex numbers do not
exist in nature, and thus it is absurd to pretend that we'll get
a meaningful result when applied to something that exists in
nature, etc. etc.

Complex numbers *are* an absurdity, the way you define absurdity.

It is a mathematical abstraction that does not correspond to anything
that exist in our universe (at least nothing thay we humans are aware
of).

Complex numbers simply follow *the mathematical rules* established
by their purely mathematical definition. Following **those**
mathematical definition, rules and axioms, you derive other
results.

Provided the definition that i^2 is equal to -1, Euler derived,
following that definition and following the usual axioms of
mathematics/algebra/calculus, that e^(x+iy) = e^x (cosy + i siny).

You could claim that anything that you derive from there is plain
wrong, since the imaginary unit does not exist in nature.

Same thing with the delta -- you claim that none of what I've said
makes sense because white noise can not exist in nature!

Then why do you justify that results derived by applying the
properties and the definition of Dirac's delta are indeed correct??
Direc's delta most certainly can not exist in nature; however,
the results that you obtain using that mathematical abstraction
are verifiable mathematically (and well, from the practical point
of view, they also have relevance, since they *very closely*
approximate the way certain phisical systems behave). You convert
a differential equation to its integral form, and applying the
definition of the delta (a definition that has no place in the real
world), which implies that the integral from -epsilon to epsilon
is 1 (for every epsilon > 0), then you obtain results that do not
contradict any other result obtained using mathematical constructs.

I insist: it does not make sense to use the argument "white noise
can not exist in nature" to justify a contradiction that arises
in mathematical terms -- two different ways of analytically
calculating something that should be equivalent, yield different
results; your argument would be valid if I claimed that in an
actual experiment, I expect to obtain whatever this or whatever
that, and I obtain something different from what the formulas tell
me... Then, your argument *might* be valid (emphasis on the
*might*; the conclusions you draw from properties of complex
numbers, Dirac's delta, etc., *are indeed* verifiable in practice,
with an accuracy given by the precision of the model, the
implementation, and the precision of the measurements).


Carlos
--

Piergiorgio Sartor
11-02-2003, 08:55 PM
Carlos Moreno wrote:
[...]

My feeling is that you're mixing things that
seems to be the same, but they're not.

The PDS of a stochastic process is the "power"
or "energy" the process can have in terms of
probability, not in terms of physical energy.

In case of white noise, it means that there
is no limitation to the possible sequence.
If not white, than the process is limited in
the "possible" realization and this is reflected
in the PDS as well.

The energy or power you calculate for a possible
realization is something else, it does not relate
with the PDS is any physical sense.
It just happens they can be exchanged, but they
should not.

bye,

--

piergiorgio

Carlos Moreno
11-02-2003, 10:53 PM
Piergiorgio Sartor wrote:
> Carlos Moreno wrote:
> [...]
>
> My feeling is that you're mixing things that
> seems to be the same, but they're not.
>
> The PDS of a stochastic process is the "power"
> or "energy" the process can have in terms of
> probability, not in terms of physical energy.

Hmm... Interesting (i.e., interesting how for
so many years I've misunderstood all these
concepts :-) I mean, :-( ).

I would ask you to define "power the process can
have in terms of probability"... But I guess
this falls again in the "I-better-go-do-my-
homework" category (homework not in the sense
that this question is part of some homework I
got; but in the sense that I have my own work
to do in investigating these things)

> In case of white noise, it means that there
> is no limitation to the possible sequence.

Can you elaborate?? No limitation to the possible
sequence meaning what exactly? (what kind of
limitation -- or lack thereof -- are we talking
about?)

> The energy or power you calculate for a possible
> realization is something else, it does not relate
> with the PDS is any physical sense.

I'm now confused by why are the units of the PSD
so conveniently Watts per Hz?

Thanks,

Carlos
--

Jerry Avins
11-02-2003, 11:09 PM
Carlos Moreno wrote:

> Jerry Avins wrote:
>
>> By applying correct geometric reasoning to a flawed figure, I can prove
>> that two lines which intersect are parallel. So what?
>>
>> Consider a line of LEDs a foot apart. Each draws 10 milliwatts. It's not
>> absurd to say that if the line extends from -infinity to infinity, then
>> the power represented by them in the aggregate is infinite. Absurdity
>> lies in the belief that a line of such extent must follow rules that
>> apply to things that can be.
>
>
> Jerry,
>
> You seem to keep missing my point. ...

It's worse than that: We're missing each other's points. Mine are

1. When integrating from -infinity to +infinity that which exists only
locally in time, the result is of no practical interest. Maybe that's too
strong. Put another way, integrating a function over all frequencies that
represents power and is valid only within a limited band can yield the
surprising conclusion that troubles you.

2. The assumption that successive samples of a signal are statistically
independent is only valid when the bandwidth of the signal is less than
half the sample rate. Your assumed properties of the signal should assure
you that the samples you hypothesize, and the calculations you perform on
them, are worthless.

Jerry
--
Engineering is the art of making what you want from things you can get.
ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ ŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻŻ

Randy Yates
11-03-2003, 01:12 AM
Carlos Moreno wrote:
> Randy Yates wrote:
>>
>> Rxx(0) = E[X^2(t)], by definition.
>
>
> Huh??
>
> I thought Rxx(0) was defined as the time-domain correlation:
>
> oo
> /
> |
> Rxx(T) = | X^2(t) dt
> |
> /
> -oo
>
> Notice that in that case, this represents the total energy
> of the signal... Hmmm, though something confuses me again:
> this would be the energy of *one particular* realization.
> (so, I think this brings me back to my original doubt:
> what the hell does PSD and Rxx represent for a random
> process?)

Yup, there's that definition too. But the autocorrelation function
which the Wiener-Khinchine theorem utilizes is the probabilistic
one.

>> PS: Go to sci.math, where I posed this very question
>
>
> Still unanswered. We'll wait. Maybe in that reference I
> was pointed to (Priestly, vol. 1) such proof/justification
> is given?

Maybe. If you find it there, please let me know.
--
% Randy Yates % "...the answer lies within your soul
%% Fuquay-Varina, NC % 'cause no one knows which side
%%% 919-577-9882 % the coin will fall."
%%%% <[email protected]> % 'Big Wheels', *Out of the Blue*, ELO
http://home.earthlink.net/~yatescr

Piergiorgio Sartor
11-03-2003, 08:57 AM
Carlos Moreno wrote:

>> The PDS of a stochastic process is the "power"
>> or "energy" the process can have in terms of
>> probability, not in terms of physical energy.
>
> Hmm... Interesting (i.e., interesting how for
> so many years I've misunderstood all these
> concepts :-) I mean, :-( ).

Well, I'm not sure I'm really on the right way...

> I would ask you to define "power the process can
> have in terms of probability"... But I guess

Maybe see below.

> this falls again in the "I-better-go-do-my-
> homework" category (homework not in the sense
> that this question is part of some homework I
> got; but in the sense that I have my own work
> to do in investigating these things)

Well, I think that posing doubts is a good
way to proceed.

>> In case of white noise, it means that there
>> is no limitation to the possible sequence.
>
> Can you elaborate?? No limitation to the possible
> sequence meaning what exactly? (what kind of
> limitation -- or lack thereof -- are we talking
> about?)

Let's say you've white noise (discrete time, it's easier).
This means that if at the time T1, x[T1] = 1, at the
time T2, x[T2] can be anything.

If the process is not white, for example is low pass
filtered, than if x[T1] = 1, x[T2] cannot be anything,
there is not enough "power" to change it to one of
all possible values.
It will be limited to change to values close to 1, the
more is low passed, the more it's limited.
The wider the spectrum, the higher is the probability
to have "anything" as sequence.

As the "power" of the stochastic process increases,
the probability of having something different is
higher. Something like the probability is boosted
by the statistical power.

That's my interpretation.

>> The energy or power you calculate for a possible
>> realization is something else, it does not relate
>> with the PDS is any physical sense.
>
> I'm now confused by why are the units of the PSD
> so conveniently Watts per Hz?

Why not?

It's "power" at the end, but it must be interpreted in
a slightly different way.

I think very often these related each other, but must
not be swapped without good sense.

bye,

--
Piergiorgio Sartor

Dilip V. Sarwate
11-03-2003, 09:13 PM
Randy Yates <[email protected]> wrote in message news:<[email protected]>...


> It is indeed a contradiction to me, and I cannot resolve
> the contradiction. It is one I have had for years.


Randy:

We have crossed swords on this point before, but what the
heck, once more into the breach...

Let us first talk of energy and power for a deterministic signal.
A signal x(t) delivers, say during the interval [0, T], into a 1
ohm resistor) an amount of energy that can be expressed as

En(T) = *integral* of x^2(t) from 0 to T

The *average power* during this interval is (1/T).En(T) while the
*instantaneous power* P(t) at any time t during this interval is the
ratio of the energy delivered during a small interval of duration dt
to the length of the interval, i.e. P(t) = [En(t + dt) - En(t)]/dt,
or more properly, the limiting value of this ratio as dt becomes
very small. We can extend all these notions to the whole real line
appropriately, and for many signals, we will have that En(T) increases
without bound as T approaches infinity while (1/T).En(T) approaches a
constant value (called the average power of the signal), but let's
just stick to a finite length interval for a while.

Now, underlying all this mathematical malarkey is the notion that
x^2(t) is an *integrable* function of t. Next, let us consider the
random process that you and Carlos are thinking about, viz. a random
process for which each X(t) is uniformly distributed on [-1, 1] and
for which each X(t) is independent of all other X(t'). If we think
about the *entire set* of realizations or sample functions of this
process, then we see that these realizations are nothing more and
nothing less than the set of *all* functions x(t) such that |x(t)|
is at most 1 over the interval under consideration. We can define
the energy delivered for all these signals as above. Or can we?
Are *all* (or almost all) bounded functions also integrable functions?.
No. In fact almost all of these sample functions of the random process
are *not integrable*. Thus it is not clear how we can define
energy delivered for the sample functions of the process that you
are considering.

But, you say, a typical sample function is, after all, a signal
that we can produce in the lab, and therefore we can measure its
energy and figure out how to define the appropriate formula for
energy. Well, as I pointed out in a different subthread, a typical
sample function is discontinuous almost everywhere, and requires
infinite currents to flow to charge and discharge capacitors
instantaneously. Can we get a pretty good approximation by creating
a sampled version? Well, you have then "filtered" the process in some
sense when you sample it, and the discrepancy between infinite power
and finite power disappears.

In summary, I suggest that we all stop trying to look for Wiener's
or Khinchine's original definitions and/or proofs in the hopes of
resolving the apparent contradictions. The answer is not there but
rather in basic calculus which we have forgotten in our rush to
judgment... The Wiener-Khinchine formulation does not apply to the
so-called white noise process with finite variance.

--Dilip Sarwate