spirent.com

The Secret Life of Modern RF Signals - Part 4

By Ash Mahjoubi-Amine, Spirent On September 1, 2009
Wireless
No tags assigned.

It’s time to answer a question from a couple of weeks back: “if fading is so random, how can we control and quantify what it’s doing?” In other words, how do we express it, record it and eventually repeat it?

One key concept is the idea of Level Crossing Rates (LCR). At a high level, all this means is that a) we choose a level below which we say a fade is a “deep fade”, and b) count the rate at which fades dip below this level.

    

For a Rayleigh-faded signal the LCR is:

Equation 1

 

Where:

 

Table 1 - Normalized Level Crossing rate

 

For example, Table 1 shows that for a Doppler frequency of 1 Hz, a fade that is at least 25 dB lower than the mean power level will occur 0.141 times per second (roughly 1 every 7 seconds) on average.

Suppose a test requires us to observe 25 “deep” fades, where we’ve defined “deep” to mean “25 dB lower than the average.” Over a very long period of time, it will take an average of 178 seconds to observe 25 deep fades. But this is another way of saying, “if we observe fading for 178 seconds, we have 50% confidence that there will be at least 25 ‘deep’ fades.” We need better than 50% confidence. As a rule of thumb in this kind of testing, we usually define an acceptable level of confidence as being 95% or greater.

We now have some information about the mean probability of an occurrence, and intuit that the arrival of the next deep fade is independent from the arrival of previous deep fades. This may ring a bell if you remember Statistics from college… it’s similar to simple queuing theory or traffic modeling, so the Poisson distribution is a pretty good model. The probability mass function is:

Equation 2

 

Where (t = time in seconds) and k = the number of events we need to capture. This gives us probabilities of discrete numbers of deep fade arrivals, but we’re looking for probabilities of at least a specific number of arrivals. In other words, we want to solve this for t:

Equation 3

 

At this point we see an intractable repetitious equation and become grateful for spreadsheet software, especially when it has a cumulative Poisson distribution function built in. If you elect to try the spreadsheet’s smart-trial-and-error process (“solver” routine), you should know that you’ll need to build up the summation shown above, because a built-in Poisson routine usually expects integer values of Nrt.

Equation 4

 

You can still solve for t using the spreadsheet using the built-in functions. See Figure 1.

Figure 1 - Setup for solving fading test times

 

By adjusting the value for t (cell C6) you can find the point at which the Poisson distribution probability crosses the point of confidence. As a sanity check, verify that the 50% confidence level (for 25 events) occurs at about 178 seconds.

Having done that, you can plug in values for t to determine the point at which the confidence level is about 95% (see Figure 2). You should see that this occurs after about 235 seconds.

Figure 2 - Solving for 95%

 

This is a seriously non-rigorous discussion, but it may help get some kind of a handle on the statistical processes behind fading as it’s used in testing. If you read the last installment of this blog, I mentioned that a lot of testing requires minimum time periods when fading is used. I also mentioned that repeatedly reproducing a set of fading data is not sufficient to meet test time requirements. If you followed this week’s hint of a random process discussion, it should now be clear why fading tests require large data sets created in real time without repetition. Note also that we’ve really only scratched the surface of the statistical models. To make a very long story short, the Poisson distribution is itself a simplification, a special case of a binomial process. If we wanted to (we don’t) we could dig in and blog for years about this one topic.

Anyway, that’s enough for this week.

 

comments powered by Disqus