馃攳
AWGN, WGN, Autocorrelation and PSD Explained using Matlab - YouTube
Channel: ECE with AK Hassan
[0]
In this video, we are going to understand and聽
visualize at a certain abstraction level that what聽聽
[7]
is Gaussian distribution, what is Gaussian聽
noise, what is white Gaussian noise that is wgn聽聽
[15]
and what is additive white Gaussian聽
noise that is awgn. So we would be using聽聽
[23]
matlab simulation the code of this simulation is聽
given in the description of this video so using聽聽
[30]
this simulation we would look into a stem time聽
series plot. So for this random process we would聽聽
[37]
look into a probability density function of white聽
Gaussian noise. Next, we would move-on towards聽聽
[44]
auto correlation function of white noise, later on聽
we would look into power spectral density of white聽聽
[51]
noise and lastly we would look into how additive聽
white question noise affects a given signal.聽聽
[58]
Note that noise is prevalent in most systems,聽
however, let us restrict ourselves to a聽聽
[64]
particular system that is a communication聽
system. Here we have a broadcast station聽聽
[73]
and this broadcast station is transmitting a聽
signal wirelessly to a user so we have a user聽聽
[80]
device that is a cell phone this device has an聽
antenna so the emt wave which was transmitted聽聽
[87]
from the broadcast station would be received at聽
the received antenna of the user device now within聽聽
[94]
this receive antenna there is a render movement聽
of electrons which are agitated and they dissipate聽聽
[102]
and unwanted energy so this is a random process聽
so the emerging signal often is a Gaussian noise
[112]
which has a probability density function referred聽
over here in this expression and it has a聽聽
[118]
plot which is referred to over here. this聽
mu is the mean value or the average value聽聽
[126]
where the sigma is the standard deviation and聽
the square of it is variance, and variance defines聽聽
[132]
or identifies the dispersion in data. note that聽
the Gaussian distribution is also referred to as聽聽
[139]
normal distribution or a bell shaped distribution聽
we call this bell shaped because the opening of聽聽
[147]
this bell is controlled by the variance聽
sigma square such that if the variance聽聽
[153]
has a higher value so we would have a bigger聽
opening as compared to a lower variance value
[162]
now for random nice as we聽
have considered previously聽聽
[166]
the mean value is often set to zero聽
x follows a normal distribution聽聽
[174]
which has a certain mu and that mu is equal to 0聽
in present case and the variance is sigma squared
[186]
now as mu is equal to zero so over here聽
we have a zero mean random variable聽聽
[192]
that is this is a zero mean probability density聽
function of white question nice now if we want聽聽
[198]
to find the probability of having a value between聽
minus 2 and say minus 0.1 so all we need to do is聽聽
[205]
to find the area in this region so we will find聽
the probability between minus 0.2 and minus 0.1聽聽
[214]
do note that integrating the pdf curve from minus聽
infinity to infinity would yield a value of 1. now聽聽
[221]
let us relate this random variable with our random聽
process and that random process is over here and聽聽
[228]
this is a stem time series plot so let us hold聽
our horses for the white aspect for the time being聽聽
[238]
now over here this is an outcome of a random聽
variable and particularly this is the first聽聽
[243]
outcome similarly we can have the second聽
outcome and third outcome and so on so聽聽
[249]
this random process can be expressed as p聽
of t note that this is a time domain plot
[257]
now for such random process we cannot take聽
the fourier transform directly because in that聽聽
[264]
scenario we would have to take the fourier聽
transform of each realization and then we聽聽
[270]
have to take the average value to find the power聽
spectral density and alternate mean is to find the聽聽
[275]
autocorrelation function let us revisit the random聽
process that we refer to as p of t previously and聽聽
[284]
we can find the auto correlation function which聽
is denoted by r and this is a function of tau聽聽
[292]
again we have the random process which is p聽
of t and then we multiply this random process聽聽
[298]
with a shifted version of itself where p of t plus聽
tau is a shifted version shifted in time domain聽聽
[305]
of this random process which is p of t and the聽
amount of shift is controlled by this variable聽聽
[313]
which is top so if this is p of t the second聽
plot is that of p of t plus tau and we shift it
[326]
and in the shift at each increment of this shift聽
we multiply and take the expectation so in this聽聽
[333]
way we will find our autocorrelation function聽
again that is the correlation of this process聽聽
[338]
with the time shifted version of itself and which聽
is plotted over here an important property of聽聽
[345]
white noise is that the autocorrelation function聽
only exists when the lag or time shift is zero聽聽
[353]
that is when you set this tau equal to zero聽
so you would have p of t multiplied by p of t聽聽
[358]
and then we would take the expectation so this聽
is basically the variance of the distribution聽聽
[364]
so hence for a white noise we have an聽
autocorrelation function and that auto correlation聽聽
[370]
function is equivalent to sigma square delta聽
of tau and for all other legs the value is zero聽聽
[381]
you would also come across a terminology聽
which is often referred to as iid
[388]
and this is referred to a distribution which聽
is independent and identically distributed聽聽
[394]
random variable so that is the outcome which聽
is this one is independent of this outcome聽聽
[400]
however both of them follow聽
an identical distribution聽聽
[403]
which is gaussian in our case now聽
using this autocorrelation function聽聽
[409]
we can reach the power spectral density聽
by simply taking the fourier transform
[416]
that is we have a random process p of聽
t we take the autocorrelation function聽聽
[425]
of that so we reach r of tau and finally we聽
can take the fourier transform of this r of tau聽聽
[435]
to get back the power spectral density聽
which is often denoted by s p of f聽聽
[443]
now for the case of autocorrelation function we聽
refer that for white noise the autocorrelation聽聽
[448]
function exists only for a time delay of 0 and it聽
is 0 elsewhere now for the psd and in the context聽聽
[458]
of white noise the power is distributed equally聽
among all frequencies so this is an important聽聽
[464]
consideration for all frequencies the power is聽
distributed equally and that is basically 10聽聽
[473]
log 10 of our variance to get that power in聽
db scale and that is over here which is 20聽聽
[483]
note that sigma in our experimentation we聽
have set it to 0.01 which is also appearing聽聽
[489]
over here that is this is 0.01 delta of聽
t and in the coding it appears over here聽聽
[498]
now in the context of additive white question nice聽
so consider that we have a signal which is simply聽聽
[503]
a sine wave and that sine wave is represented by聽
this bold line so if we add white question noise聽聽
[510]
to it that is additive white gaussian nice so聽
we would have some fluctuations or attenuations聽聽
[516]
and this level of spread would be based on聽
the distribution that we have considered聽聽
[523]
previously that is the gaussian distribution and聽
the spread is controlled by the variance that is聽聽
[529]
sigma square now let us move towards the聽
matlab plot and in this matlab plot we have聽聽
[536]
included a preamble over here and in this聽
preamble we have run the experiment for 10 seconds聽聽
[544]
that is the time t in line 5 would start from 0聽
and it would terminate at time span which is 10聽聽
[551]
seconds with the increment of the sampling time聽
ts which is equal to 1 over fs and the sampling聽聽
[557]
frequency we have set to 10 000 hertz in line聽
6 this capital l identifies the total number of聽聽
[565]
samples that would be taken and they are dependent聽
on t next we include some statistical parameters聽聽
[573]
including the mean value which is mu variance聽
which we have set 2.01 the standard deviation聽聽
[580]
which is the square root of variance and again聽
the variance in db scale that is 10 log 10 of聽聽
[586]
various var so in line 14 we generate the random聽
process that is capital x is equal to square root聽聽
[593]
of the variance times rand n that is random normal聽
which has l number of rows and one column plus聽聽
[602]
the mu which is mean but anyhow we have set聽
the mean equal to zero so next we plot this聽聽
[608]
random variable in figure one by using a stem聽
plot that is the plot in discrete time events聽聽
[614]
and we include some aesthetics in terms of title聽
and labels moreover we limit the x's specifically聽聽
[623]
the x-axis why we have limited because the聽
total number of samples that we consider are聽聽
[630]
hundred thousand this is the length of l聽
previously mentioned but that would be too much聽聽
[637]
data to analyze so we are just going to visualize聽
zero to 25 samples initially and until keyboard聽聽
[645]
the simulation would yield this plot which聽
is a time series plot of the random process聽聽
[651]
of white caution noise so let us understand聽
further this white gaussian noise聽聽
[656]
of 100 000 samples by means of an audio signal so聽
what would be the sound of a white gaussian noise聽聽
[665]
and to play the sound in matlab from line 26 to 28聽聽
[669]
we have set a command specifically into聽
line 27 that is sound of x and that sound is
[682]
this one now for the plot of the probability聽
density function of the gaussian random variable聽聽
[689]
we have used figure 2 and we have used聽
specifically the command that is ks density聽聽
[696]
of the random process x right and then that聽
is plotted over here in figure 2 with some聽聽
[704]
labeling while for the autocorrelation function聽
we have used x correlation and included聽聽
[712]
the biased option as an argument of the聽
correlation function and this correlation function聽聽
[719]
would give us the autocorrelation function at聽
different values of lag which are plotted in聽聽
[725]
figure number three so until this keyboard in聽
line number 50 and let us run the experiment so聽聽
[732]
now we have two different plots the first聽
plot is the pdf of white question noise聽聽
[740]
whereas the second plot is the autocorrelation聽
function so previously in figure 1 we mentioned聽聽
[748]
that this is our random process and if we take the聽
correlation function which is this one and then if聽聽
[754]
we take the fourier transform of this function聽
we would reach the power spectral density but聽聽
[759]
applying the fourier transform directly on this聽
autocorrelation function in the present scenario聽聽
[766]
is not giving us the power spectral density聽
and the reasons for such is that our present聽聽
[772]
simulation uh deals with a pseudorandom kind of a聽
number moreover these lags are finite in duration聽聽
[780]
so an alternate route to finding the power聽
spectral density is adopted from this reference聽聽
[788]
it is changed a little bit and in figure 4 we聽
reshape the random process x into a matrix of聽聽
[797]
1000 rows by 100 columns that is we partition x聽
into 100 random processes and each random process聽聽
[807]
has 1000 realizations and we have referred聽
this matrix as x1 so we take the fast fourier聽聽
[816]
transform of this x1 then we normalize it in line聽
60 we compute the mean power from this fft again聽聽
[825]
we normalize the x axis and from line 63 onwards聽
until 9 69 we perform some display settings of the聽聽
[835]
pst plot so until line 69 let us run the聽
experiment to plot the psd which is over here聽聽
[845]
in db per hertz now for the additive white聽
gaussian nice in figure 5 we generate a new signal聽聽
[854]
and this signal is simply a sine wave so we聽
time scale that sine wave by a factor of 2聽聽
[860]
and also multiply by 2 as a constant聽
coefficient so that this new signal聽聽
[866]
is of a certain strength and on this new signal we聽
include the the white caution noise that we have聽聽
[876]
previously created and from 76 onwards until line聽
number 82 we display the settings and label them聽聽
[887]
so let us run the experiment again the dark red is聽
identifying the original signal whereas the signal聽聽
[894]
with awgn is indicated by this green waveform聽
so if we increase the noise power that is we set聽聽
[903]
this variance to say 0.1 and we evaluate this聽
selection next we again perform this awgn analysis
[916]
now in this plot you can observe聽
that the noise is playing a much聽聽
[920]
bigger role as compared to previous聽
situation fit the noise was simply 0.01
Most Recent Videos:
You can go back to the homepage right here: Homepage





