daigai

Well-Known Member
Link tải luận văn miễn phí cho ae Kết Nối

12 Overview of Statistical Signal Processing Charles W. Therrien
Discrete Random Signals • Linear Transformations • Representation of Signals as Random Vectors
• Fundamentals of Estimation
13 Signal Detection and Classification Alfred Hero
Introduction • Signal Detection • Signal Classification • The Linear Multivariate Gaussian Model
• Temporal Signals in Gaussian Noise • Spatio-Temporal Signals • Signal Classification
14 Spectrum Estimation and Modeling Petar M. Djuri´c and Steven M. Kay
Introduction • Important Notions and Definitions • The Problem of Power Spectrum Estimation
• Nonparametric Spectrum Estimation • Parametric Spectrum Estimation • Recent Developments
15 Estimation Theory and Algorithms: From Gauss to Wiener to Kalman Jerry M.
Mendel
Introduction • Least-Squares Estimation • Properties of Estimators • Best Linear Unbiased Estimation • Maximum-Likelihood Estimation • Mean-Squared Estimation of Random Parameters •
Maximum A Posteriori Estimation of Random Parameters • The Basic State-Variable Model • State
Estimation for the Basic State-Variable Model • Digital Wiener Filtering • Linear Prediction in DSP,
and Kalman Filtering • Iterated Least Squares • Extended Kalman Filter
16 Validation, Testing, and Noise Modeling Jitendra K. Tugnait
Introduction • Gaussianity, Linearity, and Stationarity Tests • Order Selection, Model Validation,
and Confidence Intervals • Noise Modeling • Concluding Remarks
17 Cyclostationary Signal Analysis Georgios B. Giannakis
Introduction • Definitions, Properties, Representations • Estimation, Time-Frequency Links, Testing • CS Signals and CS-Inducing Operations • Application Areas • Concluding Remarks
S Te frrAequenc ties TISTICA , thei y domains rLtransformatio SIGN.ATh L PeRgoa OCESSIN n bl yissyste to extrac Gmdeal operators t spe writ tinen h rando , an t informatio d mthei sigrnals characte n, thei abou rrizatio atcth quisition e n underl in th,yethei intim grmech- perand opanisms that generate them or transform them. The area is grounded in the theories of signals and
systems, random variables and stochastic processes, detection and estimation, and mathematical
statistics. Random signals are temporal or spatial and can be derived from man-made (e.g., binary
communication signals) or natural (e.g., thermal noise in a sensory array) sources. They can be
c 1999 by CRC Press LLCcontinuous or discrete in their amplitude or index, but no exact expression describes their evolution.
Signals are often described statistically when the engineer has incomplete knowledge about their
description or origin. In these cases, statistical descriptors are used to characterize one’s degree of
knowledge (or ignorance) about the randomness. Especially interesting are those signals (e.g., stationary and ergodic) that can be described using deterministic quantities computable from finite data
records. Applications of statistical signal processing algorithms to random signals are omnipresent
in science and engineering in such areas as speech, seismic, imaging, sonar, radar, sensor arrays,
communications, controls, manufacturing, atmospheric sciences, econometrics, and medicine, just
to name a few. This chapter deals with the fundamentals of statistical signal processing, including
some interesting topics that deviate from traditional assumptions. The focus is on discrete index
random signals (i.e., time series) with possibly continuous-valued amplitudes. The reason is twofold:
measurements are often made in discrete fashion (e.g., monthly temperature data); and continuously
recorded signals (e.g., speech data) are often sampled for parsimonious representation and efficient
processing by computers.
The first chapter of the section, written by Charles Therrien, reviews definitions, characterization,
and estimation problems entailing random signals. The important notions outlined are stationarity, independence, ergodicity, and Gaussianity. The basic operations involve correlations, spectral
densities, and linear time-invariant transformations. Stationarity reflects invariance of a signal’s
statistical description with index shifts. Absence (or presence) of relationships among samples of a
signal at different points is conveyed by the notion of (in)dependence, which provides information
about the signal’s dynamical behavior and memory as it evolves in time or space. Ergodicity allows
computation of statistical descriptors from finite data records. In increasing order of computational
complexity, descriptors include the mean (or average) value of the signal, the autocorrelation, and
higher than second-order correlations which reflect relations among two or more signal samples.
Complete statistical characterization of random signals is provided by probability density and distribution functions. Gaussianity describes probabilistically a particular distribution of signal values
which is characterized completely by its first- and second-order statistics. It is often encountered in
practice because, thanks to the central limit theorem, averaging a sufficient number of random signal
values (an operation often performed by, e.g., narrowband filtering) yields outputs which are (at least
approximately) distributed according to the Gaussian probability law. Frequency-domain statistical
descriptors inherit all the merits of deterministic Fourier transforms and can be computed efficiently
using the fast Fourier transform. The standard tool here is the power spectral density which describes
how average power (or signal variance) is distributed across frequencies; but polyspectral densities
are also important for capturing distributions of higher-order signal moments across frequencies.
Random input signals passing through linear systems yield random outputs. Input-output autoand cross-correlations and spectra characterize not only the random signals themselves but also the
transformation induced by the underlying system.
Many random signals as well as systems with random inputs and outputs possess finite degrees
of freedom and can thus be modeled using finite parameters. Depending on a priori knowledge,
one estimates parameters from a given data record, treating them either as random or deterministic.
Various approaches become available by adopting different figures of merit (estimation criteria).
Those outlined in this chapter include the maximum likelihood, minimum variance, and leastsquares criteria for deterministic parameters. Random parameters are estimated using the maximum
a posteriori and Bayes criteria. Unbiasedness, consistency, and efficiency are important properties
of estimators which, together with performance bounds and computational complexity, guide the
engineer to select the proper criterion and estimation algorithm.
While estimation algorithms seek values in the continuum of a parameter set, the need arises often
in signal processing to classify parameters or waveforms as one or another of prespecified classes.
Decision making with two classes is sought frequently in practice, including as a special case the
simpler problem of detecting the presence or absence of an information-bearing signal observed
c 1999 by CRC Press LLC
Ket-noi.com kho tai lieu mien phi Ket-noi.com kho tai lieu mien phiin noise. Such signal detection and classification problems along with the associated theory and
practice of hypotheses testing is the subject of the second chapter written by Alfred Hero. The
resulting strategies are designed to minimize the average number of decision errors. Additional
performance measures include receiver operating characteristics, signal-to-noise ratios, probabilities
of detection (or correct classification), false alarm (or misclassification) rates, and likelihood ratios.
Both temporal and spatio-temporal signals are considered, focusing on linear single- and multivariate Gaussian models. Trade-offs include complexity versus optimality, off-line versus real time
processing, and separate versus simultaneous detection and estimation for signal models containing
unknown parameters.
Parametric and nonparametric methods are described in the third chapter, written by Petar Djuric´
and Steven Kay, for the basic problem of spectral estimation. Estimates of the power spectral density
have been used over the last century and continue to be of interest in numerous applications involving
retrieval of hidden periodicities, signal modeling, and timeseries analysis problems. Starting with the
periodogram (normalized square magnitude of the data Fourier transform), its modifications with
smoothing windows, and moving on to the more recent minimum variance and multiple window
approaches, the nonparametric methods described here constitute the first step used to characterize
the spectral content of stationary stochastic signals. Factors dictating the designer’s choice include
computational complexity, bias-variance, and resolution trade-offs. For data adequately described
by a parametric model, such as the auto-regressive (AR), moving-average (MA), or ARMA model,
spectral analysis reduces to estimating the model parameters. Such a data reduction step achieved by
modeling offers parsimony and increases resolution and accuracy, provided that the model and its
order (number of parameters) fit well the available time series. Processes containing harmonic tones
(frequencies) have line spectra, and the task of estimating frequencies appears in diverse applications
in science and engineering. The methods presented here include both the traditional periodogram
as well as modern subspace approaches such as the MUSIC and its derivatives.
Estimation from discrete-time observations is the theme of the next chapter, written by Jerry
Mendel. The unifying viewpoint treats both parameter and waveform (or signal) estimation from
the perspective of minimizing the averaged square error between observations and input-output
or state variable signal models. Starting from the traditional linear least-squares formulation, the
exposition includes weighted and recursive forms, their properties, and optimality conditions for
estimating deterministic parameters as well as their minimum mean-square error and maximum
a posteriori counterparts for estimating random parameters. Waveform estimation, on the other
hand, includes not only input-output signals but also state space vectors in linear and nonlinear
state variable models. Prediction, smoothing, and the celebrated Kalman filtering problems are
outlined in this framework and relationships are highlighted with the Wiener filtering formulation.
Nonlinear least-squares and iterative minimization schemes are discussed for problems where the
desired parameters are nonlinearly related with the data. Nonlinear equations can often be linearized,
and the extended Kalman filter is described briefly for estimating nonlinear state variable models.
Minimizing the mean-square error criterion leads to the basic orthogonality principle which appears
in both parameter and waveform estimation problems. Generally speaking, the mean-square error
criterion possesses rather universal optimality when the underlying models are linear and the random
data involved are Gaussian distributed.
Link Download bản DOC
Do Drive thay đổi chính sách, nên một số link cũ yêu cầu duyệt download. các bạn chỉ cần làm theo hướng dẫn.
Password giải nén nếu cần: ket-noi.com | Bấm trực tiếp vào Link để tải:

 

Các chủ đề có liên quan khác

Top