도서 정보
도서 상세설명
1 Introduction
1.1 Introduction
1.2 Topical Outline
1.3 Possible Approaches
1.4 Organization
2 Classic Detection Theory
2.1 Introduction
2.2 Simple Binary Hypothesis Tests
2.2.1 Decision Criteria
2.2.2 Performance: Receiver Operating Characteristic
2.3 M Hypotheses
2.4 Performance Bounds and Approximations
2.5 Monte Carlo Simulation
2.5.1 Monte Carlo Simulation Techniques
2.5.2 Importance Sampling
2.5.2.1 Simulation of PF
2.5.2.2 Simulation of PM
2.5.2.3 Independent Observations
2.5.2.4 Simulation of the ROC
2.5.2.5 Examples
2.5.2.6 Iterative Importance Sampling
2.5.3 Summary
2.6 Summary
2.7 Problems
3 General Gaussian Detection
3.1 Detection of Gaussian Random Vectors
3.1.1 Real Gaussian Random Vectors
3.1.2 Circular Complex Gaussian Random Vectors
3.1.3 General Gaussian Detection
3.1.3.1 Real Gaussian Vectors
3.1.3.2 Circular Complex Gaussian Vectors
3.2 Equal Covariance Matrices
3.2.1 Independent Components with Equal Variance
3.2.2 Independent Components with Unequal Variances
3.2.3 General Case: Eigendecomposition
3.2.4 Optimum Signal Design
3.2.5 Interference Matrix: Estimator-Subtractor
3.2.6 Low-Rank Models
3.2.7 Summary
3.3 Equal Mean Vectors
3.3.1 Diagonal Covariance Matrix on H0: Equal Variance
3.3.1.1 Independent, Identically Distributed Signal Components
3.3.1.2 Independent Signal Components: Unequal Variances
3.3.1.3 Correlated Signal Components
3.3.1.4 Low-Rank Signal Model
3.3.1.5 Symmetric Hypotheses, Uncorrelated Noise
3.3.2 Non-diagonal Covariance Matrix on H0
3.3.2.1 Signal on H1 Only
3.3.2.2 Signal on Both Hypotheses
3.3.3 Summary
3.4 General Gaussian
3.4.1 Real Gaussian Model
3.4.2 Circular Complex Gaussian Model
3.4.3 Single Quadratic Form
3.4.4 Summary
3.5 M Hypotheses
3.6 Summary
3.7 Problems
4 Classical Parameter Estimation
4.1 Introduction
4.2 Scalar Parameter Estimation
4.2.1 Random Parameters: Bayes Estimation
4.2.2 Nonrandom Parameter Estimation
4.2.3 Bayesian Bounds
4.2.3.1 Lower Bound on the MSE
4.2.3.2 Asymptotic Behavior
4.2.4 Case Study
4.2.5 Exponential Family
4.2.5.1 Nonrandom Parameters
4.2.5.2 Random Parameters
4.2.6 Summary of Scalar Parameter Estimation
4.3 Multiple Parameter Estimation
4.3.1 Estimation Procedures
4.3.1.1 Random Parameter
4.3.1.2 Nonrandom Parameters
4.3.2 Measures of Error
4.3.2.1 Nonrandom Parameters
4.3.2.2 Random Parameters
4.3.3 Bounds on Estimation Error
4.3.3.1 Nonrandom Parameters
4.3.3.2 Random Parameters
4.3.4 Exponential Family
4.3.4.1 Nonrandom Parameters
4.3.4.2 Random Parameters
4.3.5 Nuisance Parameters
4.3.5.1 Nonrandom Parameters
4.3.5.2 Random Parameters
4.3.5.3 Hybrid Parameters
4.3.6 Hybrid Parameters
4.3.6.1 Joint ML and MAP Estimation
4.3.6.2 Nuisance Parameters
4.3.7 Summary of Multiple Parameter Estimation
4.4 Global Bayesian Bounds
4.4.1 Covariance Inequality Bounds
4.4.1.1 Covariance Inequality
4.4.1.2 Bayesian Bounds
4.4.1.3 Scalar Parameters
4.4.1.4 Vector Parameters
4.4.1.5 Combined Bayesian Bounds
4.4.1.6 Functions of the Parameter Vector
4.4.1.7 Summary of Covariance Inequality Bounds
4.4.2 Method of Interval Estimation
4.4.3 Summary of Global Bayesian Bounds
4.5 Composite Hypotheses
4.5.1 Introduction
4.5.2 Random Parameters
4.5.3 Nonrandom Parameters
4.5.4 Simulation
4.5.5 Summary of Composite Hypotheses
4.6 Summary
4.7 Problems
5 General Gaussian Estimation
5.1 Introduction
5.2 Nonrandom Parameters
5.2.1 General Gaussian Estimation Model
5.2.2 Maximum Likelihood Estimation
5.2.3 Cramer-Rao Bound
5.2.4 Fisher Linear Gaussian Model
5.2.4.1 Introduction
5.2.4.2 White Noise
5.2.4.3 Low-Rank Interference
5.2.5 Separable Models for Mean Parameters
5.2.6 Covariance Matrix Parameters
5.2.6.1 White Noise
5.2.6.2 Colored Noise
5.2.6.3 Rank One Signal Matrix Plus White Noise
5.2.6.4 Rank One Signal Matrix Plus Colored Noise
5.2.7 Linear Gaussian Mean and Covariance Matrix Parameters
5.2.7.1 White Noise
5.2.7.2 Colored Noise
5.2.7.3 General Covariance Matrix
5.2.8 Computational Algorithms
5.2.8.1 Introduction
5.2.8.2 Gradient Techniques
5.2.8.3 Alternating Projection Algorithm
5.2.8.4 Expectation Maximization Algorithm
5.2.8.5 Summary
5.2.9 Equivalent Estimation Algorithms
5.2.9.1 Least Squares
5.2.9.2 Minimum Variance Distortionless Response
5.2.9.3 Summary
5.2.10 Sensitivity, Mismatch, and Diagonal Loading
5.2.10.1 Sensitivity and Array Perturbations
5.2.10.2 Diagonal Loading
5.2.11 Summary
5.3 Random Parameters
5.3.1 Model, MAP Estimation, and the BCRB
5.3.2 Bayesian Linear Gaussian Model
5.3.3 Summary
5.4 Sequential Estimation
5.4.1 Sequential Bayes Estimation
5.4.2 Recursive Maximum Likelihood
5.4.3 Summary
5.5 Summary
5.6 Problems
6 Representation of Random Processes
6.1 Introduction
6.2 Orthonormal Expansions: Deterministic Signals
6.3 Random Process Characterization
6.3.1 Random Processes: Conventional Characterizations
6.3.2 Series Representation of Sample Functions of Random Processes
6.3.3 Gaussian Processes
6.4 Homogeous Internal Equations and Eigenfunctions
6.4.1 Rational Spectra
6.4.2 Bandlimited Spectra
6.4.3 Nonstationary Processes
6.4.4 White Noise Processes
6.4.5 Low Rank Kernels
6.4.6 The Optimum Linear Filter
6.4.7 Properties of Eigenfunctions and Eigenvalues
6.4.7.1 Monotonic property
6.4.7.2 Asymptotic behavior properties
6.5 Vector Random Processes
6.6 Summary
6.7 Problems
7 Detection of Signals – Estimation of Signal Parameters
7.1 Introduction
7.1.1 Models
7.1.1.1 Detection
7.1.1.2 Estimation
7.1.2 Format
7.2 Detection and Estimation in White Gaussian Noise
7.2.1 Detection of Signals in Additive White Gaussian Noise
7.2.1.1 Simple binary detection
7.2.1.2 General binary detection in white Gaussian noise
7.2.1.3 M-ary detection in white Gaussian noise
7.2.1.4 Sensitivity
7.2.2 Linear Estimation
7.2.3 Nonlinear Estimation
7.2.4 Summary of Known Signals in White Gaussian Noise
7.2.4.1 Detection
7.2.4.2 Estimation
7.3 Detection and Estimation in Nonwhite Gaussian Noise
7.3.1 “Whitening” Approach
7.3.1.1 Structures
7.3.1.2 Construction of Qn(t; u) and g(t)
7.3.1.3 Summary
7.3.2 A Direct Derivation Using the Karhunen-Loeve Expansion
7.3.3 A Direct Derivation with a Sufficient Statistic
7.3.4 Detection Performance
7.3.4.1 Performance: Simple binary detection problem
7.3.4.2 Optimum signal design: Coincident intervals
7.3.4.3 Singularity
7.3.4.4 General binary receivers
7.3.5 Estimation
7.3.6 Solution Techniques for Integral Equations
7.3.6.1 Infinite observation interval: Stationary noise
7.3.6.2 Finite observation interval: Rational spectra
7.3.6.3 Finite observation time: Separable kernels
7.3.7 Sensitivity,
7.3.7.2 Mismatch and diagonal loading
7.3.8 Known Linear Channels
7.3.8.1 Summary
7.4 Signals with Unwanted Parameters: The Composite Hypothesis Problem
7.4.1 Random Phase Angles
7.4.2 Random Amplitude and Phase
7.4.3 Other Target Models
7.4.4 Nonrandom Parameters
7.4.4.1 Summary
7.5 Multiple Channels
7.5.1 Vector Karhunen-Loeve
7.5.1.1 Application
7.6 Multiple Parameter Estimation
7.6.1 Known Signal in Additive White Gaussian Noise
7.6.2 Separable Models
7.6.3 Summary
7.7 Summary
7.8 Problems
8 Estimation of Continuous–Time Random Processes
8.1 Optimum Linear Processes
8.2 Realizable Linear Filters: Stationary Processes, Infinite Past: Wiener Filters
8.2.1 Solution of Wiener-Hopf Equation
8.2.2 Errors in Optimum Systems
8.2.3 Unrealizable Filters
8.2.4 Closed-Form Error Expressions
8.3 Gaussian-Markov Processes: Kalman Filter
8.3.1 Differential Equation Representation of Linear Systems and Random Process Generation
8.3.2 Kalman Filter
8.3.3 Realizable Whitening Filter
8.3.4 Generalizations
8.3.5 Implementation Issues
8.4 Bayesian Estimation of Non-Gaussian Models
8.4.1 The Extended Kalman Filter
8.4.1.1 Linear AWGN process and observations
8.4.1.2 Linear AWGN process, nonlinear AWGN observations
8.4.1.3 Nonlinear AWGN process and observations
8.4.1.4 Nonlinear process and observations
8.4.2 Bayesian Cramer-Rao Bounds: Continuous-Time
8.4.3 Summary
8.5 Summary
8.6 Problems
9 Estimation of Discrete–Time Random Processes
9.1 Introduction
9.2 Discrete-time Wiener Filtering
9.2.1 Model
9.2.2 Random Process Models
9.2.3 Optimum FIR Filters
9.2.4 Unrealizable IIR Wiener Filters
9.2.5 Realizable IIR Wiener Filters
9.2.6 Summary: Discrete-time Wiener Filter
9.3 Discrete-time Kalman filter
9.3.1 Random process models
9.3.2 Kalman Filter
9.3.2.1 Derivation
9.3.2.2 Reduced Dimension Implementations
9.3.2.3 Applications
9.3.2.4 Estimation in Non-white Noise
9.3.2.5 Sequential Processing of Estimators
9.3.2.6 Square-root Filters
9.3.2.7 Divergence
9.3.2.8 Sensitivity and Model Mismatch
9.3.2.9 Summary: Kalman Filters
9.3.3 Kalman Predictors
9.3.3.1 Fixed-lead prediction
9.3.3.2 Fixed-point prediction
9.3.3.3 Fixed-Interval Prediction
9.3.3.4 Summary: Kalman Predictors
9.3.4 Kalman Smoothing
9.3.4.1 Fixed Interval Smoothing
9.3.4.2 Fixed Lag Smoothing
9.3.4.3 Summary: Kalman Smoothing
9.3.5 Bayesian Estimation of Nonlinear Models
9.3.5.1 General Nonlinear Model: MMSE and MAP Estimation
9.3.5.2 Extended Kalman Filter
9.3.5.3 Recursive Bayesian Cramer-Rao Bounds
9.3.5.4 Applications
9.3.5.5 Joint State And Parameter Estimation
9.3.5.6 Continuous–Time Processes and Discrete–Time Observations
9.3.5.7 Summary
9.3.6 Summary: Kalman Filters
9.4 Summary
9.5 Problems
10 Detection of Gaussian Signals
10.1 Introduction
10.2 Detection of Continuous-time Gaussian Processes
10.2.1 Sampling
10.2.2 Optimum Continuous-Time Receivers
10.2.3 Performance of Optimum Receivers
10.2.4 State-Variable Realization
10.2.5 Stationary Process-Long Observation Time (SPLOT) Receiver
10.2.6 Low-rank Kernels
10.2.7 Summary
10.3 Detection of Discrete-Time Gaussian Processes
10.3.1 Second Moment Characterization
10.3.1.1 Known means and covariance matrices
10.3.1.2 Means and covariance matrices with unknown parameters
10.3.2 State Variable Characterization
10.3.3 Summary
10.4 Summary
10.5 Problems
11 Epilogue
11.1 Classical Detection and Estimation Theory
11.1.1 Classical Detection Theory
11.1.2 General Gaussian Detection
11.1.3 Classical Parameter Estimation
11.1.4 General Gaussian Estimation
11.2 Representation of Random Processes
11.3 Detection of signals and estimation of signal parameters
11.4 Linear estimation of random processes
11.5 Observations
11.5.1 Models and Mismatch
11.5.2 Bayes vis-a-vis Fisher
11.5.3 Bayesian and Fisher Bounds
11.5.4 Eigenspace
11.5.5 Whitening
11.5.6 The Gaussian Model
11.6 Conclusion