|
EDA365欢迎您登录!
您需要 登录 才可以下载或查看,没有帐号?注册
x
Contents
' T( a0 K) R, gPreface xv
3 e! H8 A- J3 KAcknowledgments xvii
4 K) h. \, J1 JChapter 1 Probability concepts 1- U" ]% B k7 `8 ~: {8 ^
1.1 Introduction 1' K \) u5 r! y0 |4 T' O- Z
1.2 Sets and Probability 1' ^$ U) Y7 Z9 j- A* g. q
1.2.1 Basic Definitions 1
5 j! ?$ m" q" e4 g0 _5 j" i1.2.2 Venn Diagrams and Some Laws 3
6 J' I4 q Y( @3 N- a, }$ r$ A1.2.3 Basic Notions of Probability 6
; \$ h, B8 W5 Z5 k0 P0 A5 G; x% o1.2.4 Some Methods of Counting 8
( r8 A9 v0 Y l1.2.5 Properties, Conditional Probability, and Bayes’ Rule 12+ C* ~, H* G' i" g M l7 n
1.3 Random Variables 17
1 f- K7 p9 W) n _5 H1 a7 J5 o1.3.1 Step and Impulse Functions 17
$ j1 e' k, p* h8 d: k/ V$ W" v1.3.2 Discrete Random Variables 181 R3 V, q5 X+ K$ r: f3 m1 { r1 H
1.3.3 Continuous Random Variables 20
* T9 ]5 S, O. t$ W3 C3 H4 g1.3.4 Mixed Random Variables 22, B9 K/ c: h: V# C
1.4 Moments 23
1 v1 ~* h9 O. s6 ?8 x. d& R4 @1.4.1 Expectations 237 _; P' J- E$ ]6 E( V, B
1.4.2 Moment Generating Function and Characteristic Function 26
& z& H; k4 r8 t5 F4 ~1.4.3 Upper Bounds on Probabilities and Law of Large3 P, S: ?* i4 C' n
Numbers 29
8 r: n5 [4 u' f7 {1.5 Two- and Higher-Dimensional Random Variables 31
2 G6 g* h- ? R" g! `" s1.5.1 Conditional Distributions 33. Y _: @9 g: n# R G# ^' J
1.5.2 Expectations and Correlations 41
1 X6 b+ u8 P0 _# A8 n; P9 `1.5.3 Joint Characteristic Functions 447 c. m9 U6 F, b$ S% T. T
1.6 Transformation of Random Variables 48* E7 L* W: m# ~, I
1.6.1 Functions of One Random Variable 49 I2 n$ h- Q% `$ ^. `$ Y/ W
1.6.2 Functions of Two Random Variables 529 o& d- f2 P1 D( h5 c
1.6.3 Two Functions of Two Random Variables 59! |7 n) y/ }+ `3 f( y6 F. S
1.7 Summary 65& K$ `* c8 f, b. `- Y
Problems 65
/ b9 W% G+ f% O3 @1 ^6 o: YReference 739 K$ a. y* G" J1 J2 d2 I
Selected Bibliography 738 b4 H- d( f0 d8 m6 S0 n, H
Chapter 2 Distributions 75% K9 Z: ]9 _, v% M5 u: _
2.1 Introduction 75
, y+ t8 N& \+ n2.2 Discrete Random Variables 75
! ^+ j% O, n2 _ s5 T2.2.1 The Bernoulli, Binomial, and Multinomial Distributions 75+ W6 I: P+ ]. D
2.2.2 The Geometric and Pascal Distributions 78
$ A! z# v, F, Y# i9 Y% b2.2.3 The Hypergeometric Distribution 82
0 w4 ^ I8 B8 Z- W7 G; w2.2.4 The Poisson Distribution 85 C9 w2 ~, r" N6 d! e
2.3 Continuous Random Variables 887 [5 U& f* v' q6 i
2.3.1 The Uniform Distribution 88
6 t7 p9 @) G( @1 T4 \: d; T2.3.2 The Normal Distribution 89
8 z5 B/ v) Y& ]( u$ ?2.3.3 The Exponential and Laplace Distributions 96
8 k( }+ v- L+ S0 G9 Q2.3.4 The Gamma and Beta Distributions 98
& b5 R' ~5 h% T9 o, B2.3.5 The Chi-Square Distribution 101
6 c2 a6 y& R# z& u: M2.3.6 The Rayleigh, Rice, and Maxwell Distributions 106
' y. L6 F4 [5 D2.3.7 The Nakagami m-Distribution 115' F$ f: w/ U; g8 `6 s, R6 ]
2.3.8 The Student’s t- and F-Distributions 115! Y) b3 K# d7 v1 c! D
2.3.9 The Cauchy Distribution 120# }: a( g& {7 N; `
2.4 Some Special Distributions 121
( N# T- y. X. b8 R& Q6 ~$ m2.4.1 The Bivariate and Multivariate Gaussian Distributions 121
* @% h' S9 z) T# T% T2.4.2 The Weibull Distribution 129
% l) h+ N7 B# b% ~1 H1 Z& x2.4.3 The Log-Normal Distribution 1313 y2 r6 |, w6 H- ^6 Q y$ w" {; F* w
2.4.4 The K-Distribution 132
- J l' y) h/ Z* O. Q* I# f( E) f; A2.4.5 The Generalized Compound Distribution 135 U; V- { O% l7 T2 H& W
2.5 Summary 136
: u0 f: x {% z5 r. V0 PProblems 1373 N1 a/ n; Y- s' ~
Reference 139; g1 K; G- ]" }8 F; A
Selected Bibliography 139
! @" f% b0 L( LChapter 3 Random Processes 1415 F6 V9 V8 b6 h- F
3.1 Introduction and Definitions 141 X+ T: c( P& k v9 C7 d B
3.2 Expectations 1452 D& ~/ ?5 t6 K3 X1 F' h
3.3 Properties of Correlation Functions 153 G# k6 k3 V2 [! o! D) k2 A
3.3.1 Autocorrelation Function 1534 c8 o" [7 L1 l% _
3.3.2 Cross-Correlation Function 153$ g( n% H9 X' J7 \: ~
3.3.3 Wide-Sense Stationary 1543 g: X0 x9 K! j! \/ y+ N) S7 X6 B
3.4 Some Random Processes 156. m9 K' p9 Z) o- m( W0 l3 A) e3 S
3.4.1 A Single Pulse of Known Shape but Random Amplitude
' g/ j2 F& @1 @' L6 P& cand Arrival Time 156, c8 R' T0 i. U# q6 I
3.4.2 Multiple Pulses 157
- y2 I* o7 X$ ~6 p I3.4.3 Periodic Random Processes 158
4 t7 k$ m- s% `3.4.4 The Gaussian Process 161- e# m2 x( r. v5 P
3.4.5 The Poisson Process 163' D6 v% ^- a+ E8 l
3.4.6 The Bernoulli and Binomial Processes 166! W+ |5 O5 f4 w; d! ^5 O- H
3.4.7 The Random Walk and Wiener Processes 168; T( e4 A0 x* K* R1 c7 k
3.4.8 The Markov Process 172, B& p6 |. v7 ]4 Z: v0 ~3 `4 X
3.5 Power Spectral Density 174: t; `# t% C8 N0 x
3.6 Linear Time-Invariant Systems 178
) b: e% a: u% Z5 C3.6.1 Stochastic Signals 179
6 d$ q0 v0 n! M3.6.2 Systems with Multiple Terminals 185
6 v* X* w) s# A! ?: J% W3.7 Ergodicity 186
7 t# r' n1 O' [. Z1 u: y5 w5 r8 ^" A+ U3.7.1 Ergodicity in the Mean 186
( W. q5 [! @0 V U4 E3.7.2 Ergodicity in the Autocorrelation 187
9 ]- e/ ]( ^+ S8 d9 P9 {2 I3.7.3 Ergodicity of the First-Order Distribution 1880 y6 G1 w. \. o% }
3.7.4 Ergodicity of Power Spectral Density 188
" V- _7 f6 X& d! u" f3.8 Sampling Theorem 189/ U0 z% q9 q6 j8 w
3.9 Continuity, Differentiation, and Integration 194
; m4 d% r- u* E2 {, r/ l3.9.1 Continuity 194
: f2 |. @3 x/ i. O3.9.2 Differentiation 196& l' H9 @2 o1 E# }
3.9.3 Integrals 199
X v4 a7 J; S$ g& b% r ^% `2 s3.10 Hilbert Transform and Analytic Signals 2015 ^2 X Q- {' y! I J# X' p# o
3.11 Thermal Noise 205
7 f! O: c% Q; M5 W e3.12 Summary 2113 A1 }# p. r& m8 A- Y- s3 t
Problems 2128 O# {/ o: @; l9 U) M: a1 t9 Q
Selected Bibliography 221; R4 I/ x% N z& E! Z ^. G: e
Chapter 4 Discrete-Time Random Processes 223' _5 |- y% u' O- i: {
4.1 Introduction 223; r$ Q( q; I L
4.2 Matrix and Linear Algebra 2248 I( O) u0 I1 h# D; }! q' X7 x
4.2.1 Algebraic Matrix Operations 224
; a$ ]/ n( p' z& [- h- R& @4.2.2 Matrices with Special Forms 232. M" p9 ~" a3 s$ p6 P! z0 d
4.2.3 Eigenvalues and Eigenvectors 236
: ~: \% y8 E4 |' I n' d- L4.3 Definitions 245. a0 T. x! E* E
4.4 AR, MA, and ARMA Random Processes 253, V& i$ z: X7 o) t& Y1 l$ q. r/ Z
4.4.1 AR Processes 254 L% q( q% k ?% g3 J
4.4.2 MA Processes 262: [! q$ h) O/ d3 S" y- W) i
4.4.3 ARMA Processes 264$ }" B/ H9 U7 Q: b, m
4.5 Markov Chains 266
( C# \- Q) y" s; ^+ l4.5.1 Discrete-Time Markov Chains 267
; w. B8 B$ }$ f( w( j& W& a4.5.2 Continuous-Time Markov Chains 276
/ W3 R3 F% a4 M; a; n8 l4.6 Summary 284
! f2 {- o. d, f2 QProblems 2840 @2 N6 J1 X7 m1 R% i. l
References 287% A+ ]5 f# }8 ^* `
Selected Bibliography 288
& t0 `3 o2 Y8 S3 ?2 V+ ~ E6 gChapter 5 Statistical Decision Theory 289
* u) \: s) u2 b6 P; s& n5.1 Introduction 289
# X3 T: V5 y L$ R6 Q5.2 Bayes’ Criterion 291; w; n0 h9 E$ L; l- s
5.2.1 Binary Hypothesis Testing 291: x* y; H: S7 }# f8 N( e% F$ B* V
5.2.2 M-ary Hypothesis Testing 303. k8 ~: S1 X3 A/ L
5.3 Minimax Criterion 3131 Z q% M2 y0 [( x" f3 W2 ?4 a. \
5.4 Neyman-Pearson Criterion 317
# A0 ?2 O, t+ ?; a! z# {( a5.5 Composite Hypothesis Testing 326& q" Q' }* o! e' a2 q; D
5.5.1 Θ Random Variable 327
5 |. j2 J6 J% P9 m- K; i5.5.2 θ Nonrandom and Unknown 329
5 ~, w: q. K: k2 R5.6 Sequential Detection 3329 S$ X* V; g$ A4 a
5.7 Summary 3376 w1 @5 f" g; m; F$ s
Problems 338
3 [3 t y5 ^2 ~$ W5 TSelected Bibliography 343
) R8 z5 v: G. v5 I' [2 i1 P8 G; PChapter 6 Parameter Estimation 345- Y$ `" O# b# H+ F( y) `9 ]# S
6.1 Introduction 345. z1 e% ]7 ~6 F3 Y8 n' P
6.2 Maximum Likelihood Estimation 346
, {( v1 ]2 @; z6.3 Generalized Likelihood Ratio Test 3481 ]% L" M$ p T" T1 n
6.4 Some Criteria for Good Estimators 353
3 f! T4 @) Z, l4 n5 W9 _3 h( w6.5 Bayes’ Estimation 355
( m' ]: | G- Z# Q6.5.1 Minimum Mean-Square Error Estimate 357
3 W- o' \2 o9 W- m/ m9 G6.5.2 Minimum Mean Absolute Value of Error Estimate 358, C6 D) w8 x4 U: R, v+ s' C* t3 h
6.5.3 Maximum A Posteriori Estimate 359# H2 F9 A& U6 h; \
6.6 Cramer-Rao Inequality 364
! T" u4 f/ N- r5 H6.7 Multiple Parameter Estimation 371
; r6 ]% b( W- t; h1 _4 f$ a4 [6.7.1 θ Nonrandom 3710 E% i0 L& {* G1 {2 G' V
6.7.2 θ Random Vector 376
2 A( g7 x/ j9 C% H6.8 Best Linear Unbiased Estimator 378
4 y+ V$ i+ {# Y- m$ k- o6.8.1 One Parameter Linear Mean-Square Estimation 379* Z5 v8 D/ N; Q" H5 f- z
6.8.2 θ Random Vector 381
: i0 h; M) J& Y: p. w6.8.3 BLUE in White Gaussian Noise 3831 _1 x+ w5 T6 r! [5 Y8 _
6.9 Least-Square Estimation 388- b$ V Y8 u. R+ b" G) S J
6.10 Recursive Least-Square Estimator 391" l; S% `, j5 ~6 H4 D- C6 Q
6.11 Summary 393
3 Y* d s" H& W* R3 nProblems 394
) }( y* N9 N( i0 wReferences 398
/ u5 t( i* ~* P4 ~( k# w$ zSelected Bibliography 398
" e! ~, z: \/ f5 ~ IChapter 7 Filtering 3993 Q/ t t9 V' N' j
7.1 Introduction 399* g2 s* J4 h9 z1 g6 I
7.2 Linear Transformation and Orthogonality Principle 400
8 E5 `1 R1 a3 y( Y6 O+ L* v7.3 Wiener Filters 409
. f' t2 A" K2 m9 g$ T7.3.1 The Optimum Unrealizable Filter 410
$ o2 o- D, s+ F% c7.3.2 The Optimum Realizable Filter 416
, M( p! Q/ i9 J7.4 Discrete Wiener Filters 424
: G( P- V4 ~# C7.4.1 Unrealizable Filter 425
: P& u5 r8 Y3 s) V; ~7.4.2 Realizable Filter 426, h; q7 l& \0 ?0 }# C* R& ~
7.5 Kalman Filter 436
3 F+ D/ Q, l3 ?6 E6 |' z7.5.1 Innovations 437! p( L9 Z5 ?+ W, ?( [
7.5.2 Prediction and Filtering 440
5 `5 ?5 h5 X7 N2 \7.6 Summary 445' i. h# \1 u l' b# z
Problems 445) d3 b3 Q! l4 h: j) U% Y
References 448# x% F# _" w- H: H$ h0 _9 s
Selected Bibliography 448" Q0 s- {) \/ q2 N, F
Chapter 8 Representation of Signals 449
, @4 W* E$ ~5 ^0 c! A) ?) x8.1 Introduction 449
4 r7 K) S | A1 P1 @' D# w8.2 Orthogonal Functions 449
- |9 T. N* T. G& {! c- W8.2.1 Generalized Fourier Series 451/ M( j+ t7 T* h5 b
8.2.2 Gram-Schmidt Orthogonalization Procedure 455
7 P [- D0 |' Q8.2.3 Geometric Representation 458! K6 Z0 F: M& p$ r% E
8.2.4 Fourier Series 463. k' O9 o0 S6 @" w
8.3 Linear Differential Operators and Integral Equations 466
) |3 T' l3 H" X% V# ~8.3.1 Green’s Function 4707 [1 Y* w8 \0 @: |7 p1 |
8.3.2 Integral Equations 471
B$ h6 X! V( c N8.3.3 Matrix Analogy 4797 M$ r: [7 ]* a2 ^7 M
8.4 Representation of Random Processes 480
4 G* ?4 p. v1 d2 }( Y8.4.1 The Gaussian Process 4835 i0 S4 i6 R. X/ I2 Q
8.4.2 Rational Power Spectral Densities 487
# I. c( C$ k1 I$ n8.4.3 The Wiener Process 492
j+ o& e' T2 L8.4.4 The White Noise Process 4937 S/ q. n9 i8 h% z
8.5 Summary 495
: j5 i; \' w* K' o3 bProblems 4964 l8 t' h$ o1 n; Z- E9 o3 n
References 500& X; X1 _, B0 f7 n" n% F; h/ q
Selected Bibliography 500, ]/ U' q* e4 y5 t
Chapter 9 The General Gaussian Problem 503
9 s1 {( }2 i. B2 u5 M' ?8 x9.1 Introduction 503
# t, g3 h3 G. |1 R T$ I* x: E9.2 Binary Detection 503
% g8 d7 |$ o/ N' r' _3 W9.3 Same Covariance 505
8 Y% o( a ]' p) N' ]$ j- J& |9.3.1 Diagonal Covariance Matrix 508& l, T( r$ [7 _% ~& j9 L
9.3.2 Nondiagonal Covariance Matrix 511
: j- p( T9 T- R O5 u9.4 Same Mean 518 r! H0 v- P9 m! {
9.4.1 Uncorrelated Signal Components and Equal Variances 519
/ f/ ^+ ^5 d) M" [9 u" J; a8 g9.4.2 Uncorrelated Signal Components and Unequal1 ^6 g8 f/ w' ]- {8 K
Variances 5228 w1 l+ C8 B; S. G5 |" K2 y' E6 Q
9.5 Same Mean and Symmetric Hypotheses 524) P( Q, z. k, S# _: [
9.5.1 Uncorrelated Signal Components and Equal Variances 526
- h0 F( K6 L3 T# p% w7 H& Y9.5.2 Uncorrelated Signal Components and Unequal
( w% i, u* f a9 B* f( `Variances 528
" a$ F+ g; w& m3 O1 I, }2 Z9.6 Summary 529
3 G# A. P! b0 L5 HProblems 530
( Q% i3 m5 t |, E+ EReference 532% t% H$ }8 g# ~) d) v, i0 d
Selected Bibliography 532
9 |9 p' `/ X$ ?9 X v0 l4 e' }# }# `Chapter 10 Detection and Parameter Estimation 533
0 P% ]- d' m U% I0 ?' W10.1 Introduction 5335 W. r8 |& Y' Z% c" V
10.2 Binary Detection 534/ v% w5 C6 ^+ `" @2 h) S
10.2.1 Simple Binary Detection 534
: x- [5 O! B) Z6 ~ h4 p( d! ~6 r10.2.2 General Binary Detection 543
0 M9 D7 A) b" ?, A* C# Q10.3 M-ary Detection 5564 f/ i1 \ q0 a9 d0 M, k3 t
10.3.1 Correlation Receiver 557
) D. U) v6 t4 [# V10.3.2 Matched Filter Receiver 567; r: t4 m R% ?. Z7 E4 @3 @: j
10.4 Linear Estimation 572
2 a( g! t5 \/ ~5 s5 \0 {0 s' W8 M10.4.1 ML Estimation 573( X6 i8 _- _' a ?5 @: h( c$ Y
10.4.2 MAP Estimation 575* [( P- x# W$ k& s( L i
10.5 Nonlinear Estimation 576/ \( g- M' i; R2 }# }
10.5.1 ML Estimation 576
p! H! Z' @; ]$ o9 h9 W0 B" F10.5.2 MAP Estimation 579
# I3 M4 _' ^7 p5 t$ i10.6 General Binary Detection with Unwanted Parameters 580
) P; n ^5 E; C, J; u10.6.1 Signals with Random Phase 583 n% r/ _% Y( e9 i4 y- X1 u# ~- ]% Y
10.6.2 Signals with Random Phase and Amplitude 595
9 C; m( U, G+ Y0 w10.6.3 Signals with Random Parameters 598: K9 b# P1 W9 w) b+ O0 p
10.7 Binary Detection in Colored Noise 6064 b2 w/ j3 \% V
10.7.1 Karhunen-Loève Expansion Approach 607
/ u- y! h$ z/ S# S# I5 v10.7.2 Whitening Approach 611' b) K y. _& q; G$ n# v
10.7.3 Detection PeRFormance 615
( R, p9 ?- b2 I. S" ^. h/ r% m& ^+ Q5 F10.8 Summary 617% X1 U8 W% f; z) F
Problems 6182 m, I2 {2 z$ S! \9 Y0 u/ t
Reference 6262 L$ a% m8 v) @7 B3 C V( g
Selected Bibliography 626
: i. Z6 Z+ E/ R0 y+ ]2 \* m8 n0 QChapter 11 Adaptive Thresholding CFAR Detection 627
& `# z) J( O: {( W$ c11.1 Introduction 627
9 x* d! ~* S& l11.2 Radar Elementary Concepts 629
- w* T0 e; w' N11.2.1 Range, Range Resolution, and Unambiguous Range 631: s9 \+ Y! n4 o9 V7 O# j/ R4 y
11.2.2 Doppler Shift 633! n. j+ [! L: t! Q
11.3 Principles of Adaptive CFAR Detection 634! g4 Y5 Q! p3 x' e2 d2 b- n P
11.3.1 Target Models 640
' |! S, Y/ V1 h7 I* j5 M11.3.2 Review of Some CFAR Detectors 642
% J' U/ I6 Z' r6 h; o1 v11.4 Adaptive Thresholding in Code Acquisition of Direct-8 \' S8 [/ W3 Z0 v0 D
Sequence Spread Spectrum Signals 6488 H' d' x2 L. D9 `
11.4.1 Pseudonoise or Direct Sequences 649: G3 k' U* _9 x% H- s, ^% N: L' D
11.4.2 Direct-Sequence Spread Spectrum Modulation 652$ t3 ]1 y/ O" c3 ^5 }3 @
11.4.3 Frequency-Hopped Spread Spectrum Modulation 655
! q+ N# X1 z6 z+ d11.4.4 Synchronization of Spread Spectrum Systems 6553 C& D* W6 X% j! y; D7 y w
11.4.5 Adaptive Thresholding with False Alarm Constraint 659+ t8 L6 a" H9 | y; A
11.5 Summary 6603 I, Y4 ]0 w' \2 U% a. s
References 661# X7 {: E. Z. x( l0 V( M6 g& B+ H
Chapter 12 Distributed CFAR Detection 665( x: k; a6 F+ R3 [$ @$ s8 \
12.1 Introduction 6650 C+ F( d4 t5 W4 A
12.2 Distributed CA-CFAR Detection 666# a& a2 q# L. A8 k6 v& `) |$ E$ v
12.3 Further Results 670
1 Z# [1 ]9 |' I+ g7 j1 Q R- }7 A: Q12.4 Summary 671" H$ q7 ^. C8 _1 T$ w- H
References 672
" O' D0 I u' L, `5 VAppendix 675' ?/ ~" K( |& G: I
About the Author 683, o- k7 K; L4 H! }$ |
Index 685
+ J' z @: _ o% H8 F/ {2 B7 W |
|