|
|
EDA365欢迎您登录!
您需要 登录 才可以下载或查看,没有帐号?注册
x
Contents
. ?" o9 z3 @% x9 L* z' u TPreface xv, o- k: S" I# V" y% P& A+ e
Acknowledgments xvii
9 D5 u0 ?! I0 o' d0 t! fChapter 1 Probability concepts 1: C$ N8 ?% M; q3 Z% v/ I
1.1 Introduction 1
7 y' Q9 u, O% J3 o% h1.2 Sets and Probability 1% U4 n0 N. ]* L% D4 p6 R
1.2.1 Basic Definitions 1" ~% S8 q! ^ B. P
1.2.2 Venn Diagrams and Some Laws 3$ P d8 U5 R3 W4 p n7 X
1.2.3 Basic Notions of Probability 66 A2 E: t8 b. y) h5 V. ^ \
1.2.4 Some Methods of Counting 8: D. f. J7 Q5 Q+ J
1.2.5 Properties, Conditional Probability, and Bayes’ Rule 12: h6 q' J+ ]; z0 p
1.3 Random Variables 17 M6 f5 Y" H% D6 D* @
1.3.1 Step and Impulse Functions 17% g* p' S& Y. d+ {3 o2 e- T
1.3.2 Discrete Random Variables 181 `) s6 j3 T; j( W
1.3.3 Continuous Random Variables 203 f" ^5 f0 A0 l+ [9 m
1.3.4 Mixed Random Variables 229 C0 E+ `7 x4 l5 p) M6 D5 \
1.4 Moments 23
7 I: V9 [ U0 p% y1.4.1 Expectations 235 p* `8 r* ?2 Z0 c& ^% ^& s
1.4.2 Moment Generating Function and Characteristic Function 269 u# | ?5 R' W( K) t# {5 @
1.4.3 Upper Bounds on Probabilities and Law of Large9 L& b/ `+ A3 w9 f( h# _
Numbers 29
# t4 d4 s, R4 H1 R' h, M0 d) P1.5 Two- and Higher-Dimensional Random Variables 319 t$ P6 B7 V4 D* R
1.5.1 Conditional Distributions 33
' h2 ^+ Q# }$ |+ D1.5.2 Expectations and Correlations 417 L$ P& P5 B o$ m. O: i/ [
1.5.3 Joint Characteristic Functions 44
, e/ s- N7 Z7 ]6 ^( y1 ^, _1.6 Transformation of Random Variables 48
9 q# M! a, U3 J* Q. Q1.6.1 Functions of One Random Variable 49
8 v A5 J( i0 y* C# K1.6.2 Functions of Two Random Variables 52
! u; `& V. ` o. X1 V7 r0 i1.6.3 Two Functions of Two Random Variables 59% h. a: n* |- t m+ G* d
1.7 Summary 65
" f, r' V1 I* Y) S5 P0 gProblems 65! d2 _& ` w2 q7 M
Reference 73
; Q. l8 P! J. ~0 l5 u# V2 ESelected Bibliography 73
2 h2 {6 r1 V# `# a' Q2 [! [+ {' jChapter 2 Distributions 75: W( m- {9 x( j8 _6 A
2.1 Introduction 75
# r# D) S0 P& V& P+ x2.2 Discrete Random Variables 75
) ~3 D. F8 N* K+ J+ T3 O) g2.2.1 The Bernoulli, Binomial, and Multinomial Distributions 75
; p% h8 l3 V3 j5 f! b8 y2.2.2 The Geometric and Pascal Distributions 78
: C1 H; o5 b, B- ?2.2.3 The Hypergeometric Distribution 824 _: `! i0 @) N8 Q' b6 _/ \
2.2.4 The Poisson Distribution 85" i. |! M. u" t4 E2 {( |
2.3 Continuous Random Variables 88
, d1 Q( G* b t2.3.1 The Uniform Distribution 88
' A! q( j# u; @/ r1 U% Q0 L0 S2.3.2 The Normal Distribution 89
, r M# J8 p) l# E" K P. m& s2.3.3 The Exponential and Laplace Distributions 96
8 P; R& a7 J. ]5 h- c+ b2 M. u2.3.4 The Gamma and Beta Distributions 98
; N- O1 V% Z9 \. I8 l2.3.5 The Chi-Square Distribution 101
1 J) Q' `5 W6 q( n) P2 j9 m- F2.3.6 The Rayleigh, Rice, and Maxwell Distributions 106- n( b* N+ ?. l- \$ M
2.3.7 The Nakagami m-Distribution 115
2 ]! J+ w8 s" I3 }2 [5 z% l2.3.8 The Student’s t- and F-Distributions 115( N( ?- I$ |( y( T# F$ v; [/ M
2.3.9 The Cauchy Distribution 1207 R, ]& c' Z0 r0 T8 I7 l7 d/ _& f' O
2.4 Some Special Distributions 1219 _0 Y$ X3 R7 F1 b: L
2.4.1 The Bivariate and Multivariate Gaussian Distributions 121
, \8 B6 n" F [: c0 x2.4.2 The Weibull Distribution 129/ a1 G' |) i% Z' m
2.4.3 The Log-Normal Distribution 131
0 B$ l4 B/ v9 u H0 H; J4 l5 c) ^2.4.4 The K-Distribution 132
' T# P. _: {, Y- i. m9 r2.4.5 The Generalized Compound Distribution 135
# C4 t' s" k$ }# F6 l8 m! O7 q) I2.5 Summary 136
# t( `; W9 W. p9 ~2 o* WProblems 1371 H, o- n7 H! }* w1 h6 ~& S0 z
Reference 139: L" L" C" ^' `/ h' A0 O* N' N. ~
Selected Bibliography 139( s( y) h% }. w# o
Chapter 3 Random Processes 141
' I# s) H* N0 g; b. ?3.1 Introduction and Definitions 141. F3 u/ E' c' K( J* ^
3.2 Expectations 1456 b3 h( D. e5 o' M; G7 c, R' f
3.3 Properties of Correlation Functions 153- d7 U* r J7 I* E
3.3.1 Autocorrelation Function 153
# x8 j9 ~0 e8 d/ n$ N j3.3.2 Cross-Correlation Function 153
6 ]4 l1 U; K4 H3.3.3 Wide-Sense Stationary 154
, o& S+ u( _' j) E [5 X3.4 Some Random Processes 156
* c4 \( z; O8 F. L3.4.1 A Single Pulse of Known Shape but Random Amplitude' p. L! v) I. i4 q. `6 j6 V
and Arrival Time 156/ N% k2 a, O3 h+ d% {+ H( s
3.4.2 Multiple Pulses 157
# q( ^2 g/ X3 N. y& I3.4.3 Periodic Random Processes 158* P+ k; m6 t, {& n6 Q
3.4.4 The Gaussian Process 1619 |. |$ p9 [: B
3.4.5 The Poisson Process 163" z c8 b$ Z) G
3.4.6 The Bernoulli and Binomial Processes 166
* C) `* X* M9 W8 ?9 a5 `1 [! N5 l3.4.7 The Random Walk and Wiener Processes 168- U% x% j0 t6 e) ]
3.4.8 The Markov Process 172: X; @( n: R4 z+ d% h* h2 t
3.5 Power Spectral Density 174* D- X" }% {; B7 P. f' o1 c
3.6 Linear Time-Invariant Systems 178
; f, R; s3 ]9 f$ O3.6.1 Stochastic Signals 179
0 h" |) ]0 o" C' E3.6.2 Systems with Multiple Terminals 185
: u: V) ^9 @) E$ h8 F. ~6 A3.7 Ergodicity 186
& p/ F4 |1 o/ E# B+ Y3.7.1 Ergodicity in the Mean 186& k9 D( W; I4 m0 v! _; X* {
3.7.2 Ergodicity in the Autocorrelation 187
& f d5 y0 X( z# H3.7.3 Ergodicity of the First-Order Distribution 1882 z4 V1 ^& a8 s* k$ P
3.7.4 Ergodicity of Power Spectral Density 1883 @" e4 Q# F* Z! ^" I8 I' B
3.8 Sampling Theorem 189; ^* R) ^$ Y- h5 X# q
3.9 Continuity, Differentiation, and Integration 194
/ N! |( U2 z- K& ~, R3.9.1 Continuity 194
. Z8 W+ p" G1 E8 T, ^4 E1 m, r. k3.9.2 Differentiation 196
8 f9 [* P3 k( m! _ Q! d8 X; L3.9.3 Integrals 199 h8 U% D/ r$ c$ @2 i) L7 m
3.10 Hilbert Transform and Analytic Signals 201
# `8 p' h/ _2 e; K: _3.11 Thermal Noise 205
0 i: w V0 a \8 F* g1 @3.12 Summary 211( a- k1 {9 {! s r/ t
Problems 212" g* D: R, p4 m2 u+ F2 F- `
Selected Bibliography 221
6 p: k+ {; ?5 ?: f& J" G9 zChapter 4 Discrete-Time Random Processes 223
: B3 F( |; o' X0 s$ f9 D7 b4.1 Introduction 223. h7 ]! D: B$ Y8 ~/ Y
4.2 Matrix and Linear Algebra 224
( Q" I6 w, @7 t# {+ m8 }! {& O4.2.1 Algebraic Matrix Operations 224
) W; X; X" p5 z3 @7 q/ {4.2.2 Matrices with Special Forms 232( h' M$ M; E& a3 ^0 O
4.2.3 Eigenvalues and Eigenvectors 236
8 } }9 L8 z/ b4.3 Definitions 245$ s" d7 |/ v8 X
4.4 AR, MA, and ARMA Random Processes 253; {" C0 M6 D- W. s8 _
4.4.1 AR Processes 254
- m( B2 @4 ^% F( C* E9 Q$ A; o4 X4.4.2 MA Processes 2624 n9 k8 m4 l, a6 Q1 ]. Y: i( e
4.4.3 ARMA Processes 264
7 Y z) t$ W) F/ o4 v% }/ i4.5 Markov Chains 266
! z4 W; h, c: U4.5.1 Discrete-Time Markov Chains 267$ H7 l* l7 M# i7 t1 r
4.5.2 Continuous-Time Markov Chains 276/ _- t7 a9 X" t) X
4.6 Summary 284
. }9 {$ S1 I+ n( T& yProblems 284. z1 b9 D( \/ p9 W* k
References 2875 U* f7 T7 f8 r; t: E. @: A
Selected Bibliography 288
. d( g- v+ T2 T {7 D2 sChapter 5 Statistical Decision Theory 289
" S3 r% F; a4 [$ W, _. F5 Y5.1 Introduction 289
, @* q" o; B2 t7 C5.2 Bayes’ Criterion 2918 ]: C( u) y0 v! o, a0 C
5.2.1 Binary Hypothesis Testing 2912 k5 p5 \% y2 z+ Y o/ @- K( B9 Z
5.2.2 M-ary Hypothesis Testing 303
: i+ W; R6 [: g! `9 I6 |+ v4 O5.3 Minimax Criterion 313: a+ G1 S0 l6 ]& X8 g
5.4 Neyman-Pearson Criterion 317
+ D+ b* P* F& ^, x8 f% q5.5 Composite Hypothesis Testing 3266 z7 m3 k) ^& e9 ~2 b
5.5.1 Θ Random Variable 327* s7 T8 U! N+ }+ B/ q% S
5.5.2 θ Nonrandom and Unknown 3293 i/ Z. L3 B# J% \$ j2 l P" f
5.6 Sequential Detection 332# a: M! w; ~9 _ y7 S/ M2 X
5.7 Summary 337
* R3 N, ~- w* m1 U5 EProblems 338: F! Y6 @7 h) p) `0 W
Selected Bibliography 343
" Z) w# ^# ?7 ^/ ~; @5 A. pChapter 6 Parameter Estimation 345
/ @5 }& ?+ M* c0 T5 \3 D j7 e6.1 Introduction 345( R2 Z0 A I9 s. C2 Y
6.2 Maximum Likelihood Estimation 346
0 y) `6 s5 N+ ~. W9 f6.3 Generalized Likelihood Ratio Test 348
3 B2 k& s0 O6 A3 K% N6.4 Some Criteria for Good Estimators 353' f; S3 I2 s; t! r" i4 i3 o) K
6.5 Bayes’ Estimation 355
) ]2 S" C& `2 g1 |6.5.1 Minimum Mean-Square Error Estimate 357& i j% c" R) z5 |- a
6.5.2 Minimum Mean Absolute Value of Error Estimate 358
4 ~- k; [3 W `/ K. _. A6.5.3 Maximum A Posteriori Estimate 359
$ e/ q: x `3 F* l5 b6.6 Cramer-Rao Inequality 364
7 r4 V; S+ w- [, T6.7 Multiple Parameter Estimation 371
" L& X4 \/ y7 H8 K+ F4 H6.7.1 θ Nonrandom 371+ R5 s" G3 m+ \$ ]
6.7.2 θ Random Vector 376
% J d/ t. y! r1 A' T* b6.8 Best Linear Unbiased Estimator 378
" i! y, t. Z( P6.8.1 One Parameter Linear Mean-Square Estimation 379
3 b! m9 k0 N( t' P) E" a8 L6.8.2 θ Random Vector 381
. Q5 P& i" e% J6.8.3 BLUE in White Gaussian Noise 383
, q r3 z- A- n, {- I6.9 Least-Square Estimation 388' {% ^* X8 j2 h0 |% o- P
6.10 Recursive Least-Square Estimator 391, `4 ]' w: f& O5 [3 p! w+ J
6.11 Summary 393
/ |3 r- i( N7 {% `* h, K# E' J4 cProblems 3949 Z; x. c$ `/ `9 y! [
References 398
% W9 p5 \3 l2 Z! PSelected Bibliography 398
2 Z0 G5 A5 W4 P- X0 P- `Chapter 7 Filtering 399" n) Y* z, B* d& o! D( a
7.1 Introduction 3998 G8 N* q6 r. I; X
7.2 Linear Transformation and Orthogonality Principle 400
/ L3 I( \0 U7 V" G0 U" X7.3 Wiener Filters 409; X7 c" F& t/ }5 l% m2 |# x" u4 z
7.3.1 The Optimum Unrealizable Filter 410# n: o& F3 `. `, G; S$ n
7.3.2 The Optimum Realizable Filter 416* e9 [0 O3 F6 |, C
7.4 Discrete Wiener Filters 424) E2 X9 I2 T ^" f
7.4.1 Unrealizable Filter 425
$ X- n7 D: E/ c' Y+ a0 k7.4.2 Realizable Filter 4266 b4 d2 T; j: |# h4 W4 Q
7.5 Kalman Filter 436+ @6 B- l% v% C1 J: K
7.5.1 Innovations 437
0 Q# Z% \; s# ` y5 F' J7.5.2 Prediction and Filtering 4405 f5 v# w1 A! n
7.6 Summary 445
9 _: x& Z( Z' UProblems 4457 E1 }0 N. n2 I/ R8 B: e
References 448
1 R# w/ q* ]9 `3 s" B- ]& q, j/ l( RSelected Bibliography 448
4 Q9 Y$ s7 K& i# LChapter 8 Representation of Signals 449
0 k0 `% a' [. Q" X8.1 Introduction 449
, c4 [! H6 K# p6 o7 h8 W- w8.2 Orthogonal Functions 449# ^) O9 Z5 w) |2 O" M. }( v/ {
8.2.1 Generalized Fourier Series 451. W5 g0 g9 i" d: t
8.2.2 Gram-Schmidt Orthogonalization Procedure 4554 R# l& X+ W" t1 y% a! b. P% @ t
8.2.3 Geometric Representation 458
& G& A& ]9 _0 P2 F! z: N8.2.4 Fourier Series 463+ U. r7 e+ a; T1 O- q# M
8.3 Linear Differential Operators and Integral Equations 466, k9 ^6 u/ Q! q7 K5 n( i
8.3.1 Green’s Function 470' F: k6 A; }& x7 H5 _/ S" A
8.3.2 Integral Equations 471
, J9 B1 m( u9 _: n" p& l8.3.3 Matrix Analogy 479% d0 U' F" @8 T3 X" Z
8.4 Representation of Random Processes 480$ ^" ^/ u- O# n3 r, d0 W3 d1 H" S
8.4.1 The Gaussian Process 483
: J! {3 ]+ Q% u, n3 t8.4.2 Rational Power Spectral Densities 487
2 k8 Z. r' K( x5 H, v8.4.3 The Wiener Process 4924 F; H6 C& N P/ [. O
8.4.4 The White Noise Process 493
% Q9 O J7 d* A% ~" V8.5 Summary 495
1 z& Z' b4 l$ z3 @1 RProblems 496
) Z0 e3 j) j+ C, F: kReferences 500
8 q) v$ u5 w3 q! ZSelected Bibliography 5007 s* y$ I d9 |
Chapter 9 The General Gaussian Problem 5039 h' s6 P% h G$ I8 O" _
9.1 Introduction 503
/ J# o* j% m( I9.2 Binary Detection 503
5 q( \, Q2 U) H8 D9.3 Same Covariance 5055 p1 [2 G) N+ a2 h5 Z4 A
9.3.1 Diagonal Covariance Matrix 508
9 X7 D7 ?* G/ F/ k; |8 C5 L" p; c9.3.2 Nondiagonal Covariance Matrix 511! m8 L5 i5 e: z0 D4 {; O
9.4 Same Mean 518
$ n' y& C% s% t5 \6 n3 I# |9.4.1 Uncorrelated Signal Components and Equal Variances 519
- s5 y* ]. x C5 P# K! z8 v9.4.2 Uncorrelated Signal Components and Unequal) w+ i* c- b+ O& |
Variances 5225 w5 ?7 W- j8 c7 ~% i7 h6 j
9.5 Same Mean and Symmetric Hypotheses 524& q& A4 w5 S5 E- F
9.5.1 Uncorrelated Signal Components and Equal Variances 5269 }/ k& A6 O$ o* c: l# A* [
9.5.2 Uncorrelated Signal Components and Unequal; j9 e) E9 A3 }) P
Variances 528# O& p1 y3 h+ k& m, P! N. j
9.6 Summary 529
& ?, h1 b" A0 z* TProblems 530
- c f. v# |6 c3 Q6 O# H) SReference 5328 t* n7 B4 M, g9 e! z4 A2 t
Selected Bibliography 532
- K1 p/ J/ a5 GChapter 10 Detection and Parameter Estimation 533
, A; {$ e/ I0 L! a; |* m, d( U% E) j10.1 Introduction 533
0 E8 H& ^( L( B& s& {1 t10.2 Binary Detection 534
, L5 b; P! _# f! w10.2.1 Simple Binary Detection 5344 p( E# O. a# P* A9 S
10.2.2 General Binary Detection 543! W& ^- Y( E+ `7 k: H( y0 n: h
10.3 M-ary Detection 556
- p5 |8 _8 }" d% U; ?7 ]10.3.1 Correlation Receiver 557# `1 }" Y5 }% f7 ]* l" T
10.3.2 Matched Filter Receiver 567
3 b% _! M, G% Q' j9 y10.4 Linear Estimation 572) x) P* J. c+ \ k
10.4.1 ML Estimation 573, `4 C& r. S8 H$ j: _4 x. H. a5 O
10.4.2 MAP Estimation 575$ s! [1 Q* E% c% M+ [
10.5 Nonlinear Estimation 5769 X b- c* z# s _4 T" K
10.5.1 ML Estimation 576* ^0 N9 P; k0 f- Z2 ~0 t7 x) u
10.5.2 MAP Estimation 5794 r0 X% ?/ i3 {# N Y0 j+ R
10.6 General Binary Detection with Unwanted Parameters 5804 V7 K8 A" Z- ?( w8 f
10.6.1 Signals with Random Phase 583
6 W# z$ o4 @ h4 V10.6.2 Signals with Random Phase and Amplitude 595' e( h! m) {( M/ R% A/ {8 O' {
10.6.3 Signals with Random Parameters 598
" b4 Q: l+ D! r10.7 Binary Detection in Colored Noise 606
8 T* n% G$ [( `% y! n- v- G10.7.1 Karhunen-Loève Expansion Approach 6070 @) |; x/ y# v1 y3 j) t! P
10.7.2 Whitening Approach 6117 [6 O+ U% O; T
10.7.3 Detection PeRFormance 615) z2 c8 T* I, q2 q# i
10.8 Summary 6171 a( q) V5 C2 {
Problems 618
( x( F& i% v( }Reference 626
% l+ ?4 T" m$ O! S" G9 h- K1 ySelected Bibliography 626. ~5 D- @ W$ m! Q {; ^
Chapter 11 Adaptive Thresholding CFAR Detection 627
) w P4 [* ^, o7 ?6 D11.1 Introduction 627% Z% Y3 C% l" J2 |; u8 i% O' ^4 d
11.2 Radar Elementary Concepts 629/ a- P L! x" s @8 r( o) T% v
11.2.1 Range, Range Resolution, and Unambiguous Range 631
5 G2 E: W4 N. d6 e11.2.2 Doppler Shift 633- D6 J" f, }) w. y- D% \
11.3 Principles of Adaptive CFAR Detection 634
9 X* I) a# Q7 { B( g/ V D2 {11.3.1 Target Models 640) u1 ]: x# t5 r! W2 C6 {
11.3.2 Review of Some CFAR Detectors 642. a8 O* F; u3 F" I, h2 [; u
11.4 Adaptive Thresholding in Code Acquisition of Direct-; l- E4 X, |6 I. q V9 T
Sequence Spread Spectrum Signals 648
7 M5 S" m& M7 _& T11.4.1 Pseudonoise or Direct Sequences 649# y2 n7 _/ P! C
11.4.2 Direct-Sequence Spread Spectrum Modulation 652
9 P+ \# H, Z! l' ~; q/ p11.4.3 Frequency-Hopped Spread Spectrum Modulation 655
2 V: F. |( Y4 ?11.4.4 Synchronization of Spread Spectrum Systems 6559 b" ~) j* W, J8 a) ]( a" s/ }
11.4.5 Adaptive Thresholding with False Alarm Constraint 659
0 z3 E$ |- ^2 y V11.5 Summary 6609 R$ o% A; |/ {
References 661( B0 B4 [: ~- U( ?" n3 g' k
Chapter 12 Distributed CFAR Detection 665. t7 U. J( \ A9 D- U( U
12.1 Introduction 6655 A' ^/ \5 ]7 Q
12.2 Distributed CA-CFAR Detection 666
5 B, t. n( |2 T. M( o12.3 Further Results 670. q% r3 G' Z# u# h, |- b
12.4 Summary 671$ I# b: G9 O, ~
References 672
7 }! a. T8 G) k4 F# P3 u5 N0 _! L5 EAppendix 675
) s2 U! S+ h( t) T& X; ?, M6 DAbout the Author 683: x4 G5 W$ q3 F4 o9 W2 s9 `
Index 685, V0 Y+ y% }& H1 C6 C I5 V
|
|