|
|
EDA365欢迎您登录!
您需要 登录 才可以下载或查看,没有帐号?注册
x
Contents* Z) T6 P X* u$ Y) u9 n
Preface xv
2 p8 y. C5 ~1 J" _8 FAcknowledgments xvii3 x$ o- g7 M+ ~; p$ Z. Q
Chapter 1 Probability concepts 1
# e- m# _8 y/ q, ]& H1.1 Introduction 1$ G* ~1 ?: _ \: y) W
1.2 Sets and Probability 1. H; T3 K% f' k) e5 `9 Z" ~# }
1.2.1 Basic Definitions 1
" e3 x N2 T6 D% \, V1.2.2 Venn Diagrams and Some Laws 3- l5 m5 H2 e/ w9 z+ c( L
1.2.3 Basic Notions of Probability 6
) x7 D8 {5 n% i5 B4 a1.2.4 Some Methods of Counting 8
/ x9 F- z6 L/ \) b2 N/ V1.2.5 Properties, Conditional Probability, and Bayes’ Rule 12- M' t6 c, q2 {4 O5 T% j
1.3 Random Variables 17
' Y6 A# H% ?; T# h: E1.3.1 Step and Impulse Functions 17+ q8 O; j: O) h" J
1.3.2 Discrete Random Variables 18
* D7 H9 Z0 Z+ \* @2 J: _" L& S& ]1.3.3 Continuous Random Variables 20
+ W% K2 Z/ r# A1 X! Y4 P1.3.4 Mixed Random Variables 220 W- ^3 S# j/ n/ [* r
1.4 Moments 23
/ _8 D) o6 T! n/ |. l% S( W1.4.1 Expectations 23% K4 {& y6 p, e5 h' f
1.4.2 Moment Generating Function and Characteristic Function 26
. W2 v# i4 w, C7 X; K1.4.3 Upper Bounds on Probabilities and Law of Large
: l( p" }1 f! XNumbers 29
' b% X6 N y/ }% ?1 k' O0 K+ P1.5 Two- and Higher-Dimensional Random Variables 31; ]% G0 ^" m( B
1.5.1 Conditional Distributions 33: E. H: I0 J/ v6 b& k; {, [
1.5.2 Expectations and Correlations 417 p4 A4 B) O7 }$ Q! p1 ]
1.5.3 Joint Characteristic Functions 44
4 }! V. [4 _, Z! O1.6 Transformation of Random Variables 48
" Z1 X2 l! s0 ]5 m5 m8 D- F7 g1.6.1 Functions of One Random Variable 49
9 \& I6 x+ L K' a+ L9 u1.6.2 Functions of Two Random Variables 52
7 J% |! q) {: ~. K% S1.6.3 Two Functions of Two Random Variables 59 |2 {3 M$ w6 |: C
1.7 Summary 65% Y3 c1 {" O) x+ Y
Problems 65+ D; l+ z$ p; c) N) ^
Reference 73
+ l9 @$ T8 _7 L; a% `# s, x4 ZSelected Bibliography 73; Y" f8 y& N; H+ v
Chapter 2 Distributions 75
$ i0 x: m( d! E: Z9 k2.1 Introduction 75
2 `; u, N+ l- J- ~2 @- R. E2.2 Discrete Random Variables 75, b- G/ O$ {# F
2.2.1 The Bernoulli, Binomial, and Multinomial Distributions 75% Z. x q o) A* T5 c
2.2.2 The Geometric and Pascal Distributions 78" I) P. J7 E r8 b
2.2.3 The Hypergeometric Distribution 82
, ~* K* h! @( w' o/ m5 X: y- b2.2.4 The Poisson Distribution 85
# B. e3 f1 p4 J( [/ _9 E2.3 Continuous Random Variables 88
8 a* C- C1 V, Z# B9 W2.3.1 The Uniform Distribution 88
7 b0 ?* n1 Q0 z& l K" S8 N0 k2.3.2 The Normal Distribution 89, }" a. D% `/ u0 Z
2.3.3 The Exponential and Laplace Distributions 96 M$ I1 M2 d7 T
2.3.4 The Gamma and Beta Distributions 98
4 u) x! _" N; N6 ]2.3.5 The Chi-Square Distribution 101+ y; l1 ?# e8 P( ?4 c
2.3.6 The Rayleigh, Rice, and Maxwell Distributions 106
- R. s( ]0 c9 N/ q2.3.7 The Nakagami m-Distribution 115
* j/ |7 k0 t& v! \8 ]* E2.3.8 The Student’s t- and F-Distributions 1155 {0 ?3 t; w2 w2 d- Y. y
2.3.9 The Cauchy Distribution 120$ Q% F! e4 @7 p7 z" u
2.4 Some Special Distributions 121' l4 H G& T% A/ ^
2.4.1 The Bivariate and Multivariate Gaussian Distributions 121( c) P" G% f% g* t+ K J
2.4.2 The Weibull Distribution 129
0 R* \4 M/ h/ u3 p' h H r4 k2.4.3 The Log-Normal Distribution 131
; e$ L$ o/ b% C2 T/ ?9 {7 N2.4.4 The K-Distribution 132
1 l+ W" D* p3 L5 H: V2.4.5 The Generalized Compound Distribution 135
5 B G+ d* r; K6 z0 b2.5 Summary 1367 ?- e& ~, I1 S. r2 g8 B' ^. f
Problems 1373 [. B+ f& c1 Y2 }* k; o4 Q
Reference 1393 t1 E: w% e; b# e) }2 |
Selected Bibliography 139
0 e# J& U9 k" C- f* I8 L( zChapter 3 Random Processes 141: [% G. `$ M2 Z+ k
3.1 Introduction and Definitions 141
3 t: y& m% q6 j. S: u2 N k: b% s3.2 Expectations 145
2 K: J# Z' g1 d/ O3.3 Properties of Correlation Functions 1537 L, V" H8 Y' D( ?' b) N
3.3.1 Autocorrelation Function 153
/ A3 K+ |+ X$ ?8 u4 h- t1 R3.3.2 Cross-Correlation Function 153# P( Y! W5 Y- _9 A4 x7 `- U5 Q
3.3.3 Wide-Sense Stationary 1544 F( q# v& `7 X% h5 b" C
3.4 Some Random Processes 156
. ?; r/ J$ _/ Z* a" X3 K# `3.4.1 A Single Pulse of Known Shape but Random Amplitude
! |3 L: [7 z* W( U8 ?4 {and Arrival Time 156/ F3 f) s* h1 n7 R. U y. z
3.4.2 Multiple Pulses 157
. ?+ Z8 z% Q) O! k3.4.3 Periodic Random Processes 1580 l7 P3 l4 Q. p+ ~7 u; c
3.4.4 The Gaussian Process 161
2 ?* D ~: a% ^+ f3.4.5 The Poisson Process 163: @' u9 C& f8 y, Y. I8 X) T
3.4.6 The Bernoulli and Binomial Processes 166( `/ i: C2 B& W2 M' v
3.4.7 The Random Walk and Wiener Processes 168
2 T% H% Z! v- [6 p5 Y7 ?0 x3.4.8 The Markov Process 172
# M \9 R3 _ k% K3.5 Power Spectral Density 174" W! R- p9 g& r, M& D& u9 k8 T# B6 W, p
3.6 Linear Time-Invariant Systems 1783 ~ P+ w9 S. z+ j2 |% ~
3.6.1 Stochastic Signals 179
8 r# a; u. R8 c) o( b3.6.2 Systems with Multiple Terminals 185% C1 h/ O4 U# V% r
3.7 Ergodicity 186
$ o+ l3 A b; |2 D& ]3.7.1 Ergodicity in the Mean 1867 n( r$ Y [0 h8 J
3.7.2 Ergodicity in the Autocorrelation 1870 \& f! ]3 ?3 A% g$ M3 r
3.7.3 Ergodicity of the First-Order Distribution 188
( X, ]& N& Y7 ^) F( i3 g3.7.4 Ergodicity of Power Spectral Density 1886 O1 T. p% h4 q0 B; {7 ?( w
3.8 Sampling Theorem 189, {3 o. d- J6 r6 K
3.9 Continuity, Differentiation, and Integration 194, @! B; v9 @3 d& h9 y k
3.9.1 Continuity 194
, O) j: k8 ~1 V! } Z3.9.2 Differentiation 196( C$ Q3 S3 z; r0 C# O
3.9.3 Integrals 199
1 H2 [( J/ B' R' Y7 W3.10 Hilbert Transform and Analytic Signals 201
- Z% X v- `! U2 @( R' A& x) C3.11 Thermal Noise 205" m- r2 o3 d; n7 v0 F
3.12 Summary 2111 B; B! z; e$ P
Problems 212
- u" {6 ^" v# S, k; R3 }2 `Selected Bibliography 221
e3 g3 ^; _9 }4 }, EChapter 4 Discrete-Time Random Processes 223
% L$ H3 y- y, O" y( k; t$ s, N4.1 Introduction 223
* S: @4 |, R- C( _3 J" M; g4.2 Matrix and Linear Algebra 224
* h& C& O) Z3 b; w* r4.2.1 Algebraic Matrix Operations 224
# w- ]9 e0 g* w2 l: i4.2.2 Matrices with Special Forms 232
1 X1 M% Z9 p9 t3 d4.2.3 Eigenvalues and Eigenvectors 236# R' A/ S9 l" |, c6 w
4.3 Definitions 245( S7 y4 C/ [' W* M
4.4 AR, MA, and ARMA Random Processes 253% l5 g; T. w4 a- A; E
4.4.1 AR Processes 2545 X* ]3 J, [6 a+ O
4.4.2 MA Processes 262
* x# f4 ]1 _& W0 @3 X4.4.3 ARMA Processes 264, n# A: i3 Y. J/ K a% Q! V- F0 t
4.5 Markov Chains 2667 w# U# f: E& r/ u9 t
4.5.1 Discrete-Time Markov Chains 267
' L! F+ R$ b5 E1 t2 K4.5.2 Continuous-Time Markov Chains 276
, q b1 Q8 j0 e# N4.6 Summary 2842 b5 R# d3 r, t4 n/ T; S- Y: ~: B
Problems 284 v; e; D% x$ V$ i% ?) W( A% s' i, z3 w
References 287
) N3 v, [' ]+ X5 w- w- wSelected Bibliography 288
6 P1 a& m& `$ P8 q$ ~3 |Chapter 5 Statistical Decision Theory 289, o) u8 U/ e/ \6 j6 ?/ A
5.1 Introduction 2899 a7 p+ l8 O7 j* W, m' P& u# x& u
5.2 Bayes’ Criterion 291
) a( T o4 {8 n/ r5.2.1 Binary Hypothesis Testing 2915 U8 t q+ s3 V3 X
5.2.2 M-ary Hypothesis Testing 303$ B7 z6 w5 Y0 v( v4 m' d1 _! f( L
5.3 Minimax Criterion 313
2 g9 L7 Q' `2 B4 o" M( {+ L* K5.4 Neyman-Pearson Criterion 317
4 z9 T' }8 V' a. d) |0 K4 N h7 c5.5 Composite Hypothesis Testing 326/ ]2 y' s" i* O1 ^+ A& |7 Y) z
5.5.1 Θ Random Variable 327
- E$ J7 s K4 ^4 P7 P2 y5.5.2 θ Nonrandom and Unknown 329
7 \ D/ }& ? t' {. @, o# Q4 Q8 s# L5.6 Sequential Detection 332- `7 E2 K6 L2 F, \/ A
5.7 Summary 337: ]% y% a8 S0 z3 I f
Problems 338( K8 a/ K4 q) p) F( `" L
Selected Bibliography 343
3 P* H$ j, n, _Chapter 6 Parameter Estimation 345
! B0 i; d& Q0 e4 r. K* a6.1 Introduction 345
" O) G: a1 d8 `; p6.2 Maximum Likelihood Estimation 346
. k; ^" p5 r& p# N8 y X0 F4 T6.3 Generalized Likelihood Ratio Test 348
6 _ p$ _9 l9 \- v( @6.4 Some Criteria for Good Estimators 353; u. Y$ Y/ `" |. T
6.5 Bayes’ Estimation 355
2 X% v% D) r( k2 i' {$ g U3 F6.5.1 Minimum Mean-Square Error Estimate 357
' h @. h) X& W5 ^ u0 m3 o6.5.2 Minimum Mean Absolute Value of Error Estimate 358
+ a1 ~% [2 ?; _8 h0 H6.5.3 Maximum A Posteriori Estimate 359
1 {+ x1 Y$ P, A6.6 Cramer-Rao Inequality 364. {( ?4 X3 W! U* `
6.7 Multiple Parameter Estimation 371 j+ I7 B8 t- ]9 ^
6.7.1 θ Nonrandom 371/ T% S8 ]* W' R- @" U5 V) O
6.7.2 θ Random Vector 376
5 |6 x( p& i9 S7 ^& h) w6 @6.8 Best Linear Unbiased Estimator 378/ F" I. ?2 p i7 U, h& R4 n' p/ `
6.8.1 One Parameter Linear Mean-Square Estimation 379$ a7 K& r3 N( R9 u; V8 x9 l3 M
6.8.2 θ Random Vector 381
. D: a R! r. l/ D* T9 d' ]6.8.3 BLUE in White Gaussian Noise 383$ V$ E( I/ \7 E. [8 N. g. t! J S
6.9 Least-Square Estimation 3884 H0 G8 _3 B$ I2 I3 K" ?1 `8 }
6.10 Recursive Least-Square Estimator 391
* D1 k$ L* T5 t1 ^6.11 Summary 3935 _( a+ V3 K$ U6 L4 p8 c4 U- k
Problems 394
) k) `1 \8 f, ^: YReferences 398+ f/ p% a# ]5 b6 \0 A$ j
Selected Bibliography 398
0 V9 v6 H5 @5 EChapter 7 Filtering 399
0 t I! W2 Q& d7.1 Introduction 399
9 s4 ~0 m8 I' \, C# q: S7.2 Linear Transformation and Orthogonality Principle 400; P# X" N! Y l! R
7.3 Wiener Filters 409
+ F( Q0 C6 s0 t" m* x7.3.1 The Optimum Unrealizable Filter 410- R7 E$ r( x( Y" H$ `) G. T
7.3.2 The Optimum Realizable Filter 416
9 g' D1 D' I4 ]8 G5 K7.4 Discrete Wiener Filters 424
s5 N5 c% t4 U" E+ b7.4.1 Unrealizable Filter 425
Q( `, C% c% k- Z; n+ @' v( Y+ i& ~7.4.2 Realizable Filter 426% ~% {: l, O! E
7.5 Kalman Filter 436; L- Z4 p4 w4 Z" A' f
7.5.1 Innovations 437) n& d, F ^/ r5 Q0 q$ z
7.5.2 Prediction and Filtering 440# d7 h0 w# m: w9 q$ H, y3 Y+ d
7.6 Summary 445
# E6 M" f3 m$ R' d5 n' mProblems 445
3 d' m0 x! v1 |References 448
% {$ ?7 U" U. T- D, f( bSelected Bibliography 448" B6 r2 d* h! h, R
Chapter 8 Representation of Signals 449# p; n# e% {. e
8.1 Introduction 449
p4 r5 w0 G( v+ i& }$ K8.2 Orthogonal Functions 449( }$ F) X6 E' ^5 O% I9 o
8.2.1 Generalized Fourier Series 451
8 Z* U; e0 S! z; G a, c, Q2 A8.2.2 Gram-Schmidt Orthogonalization Procedure 455
+ F) q- h$ E& g8 o8.2.3 Geometric Representation 458
" c% l! B- E0 m8 a3 X8.2.4 Fourier Series 463" s0 k, V+ n" K4 ]" W
8.3 Linear Differential Operators and Integral Equations 466
1 d& G. i7 z, s0 j' c2 }% }8.3.1 Green’s Function 470% t' a! Y `! z$ l: I/ e
8.3.2 Integral Equations 4719 O9 I/ f j2 `/ Z
8.3.3 Matrix Analogy 479
& h* \ h! {1 S7 f$ W- j; h% H8 E8.4 Representation of Random Processes 4800 e2 q3 |( `8 Z
8.4.1 The Gaussian Process 4839 B1 {4 c* D( o6 E! E
8.4.2 Rational Power Spectral Densities 487
! O1 u0 X- y% s$ p5 ^& M b8.4.3 The Wiener Process 492* _+ U ?" q6 h) K6 R& E
8.4.4 The White Noise Process 493
$ m: I8 [2 [* N' N8.5 Summary 495
8 z4 e; N3 i: a* g* S: S qProblems 496+ n1 o p) N( f" n5 X# _) |: ]
References 500, E1 K/ n! c( q, g8 V6 H
Selected Bibliography 500: g$ r3 T1 P4 y* c! V& x( ]
Chapter 9 The General Gaussian Problem 503) ?0 }" F. R' i) `
9.1 Introduction 503
! s, G { O3 R. w9.2 Binary Detection 503; |$ n ]2 E- P6 Z
9.3 Same Covariance 505
1 M# a% G# d2 P' D g9.3.1 Diagonal Covariance Matrix 5084 d, ]" d+ N+ g. B4 i% C& D
9.3.2 Nondiagonal Covariance Matrix 511
) z. E( N1 i' F9.4 Same Mean 518
8 O, L1 D% t Q2 z9.4.1 Uncorrelated Signal Components and Equal Variances 519
( Y D+ t3 Z4 v8 F5 Q# r" z; m9.4.2 Uncorrelated Signal Components and Unequal
6 G# i! J# N9 h' |8 T/ kVariances 522
4 I& z1 F5 l8 h; W( }4 T7 o9.5 Same Mean and Symmetric Hypotheses 524
# Q6 u4 C' w* x. @9 v9.5.1 Uncorrelated Signal Components and Equal Variances 526 n; `( O! g0 ?) O) r; ^
9.5.2 Uncorrelated Signal Components and Unequal4 D! O1 U% r. b6 ?+ p
Variances 528# ^& q# S% v5 o4 w; c
9.6 Summary 529
1 E! c, d7 Z9 |1 ZProblems 530: |/ C* T, i5 J8 d" d8 k) q
Reference 532
+ B- v7 w4 G/ Q. _Selected Bibliography 5326 R0 e# I) k8 I9 F; I: _: n
Chapter 10 Detection and Parameter Estimation 5334 \+ K" c& O% p1 n: z" \5 W% e( X
10.1 Introduction 533
I% B9 h+ K S) N7 R8 ]" M. A- r10.2 Binary Detection 534
# w5 b! b0 G \0 I6 P" m+ d" A10.2.1 Simple Binary Detection 534
% P# X8 g) @0 q4 d+ R10.2.2 General Binary Detection 543! @: J: n, u, s- C
10.3 M-ary Detection 556+ ?( {- ^* b o! v w3 Y" |
10.3.1 Correlation Receiver 557
% k* n; l! o3 ]: L0 t10.3.2 Matched Filter Receiver 567
) l$ S2 F% e) F( E10.4 Linear Estimation 572& W- g0 @* Y+ y' @; m4 a
10.4.1 ML Estimation 573
+ o5 J# Z9 m5 F" k; c10.4.2 MAP Estimation 5758 Z9 |3 G8 j# g: z
10.5 Nonlinear Estimation 5761 V/ s# W( b. U. C$ `4 \
10.5.1 ML Estimation 576: e) t9 \# M# F) a
10.5.2 MAP Estimation 579 X! F7 `4 A4 }6 ~2 E) C
10.6 General Binary Detection with Unwanted Parameters 580
0 |9 }, ~- t0 ?, ^6 h1 F10.6.1 Signals with Random Phase 583
8 P2 M7 Y- g6 Y$ O) w/ \. e; @10.6.2 Signals with Random Phase and Amplitude 595% p. z: M. _! M7 q* u5 p- i
10.6.3 Signals with Random Parameters 5982 @- n+ }0 |+ N5 E" _ o
10.7 Binary Detection in Colored Noise 606
& Z& Q3 S3 Y7 E: Y% m3 }10.7.1 Karhunen-Loève Expansion Approach 607& a! f# B1 E4 d* y
10.7.2 Whitening Approach 611
! K8 n, D9 K) K3 F3 I8 @* V i10.7.3 Detection PeRFormance 615
: s+ U' u$ ?4 g4 O* R10.8 Summary 617' G6 z* ^+ R% D% V. e; f
Problems 618
0 {& W7 H- D+ ~$ M1 Q8 TReference 626% r7 u4 r# t$ F4 ?
Selected Bibliography 626
}/ t1 ]/ M- ^& ^% U( XChapter 11 Adaptive Thresholding CFAR Detection 627
( L% Z- j1 P& W. n: O11.1 Introduction 627
|, S5 s5 m% C0 K7 K8 k7 r1 H9 B11.2 Radar Elementary Concepts 6291 ^% e: o" h: j0 h4 |" m
11.2.1 Range, Range Resolution, and Unambiguous Range 631
+ f7 f X6 N/ u11.2.2 Doppler Shift 633
3 J5 D7 s0 h) e: U, R4 G11.3 Principles of Adaptive CFAR Detection 634
0 F* L/ F+ e' ^1 S' L7 X11.3.1 Target Models 6400 O, z& @' }* x r% d1 _ B
11.3.2 Review of Some CFAR Detectors 642" |; L7 B8 T. P2 k# e, \
11.4 Adaptive Thresholding in Code Acquisition of Direct-
( R& h% d/ m( M$ _" @Sequence Spread Spectrum Signals 648
) s9 c- D5 K% ?7 }' h11.4.1 Pseudonoise or Direct Sequences 649
6 k- f! z7 c! s( i8 u11.4.2 Direct-Sequence Spread Spectrum Modulation 652; s) X+ q7 n: i5 h/ d! q! {
11.4.3 Frequency-Hopped Spread Spectrum Modulation 655' `. B+ L/ ]( I7 ~/ w+ z# w
11.4.4 Synchronization of Spread Spectrum Systems 655) i: ]0 ]% {# ]$ s
11.4.5 Adaptive Thresholding with False Alarm Constraint 659
' W( `1 p7 }/ ~. |# _9 c0 `11.5 Summary 6602 }1 F. a: }6 d N, T
References 661
! v7 P m1 S( H& TChapter 12 Distributed CFAR Detection 665
- G3 G; j( e+ T/ O12.1 Introduction 665
3 \% A) L) H( Y. }! ]12.2 Distributed CA-CFAR Detection 666+ r; x& q; b" K# \2 ]1 c
12.3 Further Results 670/ I' ?) T) z4 i# Q Y. z; [8 D
12.4 Summary 671
) R! b! i& c7 z9 o. ~References 6726 V& D/ N, i A
Appendix 675
9 K# r4 U+ S: K9 f. p2 B/ tAbout the Author 683
/ _! ~5 u8 |9 ]Index 685
2 H. b3 @& H0 P7 u# c0 d$ V8 y |
|