|
1997年Jan C.A van der Lubbe所著教材]Information Theory的再版,1997年版由Cambridge University Press出版,再版由Delft Academic Press出版。每章节最后的Solutions部分附上了每张练习的答案。原版英文。
$ z U+ u' b2 ^6 j$ TPUBLISHED BY THE PRESS SYNDICATE OF THE UNIVERSITY OF CAMBRIDGE+ X( B: E% M8 B$ ~0 m7 {
The Pitt Building. Trumpington Street, Cambridge CB2 IRP, United Kingdom6 T* |8 {3 Z5 [7 i' ~. @# ?# J
CAMERIDGE UNIVERSLTY PRESS' X# o1 _( O! s, V1 ^0 c0 @
The Edinburgh Building, Cambridge CB2 2RU, United Kingdom( a$ q( L' k: j5 w3 S; u
40 West 20th Street, New York, NY10011-421L, USA! `7 S& p- W) t% ?4 r5 ~! b% V
10, Stamford road, Oakleigh, Melbourne 3166, Australia
+ P, j" \) @1 s. h2 p3 eOriginally published in Dutch as informatietheorie by vSSD and a vSSD! P7 c5 F" Y m* r, c; R: C
1988: R5 u2 E* P, o+ n9 j! c1 u% z9 ^* P, K
C English translation Cambridge University Press 1997: u, p/ z2 M# S4 {6 \ x
The book is in copyright, Subject to statutory exception and to the provisions
) ]4 ~/ Y# E; Q" {8 @take place without the written permission of cambridge University Press,( a; l; u+ } P& i& C
of relevant collective licensing agreements, no reproduction of any part m- \9 v& v! k2 V" M0 B/ h9 c
First published in English by Cambridge University Press 1997 as
, i H a m) eInformation Theory( ~- F3 Z! N- ]9 }
Panted in the United kingdon at the University Press, Canbridge
8 k# d" t* J! ]4 a9 FTypeset in Times
, g) X9 x6 |. Z% Z0 [A catalogue record for this book is available fiom the British Library
( [% g0 W7 ?# l( n DISBN 0 521 461987 hardback
) X F7 X0 g/ [" D/ T. p) VISBN0 $21467608 paperback" |4 O0 [9 @0 |; T
; A: i: h7 i* m7 }8 \ N9 u. n
Preface8 V2 [4 x0 e: T- Z% _
On all levels of society systems have been introduced that deal with the( e6 W7 g; ]& u6 |9 D
transmission, storage and processing of information. we live in what is6 ~6 p6 z: q* Y2 x9 m) A
usually called the infomation society Information has become a key word8 V$ Z- v ~5 M, K. _$ f( d3 o+ S
in our society. It is not surprising therefore that from all sorts of quarters2 ?4 N3 K0 H5 U6 ?( G; Q
interest has been shown in what information really is and consequently in
* Z7 K' [, l) E7 r9 Q5 R9 uquiring a better knowledge as to how information can be dealt with as
( b" s' G6 h, P! n$ _( I+ {fficiently as possible& I6 Q6 Q/ r, n7 c* K$ h! j. W# f
Information theory is characterized by a quantitative approach to the notion
3 H$ h2 ]; K% L$ uof information. By means of the introduction of measures for intormation1 w6 W1 ~% m7 H! n1 ~
answers will be sought to such questions as: How to transmit and store1 |: ~! k1 o! }+ k3 L
information as compactly as possible? What is the maxinum quantity of
- i- }% Q2 K. I0 D" X, q/ Zinformation that can be transmitted through a channel? How can security
6 O- ?& ~8 R% P+ T+ [& b1 R' Q' G/ Wbest be arranged? Etcetera. Crucial questions that enable us to enhance the
2 s& S' ]# }) C1 R5 J8 p8 gperformance and to grasp the limits of our information systems, A7 j8 z/ T: v6 \4 i( N
This book has the purpose of introducing a number of basic notions of4 `- Y5 K) J% | S. {* M, z7 g, W- K
information theory and clarifying them by showing their significance in- R% U) m- o" M$ N* }: ~
present applications. Matters that will be described are, among others:' c- P9 G+ c& v3 F7 `. g
Shannons information measure. discrete and continuous information
% R' Y4 E) {& k* }sources and information channels with or without memory, source and; ]% [' |- G0 i6 [6 G8 v
hannel decoding, rate distortion theory, error-correcting codes and the# d+ h; q* j' {! V1 l! s H5 ^6 q3 Z
information theoretical approach to cryptology Special attention has been
" x" ~4 i. M5 [2 Tpaid to multiterminal or network information theory; an area with still lots
, r g+ Y( y: w& r& _) l6 {of unanswered questions, but which is of great significance because most of- v r5 U0 t h G7 W2 l
our information is transmitted by networks
0 k D! V% \' _! m& oAll chapters are concluded with questions and worked solutions. That
1 k! P( R- F0 i' k+ Umakes the book suitable for self study
, h/ E$ K* m: m5 f6 |0 G5 H6 D$ |5 |0 s
5 _9 J, c3 I" _$ n( u9 sxii Preface
$ B4 d9 p1 A/ V: j! gThe content of the book has been largely based on the present lectures b
6 _" k1 T. W# @7 a$ y$ H6 d0 Sthe author for students in Electrical Engincering, Tcchnical Mathematics
& F* n7 D( r1 M* }# Sand Informatics, Applied Physics and Mechanical Engincering at the Delft
6 R3 u* _7 c6 a' J0 C# i8 ?: L0 AUniversity of Technology, as well as on former lecture notes by profs
) g h2 r3 m& |, L# DYsbrand Boxma, Dick Boekec and Jan Biemond. The questions have becn# m$ a# o# [% d4 y a$ ^/ Z" n& O
derived from recent exams* j$ t3 H0 I7 A/ I# [
The author wishes to express his gratitude to the colleagues mentioncd
2 f: A/ q# n, j& dabove as well as the other colleagues who in one way or other contributed to- N# y/ D k, N
this textbook. Especially I wish to thank e. Prof. Ysbrand Boxma, who
4 h3 ?6 {) M6 ylectured on information theory at the delft university of technology when i5 c" J5 E& e3 j( ]
was a student and who introduced me to information theory. Under his" h Z9 z5 A* W R
inspiring guidance I rcceived my M.Sc. in Elcctrical Engineering and my
$ d" c" H3 \2 Y& n5 B2 a( DPh.D. in the techrical sciences. In writing this book his old lecture notes
' \# k" M- q/ `4 }; M( m2 N; dwere still very helpful to me. His influence has been a determining factor in$ Z* l1 g2 L' ~4 [
my later career.
+ O4 Z5 f2 b% }$ ]Delft, December 1996" s. Q; |# X8 m; u
Jan C.A. van der Lubbe
& s7 H$ }: ^2 \3 i# w! N+ ?/ ~# [' }: M, p5 m: S4 K" Z
Contents4 }1 p: ~6 O) h
reace
+ q, o, ^' i& ^# c" Epage xr
2 |0 Y3 O! B* Z* cDiscrete information0 j- K0 K) D2 T3 q& v' A* w
1.1 The ongin of information theory
; U0 v5 |) P# Q4 A1 S, Z2 The concept of probability
9 ^- ~& S4 ~5 ?# Y+ ?* h4
7 g. n% r1 h) K$ @$ ]9 T+ W1. 3 Shannon's information measure
, u$ y' P2 l8 t2 k8
; F8 Y. y2 b; ^8 M" s4 A( Y; H1. 4 Conditional, joint and mutual information measures+ H/ ^# i7 ?" W- Q
164 j/ y% f; K, R$ W8 U
1.5 Axiomatic foundations
. @! n$ _. d7 T X0 a226 h, u) h. e4 i: E
1. 6 The communication model
; n* [1 u6 u# Ed7 Exercises F; t# Z) p$ ~; L" M
27+ }, P+ g& G. c# r* H( a) N1 t
1. 8 Solutions
3 _: m! O; L, y29
k5 O: ?* H# q- F( j, i/ M2 The discrete memoryless information source
8 f& [0 L: }& F! E2 Y9 v3 J39
( ?1 R; c0 I9 Z: z' V2.1 The discrete information source
- y+ H0 z( Z& @39' e; |1 ^; ]+ J* ~6 l
2.2 Source coding& S2 h% _) c( b! g% W; ?
43
+ `) L6 D3 _4 G3 N. X2 x2.3 Coding strategies) O. Q1 x" O. y: \- o
49, i, e) N7 m1 q, O+ o
2. 4 Most probable messages6 C& |8 B' h. ?+ y
56! q1 ?8 m* k* r7 p' [5 _
2.5 Exercises
! J0 b- E9 f; u# g9 V" n' B* Z2.6 Solutions
+ p* T/ }, \+ m# F3
) Y+ K, i' @# i3 |The discrete information source with memory/ X8 M1 N8 ^) P& V
79
6 V5 E( k6 b* ? @5 t5 c3.1 Markov processes
" i9 b8 w- f$ g) H. V2 @( v79 r, v4 Z: X' ?* |
3.2 The information of a discrete source with memory2 u4 K! q3 J& z* l$ ]! M$ E
85# p5 ^$ A( _$ S" [. U* N
3.3 Coding aspects* h; d$ F- s0 ~; x$ \
9" E! h0 H# ^0 s
3.4 Exercises, L7 V8 S. | r0 a- }& [: K' @
953 X/ T ?, i f
3.5 Solutions
~; T. i/ M+ E98
P" t& [/ M @% K; { U" b& KThe discrete communication channel) H; n, y0 l% V1 M5 C8 r
l09
2 H2 e+ q7 N* J4.1 Capacity of noiseless channels8 G1 _8 V, k _: N8 f
1092 d9 ~$ V8 ]+ s+ O
4.2 Capacity of noisy channels$ ^8 M p. y9 |
l16
- _3 S% W. j7 ^, R# l% {' Y6 ~1 H4.3 Error probability and equivocation1 {6 r0 m+ p: u/ {2 D8 R, ~- e
126! s) K3 V1 p6 ~- ~! a
# B; O: K8 J& b" p/ D; avila Contents! V# d+ u: ]% j
4.4 Coding theorem for discrete memory less channeis
% |7 f$ }: s. c4.5 Cascading of channels1 G) p5 U& b8 y3 v0 u
133) G) n. v" |, \& B* c7 G
4.6 Channels with memory1 ?8 [) H7 O4 @
1369 a/ A. @- _3 o4 u6 z, p
47 Exercises |) S: T' W5 _3 c
1388 S" Q2 C$ a" u$ w. B8 ^9 b
4.8 Solutions
4 S( l# s4 D; i* s/ @42
: D2 k) a2 I5 m# ks The continue
/ H o: S% a" e+ c8 }# B+ gnformation source5 X' s: p3 d5 f/ |
155
% t, E& B- z% M8 q# y# Y' {, s5.1 Probability density functions2 q6 `3 N/ `8 H; {$ j
155
, j' H; a' C5 `6 N6 }' I, U1 U! y$ Z7 Z5.2 Stochastic signals6 W) q, N G+ G. ?2 E
164
6 ?5 l$ ^: k& o+ } k$ k5.3 The continuous information measure
7 y: ^4 P" C* g, Y! ^ m& N! }; {1713 ]; e4 y- u: V7 R5 L' ]
5.4 Information measures and sources with memory2 L6 R4 b$ j7 [
176
2 g8 p* R* @! i6 f3 b4 P" q# O5.5 Information power
$ ^: F4 @* B, k186
9 Z7 O+ Y5 Q2 p6 q. ~56 Exercises: d7 Q2 ^3 O g2 T
1902 H7 V+ @. T+ Q7 f
5.7 Solutions
8 n$ z" R: l: ^# e# o194
A2 E$ b0 @5 H& L9 i! k0 }& f6 The continuous communication changel; u5 w2 R5 ^" m, l& `9 S
209
. N9 D3 Y; h, A& ?! N6. 1 The capacity of continuous communication channe]s1 r- n' o) a4 @1 x S
209
# r' V8 q5 g& V; Q5 _7 |6.2 The capacity in the case of additive gaussian white noise, f; D% S$ U* T- z
214
! U3 n3 K6 r2 V' z1 \6 {: _6.3 Capacity bounds in the case of non-gaussian white noise! ?. F# `- [! \) k* w/ X! F
215! ?5 [& A1 m4 f$ P% r8 T6 G
6. 4 Channel coding theorem
. i' u1 V( l0 }2 t( r218
) H+ ?, ^7 g2 h6.5 The capacity of a gaussian channel with memory
4 @* \" T: C- w" O3 u4 E W6 g! U' T222
- x- }' M' \" o6.6 Exercises3 d4 w% [! ?) B$ B9 P* X% ?
227
1 Y o1 s+ W. k; \1 R+ q/ @6.7 Solutions* F: E9 \# q |/ N/ C" f, |
229
+ j9 f* X2 Q/ ~/ w7 Rate distortion theory
) j. y9 G( f# F" j5 ` h* b$ O238
2 {. }( U0 U1 W$ E8 s! C' y# g7.1 The discrete rate distortion function
+ C: `8 U% z" i* Y( _+ t238 n' b+ e6 z0 |# K. C" a8 l9 ~. B- p
7.2 Properties of the r(D) function1 O- s( L% s3 C5 z' v- D }
243: j, p, z# K& n; H+ {
7.3 The binary case6 E R- {& |! a# x
250( n; p5 v3 e; K8 M1 \6 z& O
7.4 Source coding and information transmission thcorems
, @4 t1 m6 ?/ e* G b253
& s+ m7 B# D& x, g* M7.5 The continuous rate distortion function) M2 w& P0 p$ k. D
25
. f- g& h5 h% y# m8 ^7.6 Exercise4 h- X1 w5 @% _! u) `
264
& u5 p5 b1 k" I0 s5 @ \7 }; F7 w! \TI Solutions' h& o2 w$ x6 e z; T! N
265
* [6 A2 }" ?5 {+ {Network information theory
{3 K1 O& }+ Q% g1 Y268
/ Z$ w7 e/ ^$ z6 @8.1 Introduction
7 N, C1 X6 ^' l& R2 a, h5 P0 E6 a268( A, b$ j Q' G- q; Y$ Q4 Z8 |
8.2 MulLi-access communication channel
# e! d( v& W! y3 ~( \, d, Z( `269, k& l0 z! q& J$ `
8. 3 Broadcast channels
1 f; T# s, H6 E' k28I. v q/ R; W0 t2 U
8.4 Two-way channels( p* o) y- M- R- e3 Z7 Z
2922 e9 e# A6 x; L7 \+ z, `; Z
8.5 Exercises
, H1 B3 t8 N( i0 P p' m' |298
8 S0 q$ T6 @4 k9 K8.6 Solutions, K, f0 q, w( L
9 Error-correcting codes$ `' ]! E6 W" X2 X2 |8 E9 ]3 [
305
. \- `2 P" R) _4 I# P9 ~$ y' D9.1 Introduction: c1 U8 H7 k4 ^5 L% t6 W
305/ E5 A- W, T% Y6 |7 `0 n
. `. ]' E1 r7 Z0 B8 O, n) MContents4 q5 h! [6 V: G, ]& h: a4 o
9.2 Linear block codes0 O% u" L( K3 ]. @" H8 i
3076 M4 i8 v% I# ^6 n& i
9.3 Syndrome coding
! A! Z4 h5 q0 h$ ]% e7 Y+ P3I2) r) N+ v# D8 V( y4 h
9.4 Hamming codes6 e S& Y* k. P, f( q: ~& u
316
+ j! M3 F, T2 w% Q5 O; O9.5 Exercises& ~3 C. H6 `% k7 m0 \; m* f6 X0 X) h
318
: B4 l! h# j) j/ v* B7 |# k9 x9.6 Solutions
% V- V6 b) v" ]2 A319( P9 M' y5 G" |
10 Cryptology! ~( c4 X6 M, y- {
324
0 F6 ~. Y+ E' V10.1 Cryptography and cryptanalysis. v" w/ I6 H; W( X
324
8 W( d/ w2 u* c0 J10.2 The general scheme of cipher systems
5 A" O; ~, {& X6 c* ~325
, m0 }+ ^( }3 j$ i10.3 Cipher systems: H5 @! [7 b0 F+ j4 E
327
0 Q4 H( d& i4 g. U, k/ r10.4 Amount of information and security
5 h; l1 }" f; r334* f+ q9 Q' d9 _- F6 H+ l8 f- ^: R
10.5 The unicity dislance4 Y9 ]& x) S. r$ }
33
& I9 P1 ^9 l1 E1 O10.6 Exercises
9 n( W' `% o" y) z' G. y/ w340
2 Y4 ]/ g5 f5 M7 W& E X10.7 Solutions8 \9 `/ j. _( F8 e$ K, I" |
341
9 o6 J7 v. g+ Q& q5 I8 GBibliography
4 U" s N& A! ]% Z6 a345
' D: d7 o' \* T2 e6 J* [Index6 t% p, G7 E# ^/ x$ R7 x7 U' U+ E
347! v$ @: l, t5 Y# U ~
" U l5 O: U4 q9 o" _3 [! t1 r1/ F5 U" P/ [6 B7 K9 a e
Discrete information
/ v& ^6 x% q# D& g" h1.I The origin of information theory
2 L; V" ~ e6 @; `Information theory is the science which deals with the concept informa, c& N M8 `9 P' ]4 P" [, Z
tion, its measurement and its applications. In its broadest sense distinction0 [7 N4 u9 C2 @8 T
can be made between the American and British traditions in information7 g8 g+ B8 j- Y8 b$ \1 q T0 q
theory
3 m9 D" y- b/ \% M; iIn general there are three types of informaLion
7 P* E. G! }' c/ Dsyntactic information, related to the symbols from which messages are: V5 F' P, b6 X4 `8 @
built up and to their interrelations
2 |) T, L9 g0 Z& v& }5 ^# Qsemantic information, related to the meaning of messages, their
. d/ N1 ]+ B8 U; Areferential aspect6 s8 a0 z( K$ J5 Y. ?( F" z: q U8 ^
pragmatic informarion, related to the usage and effect of messages8 b4 T, ~: H5 K3 Y
This being so, syntactic information mainly considers the form of2 |* W p7 {1 r5 q. B( P& f
nformation, whereas semantic and pragmatic information are related to the
( V% F0 m+ L8 Sinformation content6 u% [" }( t' ]2 ]5 X
Consider the following sentences& ~9 ]5 e) B$ h; t0 Z6 Y
(O John was brought to the railway station by taxi( K& K0 h0 D5 n4 W0 S+ ` h
(U The taxi brought John to the railway station& Z0 z, l' e8 o% Q! r. q! O
(i) There is a traffic jam on highway A3, between Nuremberg and Munich
& |, J' d. c7 L* A2 L. X% Kin Germany.: J8 q. k: F3 Z X* E# c
iv) There is a traffic jam on highway A3 in Germany.. F0 d+ [/ t7 C- I
The sentences(i)and (ii)are syntactically different. HoweveR, semantically4 F; z$ P5 q& k2 |6 V
and pragmatically they are identical. They bave the same meaning and are
; M* u; J b. o0 uboth equally informative
" e1 U# [$ U% d- {% i2 e5 e. U, mThe sentences (ii)and (iv) do not differ only with respect to their syntax7 L! S4 C0 B7 Z5 A6 G4 J
but also with respect to their semantics Sentence (iii)gives more precise
5 V# ?6 n% D9 y0 K% W5 [5 Tinformation than sentence (iv)
0 F1 C# P$ S: X' l* z: z
4 L, g+ O% r: J2 Discrete information& Z! |3 P* ~4 G r% n% ^4 r% }
The pragmatic aspect of information mainly depends on the context. The
/ o& M: _, w. ninformation contained in the sentences (iii) and (iv) for example is relevant: l" X# L+ ` D0 ~/ E3 w2 V1 V
for someone in Germany, but not for someone in the USA" j0 \0 r0 g) Q B3 }" {% m( ^6 o
The semantic and pragmatic aspects of information are studied in the british9 N& [+ z' Q" S" Z1 g9 w
tradition of information theory. This being So, the British tradition is closely
8 g4 B6 x, z& s6 l2 D( R2 arelated to philosophy, psychology and biology. The British tradition is% {$ S3 g9 d1 l: z. z+ Y
influenced mainly by scientists like MacKay, Carnap, Bar-Hillel, Ackoff# D; L/ _2 s5 x* ]+ u
and hintikka
$ @5 m, y \: g, \: qThe American tradition deals with the syntactic aspects of information. I
9 |1 `$ u _9 F. d% G( W+ A' Cthis approach there is full abstraction from the meaning aspects of informa
! {% Q# Q' w$ F v9 ~4 ction. There, basic questions are the measurement of syntactic information, I2 n- c. U3 N. B8 \& ~/ K3 Z3 i5 L
the fundamental limits on the amount of information which can be trans0 O5 I A# W% H# f9 z
mitted, the fundamental limits on the compression of information which can
) u. l2 ]& y* u- \be achieved and how to build information processing systems approaching. ]1 g6 X* B* q
these limits. A rather technical approach to information remains
) ^% ^4 b- D& O3 j& i8 e z. PThe American tradition in information theory is sometimes referred to as! b; `" n+ O9 p: m
communication theory, mathematical information theory or in short as
: @# i. I2 W; S" X, sinformation theory. Well-known scientists of the American tradition are
- g6 I1 p& U' IShannon, Renyi, Gallager and Csiszar among others, m; A. C. F1 K7 W/ y% Y$ d
However, Claude E. Shannon, who published his article"A mathematical
/ d, E' S; H8 j' a% @5 Z9 Ctheory of communication in 1948, is generally considered to be the founder
0 T7 i, R* M0 S, p2 ?0 X2 A- |of the American tradition in information theory. There are, nevertheless, a, | @+ B% R* w# G
number of forerunners to Shannon who attempted to formalise the efficient
9 y) Y8 b( h+ E5 W3 |6 \/ i8 I9 Ause of communication systems.# F( z2 Q! f% x. e
In 1924 H. Nyquist published an article wherein he raised the matter of how0 h. h$ j) q1 ]" t: ] T4 h$ r
messages (or characters, to use his own words) could be sent over a
! |6 F0 Q* A0 |1 Y8 |telegraph channel with maximum possible speed, but without distortion. The
0 j' Z( Q _9 xterm information however was not yet used by him as such6 N. z; P8 Q. V4 B
It was R.Y. L. Hartley(1928)whc first tried to define a measure of
: ~0 X; \: C% a. U& qinformation. He went about it in the following manner.7 o. x# k1 z5 F" g& t k
Assume that for every symbol of a message one has a choice of s
- b) W+ u: V8 t! g0 H/ t) c1 Ipossibilities. By now considering messages of I symbols, one can distinguish
* C" c; R7 o% |. @, Ys messages. Hartley now defined the amount of information as the5 y! S; o7 e3 s- z- V) D/ i
logarithm of the number of distinguishable messages. In the case of4 Z( j4 }! s# x- n8 P( a
messages of length I one therefore finds
4 ^5 h% b2 H0 x1 E# V& ^) ?HH(s)=log[sF=I log(s).
" ?8 J) q9 f1 W, z6 [" ?7 `* RFor messages of length 1 one would find
5 _& |5 r! E2 _- J, N* x- P2 H1 ?+ l9 i( F
( o _) e4 [- | C0 M" h
+ C; S% f$ f, m4 k* ~ B6 m! Z3 K' }+ n
资源下载地址和密码(百度云盘): [/hide] 百度网盘信息回帖可见9 B# N1 X) i/ k1 k/ V$ m3 a
, T5 O0 r/ @ J, ?: f7 N
$ C% x9 y! }6 }+ E% F1 c
6 t" p+ ]8 J' _9 V' Y2 E
本资源由Java自学网收集整理【www.javazx.com】 |
|