Java自学网

 找回密码
 立即注册

QQ登录

只需一步,快速开始

查看: 4098|回复: 54

[完整版]Information Theory-Jan C.A van der Lubbe(附习题答案)

  [复制链接]

该用户从未签到

4

主题

158

帖子

306

积分

普通会员

Rank: 2

积分
306
发表于 2022-7-17 20:42:01 | 显示全部楼层 |阅读模式
1997年Jan C.A van der Lubbe所著教材]Information Theory的再版,1997年版由Cambridge University Press出版,再版由Delft Academic Press出版。每章节最后的Solutions部分附上了每张练习的答案。原版英文。( Y: @8 \) Y& a- P( p
PUBLISHED BY THE PRESS SYNDICATE OF THE UNIVERSITY OF CAMBRIDGE% V2 g+ ]; ]8 P# k
The Pitt Building. Trumpington Street, Cambridge CB2 IRP, United Kingdom
9 z" V/ }4 L1 m+ z, A% {- CCAMERIDGE UNIVERSLTY PRESS
9 ]# A# K+ i9 x: X% BThe Edinburgh Building, Cambridge CB2 2RU, United Kingdom
3 x9 a/ g& u/ w9 s* V40 West 20th Street, New York, NY10011-421L, USA/ e! Y- ?. v/ a1 ~
10, Stamford road, Oakleigh, Melbourne 3166, Australia
' ~. U( H4 f7 g) W1 y' }% AOriginally published in Dutch as informatietheorie by vSSD and a vSSD* ~3 U8 s0 q$ d+ D  z
1988
. u; K! x) E2 \  }C English translation Cambridge University Press 1997
# d7 p2 `9 t4 H  v* U( q* [  nThe book is in copyright, Subject to statutory exception and to the provisions" `9 _( n/ ~; W- b) L" M9 @: k' O: \
take place without the written permission of cambridge University Press,' k( Q0 c# `/ n5 g
of relevant collective licensing agreements, no reproduction of any part m
& [0 T3 ~( v% m+ Q4 I( `/ y6 @; uFirst published in English by Cambridge University Press 1997 as
7 p- C+ I' c6 w: v1 IInformation Theory$ k1 B) U4 C# N0 ]: z, j
Panted in the United kingdon at the University Press, Canbridge
! N: a& B" e! [# nTypeset in Times
/ h6 X* b8 Z4 i$ EA catalogue record for this book is available fiom the British Library
) S" J2 c' u& C) W+ YISBN 0 521 461987 hardback
  K. L2 |" s% ~  C" iISBN0 $21467608 paperback
5 I, Z$ d, H0 |# Z$ h, Z( z& G/ `5 G" a3 {  r
Preface
+ R9 M3 J  _0 X: L6 @% |* oOn all levels of society systems have been introduced that deal with the( [/ Y8 d/ b! t9 F4 @. z( P
transmission, storage and processing of information. we live in what is5 ~/ v+ U; U+ V% U4 G
usually called the infomation society Information has become a key word8 ?, ?' h" ~5 Z, O; T
in our society. It is not surprising therefore that from all sorts of quarters% ?: j) J6 X6 I  }! q! S
interest has been shown in what information really is and consequently in
: L- E* j( C+ R" U) zquiring a better knowledge as to how information can be dealt with as
- r" U* ]- k- j+ ]6 Ufficiently as possible# |+ h" q/ Q, _& c: Z
Information theory is characterized by a quantitative approach to the notion
) F6 A: G6 j6 q( [of information. By means of the introduction of measures for intormation
# }  f/ L. |9 @# }3 g2 S3 Y: `answers will be sought to such questions as: How to transmit and store
2 \0 K  b4 v$ U- C5 Cinformation as compactly as possible? What is the maxinum quantity of
6 Z1 X  L1 a2 X" p) f; _- rinformation that can be transmitted through a channel? How can security
( {. ?' O6 C- w+ _9 Vbest be arranged? Etcetera. Crucial questions that enable us to enhance the4 L( F: b2 l/ x* A! \
performance and to grasp the limits of our information systems2 f# f7 r( ?+ Y6 d
This book has the purpose of introducing a number of basic notions of# K/ M' q4 {5 M$ h5 b3 K
information theory and clarifying them by showing their significance in! G6 h! t4 t2 }: p
present applications. Matters that will be described are, among others:
3 a2 c) t& e9 h5 y0 Q$ C% kShannons information measure. discrete and continuous information5 r" s! \9 R5 N$ b
sources and information channels with or without memory, source and" C: ?: U' x+ n2 t: l& U' y
hannel decoding, rate distortion theory, error-correcting codes and the
# i4 w9 X' l5 ~information theoretical approach to cryptology Special attention has been1 ^' q' D: x# R' y2 B( @
paid to multiterminal or network information theory; an area with still lots
4 G! _6 |. _" z$ B; r% X8 `8 ?of unanswered questions, but which is of great significance because most of* C  ~% H. H1 V7 ?
our information is transmitted by networks8 G# Q3 B4 s% N% Z% Q- e# R
All chapters are concluded with questions and worked solutions. That0 T, i* i. c/ W5 \* L# f
makes the book suitable for self study
7 w4 R: v$ s# U+ M  H' H3 O/ X# P0 [2 ?" h! f# S+ Y: a
xii Preface
5 R% U2 y+ z# tThe content of the book has been largely based on the present lectures b, L' @7 I6 _5 d3 e
the author for students in Electrical Engincering, Tcchnical Mathematics7 P: A  l  Z8 b6 L4 O# x
and Informatics, Applied Physics and Mechanical Engincering at the Delft
$ T+ D5 B  w0 W2 ^9 Q: j6 E+ ~University of Technology, as well as on former lecture notes by profs
. R) y( h- O5 @' o/ xYsbrand Boxma, Dick Boekec and Jan Biemond. The questions have becn8 n/ z) N2 S5 ?" Z9 P: b
derived from recent exams
+ |8 ^5 F; G- V$ KThe author wishes to express his gratitude to the colleagues mentioncd* A" D. z/ Z; [2 n# m0 @8 i
above as well as the other colleagues who in one way or other contributed to2 y5 r7 q( ]: C6 ]
this textbook. Especially I wish to thank e. Prof. Ysbrand Boxma, who, z4 m/ n1 E% j) z+ j
lectured on information theory at the delft university of technology when i: c9 _  A! B$ G0 T* W
was a student and who introduced me to information theory. Under his
% |: O3 E  u9 A: ^! c2 L) H) xinspiring guidance I rcceived my M.Sc. in Elcctrical Engineering and my9 B- L7 h1 }: u- Z
Ph.D. in the techrical sciences. In writing this book his old lecture notes
" R: k* F' z7 B2 Cwere still very helpful to me. His influence has been a determining factor in/ B4 c2 m0 }  O: m. O3 k  }
my later career.- Y+ \! p' F& d9 u  ^$ R( ]" D; j
Delft, December 1996
7 J. ?. b/ Y7 M% @5 SJan C.A. van der Lubbe  _% X9 z) v: H/ L

2 X6 F8 [" \3 }1 n. O0 q4 G  WContents4 i2 n; ?7 B" R$ F9 l# j: t
reace' a- l' E* }' x- H
page xr% M3 `" Q: U( d+ }& b
Discrete information' E: F6 c8 E- W: M& e, I3 |
1.1 The ongin of information theory4 N2 E' @' O. T$ u9 _/ {1 j
2 The concept of probability  p+ R2 C  J  G
4/ e) R' }3 d1 I) E" b- f) f
1. 3 Shannon's information measure
. n9 E: V: M. e  B. o87 `( h# e$ Q  O
1. 4 Conditional, joint and mutual information measures2 L" U# \0 Z% p) L
16$ I1 Q- G$ E: k! O0 |
1.5 Axiomatic foundations; T5 R, ~7 ^. K  g- o9 {
227 S4 V: M6 n: X, u1 O' u+ n9 |
1. 6 The communication model6 f% T* W; O- Q( {" |9 L. H: q
d7 Exercises
* M& W. m' o# q+ R' y5 _273 I" a; J% h& A4 t* t( ^- S
1. 8 Solutions
* C7 z5 y9 h# J2 `; q# B) I29- Q8 N2 v+ @" x" A- F, A8 m
2 The discrete memoryless information source) X+ ^5 e( n1 L( L8 e
39
2 d$ F+ d) G% r! _2.1 The discrete information source, S5 k% J9 r9 s: Y
39
" R; J5 [; s7 ]& |0 [2.2 Source coding
( ?5 b, t2 z0 C- W' C, U2 _7 z; y, u43
1 u9 \/ B& i+ W# B! ^' d2.3 Coding strategies
7 |5 L3 J9 F2 Q: Z9 o49
5 r6 b7 ]' D% x0 y2. 4 Most probable messages
! t1 g, ?: K- N( ?/ `- U& j56
, C/ M  T+ o' e2 S4 G3 n0 ~+ Z/ X2.5 Exercises
9 w8 h3 E/ U  G# m- z& T2.6 Solutions' L0 u& l4 Q. x( R& p1 T! w
3: ?) D$ G# j$ U+ G& N1 t
The discrete information source with memory
( E6 [4 t0 ~4 T( Z: r79
7 n: N/ n: S0 v( L; `$ q: j( F3.1 Markov processes
; x. G/ C: m% R% `. W4 B790 N/ F$ D1 e  ~5 h1 E5 \
3.2 The information of a discrete source with memory
) P' R' S6 }/ k. m1 _85
2 C' f  \) H2 @; ?6 h3.3 Coding aspects( ]7 }1 Z. Q2 I( }# i& S7 y# V
9
- s# K+ V! l% T+ Q% U2 O) u3.4 Exercises2 \; e! C" n0 `$ P' ~" R2 d4 Z
95. @' e+ S3 M( w' J6 T
3.5 Solutions
- r: o2 N3 q+ ~0 P' x: N" `' I98
2 ~+ U7 F( {9 i3 B0 V9 L' U, g2 |* XThe discrete communication channel
0 S# x4 r+ G: N* J9 I) sl09  `5 w! w" Q9 i/ y: E$ L
4.1 Capacity of noiseless channels
9 l3 O" p0 g! B109
  B) L6 @+ J6 S4.2 Capacity of noisy channels) {5 Z% P5 b* R
l16
: @1 O* B. T" F4.3 Error probability and equivocation
+ b5 {( R. R" K$ S126. j+ ]) A1 C+ ^9 A8 x
: T# o# {5 y/ K/ a+ ?, G) a
vila Contents
! S6 A) O' J7 D7 t4.4 Coding theorem for discrete memory less channeis. O7 b; t" J1 Y
4.5 Cascading of channels+ c. Q& F/ O! p9 p7 B4 Y" |" P; {
133
+ F; }0 f' g8 l! R, p8 s7 L2 a4.6 Channels with memory
, y$ O+ f* d- V136
9 M: r$ @) X( H% N7 y; [47 Exercises
8 e2 N7 g& k% B& M7 z138
$ l- Q6 \7 q2 \' p4.8 Solutions
/ K# z/ [/ B; D42* l) {+ B8 j( E+ ~" u9 J% d/ R
s The continue
% M; k' Y; u7 D9 g- D- Y. J% _nformation source/ ^5 v, K8 p9 X* G' ?
155
+ O: U" @% i* ?; k1 w& X5 D3 r5.1 Probability density functions
' R3 \+ i! {5 c- A) z155! S( p  m7 q1 g$ d, G2 t! E
5.2 Stochastic signals
& E2 @  O- K" e3 l' y0 I8 i1649 x  f" M& D) D6 [
5.3 The continuous information measure
0 [7 L5 w2 Z3 e171
# S! _1 x4 V, l8 L' w* k5.4 Information measures and sources with memory
( Z( I$ o$ u, k1768 A& W9 a+ \% r+ ]
5.5 Information power& S# i% m3 ?8 N
186
$ i6 v- Z* l9 P+ R56 Exercises
  N) o6 A( q  Z: `6 g+ Q190
4 u9 B' v7 t# O# R5.7 Solutions; P- T' f& v$ I$ f8 }1 N3 I0 t! h
194
! `3 q4 r. C# E! {3 f& `6 The continuous communication changel
/ Z- K3 }7 l5 I0 d* |- m9 B209! u% d  e- a% c) X0 X( ]
6. 1 The capacity of continuous communication channe]s3 i, f  o& U! b0 c
209- Q) U8 m) B. p+ ?' n
6.2 The capacity in the case of additive gaussian white noise
% `5 W% _% z6 y; Y2141 s$ r. T* j! t$ ~. e) l
6.3 Capacity bounds in the case of non-gaussian white noise. w5 f% K8 x- G
215
+ Q$ h5 _. h) |& U' _" q9 S6. 4 Channel coding theorem4 ]' V0 ~1 r3 O3 ^/ ?
218# G+ C4 B- M2 s' \, J
6.5 The capacity of a gaussian channel with memory/ v6 ?" Q  x/ }2 T, V: e8 q
222
5 A% w6 c; ]; h6 N& J3 d6.6 Exercises
: Y: a6 M( X& x' x7 K/ I. Y227
# l/ ^# q" T/ `0 A, ?: V) Q6 E6.7 Solutions
9 y3 ^1 p( U" B2 c% q4 K229$ e0 ?9 F5 |  d2 F9 x
7 Rate distortion theory! T+ T3 y; L% ]2 A, S
238
$ h% X- Y6 t5 s' p/ u7.1 The discrete rate distortion function
3 s7 |3 d1 y; Z4 {238- J# x0 e; G2 Z6 O( v$ g* Y' E
7.2 Properties of the r(D) function' L( |- o) b5 c+ q3 {/ @
2435 N8 l/ {/ h% z+ F
7.3 The binary case
4 {2 ]" Y9 b$ U! ]% A# h; ?250
' l* j4 @5 R) U, g) r( `7.4 Source coding and information transmission thcorems
9 Q0 |  g9 I6 s4 H4 a253
1 J! n6 c0 J* ]" Q/ M" _) Q  C5 o7.5 The continuous rate distortion function
1 o9 n6 o9 B- D0 [252 }6 ^0 m7 B  ?. [
7.6 Exercise5 h! B4 |) V: E) T) @
264
/ c+ K4 D* P6 ZTI Solutions  p$ a+ ^4 k0 F3 h6 d8 d: F2 l" k: K
265! G( `. T. J5 d' ^! g
Network information theory
* h8 u1 K# ?4 F8 a268* r' a; \3 v! L, @6 W3 I: C2 n
8.1 Introduction
9 q2 G  W7 v/ x% m+ E7 ]268  R9 N/ x* G0 r! z. f+ q# r$ K
8.2 MulLi-access communication channel9 s4 F: b: ]+ n( r1 C6 y
2697 x6 v- b! \- ?# {  v
8. 3 Broadcast channels6 w$ j  s+ |  N/ e
28I
$ v: J( u6 \8 j7 @/ X% w" P5 @8.4 Two-way channels
4 X# C* d  u8 S/ m! p292, \4 J1 W- [: ~  h7 K0 G( l. n9 I
8.5 Exercises
) g, I" j; a. w, a4 e/ y4 _298* w& R; E: R9 }( l( F1 M
8.6 Solutions+ t; b$ s2 h# k, f4 Q) a& H% r6 e1 j
9 Error-correcting codes; h: s) u- G4 r
305# b3 ^0 Y) ^0 p. V; v
9.1 Introduction$ T5 E, W; m/ v0 A! r* q3 b6 H8 N
305
2 D6 W4 v# N: l4 N' }* q/ O6 n% {
Contents
: Y" Q! g  b0 [$ Z9.2 Linear block codes4 \; c; j% s/ P/ z
307
6 N7 C/ h% a' u! t$ L9.3 Syndrome coding
5 I' z$ a5 v- y. z7 \3I2
) u8 O7 o' v! T# j. D9 F8 L) |" j/ t9.4 Hamming codes9 k2 O' T* t: W; J- Z3 u
316, F- w3 \. J2 `) ?7 t( l
9.5 Exercises8 _- a: n" j* m& m
3184 Y" F/ i  N- ~9 t' J
9.6 Solutions" X. @% J  E+ G* h8 T2 x, j# n
3192 D3 r* H' r# }' J0 r! |+ I5 }
10 Cryptology9 v7 [& b! h- m/ g
324) z  M0 Z/ }0 g4 W3 B) D) V
10.1 Cryptography and cryptanalysis
" c! J6 F. @( t: V324
! e- V; c# R  K) I- a3 l10.2 The general scheme of cipher systems4 w# I- f  V2 P  u/ E
325  P* I! X$ O! H$ J) k9 F
10.3 Cipher systems
. V! _4 {' e- d( y* s. [8 X327
/ t+ t3 O9 V5 v10.4 Amount of information and security
0 x5 i  T( K! w: j9 ~3343 {" F8 h2 ~! ~; m) @+ ]/ ^. w+ e/ @& E
10.5 The unicity dislance
% |( Z6 k  }; I  K- S4 r33* i2 `; M/ _# m: H; m# ?2 d, ?
10.6 Exercises& B/ L- }' U$ l! z. W* t
340- ?  Y* l/ D- A+ x; I; u+ d: A% L! c6 i. n
10.7 Solutions
; X0 H8 p) x. A1 ]1 c" Q+ ]341
  h8 I  ^' _6 F9 @5 V2 B+ \8 iBibliography& }9 i& B7 ]4 g3 w7 P
345
# Q' j( ?! I) T+ k: Y+ eIndex- m, @$ ~& l' {% i1 G$ H5 K5 j* O
347( q; l5 M$ f) Y- b7 k
( j% Z0 v1 {& z
1
1 K! p& x7 H- r# K# h8 ~7 RDiscrete information
# ?: N/ P% h6 Q$ Y' T1.I The origin of information theory
) Y7 L7 d; @( K/ v) N9 mInformation theory is the science which deals with the concept informa2 I# O2 S. k# A& @  I0 _3 @
tion, its measurement and its applications. In its broadest sense distinction8 c6 d- x( o& \- G
can be made between the American and British traditions in information+ h5 b) K8 e: ~. O- h
theory& P6 Z' _, I, v/ z
In general there are three types of informaLion: s/ t' g8 w: Z
syntactic information, related to the symbols from which messages are$ ^& v. T4 }+ P3 D
built up and to their interrelations- n9 b$ _5 p) h/ R! a. c8 C; K/ I! h
semantic information, related to the meaning of messages, their4 ]( Q9 n5 u+ j
referential aspect, a: P) z. Q1 G/ X
pragmatic informarion, related to the usage and effect of messages
+ v& F5 H; I( x' K. x  l" z  i* ]2 C5 YThis being so, syntactic information mainly considers the form of
& X  `3 ~4 t: _4 K' v4 _nformation, whereas semantic and pragmatic information are related to the
0 N- @  h* P$ H% W! K5 Xinformation content
$ H* b! f& H* L, o  cConsider the following sentences: L: B4 }3 A2 |# T1 H
(O John was brought to the railway station by taxi
' r" K1 O; u& q; o! m(U The taxi brought John to the railway station( q1 B: R$ y9 B) Q* P
(i) There is a traffic jam on highway A3, between Nuremberg and Munich: F) B9 x8 e1 Q9 H' ]9 l
in Germany.+ A4 x" p+ {5 b
iv) There is a traffic jam on highway A3 in Germany.
1 b& g* Z$ g1 k& K% SThe sentences(i)and (ii)are syntactically different. HoweveR, semantically7 _& w; m. ?4 [$ R3 O
and pragmatically they are identical. They bave the same meaning and are
5 v0 L  }* j# y3 sboth equally informative4 I! ?( u$ }9 ]# }/ B
The sentences (ii)and (iv) do not differ only with respect to their syntax# [  @4 l& J2 i* s5 @
but also with respect to their semantics Sentence (iii)gives more precise
) E8 s' U! |! Y) q" Q! r8 Y3 cinformation than sentence (iv)
! f6 Z4 w) y" w8 I/ ?- l/ P* U: o& Z! \' x* j# b5 f3 \5 X( S  X- |% K3 u
2 Discrete information
2 h. E9 W! z) n$ O) cThe pragmatic aspect of information mainly depends on the context. The
; c5 W' Q3 B& I- k6 v- r; sinformation contained in the sentences (iii) and (iv) for example is relevant
  j5 e2 N2 k+ u- I% f  G% B& L2 U2 [for someone in Germany, but not for someone in the USA
9 L: M% T4 R( F% w# E; M+ s+ ~4 `, k, JThe semantic and pragmatic aspects of information are studied in the british
+ t7 R5 v' ^" e$ K+ x1 {( jtradition of information theory. This being So, the British tradition is closely
1 y) u9 g& k; U7 t. q' Trelated to philosophy, psychology and biology. The British tradition is
/ K5 `3 h5 u' w2 {% Xinfluenced mainly by scientists like MacKay, Carnap, Bar-Hillel, Ackoff; n. f2 Z9 Z+ D+ E1 ?  i/ n' f
and hintikka- d/ Q7 X/ q6 R* d7 b" x
The American tradition deals with the syntactic aspects of information. I2 M. ?& g) D) g; T; b* B1 _
this approach there is full abstraction from the meaning aspects of informa* _4 l" K) I$ g1 V* [
tion. There, basic questions are the measurement of syntactic information
0 T. l- H; x# l# D+ bthe fundamental limits on the amount of information which can be trans
1 K* i1 q8 q+ l/ }8 dmitted, the fundamental limits on the compression of information which can
; V0 f- a$ m/ u) v$ Cbe achieved and how to build information processing systems approaching, `. C7 x' V/ s5 o
these limits. A rather technical approach to information remains
. {' t0 e3 g( C( w. lThe American tradition in information theory is sometimes referred to as
+ U+ h5 R4 z& N! U# s$ Bcommunication theory, mathematical information theory or in short as
9 B# d# a" N/ c: d1 einformation theory. Well-known scientists of the American tradition are( k' W- ~7 L( u) m/ k2 L& \
Shannon, Renyi, Gallager and Csiszar among others
) n, j. ~- k+ _7 d/ {3 n- @, cHowever, Claude E. Shannon, who published his article"A mathematical
6 v1 D0 r7 v) O' O5 H9 `" V' L/ G0 Utheory of communication in 1948, is generally considered to be the founder
8 L6 l7 \& g! ~) y+ qof the American tradition in information theory. There are, nevertheless, a
6 J6 j* n2 K  s+ s" Fnumber of forerunners to Shannon who attempted to formalise the efficient
  K) `7 `# f( l. o7 e  euse of communication systems.
2 C2 R( H  ~! E0 A3 m, x$ qIn 1924 H. Nyquist published an article wherein he raised the matter of how
, {" j" l. {# I; C/ r1 bmessages (or characters, to use his own words) could be sent over a
7 N. r8 L* C, l: i7 {telegraph channel with maximum possible speed, but without distortion. The3 S" u, O! D' k" N8 I
term information however was not yet used by him as such
% `9 n/ ^, ~$ ~It was R.Y. L. Hartley(1928)whc first tried to define a measure of
9 _* t% L6 T+ m: _9 @$ {information. He went about it in the following manner.
" [8 u% r2 T  F$ t* E7 p6 @* kAssume that for every symbol of a message one has a choice of s
6 A. r: @4 O$ h3 B4 L5 Ppossibilities. By now considering messages of I symbols, one can distinguish
0 ^) F4 s8 M9 o0 A$ D. Y- @6 Hs messages. Hartley now defined the amount of information as the
0 ]/ }. s' [$ alogarithm of the number of distinguishable messages. In the case of
  W8 t4 L6 R/ \# m: E1 Imessages of length I one therefore finds, }5 r9 [) M0 V& a3 C% ~0 m5 X$ O
HH(s)=log[sF=I log(s).
9 D& K( Q7 }5 T* l  ~, TFor messages of length 1 one would find2 r- h. L( G- f% H* d0 |. ~% q$ l; i
" f" U. |; ?  \0 g6 A
# d8 B( }% {0 C) G$ a# V
9 V6 M2 U7 U2 Q% R
- v$ g( z4 b' G- P& N
资源下载地址和密码(百度云盘):
游客,如果您要查看本帖隐藏内容请回复
[/hide] 百度网盘信息回帖可见! i) r. N1 S) h. q, g8 r7 N7 ~

" p3 l5 J# @. _; E3 T, y/ Q) V7 v7 X7 ]$ F) o- W
+ \; z6 L1 v  D9 c8 `. _' Y- t! [
本资源由Java自学网收集整理【www.javazx.com】
回复

使用道具 举报

该用户从未签到

1

主题

129

帖子

223

积分

普通会员

Rank: 2

积分
223
发表于 2022-7-17 20:45:18 | 显示全部楼层
强烈支持楼主ing……
回复 支持 反对

使用道具 举报

该用户从未签到

0

主题

4096

帖子

8194

积分

普通会员

Rank: 2

积分
8194
发表于 2022-7-27 11:34:51 | 显示全部楼层
大佬  厉害呀
回复 支持 反对

使用道具 举报

该用户从未签到

0

主题

4128

帖子

8257

积分

普通会员

Rank: 2

积分
8257
发表于 2022-8-20 11:22:45 | 显示全部楼层
多谢分享~~~~~~~~~~~~~
回复 支持 反对

使用道具 举报

该用户从未签到

0

主题

4115

帖子

8232

积分

普通会员

Rank: 2

积分
8232
发表于 2022-8-28 08:08:25 | 显示全部楼层
居然在这里找到了 不易
回复 支持 反对

使用道具 举报

该用户从未签到

0

主题

4102

帖子

8206

积分

普通会员

Rank: 2

积分
8206
发表于 2022-9-1 18:38:32 | 显示全部楼层
非常好非常好非常好非常好非常好非常好
回复 支持 反对

使用道具 举报

该用户从未签到

0

主题

4099

帖子

8200

积分

普通会员

Rank: 2

积分
8200
发表于 2022-9-12 12:56:31 | 显示全部楼层
学习下
回复 支持 反对

使用道具 举报

  • TA的每日心情
    开心
    2015-6-4 18:31
  • 签到天数: 1 天

    [LV.1]初学乍练

    0

    主题

    4137

    帖子

    8295

    积分

    普通会员

    Rank: 2

    积分
    8295
    发表于 2022-9-15 20:41:10 | 显示全部楼层
    学习了学习了学习了
    回复 支持 反对

    使用道具 举报

    该用户从未签到

    0

    主题

    4189

    帖子

    8378

    积分

    普通会员

    Rank: 2

    积分
    8378
    发表于 2022-9-16 06:44:36 | 显示全部楼层
    资源持续更新 牛
    回复 支持 反对

    使用道具 举报

  • TA的每日心情
    慵懒
    2015-7-28 14:19
  • 签到天数: 1 天

    [LV.1]初学乍练

    0

    主题

    4180

    帖子

    8378

    积分

    普通会员

    Rank: 2

    积分
    8378
    QQ
    发表于 2022-9-21 17:50:38 | 显示全部楼层
    感谢感谢感谢感谢感谢感谢感谢感谢
    回复 支持 反对

    使用道具 举报

    您需要登录后才可以回帖 登录 | 立即注册

    本版积分规则

    QQ|Archiver|手机版|小黑屋|Java自学网

    GMT+8, 2024-11-27 09:53 , Processed in 0.108793 second(s), 29 queries .

    Powered by Javazx

    Copyright © 2012-2022, Javazx Cloud.

    快速回复 返回顶部 返回列表