2024年6月5日发(作者:希礼)
3.5 AEP. Let
X
1
,
X
2
,
be independent identically distributed random variables
1,2,m
. Thus drawn according to the probability mass function
p(x),x
p(x
1
,x
2
,,x
n
)
p(x
i
)
i1
n
n
1
logp(X
1
,X
2
,,X
n
)H(X)
. We know that
n
in probability. Let
q(x
1
,x
2
,,x
n
)
q(x
i
)
i1
1,2,m
. , where q is another probability mass function on
1
lim
logq(X
1
,X
2
,
,X
n
)
n
(a) Evaluate , where
X
1
,
X
2
,
are i.i.d. ~
p(x)
.
8.1 Preprocessing the output. One is given a communication channel with
Cmax
p(x)
I(X;Y)
transition probabilities
p(y|x)
and channel capacity . A helpful
statistician preprocesses the output by forming
Yg(Y)
. He claims that this will
strictly improve the capacity.
_
(a) Show that he is wrong.
(b) Under what condition does he not strictly decrease the capacity?
8.3 An addition noise channel. Find the channel capacity of the following
discrete memoryless channel:
Pr
Z0
Pr
Za
1
2
. The alphabet for
x
is
X
0,1
. Assume that
Z
is Where
independent of
X
. Observe that the channel capacity depends on the value of
a
.
8.5 Channel capacity. Consider the discrete memoryless channel
2, 3
1,
Z
1/3, 1/3, 1/3
YXZ(mod 11)
, where
and
X
0,1,,10
. Assume that
Z
is
independent of
X
.
(a) Find the capacity.
(b)
*
p
What is the maximizing
(x)
?
8.12 Time-varying channels. Consider a time-varying discrete memoryless
channel. Let
Y
1
,Y
2
,Y
n
be conditionally independent given
X
1
,X
2
,X
n
, with
conditional distribution given by
Y(Y
1
,Y
2
,Y
n
)
p(y|x)
p
i
(y
i
|x
i
)
i1
n
. Let
X(X
1
,X
2
,X
n
)
,
. Find
max
p(x)
I(X;Y)
.
h
X
flnf
9.1 Differential entropy. Evaluate the differential entropy
following:
f
x
e
x
,x0
for the
(a) .
(b)
1
x
f
x
e
2
(c) The sum of X1 and X2, where they are independent normal random
variables with means
1
and variance
t
2
,t=1,2.
2024年6月5日发(作者:希礼)
3.5 AEP. Let
X
1
,
X
2
,
be independent identically distributed random variables
1,2,m
. Thus drawn according to the probability mass function
p(x),x
p(x
1
,x
2
,,x
n
)
p(x
i
)
i1
n
n
1
logp(X
1
,X
2
,,X
n
)H(X)
. We know that
n
in probability. Let
q(x
1
,x
2
,,x
n
)
q(x
i
)
i1
1,2,m
. , where q is another probability mass function on
1
lim
logq(X
1
,X
2
,
,X
n
)
n
(a) Evaluate , where
X
1
,
X
2
,
are i.i.d. ~
p(x)
.
8.1 Preprocessing the output. One is given a communication channel with
Cmax
p(x)
I(X;Y)
transition probabilities
p(y|x)
and channel capacity . A helpful
statistician preprocesses the output by forming
Yg(Y)
. He claims that this will
strictly improve the capacity.
_
(a) Show that he is wrong.
(b) Under what condition does he not strictly decrease the capacity?
8.3 An addition noise channel. Find the channel capacity of the following
discrete memoryless channel:
Pr
Z0
Pr
Za
1
2
. The alphabet for
x
is
X
0,1
. Assume that
Z
is Where
independent of
X
. Observe that the channel capacity depends on the value of
a
.
8.5 Channel capacity. Consider the discrete memoryless channel
2, 3
1,
Z
1/3, 1/3, 1/3
YXZ(mod 11)
, where
and
X
0,1,,10
. Assume that
Z
is
independent of
X
.
(a) Find the capacity.
(b)
*
p
What is the maximizing
(x)
?
8.12 Time-varying channels. Consider a time-varying discrete memoryless
channel. Let
Y
1
,Y
2
,Y
n
be conditionally independent given
X
1
,X
2
,X
n
, with
conditional distribution given by
Y(Y
1
,Y
2
,Y
n
)
p(y|x)
p
i
(y
i
|x
i
)
i1
n
. Let
X(X
1
,X
2
,X
n
)
,
. Find
max
p(x)
I(X;Y)
.
h
X
flnf
9.1 Differential entropy. Evaluate the differential entropy
following:
f
x
e
x
,x0
for the
(a) .
(b)
1
x
f
x
e
2
(c) The sum of X1 and X2, where they are independent normal random
variables with means
1
and variance
t
2
,t=1,2.