Activity

  • Helen Rao posted an update 6 years, 6 months ago

    Containing input MedChemExpress RRx-001 symbols and G = g1 , g2 , . . . , gm , containing output symbols, where can also be m = n. Since we don’t know which symbol fi generated the symbol gj , the characterization of the channel is given by the set P fi gj . Now we can define the mutual contagion I fi ; gj = P(fi |gj ) 1 1 log P f – log P f |g = log P(fi ) , where P(f i ) is an “a ( i) ( i j) priori” estimate. Thus: I fi ; gj = I(gj ; fi ); I fi ; gj = log P fi gj + I fi ; I(fi ; gj ) I fi ; P fi ; gj = P fi P gj I fi ; gj = 0. = Now since I F; gj = i P fi gj I fi ; gj P(fi |gj ) , and I fi ; G = i P fi gj i P fi gj log P(fi ) P(g |f ) log P jg i , we will have I (F; G) = i P fi I fi ; G = ( j) P(fi |gj ) i j P fi ; gj log P(fi )P(gj ) = I (G; F) . Then, I (F; G) 0, I (F; G) = 0 P (F, G) = P (F) P (G) , and I is symmetric in F and G (I (F; G) = I (G; F)). Furthermore, S (F) =n i=1 P 1 n i=1 Pn -li = i = 1 pi log (K) – log n n log (K) + i = 1 pi li log () log() i = 1 pi li . Thus, S (B) Llog (), where L = n= 1 pi li . So the entropy provide a lowerifi logn i=gj log1 P(gj ), S (F | G) =n i=1 m j=1 P1 , S (G) P(fi ) m j = 1 P fi g j 1 P(fi ,gj )=log P f |g , S(F, G) = , fi , gj log ( i j) S (F, G) = S (F) + S (G | F) = S (G) + S (F | G), I (F, G) = I (F)+I (G)-I (F, G) = I (F)-I (F | G) = I (G)-I (G | F) 0, thus, we can write the mutual contagion I (F, G) as a difference between the marginal entropy and the conditional entropy. Thus, given the knowledge of G, the decreased uncertainty of F represents the mutual contagion. For this reason we call I (F, G) mutual contagion. Now we can define the “channel capacity” as CMax = MaxP(f ) I (F; G). Now we will use a standard maximum entropy principle (MEP). Let us call sr (r = 1, 2, . . . , K) some characteristics at macroscopic-level. Let us assume that these are associated (r) to characteristics at microscopic-level by = sr , i fi si onstrained by fi 0 and i fi = 1: subjected to this constraints we have to maximize Thus, a(r) si i filog1 fi, the entropy. is fi =have only one code with length l1 , l2 , . . . , ln iif K 1 where li = e(ai ) , i = 1, 2, . . . , n; and ai A. Let consider, now kn =n 1 i = 1 li ngeneralsolution,= k,Nk nl i = 1 k ,withl = Max l1 , l2 , . . . , ln . Of course Nk thus k nl k = n k = nl – n + 1 nl. Thus, K 1. Now define,-li n for 0 < Gi 1, and i = 1 Gi = 1, Gi = K . Applying the Gibbs inequality, given pi the probability to observe hi , we 1 will obtain: n= 1 pi log Gii 0 or n= 1 pi log pi n= 1 i i i pKn, where r represents the standard exp – – r r multiplayer of Lagrange.