1、1The Study on Conditional Probability of Error for m-ary Hypothesis TestsAbstract. In this paper, a new class of lower bounds on the probability of error in multiple hypothesis testing was presented. These new bounds maintain the desirable properties of continuity, differentiability, and symmetry. I
2、n the binary case, the proposed class depends on a parameter, q which at the limit of infinity provides the minimum attainable probability of error, provided by the MAP detector. It is shown that this class of bounds generalizes some existing bounds for binary hypothesis tests. It was shown via exam
3、ples that the proposed bounds outperform other existing bounds in terms of tightness and simplicity of calculation. Key words: MAP;hypothesis testing;probability of error Introduction Lower bounds on the probability of error are of great importance in system design and performance analysis in many a
4、pplications, such as signal detection, communications, and classification. It is well known that the minimum probahility of error is attained by the maximum a-posteriori probability 2(MAP) criterion, however, its probability of error is often difficult to calculate and usually not tractable. In such
5、 cases, lower bounds on the probability of error are useful for performance analysis, feasibility study and system design. These bounds can be useful also for derivation of analytical expressions for the Ziv-Zakai family of bounds for parameter estimation 1. One of the difficulties in computation of
6、 the Ziv-Zakai bounds is that they involve an expression for the minimum probability of error of a binary hypothesis problem. Analytic expressions for lower bounds on the probability of error may be useful to simplify the calculation of the bound. PROBLEM STATEMENT Consider an M -ary hypothesis test
7、ing problem, in which the hypotheses are , with the corresponding a-priori probabilities , , and the random observation vector is x. Let , denote the conditional probability of given x, and and are the conditional and joint probability density functions (pdt) of x and , , respectively. The probabili
8、ty of error of the decision problem is denoted . It is well known that the minimum average probability of error, obtained by the MAP criterion, is given by 9 (1) 3However, the minimum prohahility in (1) is often difficult to calculate and usually not tractable. Therefore, computable and tight lower
9、bounds on the probability of error are useful for performance analysis and system design. REVIEW OF EXISTING LOWER BOUNDS In this section, some existing lower bounds on the minimum probability of error are presented. The following bounds have been derived especially for the binary hypothesis testing
10、. Most of the binary hypothesis testing bounds are based on divergence measures of the difference between two probability distributions, known as I-divergences or Ali-Silvey distances. In 2, the divergence and two Bhattacharyya-based lower bounds were proposed. The divergence lower bound is (2) Wher
11、e and is the likelihood ratio function, and A simple Bhattacharyya-based lower bound is This bound is always tighter than the divergence lower bound 2. The second Bhattacharyya-based bound on is Another I-divergence bound is proposed in 3: where L1. For L = 1 this bound can be obtained also by apply
12、ing Jensens inequality on the MAP probability of error. 4The harmonic lower bound was proposed in 5: The “Gaussian-Sinusoidal“ lower bound is given by where = 1.8063. Although this bound is tight, it is usually not tractable. An arbitrarily tight lower bound is given by for any 0. By selecting high
13、enough values for a, this lower bound can be made arbitrarily close to Pemin. However, this bound is, in general, difficult to evaluate. For multiple hypothesis testing problems, the following lower bounds have been proposed. In 6, Devijver derived the following bounds using the conditional Bayesian
14、 distance: and Where stands for the conditional Bayesian distance. In 6, it is analytically shown that for the binary case the Bayesian distance lower bound in (9) is always tighter than the Bhattacharyya bound in (4). The bound in (10) is tighter than the bound 5, 6 The bound was proposed in 7 in t
15、he context of Vajdas quadratic entropy and the quadratic mutual information, respectively. In 6, it is claimed that The bound can be interpreted as an M-ary extension to the harmonic mean bound, presented in (6). 5A NEW CLASS OF BOUNDS ON PROBABILITY OF ERROR Derivation of the new class of bounds Co
16、nsider an M -ary hypothesis testing problem with detector . Let where 0 is the true hypothesis and lA is the indicator function of a subset A. Then, according to Holders inequality, for for an arbitrary scalar function . It can be seen that where . By substituting of (15) into (14) one obtains the f
17、ollowing lower bound on the probability of error: Using (13) the expectation term in the numerator of (16) can be rewritten as It can be shown that in order to obtain a valid bound which is independent of the detector should be structured as follows where is an arbitrary function. With no loss of ge
18、nerality can be chosen to be a nonnegative unction. By substituting (18) in (17) one obtains Using (18), it can be seen that Where . By subsitiution of (19) and (20) into (16) the bound can be rewritten as: Maximization of (21) with respect to (w.r.t.) results and by substituting (22) in (21), the a
19、ttained lower bound 6is for all q 1. Binary hypothesis testing In the binary hypothesis problem with the hypotheses and , the lower bound in (23) is Asymptotic properties It can be seen that the bound in (24) becomes tighter by in-creasing q, and for the bound is which is the optimal bound for the b
20、inary hypothesis test, presented in (1). The harmonic lower bound For q = 2 this bound can be written by the following simpleversion: which is identical to the “harmonic lower bound“ in (6) and to B(quad) for the binary case in (12). Thus, the binary lower bound in 5 can be interpreted as a special
21、case of our general M -hypotheses bound, presented in (23). Relation to upper bounds on minimum probability of error In 5, an upper bound on the probability of error of MAP estimator for binary hypothesis testing is derived using thenegative power mean inequalities. According to this paper: for any
22、q 1. It can be seen that this upper bound can be obtained by multiplying the proposed lower bound in (23) by The factor of controls the tightness between upper and lower bounds in the probability of error for binary hypothesis 7testing. Bounds comparison Figure I depict the new lower bound for the b
23、inary hypothesis problem against the conditional probability , for different values of the parameter q. The new bounds are compared to the bounds , with = 5, and , presented in (7), (8), and (II), respectively. It can be seen that the bound in (24) becomes tighter as q grows, and that for q = 10, th
24、e new bound is tighter than the other lower bounds almost everywhere. Fig. 1. The proposed lower bounds for q =2. 5. 10 and other existing bounds as a function of the conditional probability for binary hypothesis testing. CONCLUSION Practical and useful lower bounds on the probability of error are e
25、xpected to be computationally simple, tight, and appropriate for general multi-hypothesis problems. In this paper, a new class of lower bounds with the aforementioned desired properties is derived using Holders inequality. The bounds in this class are simpler to compute than the optimum probability
26、of error provided by the MAP criterion and they provide good prediction of the minimum probability of error in multiple hypothesis testing. References 81 D. Chazan, M. Zakai, and J. Ziv, “Improved lower bounds on signal parameter estimation,“ IEEE Trans. Inform. Theory, vol. IT-21 , no. l,pp.90-93,
27、1975. 2 T. Kailath, “The divergence and Bhattacharyya distance measures in signal selection,“ IEEE Trans. Commun. Techno!., vol. com-15, no. 1, pp. 52-60, 1967. 3 1. V. Tilburg and D. E. Boekee, “Divergence bounds on key equivocation and error probability in cryptanalysis,“ Advances in cryptology-CR
28、YPTO85, vol. 218, pp. 489-513,1986. 4 T. S. Han and S. Verdu, “Generalizing the Fano inequality,“ IEEE Trans. Inform. Theory, vol. 40, no. 4, pp. 1247-1251, 2010. 5 N. Santhi and A. Vardy, “On an improvement over Renyis equivocation bound,“ Proc. 44-th Allerton Conference on Com- munications, Contro
29、l and Computing, pp. 1118-1124,2009. 6 P. A. Devijver, “On a new class of bounds on Bayes risk in multihypothesis pattern recognition,“ IEEE Trans. Comput., vol.C-23,pp. 70-80,1974. 7 C. E. Shannon, “Certain results in coding theory for noisy channels,“ Inform. Contr., vol. 1, pp. 6-25, 1957. 8 R. M. Fano, Class notes for Transmission of Information course 6.574, MIT, Cambridge, MA, 1952. 99 M. Feder and N. Merhav, “Relations between entropy and error probability,“ IEEE Trans. Inform. Theory, vol. 40, no. 1, pp. 259-266, 2010.