Abstract
In this communication, we propose two quantities named ‘An \((R', S')\)-norm directed divergence measure’ and ‘\((R', S')\)-norm fuzzy directed divergence measure’, establish their validity and discuss their properties. Further, the performance of proposed divergence measure is compared with some existing measures through a numerical example. Finally, the application of proposed fuzzy directed divergence measure is given in strategic decision-making and the outcome is compared with other methods in literature.
Similar content being viewed by others
References
Ali SM, Silvey SD (1966) A general class of coefficients of divergence of one distribution from another. J Roy Stat Soc B 28:131–142
Aczel J, Daroczy Z (1975) On measures of information and their characterization. Academic Press, New York
Boekee DE, Vander Lubbe JCA (1980) The R-norm information measure. Inf Control 45:136–155
Bhandari D, Pal NR (1993) Some new information measures for fuzzy sets. Inf Sci 67(3):204–228
Bajaj RK, Hooda DS (2010) On some new generalized measures of fuzzy information. World Acad Sci Eng Technol 62:747–753
Bhatia PK, Singh S (2012) Three families of generalized fuzzy directed divergence. AMO Adv Model Optim 14(3):599–614
Bhandari D, Pal NR, Majumdar DD (1992) Fuzzy divergence, probability measure of fuzzy events and image thresholding. Inf Sci 13:857–867
Bhatia PK, Singh S (2013) A new measure of fuzzy directed divergence and its applications in image segmentation. Int J Intell Syst Appl 4:81–89
Brauers WKM, Zavadskas EK (2006) The MOORA method and its application to privatization in transition economy. Control Cybern 35(2):443–468
Bouchon-Meunier B, Rifqi M, Bothorel S (1996) Towards general measures of comparison of objects. Fuzzy Sets Syst 84:143–153
Bassat Ben M (1978) \(f\)-entropies, probability of error and feature selection. Inf Control 39:227–242
Burbea J, Rao CR (1982) Entropy differential metric, distance and divergence measures in probability spaces: a unified approach. J Multivar Anal 12:576–579
Csiszar J (1967) Information type measure of differences of probability distribution and indirect observations. Stud Sci Mater Hungary 2:299–318
Chen CH (1973) Statistical pattern recognition, vol 4. Hayden Book Bo, Rochelle Park, NJ
De Luca A, Termini S (1972) A definition of a non-probabilistic entropy in the setting of fuzzy sets theory. Inf Control 20:301–312
Fan J, Xie W (1999) Distance measures and induced fuzzy entropy. Fuzzy Sets Syst 104(2):305–314
Ferreri C (1980) Hyper entropy and related hetrogeneity divergence and information measure. Statistica 40(2):155–168
Fan S, Yang S, He P, Nie H (2011) Infrared electric image thresholding using two dimensional fuzzy entropy. Energy Proc 12:411–419
Hooda DS (2004) On generalized measures of fuzzy entropy. Math Slovaca 54:315–325
Hwang CL, Yoon K (1981) Multiple attribute decision making: methods and applications. Springer, New York
Hwang CH, Yang MS (2008) On entropy of fuzzy sets. Int J Uncertain Fuzzy Knowel Based Syst 16(4):519–527
Ghosh M, Das D, Ray C, Chakraborty AK (2010) Autumated leukocyte recoginition using fuzzy divergence. Micron 41:840–846
Joshi R, Kumar S (2017) An \((R, S)\)-norm fuzzy information measure with its applications in multiple attribute decision making. Comput Appl Math. https://doi.org/10.1007/s40314-017-0491-4
Joshi R, Kumar S (2016) A new approach in multiple attribute decision making using \(R\)-norm entropy and Hamming distance measure. Int J Inf Manag Sci 27:253–268
Joshi R, Kumar S, Gupta D, Kaur H (2017) A Jensen-\(\alpha \)-Norm dissimilarity measure for intuitionistic fuzzy sets and its applications in multiple attribute decision making. Int J Fuzzy Syst. https://doi.org/10.1007/s40815-017-0389-8
Kullback S, Leibler RA (1951) On information and sufficiency. Ann Math Stat 22:79–86
Kullback S (1959) Information theory and statistics. Wiley, New York
Kapur JN (1984) A comparative assessment of various measures of directed divergence. Adv Manag Stud 3:1–16
Kumar S, Chaudhary A, Kumar R (2014) Some more results on a generalized parametric \(R\)-norm information measure of type \(\alpha \). J Appl Sci Eng 17(4):447–453
Kapur JN (1997) Measures of fuzzy information. Mathematical Science Trust Society, New Delhi
Kumar S, Garg D (2016) Parametric \(R\)-norm directed divergence convex function. Inf Dimens Anal Quant Prob Relat Top 19(2):1–12. https://doi.org/10.1142/S0219025716500144
Liu H, Wang G (2007) Multicriteria decision making methods based on intuitionistic fuzzy sets. Eur J Oper Res 179:220–223
Montes S, Couso I, Gil P, Bertoluzza C (2002) Divergence measure between fuzzy sets. Int J Approx Reason 30(2):91–105
Mahalanobis PC (1936) On the generalized distance in statistics. Proc Natl Inst Sci 2:49–55
Pardo L (1993) \(R^\phi _h\)-Divergence statistics in applied categorical data analysis with stratified sampling. Utilit Math 44:145–164
Pal NR, Pal SR (1992) Higher order fuzzy entropy and hybrid entropy of a set. Inf Sci 61(3):211–231
Poletti E, Zappelli F, Ruggeri A, Grisan E (2012) A review of thresholding strategies applied to human chromosome segmentation. Comput Method Progr Biomed 108:679–688
Renyi A (1961) On measures of entropy and information. In: Proceedings of 4th Barkley symposium on Mathematics and Statistics and Probability. University of California Press, vol 1, p 547
Rao CR (1982) Diversity and dissimilarity coefficients: a unified approach. Theor Popul Biol 21:24–43
Shioya H, Da-te T (1995) A generalization of Lin divergence and the derivative of a new information divergence. Electron Commun Jpn 78:37–40
Shannon CE (1948) A mathematical theory of communication. Bell Syst Technol J 27(378–423):623–656
Sharma BD, Mittal DP (1975) New non-additive measures of entropy for discrete probability distributions. J Math Sci (Calcutta) 10:28–40
Shore JE, Gray RM (1982) Minimization cross-entropy pattern classification and cluster analysis. IEEE Trans Pattern Anal Mach Intell 4(1):11–17
Theil H (1967) Economics and information theory. North-Holland Publishing Co., Amesterdum
Tomar VP, Ohlan A (2014) New parametric generalized exponential fuzzy divergence measure. J Uncertain Anal Appl. https://doi.org/10.1186/s40467-014-0024-2
Taneja IJ (1975) A study of generalized measures of information theory. Ph.D. Thesis, New Delhi
Aman U (1996) Entropy, divergence and distance measures with econometric applications. J Stat Plan Inf 49:137–162
Vajda I (1989) Theory of statistical inference and information. Kluwer academics publishers, Dordrecht-Boston
Yager RR (1979) On the measure of fuzziness and negation. Part 1: Membership in the unit interval. Int J Gen Syst 5:221–229
Zadeh LA (1965) Fuzzy sets. Inf Control 8:221–229
Zadeh LA (1968) Probability measures of fuzzy events. J Math Anal Appl 23:421–427
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Marcos Eduardo Valle.
Appendix: Proof of properties
Appendix: Proof of properties
Proof
Consider the sets
and
Using the notations given in the ‘Preliminaries’ section, i.e., \(\square \)
In set \(X_1\),
In set \(X_2\),
(i) Consider
Dividing X into \(X_1\) and \(X_2\), we get
Using (8.3) and (8.4) and simplifying, we get
Therefore,
Hence, \(I_{R'}^{S'}(P\cup Q, P)+I_{R'}^{S'}(P \cap Q, P)=I_{R'}^{S'} (Q, P)\).
(ii) We have to prove \(I_{R'}^{S'} (P \cup Q, C)+I_{R'}^{S'} (P \cap Q, C)=I_{R'}^{S'} (P, C)+I_{R'}^{S'} (Q, C)\).
For this, consider
Dividing X into \(X_1\) and \(X_2\), we get
Using (8.3) and (8.4) and simplifying, we get
\(=I_{R'}^{S'} (P, C)+I_{R'}^{S'} (Q, C)\). Thus proved.
(iii) We have to prove that \(I_{R'}^{S'}\overline{(P\cup Q},\overline{P\cap Q)}=I_{R'}^{S'}(\bar{P}\cap \bar{Q}, \bar{P}\cup \bar{Q})\).
To prove this, consider
Using definition (3.2), we get
Bifurcating X into \(X_1\) and \(X_2\) and applying (8.3) and (8.4), we get
Now consider,
Dividing X into \(X_1\) and \(X_2\) and simplifying using (8.3) and (8.4), we get
Using definition (3.2), we get
\(=I_{R'}^{S'}(\overline{P \cup Q}, \overline{P \cap Q})\).
Therefore, (iii) holds.
(iv) Consider
Using notations in definition (3.2), we get
\(=I_{R'}^{S'}(\bar{P}, P)\).
(v) Consider
Using notations in definition (3.2) and simplifying, we get
\(=I_{R'}^{S'} (P, Q)\).
(vi) Consider
Using notations in definition (3.2) and simplifying, we get
\(=I_{R'}^{S'}(\bar{P}, Q)\).
(vii) It follows directly from (b) and (c).
Rights and permissions
About this article
Cite this article
Joshi, R., Kumar, S. An \((R',S')\)-norm fuzzy relative information measure and its applications in strategic decision-making. Comp. Appl. Math. 37, 4518–4543 (2018). https://doi.org/10.1007/s40314-018-0582-x
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40314-018-0582-x
Keywords
- \((R', S')\)-Norm fuzzy information measure
- \((R' , S')\)-Norm fuzzy directed divergence
- Fuzzy TOPSIS
- Fuzzy MOORA
- Monotonicity