Mervat Mahdy, Dina S. Eltelbany, Hoda Mohammed
Department of Statistics, Mathematics and Insurance, College of Commerce, Benha University, Egypt.
π―π΅- Entropy: A New Measureof Information And Its Properties
Authors
Abstract
Entropy measures the amount of uncertainty and dispersion of an unknown or random quantity, this concept introduced at first by Shannon (1948), it is important for studies in many areas. Like, information theory: entropy measures the amount of information in each message received, physics: entropy is the basic concept that measures the disorder of the thermodynamical system, and others. Then, in this paper, we introduce an alternative measure of entropy, called π»π- entropy, unlike Shannon entropy, this proposed measure of order Ξ± and Ξ² is more flexible than Shannon. Then, the cumulative residual π»π- entropy, cumulative π»π- entropy, and weighted version have been introduced. Finally, comparison between Shannon entropy and π»π- entropy and numerical results have been introduced.