Mervat Mahdy, Dina S. Eltelbany, Hoda Mohammed
Department of Statistics, Mathematics and Insurance, College of Commerce, Benha University, Egypt.
๐ฏ๐ต- Entropy: A New Measureof Information And Its Properties
Authors
Abstract
Entropy measures the amount of uncertainty and dispersion of an unknown or random quantity, this concept introduced at first by Shannon (1948), it is important for studies in many areas. Like, information theory: entropy measures the amount of information in each message received, physics: entropy is the basic concept that measures the disorder of the thermodynamical system, and others. Then, in this paper, we introduce an alternative measure of entropy, called ๐ป๐- entropy, unlike Shannon entropy, this proposed measure of order ฮฑ and ฮฒ is more flexible than Shannon. Then, the cumulative residual ๐ป๐- entropy, cumulative ๐ป๐- entropy, and weighted version have been introduced. Finally, comparison between Shannon entropy and ๐ป๐- entropy and numerical results have been introduced.