๐‘ฏ๐‘ต- Entropy: A New Measureof Information And Its Properties

Authors

Mervat Mahdy, Dina S. Eltelbany, Hoda Mohammed
Department of Statistics, Mathematics and Insurance, College of Commerce, Benha University, Egypt.

Abstract

Entropy measures the amount of uncertainty and dispersion of an unknown or random quantity, this concept introduced at first by Shannon (1948), it is important for studies in many areas. Like, information theory: entropy measures the amount of information in each message received, physics: entropy is the basic concept that measures the disorder of the thermodynamical system, and others. Then, in this paper, we introduce an alternative measure of entropy, called ๐ป๐‘- entropy, unlike Shannon entropy, this proposed measure of order ฮฑ and ฮฒ is more flexible than Shannon. Then, the cumulative residual ๐ป๐‘- entropy, cumulative ๐ป๐‘- entropy, and weighted version have been introduced. Finally, comparison between Shannon entropy and ๐ป๐‘- entropy and numerical results have been introduced.