ABSTRACT:
Entropy, in information theory, is regarded as a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which measures the average value of the information contained in a message. Entropy is in general measured in bits, nats or bans. Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content. In this paper, some new (h,?)-entropies have been constructed using utility function. Along with this in the last part of this paper, ten new functions are computed using classical entropies, which can further be adapted in the field of entropies. This suggests that there is a scope for further research, which can be accomplished by using these new functions for moment entropies.
Cite this article:
Anita Pruthi, . Computation of Generalized (h,)-Entropies & Moment entropies for Markov Chains. Research J. Science and Tech 5(1): Jan.-Mar.2013 page 173-178.
Cite(Electronic):
Anita Pruthi, . Computation of Generalized (h,)-Entropies & Moment entropies for Markov Chains. Research J. Science and Tech 5(1): Jan.-Mar.2013 page 173-178. Available on: https://rjstonline.com/AbstractView.aspx?PID=2013-5-1-27