Author(s): Anita Pruthi,


DOI: Not Available

Address: Anita Pruthi,
PG Department of Mathematics, DAV College, Abohar
*Corresponding Author

Published In:   Volume - 5,      Issue - 1,     Year - 2013

Entropy, in information theory, is regarded as a measure of the uncertainty in a random variable. In this context, the term usually refers to the Shannon entropy, which measures the average value of the information contained in a message. Entropy is in general measured in bits, nats or bans. Shannon entropy is the average unpredictability in a random variable, which is equivalent to its information content. In this paper, some new (h,?)-entropies have been constructed using utility function. Along with this in the last part of this paper, ten new functions are computed using classical entropies, which can further be adapted in the field of entropies. This suggests that there is a scope for further research, which can be accomplished by using these new functions for moment entropies.

Cite this article:
Anita Pruthi, . Computation of Generalized (h,)-Entropies & Moment entropies for Markov Chains. Research J. Science and Tech 5(1): Jan.-Mar.2013 page 173-178.

Recomonded Articles:

Author(s): Anita Pruthi,

DOI:         Access: Open Access Read More

Research Journal of Science and Technology (RJST) is an international, peer-reviewed journal, devoted to science and technology...... Read more >>>

RNI: Not Available                     
DOI: 10.5958/2349-2988 

Recent Articles