DOI QR코드

DOI QR Code

Characterization of New Two Parametric Generalized Useful Information Measure

  • Bhat, Ashiq Hussain (Post Graduate Department of Statistics University of Kashmir) ;
  • Baig, M. A. K. (Post Graduate Department of Statistics University of Kashmir)
  • 투고 : 2016.07.03
  • 심사 : 2016.11.28
  • 발행 : 2016.12.30

초록

In this paper we define a two parametric new generalized useful average code-word length $L_{\alpha}^{\beta}$(P;U) and its relationship with two parametric new generalized useful information measure $H_{\alpha}^{\beta}$(P;U) has been discussed. The lower and upper bound of $L_{\alpha}^{\beta}$(P;U), in terms of $H_{\alpha}^{\beta}$(P;U) are derived for a discrete noiseless channel. The measures defined in this communication are not only new but some well known measures are the particular cases of our proposed measures that already exist in the literature of useful information theory. The noiseless coding theorems for discrete channel proved in this paper are verified by considering Huffman and Shannon-Fano coding schemes on taking empirical data. Also we study the monotonic behavior of $H_{\alpha}^{\beta}$(P;U) with respect to parameters ${\alpha}$ and ${\beta}$. The important properties of $H_{{\alpha}}^{{\beta}}$(P;U) have also been studied.

키워드

1. INTRODUCTION / LITERATURE REVIEW

The growth of telecommunication in the early twentieth century led several researchers to study the information control of signals; the seminal work of Shannon (1948), based on papers by Nyquists (1924; 1928) and Hartley (1928), rationalized these early efforts into a coherent mathematical theory of communication and initiated the area of research now known as information theory. The central paradigm of classical information theory is the engineering problem of the transmission of information over a noisy channel. The most fundamental results of this theory are Shannon’s source coding theorem which establishes that on average the number of bits needed to represent the result of an uncertain event is given by its entropy; and Shannon’s noisy-channel coding theorem which states that reliable communication is possible over noisy channels provided that the rate of communication is below a certain threshold, called the channel capacity. Information theory is a broad and deep mathematical theory with equally broad and deep applications, amongst which is the vital field of coding theory. Information theory is a new branch of probability and statistics with extensive potential application to communication systems. The term information theory does not possess a unique definition. Broadly speaking, information theory deals with the study of problems concerning any system. This includes information processing, information storage, and decision making. In a narrow sense, information theory studies all theoretical problems connected with the transmission of information over communication channels. This includes the study of uncertainty (information) measure and various practical and economical methods of coding information for transmission.

 

It is a well-known fact that information measures are important for practical applications of information processing. For measuring information, a general approach is provided in a statistical framework based on information entropy introduced by Shannon (1948) as a measure of information. The Shannon entropy satisfies some desirable axiomatic requirements and also it can be assigned operational significance in important practical problems, for instance in coding and telecommunication. In coding theory, usually we come across the problem of efficient coding of messages to be sent over a noiseless channel where our concern is to maximize the number of messages that can be sent through a channel in a given time. Therefore, we find the minimum value of a mean codeword length subject to a given constraint on codeword lengths. As the codeword lengths are integers, the minimum value lies between two bounds, so a noiseless coding theorem seeks to find these bounds which are in terms of some measure of entropy for a given mean and a given constraint. Shannon (1948) found the lower bounds for the arithmetic mean by using his own entropy. Campbell (1965) defined his own exponentiated mean and by applying Kraft’s (1949) inequality, found lower bounds for his mean in terms of Renyi’s (1961) measure of entropy. Longo (1976) developed lower bound for useful mean codeword length in terms of weighted entropy introduced by Belis and Guiasu (1968). Guiasu and Picard (1971) proved a noiseless coding theorem by obtaining lower bounds for another useful mean code-word length. Gurdial and Pessoa (1977) extended the theorem by finding lower bounds for useful mean codeword length of order α; also various authors like Jain and Tuteja (1989), Taneja et al (1985), Hooda and Bhaker (1997), and Khan et al 
(2005
) have studied generalized coding theorems by considering different generalized ‘useful’ information measures under the condition of unique decipherability.

In this paper we define a new two parametric generalized useful average codeword length \(L_α^β(P;U)\) and disuss its relationship with new two parametric general￾ized useful information measure \(H_α^β(P;U)\). The lower and upper bound of  \(L_α^β(P;U)\), in tearms of are derived for a discrete noiseless channel in Section 3. The mea￾sures defined in this communication are not only new but also generalizations of certain well known measures in the literature of useful information theory. In Section 4, the noiseless coding theorems for discrete channels proved in this paper are verified by considering Huffman and Shannon-Fano coding schemes using empirical data. In Section 5, we study the monotonic behavior of \(H_α^β(P;U)\)with respect to parameters \(α\) and β. Several other properties of \(H_α^β(P;U)\)are studied in Section 6.

 

2. BASIC CONCEPTS 

 Let \(X\) be a finite discrete random variable or finite source taking values \(x_1, x_2, ..., x_n\) with respective probabilities \(P=(p_1, p_2, ..., p_n), p_i \ge \,0\, \forall \,i=1,2,...,n\) and \(\sum_{i=1}^n p_i=1 \).  Shannon (1948) gives the following mea￾sure of information and calls it entropy.

\(H(P)=-\sum_{i=1}^n p_i \log\,p_i\)       (1.1)

The measure (1.1) serves as a suitable measure of entropy. Let \(p_1, p_2, p_3, ..., p_n\) be the probabilities of n codewords to be transmitted and let their lengths \(l_n, l_2, ..., l_n\)
satisfy Kraft (1949) inequality,

\(\sum_{i=1}^n D^{-l_i} \le1\)                        (1.2)

For uniquely decipherable codes, Shannon (1948) showed that for all codes satisfying (1.2), the lower bound of the mean codeword length,

\(L(P)=\sum_{i=1}^n p_il_i \)                       (1.3)

lies between \(H(P)\) and \(H(P)+1\), where \(D\) is the size of code alphabet.

Shannon’s entropy (1.1) is indeed a measure of un￾certainty and is treated as information supplied by a probabilistic experiment. This formula gives us the measure of information as a function of the probabil￾ities only in which various events occur without con￾sidering the effectiveness or importance of the events. Belis and Guiasu (1968) remarked that a source is not completely specified by the probability distribution \(P\) over the source alphabet \(X\) in the absence of quality character. They enriched the usual description of the information source (i.e., a finite source alphabet and finite probability distribution) by introducing an addi￾tional parameter measuring the utility associated with an event according to their importance or utilities in view of the experimenter.

Let \(U=(u_1, u_2, ..., u_n)\) be the set of positive real numbers, where \(u_i\) is the utility or importance of outcome \(x_i\). The utility, in general, is independent of \(p_i\) , i.e., the 
probability of encoding of source symbol \(x_i\) . The information source is thus given by

\(S=\begin{bmatrix} X_1 & X_2 &...& X_n \\ p_1 & p_2 &...& p_n \\ u_i & u_2 & ...&u_n \end{bmatrix},u_i,>0\,p_i\ge0, \sum_i^n p_i=1 \)       (1.4)

We call (1.4) a Utility Information Scheme. Belis and Guiasu (1968) introduced the following quantitative - qualitative measure of information for this scheme.

\(H(P,U)=-\sum_{i=1}^n u_ip_i \log \,p_i\)                   (1.5)

and call it as ‘useful’ entropy. The measure (1.5) can be taken as satisfactory measure for the average quantity of ‘valuable’ or ‘useful’ information provided by the information source (1.4). Guiasu and Picard (1971) considered the problem of encoding the letter output by the source (1.4) by means of a single letter prefix code whose codeword’s \(c_1, c_2, ...,c_n\) have lengths \(l_n, l_2, ...,l_n\)  respectively and satisfy the Kraft’s inequality (1.2), they introduced the following quantity

\(L(P;U)=\frac{\sum_{i=1}^n u_ip_il_i }{\sum_{i=1}^n u_ip_i}\)    (1.6)

and call it as ‘useful’ mean length of the code. Further, they derived a lower bound for (1.6). However, Longo (1976) interpreted (1.6) as the average transmission cost of the letters \(xi\) with probabilities \(p_i\) and utility and gave some practical interpretations of this length; bounds for the cost function (1.6) in terms of (1.5) are derived by him.

 

3. NOISELESS CODING THEOREMS FOR ‘USEFUL’ CODES 

Define a two parametric new generalized useful information measure for the incomplete power distribution as:

\(H_α^β(P;U)=\frac{β}{1-α}\,\log_D [\frac{\sum_{i=1}^n u_ip_i^{αβ}}{\sum_{i=1}^n u_ip_i^β}],\)       (2.1)

\(Where\,0<α<1,0<β\le1,p_i\ge0\,\forall\,i=1,2,..., n,\sum_{i=1}^n p_i\le1 \)

 

Remarks for (2.1)

Ⅰ.    When\(β =1\), (2.1)  reduces to 'useful' information measure studied by Taneja, Hooda, and Tuteja(1985), i.e.,

\(H_α(P;U)=\frac{1}{1-α}=\,\log_D[\frac{\sum_{i=1}^n u_ip_i^α}{\sum_{i=1}^nu_ip_i}]\)       (2.2)

Ⅱ.     When \(β=1, u_i=1, \forall\,i=1,2,...n\), i.e., when the utility aspact is  ignored and \(\sum_{i=1}^n u_ip_i=1 \),(2.1) reduces to Reyni's (1961) entropy, i,e.,

\(H_α(P)=\frac{1}{1-α}\,=log_D[\sum_{i=1}^np_i^α]\)                                       (2.3)

 

Ⅲ. When \(β=1, u_i, \forall\,i=1,2...,n\) i.e., when the utility aspact is inored, \(\sum_{i=1}^n p_i=1, α→1,\) and \(p_i=\frac{1}{n}\forall i=1,2...,\) the measure (2.1) reduces o useful information measure for the incomplete distribution due to Bhakar and Hooda(1993), i.e.,

 

 

 

 

 

 

 

 

참고문헌

  1. Belis, M., & Guiasu S. (1968). A quantitative-qualitative measure of information in cybernetic System. IEEE Transaction on Information Theory, 14, 593-594. https://doi.org/10.1109/TIT.1968.1054185
  2. Bhaker, U.S., & Hooda, D.S. (1993). Mean value characterization of 'useful' information measure. Tamkang Journal of Mathematics, 24, 283-294.
  3. Bhat, A. H., & Baig, M. A. K. (2016). Some coding theorems on generalized Renyi's entropy of order α and type $\beta$. International Journal of Applied Mathematics and Information Sciences Letters, 5, 1-5. https://doi.org/10.18576/isl/050101
  4. Campbell L. L. (1965). A coding theorem and Renyi's entropy. Information and Control, 8, 423-429. https://doi.org/10.1016/S0019-9958(65)90332-3
  5. Guiasu, S., & Picard, C. F. (1971). Borne inferieure de la longueur de certain codes (pp. 248-251). Paris: C.R. Academic Sciences, t. 273.
  6. Gurdial, P. F. (1977). On useful Information of order $\alpha$. Journal of Combinatorics Information and System Sciences, 2, 158-162.
  7. Hartley, R. V. L. (1928). Transmission of information. Bell System Technical Journal, 7,535-563. https://doi.org/10.1002/j.1538-7305.1928.tb01236.x
  8. Hooda, D. S., & Bhaker, U. S. (1997). A generalized 'useful' information measure and coding theorems. Soochow Journal of Mathematics, 23, 53-62.
  9. Jain P., & Tuteja, R. K. (1989). On coding theorem connected with 'useful' entropy of order $\beta$.International Journal of Mathematics and Mathematical Sciences, 12, 193-198. https://doi.org/10.1155/S0161171289000232
  10. Khan, A. B., Bhat, B. A., & Pirzada, S. (2005). Some results on a generalized 'useful' information measure. Journal of Inequalities in Pure and Applied Mathematics, 6, 117.
  11. Kraft, L. G. (1949). A device for quantizing grouping and coding amplitude modulates pulses (M. S. thesis). Department of Electrical Engineering, MIT, Cambridge.
  12. Longo G. (1976). A noiseless coding theorem for source having utilities. SIAM Journal of Applied Mathematics, 30, 739-748. https://doi.org/10.1137/0130067
  13. Mitter, J., & Mathur, Y. D. (1972). Comparison of entropies of power distributions. ZAMM, 52, 239-240.
  14. Nyquist, H. (1924). Certain factors affecting telegraph speed. Bell System Technical Journal, 3, 324-346. https://doi.org/10.1002/j.1538-7305.1924.tb01361.x
  15. Nyquist, H. (1928). Certain topics in telegraphy transmission theory. Journal of the American Institute of Electrical Engineers, 47, 617. https://doi.org/10.1109/T-AIEE.1928.5055024
  16. Renyi, A. (1961). On measure of entropy and information. Proceedings Fourth Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, 1, 547-561.
  17. Shannon, C. E. A mathematical theory of communication. Bell System Technical Journal, 27, 379-423, 623-659.
  18. Sharma, B. D, Mohan, M., & Mitter, J. On measure of 'useful' information. Information and Control, 39, 323-336.
  19. Taneja, H. C., Hooda, D. S., & Tuteja, R. K. Coding theorems on a generalized 'useful' information. Soochow Journal of Mathematics, 11, 123-131.