(3.144.242.197)
Users online: 17489     
Ijournet
Email id
 

Economic Affairs
Year : 2014, Volume : 59, Issue : 4
First page : ( 505) Last page : ( 514)
Print ISSN : 0424-2513. Online ISSN : 0976-4666.
Article DOI : 10.5958/0976-4666.2014.00019.9

An economic analysis of input structure in context to information inaccuracy, improvement and predictions

 Archana, Dwivedi Sudhakar*, Khan A. B., Sharma Manish Kr., Singh Harminder

Division of Agricultural Economics and Statistics, Sher-e-Kashmir University of Agricultural Sciences and Technology of Jammu (SKUAST-J), Main Campus, Chatha, Jammu (J&K)-180009, India

*Corresponding author: dwivedi.sudhakar@gmail.com

Online published on 5 January, 2015.

Abstract

During the last six decades, the information theory has attracted the researchers from worldwide and its literature is growing leaps and bounds. Some of its terminologies even have become part of our daily language. Every probability distribution has some uncertainty associated with it. The concept of ‘entropy’ is introduced here to provide a quantitative measure of this uncertainty. Different approaches for measure of entropy and its development has been made, viz: 1. An axiomatic approach, 2. Measure of entropy via measure of inaccuracy and directed divergence and 3. Information measures and coding theorem. A hypothetical data of agricultural, fisheries and forestry sectors, in each of nine years were framed. All inputs bought to fisheries and forestry sectors were supplied by other firms of the same sector. It was worked out that the smaller the distance of probability distribution P from Q, the greater will be the uncertainty and greater the entropy. This is always positive and vanishes if and only if P = Q. Now from the Shannon entropy

The calculation of D.F. Kerridge inaccuracy same as we did for measures using the Kullback – Liebler measure of relative information. So that as probability becomes more and more probability equal as the probability distributions comes closer to the uniform distribution, D(P/Q) becomes smaller and smaller, H(P) decreases larger and larger till H(P) approaches log n as P approaches Q.

Top

Keywords

Entropy, probability, inaccuracy, predictions, coding.

Top

  
║ Site map ║ Privacy Policy ║ Copyright ║ Terms & Conditions ║ Page Rank Tool
754,153,377 visitor(s) since 30th May, 2005.
All rights reserved. Site designed and maintained by DIVA ENTERPRISES PVT. LTD..
Note: Please use Internet Explorer (6.0 or above). Some functionalities may not work in other browsers.