SUPPORT THE WORK

GetWiki

Nat (unit)

ARTICLE SUBJECTS
aesthetics  →
being  →
complexity  →
database  →
enterprise  →
ethics  →
fiction  →
history  →
internet  →
knowledge  →
language  →
licensing  →
linux  →
logic  →
method  →
news  →
perception  →
philosophy  →
policy  →
purpose  →
religion  →
science  →
sociology  →
software  →
truth  →
unix  →
wiki  →
ARTICLE TYPES
essay  →
feed  →
help  →
system  →
wiki  →
ARTICLE ORIGINS
critical  →
discussion  →
forked  →
imported  →
original  →
Nat (unit)
[ temporary import ]
please note:
- the content below is remote from Wikipedia
- it has been imported raw for GetWiki
{{Short description|Unit of information}}{{lowercase title}}{{Fundamental info units}}The natural unit of information (symbol: nat),{{refn|name=IEC 80000-13:2008}} sometimes also nit or nepit, is a unit of information or information entropy, based on natural logarithms and powers of e, rather than the powers of 2 and base 2 logarithms, which define the shannon. This unit is also known by its unit symbol, the nat. One nat is the information content of an event when the probability of that event occurring is 1/e.One nat is equal to {{sfrac|1|ln 2}} shannons ≈ 1.44 Sh or, equivalently, {{sfrac|1|ln 10}} hartleys ≈ 0.434 Hart.{{refn|name=IEC 80000-13:2008|WEB, IEC 80000-13:2008,www.iso.org/iso/catalogue_detail?csnumber=31898, International Electrotechnical Commission, 21 July 2013, }}

History

Boulton and Wallace used the term nit in conjunction with minimum message length,{{refn|JOURNAL, D. M., Boulton, C. S., Wallace, Chris Wallace (computer scientist), A program for numerical classification, Computer Journal, 13, 1, 63–69, 1970, 10.1093/comjnl/13.1.63, free, }} which was subsequently changed by the minimum description length community to nat to avoid confusion with the nit used as a unit of luminance.{{refn|BOOK, J. W., Comley, amp, D. L., Dowe,www.csse.monash.edu.au/~dld/David.Dowe.publications.html#ComleyDowe2005, Minimum Message Length, MDL and Generalised Bayesian Networks with Asymmetric Languages, sec. 11.4.1, p271, P., Grünwald, I. J., Myung, M. A., Pitt,mitpress.mit.edu/catalog/item/default.asp?sid=4C100C6F-2255-40FF-A2ED-02FC49FEBE7C&ttype=2&tid=10478, Advances in Minimum Description Length: Theory and Applications, Cambridge, MIT Press, 0-262-07262-9, 2005, }}Alan Turing used the natural ban.{{refn|BOOK, (Alan Turing: The Enigma), Hodges, Andrew, Andrew Hodges, 1983, Simon & Schuster, New York City, New York, 0-671-49207-1, 10020685, }}

Entropy

Shannon entropy (information entropy), being the expected value of the information of an event, is inherently a quantity of the same type and with a unit of information. The International System of Units, by assigning the same unit (joule per kelvin) both to heat capacity and to thermodynamic entropy implicitly treats information entropy as a quantity of dimension one, with {{nowrap|1=1 nat = 1}}.{{efn|This implicitly also makes the nat the coherent unit of information in the SI.}} Systems of natural units that normalize the Boltzmann constant to 1 are effectively measuring thermodynamic entropy with the nat as unit.When the shannon entropy is written using a natural logarithm,
Eta = - sum_i p_i ln p_i
it is implicitly giving a number measured in nats.

Notes

{{notelist}}

References

{{reflist}}

Further reading

  • BOOK, Fazlollah M., Reza, An Introduction to Information Theory, New York, Dover, 1994, 0-486-68210-2,


- content above as imported from Wikipedia
- "Nat (unit)" does not exist on GetWiki (yet)
- time: 8:27am EDT - Wed, May 22 2024
[ this remote article is provided by Wikipedia ]
LATEST EDITS [ see all ]
GETWIKI 21 MAY 2024
GETWIKI 09 JUL 2019
Eastern Philosophy
History of Philosophy
GETWIKI 09 MAY 2016
GETWIKI 18 OCT 2015
M.R.M. Parrott
Biographies
GETWIKI 20 AUG 2014
CONNECT