Category:Information theory

From ETHW
Revision as of 18:03, 14 April 2014 by Administrator1 (talk | contribs)

The processing of information via the use of applied mathematics and electrical engineering

Subcategories

  • Audio coding - the translation of auditory information into digital code
  • Channel coding - code used to protect information over a channel by correcting errors resulting from noise or other interference
  • Codes - rules for converting one piece of information into another
  • Communication channels - a physical or logical connection between two points that allows for the exchange of an information signal
  • Decoding - translating from an coded message into the original language or form
  • Encoding - the process by which information from a source is changed into symbols to be communicated
  • Error compensation - the encoding or transmission of extra information or code to compensate for possible errors
  • Information entropy - the level of uncertainty associated with a random variable (often refers to the "Shannon entropy")
  • Mutual information - occasionally called transinformation, the quantity that measures the mutual dependence of two random variables
  • Rate-distortion - the number of bits per data sample to be stored or transmitted and the amount of distortion in that sample
  • Speech coding - the use of the data compression of digital audio signals to encode speech

Pages in category "Information theory"

The following 76 pages are in this category, out of 76 total.