Category:Information theory: Difference between revisions

From ETHW
No edit summary
m (Text replace - "[[Category:Computers_and_information_processing" to "[[Category:Computing and electronics")
 
(2 intermediate revisions by the same user not shown)
Line 10: Line 10:
*'''[[:Category:Encoding|Encoding]]''' - the process by which information from a source is changed into symbols to be communicated
*'''[[:Category:Encoding|Encoding]]''' - the process by which information from a source is changed into symbols to be communicated
*'''[[:Category:Error compensation|Error compensation]]''' - the encoding or transmission of extra information or code to compensate for possible errors
*'''[[:Category:Error compensation|Error compensation]]''' - the encoding or transmission of extra information or code to compensate for possible errors
*'''[[:Category:Genetic communication|Genetic communication]]''' - communication of information in a form analagous to biological genes, such as in a sequential or additive fashion
*'''[[:Category:Hamming distance|Hamming distance]]''' - the number of positions in two strings of equal length which have different corresponding symbols
*'''[[:Category:Hamming weight|Hamming weight]]''' - the number of symbols in a string that are different from the zero symbol in the utilized alphabet
*'''[[:Category:Information entropy|Information entropy]]''' - the level of uncertainty associated with a random variable (often refers to the "Shannon entropy")
*'''[[:Category:Information entropy|Information entropy]]''' - the level of uncertainty associated with a random variable (often refers to the "Shannon entropy")
*'''[[:Category:Mutual information|Mutual information]]''' - occasionally called transinformation, the quantity that measures the mutual dependence of two random variables
*'''[[:Category:Mutual information|Mutual information]]''' - occasionally called transinformation, the quantity that measures the mutual dependence of two random variables
*'''[[:Category:Rate distortion theory|Rate distortion theory]]''' - the branch of information theory which explains lossy data compression and which determines the minimal amount of entropy that should be communicated over a channel
*'''[[:Category:Rate distortion theory|Rate distortion theory]]''' - the branch of information theory which explains lossy data compression and which determines the minimal amount of entropy that should be communicated over a channel
*'''[[:Category:Rate-distortion|Rate-distortion]]''' - the number of bits per data sample to be stored or transmitted and the amount of distortion in that sample
*'''[[:Category:Speech coding|Speech coding]]''' - the use of the data compression of digital audio signals to encode speech
*'''[[:Category:Speech coding|Speech coding]]''' - the use of the data compression of digital audio signals to encode speech


[[Category:Computers_and_information_processing|{{PAGENAME}}]]
[[Category:Computing and electronics|{{PAGENAME}}]]

Latest revision as of 16:16, 22 July 2014

The processing of information via the use of applied mathematics and electrical engineering

Subcategories

  • Audio coding - the translation of auditory information into digital code
  • Channel coding - code used to protect information over a channel by correcting errors resulting from noise or other interference
  • Codes - rules for converting one piece of information into another
  • Communication channels - a physical or logical connection between two points that allows for the exchange of an information signal
  • Decoding - translating from an coded message into the original language or form
  • Encoding - the process by which information from a source is changed into symbols to be communicated
  • Error compensation - the encoding or transmission of extra information or code to compensate for possible errors
  • Information entropy - the level of uncertainty associated with a random variable (often refers to the "Shannon entropy")
  • Mutual information - occasionally called transinformation, the quantity that measures the mutual dependence of two random variables
  • Rate distortion theory - the branch of information theory which explains lossy data compression and which determines the minimal amount of entropy that should be communicated over a channel
  • Speech coding - the use of the data compression of digital audio signals to encode speech

Pages in category "Information theory"

The following 76 pages are in this category, out of 76 total.