Category:Information theory: Difference between revisions

From ETHW
No edit summary
m (Text replace - "[[Category:Computers_and_information_processing" to "[[Category:Computing and electronics")
 
(3 intermediate revisions by 2 users not shown)
Line 3: Line 3:
== Subcategories ==
== Subcategories ==


*'''[[:Category:Audio coding|Audio coding]]'''
*'''[[:Category:Audio coding|Audio coding]]''' - the translation of auditory information into digital code
*'''[[:Category:Channel coding|Channel coding]]'''
*'''[[:Category:Channel coding|Channel coding]]''' - code used to protect information over a channel by correcting errors resulting from noise or other interference
*'''[[:Category:Codes|Codes]]'''
*'''[[:Category:Codes|Codes]]''' - rules for converting one piece of information into another
*'''[[:Category:Communication channels|Communication channels]]'''
*'''[[:Category:Communication channels|Communication channels]]''' - a physical or logical connection between two points that allows for the exchange of an information signal
*'''[[:Category:Decoding|Decoding]]'''
*'''[[:Category:Decoding|Decoding]]''' - translating from an coded message into the original language or form
*'''[[:Category:Encoding|Encoding]]'''
*'''[[:Category:Encoding|Encoding]]''' - the process by which information from a source is changed into symbols to be communicated
*'''[[:Category:Error compensation|Error compensation]]'''
*'''[[:Category:Error compensation|Error compensation]]''' - the encoding or transmission of extra information or code to compensate for possible errors
*'''[[:Category:Genetic communication|Genetic communication]]'''
*'''[[:Category:Information entropy|Information entropy]]''' - the level of uncertainty associated with a random variable (often refers to the "Shannon entropy")
*'''[[:Category:Hamming distance|Hamming distance]]'''
*'''[[:Category:Mutual information|Mutual information]]''' - occasionally called transinformation, the quantity that measures the mutual dependence of two random variables
*'''[[:Category:Hamming weight|Hamming weight]]'''
*'''[[:Category:Rate distortion theory|Rate distortion theory]]''' - the branch of information theory which explains lossy data compression and which determines the minimal amount of entropy that should be communicated over a channel
*'''[[:Category:Information entropy|Information entropy]]'''
*'''[[:Category:Speech coding|Speech coding]]''' - the use of the data compression of digital audio signals to encode speech
*'''[[:Category:Mutual information|Mutual information]]'''
*'''[[:Category:Rate distortion theory|Rate distortion theory]]'''
*'''[[:Category:Rate-distortion|Rate-distortion]]'''
*'''[[:Category:Speech coding|Speech coding]]'''


[[Category:Computers_and_information_processing|{{PAGENAME}}]]
[[Category:Computing and electronics|{{PAGENAME}}]]

Latest revision as of 16:16, 22 July 2014

The processing of information via the use of applied mathematics and electrical engineering

Subcategories

  • Audio coding - the translation of auditory information into digital code
  • Channel coding - code used to protect information over a channel by correcting errors resulting from noise or other interference
  • Codes - rules for converting one piece of information into another
  • Communication channels - a physical or logical connection between two points that allows for the exchange of an information signal
  • Decoding - translating from an coded message into the original language or form
  • Encoding - the process by which information from a source is changed into symbols to be communicated
  • Error compensation - the encoding or transmission of extra information or code to compensate for possible errors
  • Information entropy - the level of uncertainty associated with a random variable (often refers to the "Shannon entropy")
  • Mutual information - occasionally called transinformation, the quantity that measures the mutual dependence of two random variables
  • Rate distortion theory - the branch of information theory which explains lossy data compression and which determines the minimal amount of entropy that should be communicated over a channel
  • Speech coding - the use of the data compression of digital audio signals to encode speech

Pages in category "Information theory"

The following 76 pages are in this category, out of 76 total.