enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Ralph Hartley - Wikipedia

    en.wikipedia.org/wiki/Ralph_Hartley

    Ralph Vinton Lyon Hartley (November 30, 1888 – May 1, 1970) was an American electronics researcher. He invented the Hartley oscillator and the Hartley transform, and contributed to the foundations of information theory.

  3. Information theory - Wikipedia

    en.wikipedia.org/wiki/Information_theory

    Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as H = log S n = n log S, where S was the number of possible symbols, and n the number of symbols in a transmission ...

  4. Shannon–Hartley theorem - Wikipedia

    en.wikipedia.org/wiki/Shannon–Hartley_theorem

    In information theory, the Shannon–Hartley theorem tells the maximum rate at which information can be transmitted over a communications channel of a specified bandwidth in the presence of noise. It is an application of the noisy-channel coding theorem to the archetypal case of a continuous-time analog communications channel subject to ...

  5. History of information theory - Wikipedia

    en.wikipedia.org/wiki/History_of_information_theory

    Hartley's 1928 paper, called simply "Transmission of Information", went further by using the word information (in a technical sense), and making explicitly clear that information in this context was a measurable quantity, reflecting only the receiver's ability to distinguish that one sequence of symbols had been intended by the sender rather ...

  6. Channel capacity - Wikipedia

    en.wikipedia.org/wiki/Channel_capacity

    Information-theoretic analysis of communication systems that incorporate feedback is more complicated and challenging than without feedback. Possibly, this was the reason C.E. Shannon chose feedback as the subject of the first Shannon Lecture, delivered at the 1973 IEEE International Symposium on Information Theory in Ashkelon, Israel.

  7. Data communication - Wikipedia

    en.wikipedia.org/wiki/Data_communication

    The fundamental theoretical work in data transmission and information theory by Harry Nyquist, Ralph Hartley, Claude Shannon and others during the early 20th century, was done with these applications in mind.

  8. Hartley (unit) - Wikipedia

    en.wikipedia.org/wiki/Hartley_(unit)

    The hartley (symbol Hart), also called a ban, or a dit (short for "decimal digit"), [1] [2] [3] is a logarithmic unit that measures information or entropy, based on base 10 logarithms and powers of 10. One hartley is the information content of an event if the probability of that event occurring is 1 ⁄ 10. [4]

  9. Hartley function - Wikipedia

    en.wikipedia.org/wiki/Hartley_function

    The Hartley function is a measure of uncertainty, introduced by Ralph Hartley in 1928. If a sample from a finite set A uniformly at random is picked, the information revealed after the outcome is known is given by the Hartley function