Search results
Results from the WOW.Com Content Network
Zipf's law (/ z ɪ f /; German pronunciation:) is an empirical law stating that when a list of measured values is sorted in decreasing order, the value of the n-th entry is often approximately inversely proportional to n. The best known instance of Zipf's law applies to the frequency table of words in a text or corpus of natural language:
George Kingsley Zipf (/ ˈ z ɪ f / ZIFF; [1] January 7, 1902 – September 25, 1950), was an American linguist and philologist who studied statistical occurrences in different languages. [ 2 ] Zipf earned his bachelors, masters, and doctoral degrees from Harvard University , although he also studied at the University of Bonn and the University ...
The Brevity law appears universal and has also been observed acoustically when word size is measured in terms of word duration. [5] 2016 evidence suggests it holds in the acoustic communication of other primates. [6] Log per-million word count as a function of wordlength (number of characters) in the Brown Corpus, illustrating Zipf's Brevity Law.
While Zipf's law works well in many cases, it tends to not fit the largest cities in many countries; one type of deviation is known as the King effect. A 2002 study found that Zipf's law was rejected in 53 of 73 countries, far more than would be expected based on random chance. [10]
Zipf's law states that given some corpus of natural language utterances, the frequency of any word is inversely proportional to its rank in the frequency table. The law is named after George Kingsley Zipf, an early twentieth century American linguist. Zipf popularized Zipf's law and sought to explain it, though he did not claim to have ...
It has been found that natural cities exhibit a striking Zipf's law [9] Furthermore, the clustering method allows for a direct assessment of Gibrat's law. It is found that the growth of agglomerations is not consistent with Gibrat's law: the mean and standard deviation of the growth rates of cities follows a power-law with the city size. [10]
However, the long tails characterizing distributions such as the Gutenberg–Richter law or the words-occurrence Zipf's law, and those highlighted by Anderson and Shirky are of very different, if not opposite, nature: Anderson and Shirky refer to frequency-rank relations, whereas the Gutenberg–Richter law and the Zipf's law are probability ...
This is an accepted version of this page This is the latest accepted revision, reviewed on 17 January 2025. Observation that in many real-life datasets, the leading digit is likely to be small For the unrelated adage, see Benford's law of controversy. The distribution of first digits, according to Benford's law. Each bar represents a digit, and the height of the bar is the percentage of ...