What is a Quadbit?
Bit (BIT, Binary digit), a computer term, is a unit of information volume, which is transliterated from English BIT. It is also a bit in a binary number, a unit of measurement for the amount of information, and the smallest unit of information. Information necessary to halve the number of alternative stimuli in situations where different choices are required That is, the amount of information (number of bits) of the signal is equal to the logarithm of the base of the signal stimulus. L. Hartley thought in 1928 that it was most appropriate to use logarithmic units to measure the amount of information. [1]
- Two concepts
- 1)
- Discrete bit-encoded data was used in punch cards invented by Basile Bouchon and Jean-Baptiste Falcon (1732), developed by Joseph Marie Jacquard (1804), and later adopted by Semen Korsakov, Charles Babbage, Hermann Hollerith and earlier. Computer manufacturers such as IBM. Another variation on this idea is perforated paper tape. In all of these systems, the media (card or tape) conceptually carries a series of hole locations; each location can be worn or not, and therefore carries a bit of information. Bit-encoded text was also used in Morse code (1844) and early digital communication machines such as telex and stock code machines (1870).
- Ralph Hartley suggested using logarithmic metrics in 1928. Claude E. Shannon first used the word bit in his groundbreaking thesis "The Mathematical Theory of Communication" in 1948. He attributes his origins to John W. Tukey, who wrote the Bell Labs memorandum on January 9, 1947. He referred to "binary information numbers" as "bits" for short. Interestingly, Vannevar Bush wrote "Information Points" in 1936, which could be stored on punch cards used in mechanical computers at the time. The first programmable computer built by Konrad Zuse was numbered in binary notation.
- The sampling frequency of CD is 44.1KHz. This specification is based on Nyquist's sampling theory. He believes that the analog signal must be changed to a discrete time. The sampling frequency must be at least twice the original signal. The hearing limit of the human ear is about 20KHz, so when Philips introduced the CD in 1982, it set it to 44.1KHz. Sampling is the first step in replacing analog signals with digital signals, but the accuracy is still rough, so
- Some bitwise computer processor instructions (such as bit sets) operate at the level of operating bits, rather than manipulating data that is interpreted as a set of bits.
- In the 1980s, when bitmap computer monitors became popular, some computers provided special bit-block transfer ("bitblt" or "blit") instructions to set or copy bits corresponding to a given rectangular area on the screen [2 ] .
- In most computers and programming languages, when referring to a bit in a set of bits, such as a byte or word, it is usually specified by a 0-up number corresponding to its position within the byte or word. However, 0 can refer to the most significant or least significant bit, depending on the context.