What Is Granular Computing?
Granular computing, a new concept and computing paradigm of information processing, covers all theories, methods, technologies and tools related to granularity, and is mainly used for the intelligent processing of uncertain and incomplete fuzzy mass information.
Granularity calculation
- This entry lacks an overview map . Supplementing related content makes the entry more complete and can be upgraded quickly. Come on!
- Chinese name
- Granularity calculation
- Foreign name
- Granular computing
- Mainstream application technology
- Granular computing, cloud computing
- Granular computing, a new concept and computing paradigm of information processing, covers all theories, methods, technologies and tools related to granularity, and is mainly used for the intelligent processing of uncertain and incomplete fuzzy mass information.
- Granularity is a tool for describing fuzzy uncertain objects.
- In 1979, scholar Zadeh put forward the term "information grain" in his paper "Fuzzy sets and information granularity", which aroused great attention of researchers. TYLin and 1997 formally proposed the concept of "particle computing", which soon became artificial intelligence. One of the hot topics in the research field. At present, there are three main theories and methods about granular computing: Theory of Works Computing, Theory of Rough Set, and Theory of Quotient. Space).
- Granularity is taking objects of different sizes. In other words, the original "coarse-grained" large object is divided into several "fine-grained" small objects, or several small objects are combined into one large coarse-grained object for research. Academician Zhang Jian, a Chinese scholar, once pointed out: "A recognized feature of human intelligence is that people can observe and analyze the same problem from very different granularities. People can not only solve problems in a world with different granularities, but also Quickly move from one granular world to another, without any difficulty. "Granular computing is a method that mimics the way humans think about problems.
- Granular computing is a direction of the newly emerging field of artificial intelligence research. Granularity calculation is a big umbrella that covers all the research on theories, methodologies, techniques and tools about granularity. Roughly speaking, granular computing is a superset of fuzzy information granularity theory, while rough set theory and interval computing are subsets of granular mathematics.
- Granular computing, as a methodology, aims to effectively establish a user-centric concept based on the external world, while simplifying our knowledge of the physical world and the virtual world and based on this. In the process of solving problems, granularity is used. "Appropriate granules" are the object of processing, so as to improve the efficiency of problem solving on the premise of ensuring a satisfactory solution. The appropriate granularity is often determined by the problem and the environment of the problem. This is important for design based on granules. The calculated data processing framework is of great significance. Take an example of time. For example, Mr. Zhang asked his friend; "When did you return to China?" The time granularity chosen to answer this question is actually determined by how long the friend has returned to China. If it s not more than one day, then he would say: Noon yesterday; if there are more than ten days, he can say Last week ; if it is a friend who has been back home for several years, Mr. Zhang will learn the news, then the year It can be a satisfactory answer. Note that the above answers have different granularities, which are noon, week, and year. If not using the right size, all unified by a common answer on the computer timestamp format: year, month, day afternoon, "is clearly not reasonable, people feel very uncomfortable. [1]
- Granular computing and cloud computing will become the two mainstream application technologies of the computer in the future. [2]