What Is an Unreachable Code?
The so-called compiler generator is to automatically construct a lexical analyzer that can perform lexical analysis on a given word regular expression; for a given grammar, automatically construct a parser that can perform grammatical analysis: it can automatically add Necessary semantic analysis, and can give user-oriented programs of semantic program interface.
- With the advent of the information age, computer networks have increasingly penetrated into many areas of people's lives with the characteristics of resource sharing and data communication. Therefore, maintaining good network reliability and high efficiency is crucial. . Network management is the planning, design, and control of the resources and equipment that make up the network, so that the network has the highest efficiency and productivity, thereby providing users with efficient services.
- The so-called compiler generator is to automatically construct a lexical analyzer that can perform lexical analysis on a given word regular expression; for a given grammar, automatically construct a parser that can perform grammatical analysis: it can automatically add Necessary semantic analysis, and can give user-oriented programs of semantic program interface.
- The current compiler generator is mostly a single separate, that is, it is a single lexical analyzer generator, or a single parser generator, the semantic analyzer generator can only complete abstract semantic analysis. A compiler generator that integrates the various stages of the compiler needs further research. [1]
- The compiler's automatic generator can generate a corresponding optimized compiler for a new type of processor architecture. The compiler's automatic generator then forms an important part of reconfigurable software. For a new type of hardware structure, its assembly instruction system and machine resources are specific. Designers should be provided with a configuration interface so that they can easily define these specific resources to describe and specify a specific hardware structure. Automatic generators generate compilers for such machines, enabling retargeting. [2]
- The job of the compiler generator is on the front end, which needs to convert the user's visual description (formal grammar) of the lexical, grammatical, and semantic information of the new language into the internal processing form of the analysis program; on the back end, it is necessary to convert the user's physics to the new machine. The visual description of the characteristic-to-logical characteristic mapping translates into data and algorithms used by the generator.
- The front end of the compiler includes lexical analysis and semantic analysis, and generates intermediate code. It mainly depends on the source language and has nothing to do with the target machine architecture. So this part can be fully reused as a component. The compiler backend includes code optimization, code selection, code generation, register allocation, and more. The back end is the part of the compiler that is related to the target machine. In general, these parts are independent of the source language and only rely on the intermediate representation and the target machine. [2]
- Layered
- The layers of the auto generator are shown in the following figure.
- Layered illustration of auto generator
- structure
- The general structure of the compiler generator is shown below.
- General structure of the compiler generator
- In fact, such a program automatically generates a structure that separates the representation and operation of the problem solution. The generator converts the information represented by the meta-language into data structures directly accessed by the driver. In most cases, these data structures are stored in the form of a program and then compiled and linked with the driver. Although this will be less efficient than generating the table directly, it allows the user Modify the generated content to adjust the driver's input. [1]
- Automatic lexical analysis program generation
- The structure of the lexical analyzer generator is shown below. The process of generating a lexical analysis program is similar to the compilation process itself. It uses the following method: a regular expression of a grammar is converted into an NFA, and then converted into a DFA, and finally the DFA is simplified by the DFA simplification method.
- Structure of the lexical analyzer generator
- Automatic generation of a parser
- The structure of the parser generator is shown in the following figure. The context-free grammar is expressed by BNF or EBNF, and the generated table can be either an LL parsing table or an LR parsing table.
- Structure of the parser generator
- Semantic analyzer generator
- The structure of the semantic analyzer generator is shown in the figure below.
- Structure of the semantic analyzer generator
- Attribute grammar originated from grammar-guided translation technology. In 1968, Knuth made a standardized description of attribute grammar. One
- Attribute grammar is to attach attributes (such as data structure, value, machine code, etc.) and semantic functions (define how to assign values to attributes) on BNF. LL parsing is suitable for L-attribute grammar (using inherited attributes), and LR parsing is suitable for S-attribute grammar (using synthetic attributes).
- Translation tools for metalanguages including attribute grammars appeared in the late 1970s to early 1980s, such as GAG (Kastens et al., 1982), HLP (Rai et al., 1978), VATS (Berg et al., 1984), and MUG-2 (Ganapathi Et al. (1982) are descriptions based on attribute grammars.
- Back-end technology
- The backend accepts intermediate representations and generates equivalent object code. The general process is:
- 1) Intermediate code optimization
- The ultimate goal of optimization is to enable the user to make full use of the characteristics of a language to describe his / her algorithm without having to care about the efficiency of the implementation, that is, the programmer cannot be required to write delicate and efficient code. They should care about the readability and maintainability of the program.
- Optimization should be performed on those code segments that are most frequently executed, as well as those that are most often written. Optimization is an attempt to reduce code execution time and to reduce the amount of generated code, usually a compromise between the two. The most basic principle is that the intermediate code is an iso-semantic transformation. If the program is not run multiple times, or the time / space efficiency of the generated code is not very important, then there is no need to optimize. There are many optimization strategies, and most systems simply select several optimization algorithms based on specific requirements.
- Use flow diagrams to describe all possible execution paths. Control flow analysis makes data flow analysis simple. Control flow analysis has two steps: first divide the basic blocks, and then build the flow graph. Data flow analysis provides information about the definition and use of data items, both for optimization and debugging. The role is mainly to provide a path used before a data item is defined: to detect redundant or useless code. Data flow analysis also detects information such as loop invariants and calculated expressions.
- 2) Code generation
- Code generation includes two parts: register allocation and code selection. The simplest code generation can be just code selection, that is: converting intermediate representations into assembly code. Modern compilation systems pursue automatic generation of code generators. Register allocation and code selection are "chicken-egg-egg-chicken" issues. A good register allocation algorithm can greatly improve the execution speed of the generated code, so it can be classified as an optimization process.
- 3) Objective code optimization
- The optimization strategies for the generated code are: the various intermediate representation optimizations, register optimizations, and peephole optimizations described earlier. Peephole optimization: Check a small piece of generated code and replace it with a faster and shorter sequence. The short sequence of instructions being examined is called a peephole. Its general steps are: eliminate unnecessary access, eliminate unreachable code, eliminate indirect jumps, arithmetic processing, identify special instructions or patterns. [1]