What Is Automated Image Processing?
A real-time image processing system is a system that acquires images, analyzes images to obtain data, and uses these data to control some behaviors. All processing must occur within a predefined time (usually the image acquisition rate).
- For the traditional von Neumann computer, it is a standard serial machine. Most of the work is to exchange data between memory and ALU, and the flow rate of data limits the speed of the computer. This is also commonly known as Bottleneck effect, it is difficult to complete real-time high-speed image processing.
With the advent of the PCI bus, this problem has been resolved to a certain extent. Some people have compressed the data transmission time to 13ms, and the remaining 27ms is used to process the algorithm. However, as the complexity of processing increases, the algorithm is getting more and more The more complex, this processing is quasi-real time. Therefore, multiprocessor systems and parallel processing architecture computers have become an inevitable trend of development.
- Parallel processing is the use of concurrent events in the computing process for effective information processing. Concurrency includes three meanings: parallelism, simultaneity, and pipeline. Parallel time can occur in multiple resources in the same time interval, simultaneous events occur at the same instant, and pipeline events can occur in overlapping time periods. These concurrency events can be obtained at various processing levels of a computer system. The highest level of parallel processing is to implement multiple jobs or programs through multi-channel, time-sharing, and multi-processing. This level needs to study the algorithms of parallel processing. Through parallel algorithms, the limited software and hardware resources are effectively allocated to solve large computing problems. Multiple programs; second-level parallel processing is the implementation of processes and tasks within the same program, which involves decomposing a program into multiple tasks; the third level uses the concurrency of multiple instructions to achieve fast speed through the concurrency of instructions Concurrent operations.
- Parallel processing is not only applied in computers, but its processing ideas are also implemented in all aspects of hardware circuits. Digital image processing systems have also undergone major changes in terms of hardware structure, that is, they have evolved from a basic serial structure to a parallel processing structure, from a single processor to a multiprocessor system, or a high-speed processing system with an array processor. With the development of parallel structure theory, parallel algorithms and languages, VLSI technology, and CAD tools, parallel processor arrays are now widely used in communications, biomedicine, industrial inspection, and military and other aspects. An effective way for intensive computing capabilities.
- The continuous development and improvement of image processing technology, compression coding technology, and the rapid development of VLSI technology make it possible to realize real-time image processing with hardware circuits. With the continuous promotion of Integrated Services Digital Network (ISDN) and the rise of multimedia technology, real-time image processing has been widely used in various fields. In recent years, various high-performance dedicated chips, digital signal processors (DSP) and ultra-large-scale programmable logic devices (FPGA / ASIC) have emerged, making the design of modern real-time image processing systems incomparable. . The signal processing system of a chassis or even a cabinet in the past can now be integrated on a single chip. The emergence of programmable logic devices enables hardware designers to implement large-scale digital logic control circuits through software design and programming according to the needs of application systems. [1]
- A real-time image processing system is a system that acquires images, analyzes images to obtain data, and uses these data to control some behaviors. All processing must occur within a predefined time (usually the image acquisition rate). In other words, the execution of the entire image processing algorithm must be completed in a limited time, which means that some operation-level algorithms will be excluded from real-time processing. For example, operations based on iterative or recursive algorithms can only be used if satisfactory convergence can be obtained after a predetermined number of repeated operations.
- There are many examples of real-time image processing systems. In machine vision, image processing algorithms are usually used to implement path planning and process control. In these occasions, time is very important. Autonomous vehicle control requires vision or other forms of sensors to enable the vehicle to navigate or avoid obstacles in a dynamic environment. In a video transmission system, successive frames of an image must be transmitted and displayed in the correct order and minimized jitter to avoid quality loss on the final video image. In the application system of image acquisition and display, the time limit is on the order of tens of nanoseconds, so it needs to be completed by hardware.
- The FPGA implements the logic required by the entire system by establishing separate hardware for each function. It has the characteristics of parallel processing of hardware, on the one hand, it has a high running speed, and on the other hand, it has the flexibility of software reprogramming. The low-level image processing stage with a large number of pixel expressions can make full use of the inherent parallelism of image processing operations. By using parallel hardware instead of sequential loop operations, many image processing algorithms can be accelerated.
- When the image data is serially passed through a single function module from the camera or display in the form of a data stream, it can be easily mapped to hardware implementation. If all operations can be processed using streams, the entire algorithm can be efficiently pipelined. Way to achieve. To sum up, FPGA has unique advantages in real-time image processing.
- This system consists of a real-time image acquisition module, a frame buffer module (before processing), a real-time image processing module, a frame buffer module (after processing), a real-time image display module, and a debugging interface module. The processing flow is as follows: After the FPGA is powered on and configured, the FPGA communication module is used to configure the function of the CMOS camera, and the real-time digital image data stream is obtained through the image capture module, which is buffered to the frame buffer module (before processing); the FPGA real-time image processing module reads Take and process the image data stream in the frame buffer module (before processing), and store it in the frame buffer module (after processing) after processing; the FPGA real-time image display module reads the image data from the frame buffer module (after processing) and sends it The analog signal is output to the video DA conversion module, and the corresponding line synchronization and field synchronization signals are sent according to the VGA protocol to realize the display after real-time image processing.
- The main structure of the system is shown in the figure.
- The system uses two frame buffer modules, using a real-time image processing interface to isolate the real-time image processing function from the front-end acquisition module and terminal display module, and each frame buffer module uses only one frame buffer instead of the two commonly used The frame buffer performs ping-pong operations to improve efficiency and ensure image integrity. This system can use a frame buffer for simultaneous reading and writing processing based on the small difference between two consecutive images of the real-time image. Due to the continuous high-speed refresh of the image, the human eye can hardly detect the incomplete picture. in this way. The advantage of this design is that the structure is simpler, the read and write ends of the frame buffer module do not need to use handshake signals, and the processing rate of the image processing module does not depend on the rates of the acquisition and display ends, reducing the image processing module and other modules. The synergistic nature of the system makes this system more versatile. [4]