What is a distributed concurrent control?

Distributed control of concurrency is a strategy that spreads liability for concurrent control throughout the network. The concurrentness concerns that all computers work with the same version of the same files. Once computers are interconnected, concerns about concurrency - problems with maintaining all files in the network identical to all users - as many users can have current access to any authorized files and folders in the system. Without promoting concurrentness, these files could easily become inconsistent from one computer to another when users change and manipulate real -time data, leading everyone to quickly lose the ability to rely on network files as soon as changes. The concurrence check keeps the files consistent throughout the network and avoids this problem.Režie on everyone. Without distributed control of concurrency, maintenance in the network could easily cost full -time work for one computer, which would be unnecessary foranything else. With distributed concurrence control, each computer on the network can help share workload and ensure that end users can still use terminals for other network tasks.

Strong strict two -phase locking is one of the most common types of distributed concurrent control. With a strong strict two -phase lock, as soon as there is access to a single network file, it is locked for both read and write operations until the access ends. This means that only one user in the network can change the file at the same time, making the file impossible to fall out of the network parallel. As soon as the end user saves changes in the file or completely leaves the file, the locks are deleted, allowing another user in the System to reopen the file.

One of the biggest disadvantageously strong strict two -phase locking is another overhead cost that places network sources. Every file under each alreadyThe user must be earmarked by the network as "locked" and this information must be kept in memory until it blocks. In the summary of hundreds of end -users, hundreds of files at the same time, it can easily cannibalize a significant part of the network memory. This excessive memory cannibalization can slow networks with inefficient or outdated hardware designs.

IN OTHER LANGUAGES

Was this article helpful? Thanks for the feedback Thanks for the feedback

How can we help? How can we help?