What Is a Recurrent Neural Network?

Recursive neural network is an artificial neural network (Artificial Neural Network, ANN) with a tree-like hierarchical structure and network nodes recursively input information according to their connection order. It is one of the deep learning algorithms [1 ] .

Recurrent neural networks are oriented
The core part of a recurrent neural network is composed of hierarchically distributed nodes, where high-level nodes are parent nodes, low-level nodes are called child nodes, and the end child nodes are usually output nodes.
Recursive neural networks can be used

Long and short-term memory of recurrent neural network

Long Short-Term Memory (LSTM) is a time-recurrent neural network (RNN). The paper was first published in 1997. Due to its unique design structure, LSTM is suitable for processing and predicting problems involving long distance dependencies in time series.
LSTMs usually perform better than time-recursive neural networks and Hidden Markov Models (HMM), such as for continuous handwriting recognition without segmentation. In 2009, the artificial neural network model built with LSTM won the ICDAR handwriting recognition competition championship. LSTM is also commonly used for autonomous speech recognition. In 2013, the TIMIT natural speech database reached a record of 17.7% error rate. As a non-linear model, LSTM can be used as a complex non-linear unit to construct larger deep neural networks.
LSTM LSTM comprising a block (Blocks), or other kind of neural network, documents, or other materials may be described as block LSTM intelligent network element, because it can be a time value memory of variable length, a block The gate can determine whether the input information can be extracted and can be output.
At the bottom of the right are four S function units. The leftmost function may become the input of the block according to the situation. The three on the right will pass the gate to determine whether the input can be passed into the block. The second on the left is the input gate. zero, the value here will be blocked and will not advance to the next level. The third one on the left is the forget gate. When this value is close to zero, the value remembered in the block will be forgotten. The fourth and rightmost input is the output gate. He can decide whether the input in the block memory can be output.
There are many versions of LSTM, one of which is the GRU (Gated Recurrent Unit). According to Google's tests, the largest contribution to learning in LSTM is the Forget gate, followed by the Input gate and the output gate.

Recursive Recursive neural network structureRecursive neural network

Structural recurrent neural networks are a type of networks constructed in a structurally recursive way, such as the Recursive Autoencoder, which is used to analyze sentences in neural network analysis methods for natural language processing.

IN OTHER LANGUAGES

Was this article helpful? Thanks for the feedback Thanks for the feedback

How can we help? How can we help?