Bidirectional Language Modeling: A Systematic Literature Review

In transfer learning, two major activities, i.e., pretraining and fine-tuning, are carried out to perform downstream tasks. )eadvent of transformer architecture and bidirectional language models, e.g., bidirectional encoder representation from transformer(BERT), enables the functionality of transfer learning. Besides, BERT bridges the limitations of unidirectional language models byremoving the dependency on the recurrent neural network (RNN). …

Bidirectional Language Modeling: A Systematic Literature Review Read More »