Abstract: Transformer-based models, such as Bidirectional Encoder Representations from Transformers (BERT), cannot process long sequences because their self-attention operation scales quadratically ...
Abstract: The classification of scholarly publications is a critical task for knowledge organization, yet traditional methods often fall short in addressing the complexities of interdisciplinary ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results