research
∙
04/09/2022
FoundationLayerNorm: Scaling BERT and GPT to 1,000 Layers
The mainstream BERT/GPT model contains only 10 to 20 layers, and there i...
research
∙
11/09/2021
FPM: A Collection of Large-scale Foundation Pre-trained Language Models
Recent work in language modeling has shown that training large-scale Tra...
research
∙
06/24/2020
Movie Box office Prediction via Joint Actor Representations and Social Media Sentiment
In recent years, driven by the Asian film industry, such as China and In...
research
∙
06/24/2020