CFP | Special Issue on Large-scale Pre-training: Data, Models, and Fine-tuning

Machine Intelligence Research (ISSN 2731-538X) seeks original manuscripts for a special issue on "Large-scale Pre-training: Data, Models, and Fine-tuning".


Recent years have witnessed an explosion of interest in and a fast development of large-scale pretrained models with the explosion of massive data and model parameters. Large-scale pretrained models have achieved milestones and exemplary performance on a broad range of practical problems, including not only computer science areas like natural language processing, computer vision, and recommender systems, but also other research areas like biology, meteorology, art, etc. Different from early non-neural models and small models that heavily rely on hand-crafted features, statistical methods, and accurate human annotations, neural models can automatically learn low-level distributed representations and high-level latent semantic information from data. As deep neural models tend to overfit and have poor generalization with huge numbers of parameters, massive efforts have been devoted to exploiting how to pre-train large-scale models on large-scale data. As large-scale human annotations are labor-consuming and time-costing, it is impractical for large-scale pre-training in a fully-supervised manner. Considering this issue, the AI community has made recent efforts on self-supervised learning algorithms and theories, large-scale pre-training paradigms according to data format, large-scale model architecture designs, and fine-tuning pre-trained models for downstream applications.



This special issue seeks original and novel contributions towards advancing the theory, architecture, and algorithmic design for large-scale pre-trained models as well as downstream applications. The special issue will provide a timely collection of recent advances to benefit the researchers and practitioners working in the broad research field of deep learning, natural language processing, computer vision, and machine intelligence. Topics of interest include (but are not limited to):

- Language Pre-training

- Visual Pre-training

- Multi-modal Pre-training

- Multi-lingual Pre-training

- Large-scale Pre-training Theories

- Large-scale Pre-training Algorithms and Architectures

- Efficient Large-scale Pre-training

- Fine-tuning Pre-trained Models

- Pre-training Applications

- Survey of Large-scale Pre-training


Submission Deadline:June 30, 2022

Submission Entrance:

https://mc03.manuscriptcentral.com/mir

Please submit your manuscript to:

“Step 6 Details & Comments: Special Issue and Special Section”---“Special Issue on Large-scale Pre-training: Data, Models, and Fine-tuning”


Guest Editors: 

Prof. Ji-Rong Wen, Renmin University of China, China (jrwen@ruc.edu.cn) 

Prof. Zi Huang, The University of Queensland, Australia (huang@itee.uq.edu.au) 

Prof. Hanwang Zhang, Nanyang Technological University, Singapore (hanwangzhang@ntu.edu.sg) 



About Machine Intelligence Research Machine Intelligence Research (MIR, ISSN 2731-538X, original title International Journal of Automation and Computing) is published by Springer, and sponsored by Institute of Automation, Chinese Academy of Sciences. MIR publishes papers on original theoretical and experimental research and emerging technologies in intelligent science, and strives to bridge the gap between theoretical research and practical applications. The topics of MIR include AI Basic Theory, Machine Learning, Pattern Recognition & Computer Vision, Natural Language Processing, Knowledge Management & Data Mining, Brain-inspired Intelligence, Intelligent Robotics, Distributed AI & Multi-Agent Systems, and Innovative AI Applications. 


Machine Intelligence Research 

Editor-in-Chief: Tieniu Tan 

Tel: (86-10)82544499 

E-mail: mir@ia.ac.cn 

http://www.mi-research.net 

http://link.springer.com/journal/11633 

Submission Entrance: https://mc03.manuscriptcentral.com/mir

  • Share:
Release Date: 2022-05-06 Visited: