Ausgabe 2/2023
Special Issue on Large-scale Pre-training: Data, Models, and Fine-tuning
Inhalt (11 Artikel)
Editorial for Special Issue on Large-scale Pre-training: Data, Models, and Fine-tuning
Ji-Rong Wen, Zi Huang, Hanwang Zhang
Pre-training in Medical Data: A Survey
Yixuan Qiu, Feng Lin, Weitong Chen, Miao Xu
Red Alarm for Pre-trained Models: Universal Vulnerability to Neuron-level Backdoor Attacks
Zhengyan Zhang, Guangxuan Xiao, Yongwei Li, Tian Lv, Fanchao Qi, Zhiyuan Liu, Yasheng Wang, Xin Jiang, Maosong Sun
A Study of Using Synthetic Data for Effective Association Knowledge Learning
Yuchi Liu, Zhongdao Wang, Xiangxin Zhou, Liang Zheng
EVA2.0: Investigating Open-domain Chinese Dialogue Systems with Large-scale Pre-training
Yuxian Gu, Jiaxin Wen, Hao Sun, Yi Song, Pei Ke, Chujie Zheng, Zheng Zhang, Jianzhu Yao, Lei Liu, Xiaoyan Zhu, Minlie Huang
Multimodal Pretraining from Monolingual to Multilingual
Liang Zhang, Ludan Ruan, Anwen Hu, Qin Jin
Offline Pre-trained Multi-agent Decision Transformer
Linghui Meng, Muning Wen, Chenyang Le, Xiyun Li, Dengpeng Xing, Weinan Zhang, Ying Wen, Haifeng Zhang, Jun Wang, Yaodong Yang, Bo Xu
Compositional Prompting Video-language Models to Understand Procedure in Instructional Videos
Guyue Hu, Bin He, Hanwang Zhang
Mitigating Spurious Correlations for Self-supervised Recommendation
Xin-Yu Lin, Yi-Yan Xu, Wen-Jie Wang, Yang Zhang, Fu-Li Feng
DynamicRetriever: A Pre-trained Model-based IR System Without an Explicit Index
Yu-Jia Zhou, Jing Yao, Zhi-Cheng Dou, Ledell Wu, Ji-Rong Wen
Vision Enhanced Generative Pre-trained Language Model for Multimodal Sentence Summarization
Liqiang Jing, Yiren Li, Junhao Xu, Yongcan Yu, Pei Shen, Xuemeng Song