Skip to Main content Skip to Navigation
Book sections

Pre-Training a Neural Language Model Improves the Sample Efficiency of an Emergency Room Classification Model

Abstract : To build a French national electronic injury surveillance system based on emergency room visits, we aim to develop a coding system to classify their causes from clinical notes in free-text. Supervised learning techniques have shown good results in this area but require a large amount of expert annotated dataset which is time consuming and costly to obtain. We hypothesize that the Natural Language Processing Transformer model incorporating a generative self-supervised pre-training step can significantly reduce the required number of annotated samples for supervised fine-tuning. In this preliminary study, we test our hypothesis in the simplified problem of predicting whether a visit is the consequence of a traumatic event or not from free-text clinical notes. Using fully retrained GPT-2 models (without OpenAI pre-trained weights), we assess the gain of applying a self-supervised pre-training phase with unlabeled notes prior to the supervised learning task. Results show that the number of data required to achieve a ginve level of performance (AUC>0.95) was reduced by a factor of 10 when applying pre-training. Namely, for 16 times more data, the fully-supervised model achieved an improvement <1% in AUC. To conclude, it is possible to adapt a multipurpose neural language model such as the GPT-2 to create a powerful tool for classification of free-text notes with only a small number of labeled samples.
Complete list of metadata

Cited literature [17 references]  Display  Hide  Download

https://hal.inria.fr/hal-02611917
Contributor : Marta Avalos <>
Submitted on : Tuesday, May 19, 2020 - 11:23:41 AM
Last modification on : Wednesday, April 14, 2021 - 12:26:03 PM

File

18444-79384-1-PB.pdf
Publisher files allowed on an open archive

Identifiers

  • HAL Id : hal-02611917, version 1

Collections

Citation

Binbin Xu, Cédric Gil-Jardiné, Frantz Thiessard, Éric Tellier, Marta Avalos, et al.. Pre-Training a Neural Language Model Improves the Sample Efficiency of an Emergency Room Classification Model. Roman Barták, Eric Bell. Proceedings of the 33rd International Florida Artificial Intelligence Research Society Conference, The AAAI Press, 2020, 978-1-57735-821-3. ⟨hal-02611917⟩

Share

Metrics

Record views

118

Files downloads

578