Maintenance Notice

Due to necessary scheduled maintenance, the JMIR Publications website will be unavailable from Wednesday, July 01, 2020 at 8:00 PM to 10:00 PM EST. We apologize in advance for any inconvenience this may cause you.

Who will be affected?

Accepted for/Published in: JMIR Medical Informatics

Date Submitted: Jul 28, 2020
Date Accepted: Dec 22, 2020

The final, peer-reviewed published version of this preprint can be found here:

Adapting Bidirectional Encoder Representations from Transformers (BERT) to Assess Clinical Semantic Textual Similarity: Algorithm Development and Validation Study

Adapting Bidirectional Encoder Representations from Transformers (BERT) to Assess Clinical Semantic Textual Similarity: Algorithm Development and Validation Study

JMIR Med Inform 2021;9(2):e22795

DOI: 10.2196/22795

PMID: 33533728

PMCID: 7889424

Warning: This is an author submission that is not peer-reviewed or edited. Preprints - unless they show as "accepted" - should not be relied on to guide clinical practice or health-related behavior and should not be reported in news media as established information.

Extending BERT for Clinical Semantic Textual Similarity

ABSTRACT

Background:

Natural Language Understanding enables automatic extraction of relevant information from clinical text data which are acquired every day in hospitals. In 2018, the language model BERT was introduced generating new state of the art results on several downstream tasks. The National NLP Clinical Challenges (n2c2) was initiated to tackle such downstream tasks on clinical text data where domain adapted methods might be a way to further improve language models like BERT.

Objective:

Optimally leverage BERT for the task of semantic textual similarity on clinical text data.

Methods:

We used BERT as an initial baseline and analysed its results which we used as a starting point to develop three different approaches where we (1) added additional, handcrafted sentence similarity features to the classifier token of BERT and combined the results with more features in multiple regression estimators, (2) incorporated a built-in ensembling method, M-Heads, into BERT by duplicating the regression head and applying an adapted training strategy to facilitate the focus of the heads on different input patterns of the medical sentences and (3) developed a graph-based similarity approach for medications which allows extrapolating similarities across known entities from the training set. The approaches were evaluated with the Pearson correlation coefficient between the predicted scores and ground truth on the official training and test dataset.

Results:

We improve the performance of BERT on the test dataset from a Pearson correlation coefficient of 0.859 to 0.883 using a combination of the M-Heads and the graph-based similarity approach. We also show differences between the test and training dataset and how they influence the results.

Conclusions:

We found that using a graph-based similarity approach has the potential to extrapolate domain specific knowledge to unseen sentences. For the evaluation, we observed that it is easily possible to get deceived by results on the test dataset especially when the distribution of the data samples is different between the training and test datasets.


 Citation

Please cite as:

Adapting Bidirectional Encoder Representations from Transformers (BERT) to Assess Clinical Semantic Textual Similarity: Algorithm Development and Validation Study

JMIR Med Inform 2021;9(2):e22795

DOI: 10.2196/22795

PMID: 33533728

PMCID: 7889424

Download PDF


Request queued. Please wait while the file is being generated. It may take some time.

© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.