Accepted for/Published in: Journal of Medical Internet Research
Date Submitted: Feb 17, 2020
Date Accepted: Apr 27, 2020
Exploring the privacy-preserving properties of word embeddings: Algorithmic Validation
ABSTRACT
Background:
Word embeddings are dense numeric sequences used to represent language in neural networks. Until recently, there had been no publicly released embeddings trained on clinical data. Our work is the first to study the privacy implications of releasing word embeddings.
Objective:
We demonstrate that traditional word embeddings created on clinical corpora that have been de-identified by removing personal health information (PHI) can nonetheless be exploited to reveal sensitive patient information.
Methods:
We demonstrate this on two corpora: 400,000 doctor-written consultation notes, and a selected subset of English Wikipedia. We experiment with the three most common word embedding methods to explore the privacy-preserving properties of each.
Results:
We show that if publicly released embeddings are trained from a corpus that is anonymized by PHI removal, it is possible to: • reconstruct full names from the original corpus, and • associate sensitive information to specific patients in the corpus from which the embeddings were created. We demonstrate that the distance between the word vector representation of a patient’s name and a diagnostic billing code is informative and differs significantly from the distance between that name and a code not billed for that patient.
Conclusions:
Special care must be taken when sharing word embeddings created from clinical texts, as current approaches may compromise patient privacy. If PHI removal is used for anonymization before traditional word embeddings are trained, it is possible to attribute sensitive information to patients who have not been fully de-identified by the (necessarily imperfect) removal algorithms. A promising alternative (i.e., anonymization by PHI replacement) may avoid these flaws. Our observations are critical, as an increasing number of researchers are pushing for publicly released health data.
Citation
Request queued. Please wait while the file is being generated. It may take some time.
Copyright
© The authors. All rights reserved. This is a privileged document currently under peer-review/community review (or an accepted/rejected manuscript). Authors have provided JMIR Publications with an exclusive license to publish this preprint on it's website for review and ahead-of-print citation purposes only. While the final peer-reviewed paper may be licensed under a cc-by license on publication, at this stage authors and publisher expressively prohibit redistribution of this draft paper other than for review purposes.