Density Matrix Methods in Quantum Natural Language Processing

Please use this identifier to cite or link to this item:
Open Access logo originally created by the Public Library of Science (PLoS)
Title: Density Matrix Methods in Quantum Natural Language Processing
Authors: Bruhn, Saskia
ORCID of the author:
Abstract: Though vectors are the most commonly used structure to encode the meaning of words computationally, they fail to represent uncertainty about the underlying mean- ing. Ambiguous words can be best described by probability distributions over their various possible meanings. Putting them in context should disambiguate their mean- ing. Similarly, lexical entailment relationships can be characterized using probability distributions. A word higher up in the hierarchical order is then modeled as a prob- ability distribution over the meanings of words it subsumes. The DisCoCat model, which is inspired by the mathematical structure of quantum theory, proposes density matrices as word embeddings that are able to capture this structure. In quantum mechanics, they describe systems whose states are only known with uncertainty. First experiments have proven their ability to capture word similarity, word ambiguity, and lexical entailment structures. An adaption of the Word2Vec model, called Word2DM, can learn such density matrix word embeddings. To enforce that the learned matrices possess the properties of density matrices, the model learns intermediary matrices and derives the density matrices from them. This strategy causes the parameter updates to be sub-optimal. This thesis proposes a hybrid quantum-classical algorithm for learning density matrix word embeddings to resolve this issue. Exploiting the fact that density matrices naturally describe quantum systems, no intermediary matrices are needed, and the shortcomings of the classical Word2DM model can theoretically be circumvented. The parameters of a variational quantum circuit are optimized such that the qubits’ state corresponds to the word’s meaning. The state’s density matrix description is then extracted and used as word embedding. A separate set of parameters corresponding to its density matrix embedding is learned for each word in the vocabulary. A first implementation has been executed on a quantum simulator in the course of this thesis. The utilized objective function decreases the distance between co-occurring words and increases the distance between words that do not occur together. The training success can therefore be measured by evaluating the similarity of the learned word embeddings. The model was trained on text corpora with small vocabulary sizes. The learned embeddings showed the expected similarities between the words in the text. Implementation issues on real quantum hardware like extracting complete state representations and calculating gradients for this model will also be discussed.
Subject Keywords: Quantum Natural Language Processing; Quantum Word Embeddings; Word Embeddings; Density Matrix Word Embeddings; Quantum Neural Networks
Issue Date: 2-May-2022
License name: Attribution 3.0 Germany
License url:
Appears in Collections:FB08 - Hochschulschriften

Files in This Item:
File Description SizeFormat 
Bruhn_Masterthesis_Density-Matrix-Methods-QNLP_2022.pdfDensity Matrix Methods in Quantum Natural Language Processing1,25 MBAdobe PDF

This item is licensed under a Creative Commons License Creative Commons