//php echo do_shortcode(‘[responsivevoice_button voice=”US English Male” buttontext=”Listen to Post”]’) ?>
One of the most demanding—yet fascinating—challenges that researchers, scientists and academics are facing is understanding the mathematics of language.
In this context, the discipline that deals with defining and applying models and tools to the treatment of natural language is called computational linguistics (CL). By nature, CL is interdisciplinary, as it combines methods and techniques of linguistics with those of computer science, artificial intelligence (AI) and statistics.
Strictly related to CL is natural language processing (NLP), a subfield of linguistics, computer science and AI that deals with the interactions between computers and human (natural) languages. More specifically, how to instruct computers to process and analyze large amounts of natural language data.
NLP has a wide range of potential applications in areas like machine translation, automatic summarization, speech recognition, question answering, information extraction and smart user interfaces. Quantum computers, based on the use of qubits instead of the bits used by traditional computers, have superior computing power and enable the possibility of achieving significant developments and innovations in the field of computational linguistics. In this way, we can begin to talk about quantum natural language processing (QNLP).
Quantinuum, an integrated quantum computing company, recently collaborated with the University College London (UCL) and the British Broadcasting Corporation (BBC) to examine the business potential of QNLP and quantum-inspired natural language processing.
The goal of computational linguistics is to comprehend language from a computational standpoint. Theoretically, a computer that could speak with human-like proficiency might have meaningful conversations, learn new languages, translate between them, and gain information through reading texts. Additionally, understanding language through computers may also shed light on how the human brain works.
While NLP is effective, QNLP aims to outperform it by translating language into coded circuits that can be processed by quantum computers.
Current NLP language models made using deep neural networks and transformer models use a lot of energy, raising environmental issues. Within a decade, quantum computers will scale from a few hundred qubits to millions of qubits, enabling broader QNLP applications that are more effective, quicker, can handle very large datasets, use less power, and have less of an impact on the environment.
Engineers can greatly enhance AI by incorporating QNLP. Since a lot of data is needed to build AI models, using quantum computing could greatly accelerate the training process, possibly cutting training time in half to only a few hours, or even minutes.
Quantinuum is accelerating quantum computing and its application across chemistry, cybersecurity, finance and optimization. With their combined knowledge and experience, as well as a body of QNLP research, Stephen Clark, head of AI at Quantinuum, and Bob Coecke, chief scientist at Quantinumm, give the company a distinct tactical edge in both current and future QNLP applications.
Content discovery and archive retrieval
With the aim to investigate the commercial potential of QNLP and quantum-inspired natural language processing, Quantinuum joined a cooperation with UCL and the BBC.
The consortium will build on a lengthy investigation into quantum mechanics and linguistics, led by Coecke, Clark, and Mehrnoosh Sadrzadeh, a computer science professor at UCL. The consortium is funded by the Royal Academy of Engineering for a Senior Research Fellowship at UCL.
To make a sentence understandable to a computer, QNLP transforms it into a logical format (syntax tree). By distinguishing between verbs, nouns, prepositions and adjectives using mathematical linguistics, the software arranges the syntax tree into parts of speech. The sentence’s components are then categorized based on how the words are related to one another.
Like the sentence diagrams you learned in elementary school, the sentence is transformed into a string diagram. This not only simplifies NLP design on quantum hardware but can also be seen as an enriched tensor network—a mathematical structure with many applications in quantum physics. Figure 1 shows an example for the sentence, “John gave Mary a flower.”
A sentence represented as a string diagram can then be turned, or mapped, into a concrete quantum circuit or tensor network. For instance, after a proper qubit assignment, the sentence, “John walks in the park,” produces the circuit in Figure 2.
Quantum circuits that have been constructed and encoded are ready to be optimized for machine learning applications like text classification.
“Quantum mechanics, applied to the language of composition, can explain what happens when you put systems together, how they differently start interacting. If you draw two lines, it means you got two things, and then they can start interacting,” Coecke said.
To enable tasks like content discovery and archival retrieval, the BBC is looking for innovative ways to express content in formats that computers can understand. The prior work the corporation and Sadrzadeh did on improving personalized recommendations using multimodal data serves as a foundation for this.
In the U.K. and beyond, the BBC archives span a century of world news and cultural life. With nearly 15 million objects, including audio, film and text documentation, as well as toys, games, goods, artifacts and historical equipment, it is one of the largest broadcast archives in the world.
Language processing will benefit enormously from fault tolerant quantum processors, especially when they will become available at scale, according to Quantinuum. “We are still doing things on classical computers, but if we can get quantum computers, then it’s going to be a completely different ballgame,” Coecke said.
As Coecke further explains, graphical languages, such as ZX, calculus were initially met with a lot of skepticism by quantum computing people. They said, “This is never going to lead to anything.” As we know today, all big companies are investing in the quantum industry and using the technology. Additionally, Peter Shor, professor of applied mathematics at MIT, just submitted a research paper to arXiv explaining how to build graphical quantum Clifford-encoder compilers from the ZX calculus.
“In academia, people think what they are doing is the thing that everybody should be doing,” Coecke said. “Companies, on the other hand, do what is useful. It is very interesting that what we studied years ago, although this topic was at a very basilar level, is today used by companies like Google and IBM.”
If a company like BBC had used standard AI and machine learning methods, it would not have worked, because you really need to understand what’s going on, according to Coecke. They would have needed BBC archives to train a system that would deal with the BBC archive, because that’s just how it normally works.
“Our new theory, from the start, was much more meaning aware,” he said. “It basically understands the words, and how they interact, rather than just training AI and machine learning. We came up with a new theory about language in circuits.”
He concluded, “In the circuits, all languages become the same. That’s a big statement.”