
IonQ leverages quantum computing to advance AI

IonQ has advanced research in applying quantum computing to AI and machine learning, marking significant progress in hybrid quantum-classical approaches that enhance large language models (LLMs) and generative AI.
In two new research papers, IonQ researchers detailed how quantum computing can support advanced materials development by generating synthetic images of rare anomalies and enhancing large language models by adding a quantum layer for fine-tuning. These efforts reflect IonQ’s continued focus on practical, near-term commercial quantum applications in AI to drive value in data-scarce settings and for complex tasks.
In a recent paper, IonQ introduced a hybrid quantum-classical architecture designed to enhance LLM fine-tuning, where a pre-trained LLM is supplemented with a small set of training data to customise its functionality through quantum machine learning. To compare performance against classical methods, IonQ researchers used an open-source large language model that is widely employed to predict words in a sentence and incorporated a parameterised quantum circuit as a new layer. With this quantum fine-tuning step, the hybrid model was repurposed to understand sentence sentiment.
The resulting hybrid quantum approach outperformed classical-only methods in accuracy, surpassing classical methods that use a similar number of parameters by a meaningful margin. The researchers observed a trend of increased classification accuracy with an increasing number of qubits. They also projected significant energy savings for inference using the hybrid quantum algorithm, relative to inference using all-classical models, as the problem size increases beyond 46 qubits. This paves the way for quantum-enhanced fine-tuning of broader classes of foundational AI models, including AI models for natural language processing, image processing, and property prediction in chemistry, biology and materials science.
“This work highlights how quantum computing can be strategically integrated into classical AI workflows, taking advantage of increased expressivity to enhance traditional AI LLMs in rare-data regimes,” said Masako Yamada, Director of Applications Development IonQ. “LLMs have demonstrated versatility far beyond pure ‘language’ applications, and we believe hybrid quantum-classical models are well positioned to unlock the next wave of AI capabilities.”
In a separate research publication, IonQ collaborated with a top-tier automotive manufacturer to apply quantum-enhanced generative adversarial networks (GANs) to materials science. Researchers trained GANs to sample the output distribution of a quantum circuit, generating synthetic images of steel microstructures that augment conventional imaging techniques, where data is often sparse and, therefore, model trainability is poor.
The microstructure images produced using IonQ’s hybrid QGAN method achieved a higher quality score in up to 70% of cases than those produced using baseline classical generative models. Industrial AI models often rely on proprietary data sets, which may result in a lack of data, a data imbalance, or high data cost. The ability to supplement image data is vital to developing AI models where the objective is to optimise manufacturing process parameters to result in material properties that meet stringent requirements.
“This work is a compelling example of how the combination of IonQ’s quantum computers and classical machine learning can produce impressive results for materials science and manufacturing,” said Ariel Braunstein, SVP of Product at IonQ. “Using classical computing to augment experimental data with synthetic generation can be expensive and limited in value. This work shows that a quantum hybrid approach can yield higher quality images with less data than classical methods and could lead to new applications across industries such as materials science, medical imaging, and financial forecasting.”
