Wednesday 2:50 PM–3:25 PM in Main Room

Qumodes and Quantum Generalized Linear Models

Colleen M. Farrelly, Ike Uchenna Chukwu

Audience level:
Experienced

Description

Generalized linear models are ubiquitous in statistics and machine learning; however, the problem of defining a link function that fits an outcome of interest is a major problem for some real-world applications. We use a novel solution: non-Gaussian transformation gates within a qumodes quantum computing framework. Results suggest this is an effective solution and one that can be easily extended.

Abstract

Generalized linear models are the simplest instance of link-based statistical models, which are based on the underlying geometry of an outcome’s underlying probability distribution (typically from the exponential family). Machine learning algorithms provide alternative ways to minimize a model’s sum of square error (error between predicted values and actual values of a test set). However, some deep results regarding the exponential family’s relation to affine connections in differential geometry provide a possible alternative to link functions: 1. Algorithms that either continuously deform the outcome distribution from known results 2. Algorithms that superpose all possible distributions and collapse to fit a dataset 3. Leveraging the fact that some quantum computer gates, such as the non-Gaussian transformation gate, essentially perform (1) natively and in a computationally-efficient way! This project provides a proof-of-concept for leveraging specific hardware gates to solve the affine connection problem. The algorithm first leverages a dimensionality reduction technique well-known in the manifold learning literature, T-Distributed Stochastic Neighbor Embedding, and then makes use of the Strawberry Fields simulations of qumode-based quantum circuits (similar to Xanadu’s proposed quantum computer) to create a generalized linear model with a non-Gaussian transformation gate to replace the link function. Results of the quantum generalized linear model are then compared to random forest, Bayesian adaptive regression tree, boosted regression, homotopy-based LASSO, Tweedie regression, and differential-geometry-based least angle regression models on classical systems. Results suggest improved performance, and this quantum-computing-based regression model can be extended to many other, more complicated statistical models, such as generalized estimating equations, hierarchical regression models, and even homotopy-continuation problems.

Subscribe to Receive PyData Updates

Subscribe

Tickets

Get Now