My Authors
Read all threads
What makes an optimizer “quantum aware”?

Read on as we explore this interesting question in a bit more detail.

[THREAD]
1) First off, what is an “optimizer”? In ML libraries like @TensorFlow or @PyTorch, these classes iteratively update the values of your model parameters according to some recipe. Many also update hyperparameters like the learning rate. 📈
2) Commonly encountered optimizers are SGD, BFGS, Adam, etc. Many of these can be directly used for QML.

For instance, though Adam was born from deep learning, it is used in many QML papers, since it often works nicely out of the box.
3) But there’s no free lunch 🚫🥪

No optimizer is guaranteed to work well across all types of problems.

When you’re tackling tougher domain-specific problems, it can make sense to use an optimizer that “knows” a little more about the structure of your problem.
4) Here’s where “quantum-aware” optimizers enter the story.

These optimizers have taken advanced courses in quantum computing and are eager to put their new-found skills into practice. ⚛️💻

This can potentially lead to optimizations which are quicker, easier, or cheaper.
5) Some quantum-aware optimizer examples:

Rotosolve/Rotoselect
Quantum Natural Gradient
iCANS/Rosalin
6) Rotosolve/Rotoselect are clever because they avoid all gradient computations.

Since most QML models are inherently sinusoidal, you can actually solve for the optima (at least wrt a single parameter)

pennylane.ai/qml/demos/tuto…
arxiv.org/abs/1905.09692
7) Quantum Natural Gradient is more geometric in nature.

It exploits the intrinsic geometry of quantum information to determine more “natural” parameter updates than conventional gradient descent.

pennylane.ai/qml/demos/tuto…
arxiv.org/abs/1909.02108
arxiv.org/abs/1912.08660
8) NatGrad methods have rich theory, but not widely used in ML because other optimizers are cheaper. There are also higher costs in the quantum case, but these might be worth paying for the performance gains.

arxiv.org/abs/2004.14666
arxiv.org/abs/2005.05172
9) The iCANS/Rosalin optimizer family takes a different approach. Their goal is more to minimize the bottleneck of evaluating quantum circuits, by tuning hyperparameters like the number of samples to take.

pennylane.ai/qml/demos/tuto…
arxiv.org/abs/1909.09083
arxiv.org/abs/2004.06252
10) One interesting thing about the iCANS/Rosalin family is that those optimizers can even be combined with other optimizers like SGD, Adam, QNG, etc.
11) So we have optimizers that work well for deep learning problems, and optimizers that work well for QML problems.

Could the next frontier be optimizers that are specifically designed to work well for hybrid quantum-classical models? 🤔
12) What other optimizers could qualify as “quantum aware”?

What other optimization strategies could we come up with that cleverly leverage the structures and constraints of QM?
Missing some Tweet in this thread? You can try to force a refresh.

Enjoying this thread?

Keep Current with PennyLane

Profile picture

Stay in touch and get notified when new unrolls are available from this author!

Read all threads

This Thread may be Removed Anytime!

Twitter may remove this content at anytime, convert it as a PDF, save and print for later use!

Try unrolling a thread yourself!

how to unroll video

1) Follow Thread Reader App on Twitter so you can easily mention us!

2) Go to a Twitter thread (series of Tweets by the same owner) and mention us with a keyword "unroll" @threadreaderapp unroll

You can practice here first or read more on our help page!

Follow Us on Twitter!

Did Thread Reader help you today?

Support us! We are indie developers!


This site is made by just two indie developers on a laptop doing marketing, support and development! Read more about the story.

Become a Premium Member ($3.00/month or $30.00/year) and get exclusive features!

Become Premium

Too expensive? Make a small donation by buying us coffee ($5) or help with server cost ($10)

Donate via Paypal Become our Patreon

Thank you for your support!