-
Notifications
You must be signed in to change notification settings - Fork 337
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Migrate QGAN tutorial to SamplerQNN #555
Migrate QGAN tutorial to SamplerQNN #555
Conversation
@ElePT for your attention as an expert in the tutorials :) |
Pull Request Test Coverage Report for Build 4264056896
💛 - Coveralls |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hello! Thank you for the hard work refactoring this tutorial. Overall it looks really good. I left some minor comments on the section numbering, a few code suggestions (one regarding primitive syntax), and a few suggestions for the wording that I hope make sense.
"## Tutorial\n", | ||
"\n", | ||
"### Data and Representation\n", | ||
"## 2. Data and Representation\n", | ||
"\n", | ||
"First, we need to load our training data $X$.\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I cannot add a suggestion directly but in the line below, but would it make sense to say "In this tutorial, the training data is given by a 2D multivariate normal distribution"???
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Totally makes sense, replaced, thanks!
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
Co-authored-by: ElePT <[email protected]>
@woodsp-ibm would be nice to have your review as usual ;) |
@@ -79,8 +87,8 @@ | |||
"import torch\n", | |||
"from qiskit.utils import algorithm_globals\n", | |||
"\n", | |||
"torch.manual_seed(42)\n", | |||
"algorithm_globals.random_seed = 42" | |||
"algorithm_globals.random_seed = 123456\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Above it states this
We first begin by fixing seeds in the random number generators, then we will...
I assume people by now would realize that this just for reproducibility of the outcome in this tutorial, and for no other reason. Would it harm to to state that ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No harm at all, added.
"\n", | ||
"The qGAN \\[1\\] is a hybrid quantum-classical algorithm used for generative modeling tasks. The algorithm uses the interplay of a quantum generator $G_{\\theta}$, i.e., an ansatz, and a classical discriminator $D_{\\phi}$, a neural network, to learn the underlying probability distribution given training data.\n", | ||
"\n", | ||
"The generator and discriminator are trained in alternating optimization steps, where the generator aims at generating samples that will be classified by the discriminator as training data samples (i.e, samples extracted from the real training distribution), and the discriminator tries to differentiate between original training data samples and data samples from the generator (in other words, telling apart the real and generated distributions). The final goal is for the quantum generator to learn a representation for the training data's underlying probability distribution.\n", | ||
"The generator and discriminator are trained in alternating optimization steps, where the generator aims at generating probabilities that will be classified by the discriminator as training data values (i.e, probabilities from the real training distribution), and the discriminator tries to differentiate between original distribution and probabilities from the generator (in other words, telling apart the real and generated distributions). The final goal is for the quantum generator to learn a representation for the target probability distribution.\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a really minor comment to some unchanged text above
of a quantum generator G, i.e., an ansatz, and a classical discriminator$D, a neural network
The generator has an i.e. (i.e. an ansatz) whereas discriminator just says, a neural network.
Was thinking about this below as it also explains ansatz a little (though it goes into things later in the notebook).
of a quantum generator G, an ansatz (parametrized quantum circuit), and a classical discriminator$D, a neural network
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added "(parametrized quantum circuit)"
@@ -237,7 +182,17 @@ | |||
} | |||
}, | |||
"source": [ | |||
"We move to PyTorch modeling and start from converting data arrays into tensors and create a data loader from our training data." | |||
"## 3. Definitions of the Neural Networks\n", | |||
"In this section we define two neural networks as described above:\n", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Above it described the generator as an ansatz (I am thinking the very minor comment I made about it saying quantum generator, i,e, an ansatz - do people think of that as a neural network?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Slightly rephrased the sentences.
Co-authored-by: Steve Wood <[email protected]>
Co-authored-by: Steve Wood <[email protected]>
Co-authored-by: Steve Wood <[email protected]>
Co-authored-by: Steve Wood <[email protected]>
Co-authored-by: Steve Wood <[email protected]>
The issue seems to be |
Co-authored-by: Steve Wood <[email protected]>
Ah, shame on me, I don't see typos! I thought I typed correctly, but the word is not in the dictionary. |
* updated tutorial * new training technique * fix spelling * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: ElePT <[email protected]> * code review * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: Steve Wood <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: Steve Wood <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: Steve Wood <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: Steve Wood <[email protected]> * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: Steve Wood <[email protected]> * code review * fix spell * Update docs/tutorials/04_torch_qgan.ipynb Co-authored-by: Steve Wood <[email protected]> * revert dictionary * update dictionary --------- Co-authored-by: ElePT <[email protected]> Co-authored-by: Steve Wood <[email protected]>
Summary
The tutorial is updated to make use of
SamplerQNN
. The content of the tutorial has been heavily revised.