Quantum ML Scribe

Quick notes on quantum machine learning.

🧬 nat

Table of Contents

These are quick scribes from this really nice Lecture here by Maria Schuld.

Building Blocks

We consider a measurement matrix MM which is a diagonal of measurement values and the corresponding probability assignments pp to each.

M=(m1⋯0⋮⋱⋮0⋯mn)p={p1,⋯ ,pn}\begin{aligned} M &= \begin{pmatrix}m_1 & \cdots & 0 \\ \vdots & \ddots & \vdots \\ 0 & \cdots & m_n \end{pmatrix} \\ p &= \{ p_1, \cdots, p_n \} \end{aligned}

The expectation of such a random variable can now be represented in a quadratic form vector product such that for a vector qq, qi2=piq_i^2 = p_i as

⟨M⟩=∑ipimi=qTMq\langle M \rangle = \sum_i p_i m_i = q^T M q

Quantum theory revolves around computing expectation of measurements and these ideas from classical linear algebra are extended in a general form as

⟨M⟩=⟨ψ∣M∣ψ⟩ψ=(α1α2⋮αn)∈Cnαi2=pi\begin{aligned} \langle M \rangle &= \left\langle \psi \big| M \big| \psi \right\rangle \\ \psi &= \begin{pmatrix} \alpha_1 \\ \alpha_2 \\ \vdots \\ \alpha_n \end{pmatrix} \in \mathbb{C}^n \\ \alpha_i^2 &= p_i \end{aligned}

MM in the most general case (can have non-diagonal elements as well) is a Hermitian matrix with eigen values equal to the measurements.

Remarks