Adaptive ansatz based on low-rank state preparation

Detalhes bibliográficos
Ano de defesa: 2023
Autor(a) principal: ARAÚJO, Ismael Cesar da Silva
Orientador(a): Não Informado pela instituição
Banca de defesa: Não Informado pela instituição
Tipo de documento: Dissertação
Tipo de acesso: Acesso aberto
Idioma: eng
Instituição de defesa: Universidade Federal de Pernambuco
UFPE
Brasil
Programa de Pos Graduacao em Ciencia da Computacao
Programa de Pós-Graduação: Não Informado pela instituição
Departamento: Não Informado pela instituição
País: Não Informado pela instituição
Palavras-chave em Português:
Link de acesso: https://repositorio.ufpe.br/handle/123456789/53791
Resumo: Quantum State Preparation Algorithms consist of defining a sequence or unitary operations to load a specific target state on a quantum computer. We can use those algorithms in appli- cations such as quantum machine learning. However, some state preparation algorithms have exponential circuit complexity with the number of qubits on the system. That is the case of amplitude encoding algorithms, which is an encoding type for loading normalized data into the probability amplitudes of the state. To circumvent this overhead in circuits’ complexity, works explore specific properties of quantum states to optimize the circuit’s complexity, such as sparsity or symmetry. Other works explore simplifying the quantum circuit to load an ap- proximate quantum state. It is the case of Quantum Generative Adversarial Networks, which use a specific circuit architecture comprised of alternating blocks of single-qubit rotations and two-qubit entangling controlled gates. But when trained to load random distributions on, we observed the performance deteriorates as the number of qubits increases in terms of relative entropy. In this work, we propose different architectures for the Quantum Generative mod- els based on the state preparation algorithm known as Low-Rank. Through experiments for loading the log-normal distribution, we show error reductions in quantum state initialization.