Differentially private selection using smooth sensitivity

Detalhes bibliográficos
Ano de defesa: 2024
Autor(a) principal: Chaves, Iago Castro
Orientador(a): Não Informado pela instituição
Banca de defesa: Não Informado pela instituição
Tipo de documento: Tese
Tipo de acesso: Acesso aberto
Idioma: eng
Instituição de defesa: Não Informado pela instituição
Programa de Pós-Graduação: Não Informado pela instituição
Departamento: Não Informado pela instituição
País: Não Informado pela instituição
Palavras-chave em Português:
Link de acesso: http://repositorio.ufc.br/handle/riufc/78262
Resumo: Differentially private selection mechanisms offer strong privacy guarantees for queries whose canonical outcome is the top-scoring element r within a finite set R according to a dataset-dependent utility function. While selection queries are pervasive throughout data science, there are few mechanisms to ensure their privacy. Additionally, the vast majority focus on achieving differential privacy (DP) through global sensitivity, possibly corrupting the query result with excessive noise and maiming downstream inferences. We propose the Smooth Noisy Max (SNM) algorithm to alleviate this issue. In particular, SNM algorithm leverages the notion of smooth sensitivity to provably provide smaller (upper bounds on) expected errors compared to methods based on global sensitivity under mild conditions. Empirical results show that our algorithm is more accurate than state-of-the-art differentially private selection methods in three applications: percentile selection, greedy decision trees, and random forest.