Genomic Relational Privacy in AI Era: a scoping review
Presenter: Ying Huang, MD Session: Regulatory Science and Policy Time: 4/20/2026 9:00:00 AM → 4/20/2026 12:00:00 PM
Authors
Ying Huang 1 , Emily Boja 2 , Jaime Guidry Auvil 1 1 National Cancer Institute, Bethesda, MD, 2 National Cancer Inst. - Bethesda Campus, Rockville, MD
Abstract
Objective: The concept of privacy is ever evolving and commonly defined as an individual’s “right to be let alone”. It has been shown that it is foundationally insufficient to address the challenges and opportunities presented by blooming of genomic data, where one individual’s decision regarding genomic information to consent to data sharing can directly impact the privacy of their biological and social relations. This scoping review maps the evolution of privacy theory and systematically examines the legal, ethical, and technological conflicts and governance models at the intersection of genomics and artificial intelligence (AI). Methods: A scoping review was conducted in 2025 following the Joanna Briggs Institute (JBI) methodology. Systematic searches were performed in three major databases (up to August 2025) - Westlaw, PubMed, and Web of Science databases including foundational privacy scholarship, modern relational and critical data theories, U.S. case law, federal statutes (HIPAA, GINA), international regulations (GDPR), and technical papers on AI and Privacy-Preserving Machine Learning (PPML). Results: From 1412 initially retrieved records and after 3 phases of screening, we included 56 studies. The review identified a fundamental and growing conflict between individualistic legal frameworks (based on personal consent) and the interdependent nature of genomic data. Key findings are: (1) The “relational turn” in privacy theory provides a more accurate analytical lens by focusing on context, power imbalances, and interdependence rather than solely on individual control. (2) Escalating real-world conflicts—evident in law enforcement’s use of familial DNA searching, DTC data breaches, and clinical “duty to warn” dilemmas—demonstrate the failure of current legal models. (3) AI acts as a powerful risk-amplifier, using its inferential power to deduce familial relationships and perpetuate bias, driven by the economic incentives of surveillance capitalism. (4) A suite of PPML tools (e.g., federated learning, differential privacy) offers technical mitigation, but these are not substitutes for robust ethical governance and risk being co-opted. Conclusion: A modernized governance framework for genomic data is a critical and urgent need. This framework shall expand upon, not replace, individual rights, while it should be capable of governing complex, competing interests. We conclude that a durable solution requires a multi-pronged approach based on relational privacy, contextual integrity, and the imposition of legally enforceable fiduciary duties on the powerful entities that steward our collective genomic data.
Disclosure
Y. Huang, None.
Cited in
Control: 4431 · Presentation Id: 1565 · Meeting 21436