Explainability by Design: Shaping the Future of AI Accountability and Digital Agency | Privacy Engineering & Technology Education Discussion (PETed) Recording
In this PETed, Shoshana Rosenberg advocates for the establishment and recognition of digital agency and understanding as a fundamental human right, serving as a reinforcement and guiding principle for the evolving legal frameworks working to keep pace with technological innovation. This right bridges the gaps between requirements around transparency and around mere transparency and the requisite layers of explainability, ensuring AI systems provide meaningful and accessible explanations of their decision-making processes.
Differential Privacy in Practice: Unlocking Insights from Data while Protecting Individual Privacy | Privacy Engineering & Technology Education Discussion (PETed) Recording
In this talk Gerome Miklau will explain how we have successfully used differential privacy to unlock insights from highly sensitive data. This will include examples of practical deployments of the technology at major enterprises: the challenges, solutions, and lessons learned.
Design Process Standard Deep Dive | Privacy Engineering & Technology Education Discussion (PETed) Recording
IOPD President R Jason Cronk welcomes you to ask all your burning questions about the Design Process Standard published last year. The need for this standard is a culmination of several factors, and details the components necessary in a design process to incorporate privacy considerations and reduce privacy risks to individuals.
Data Access and Deletion in the Large Scale Structured and Unstructured Datasets | Privacy Engineering & Technology Education Discussion (PETed) Recording
Privacy technologists struggle to efficiently handle large-scale requests for data access and deletion due to challenges in mapping data across systems, avoiding production disruptions, and redacting unrelated information from unstructured data. This requires scalable solutions to ensure regulatory compliance and protect data rights without compromising system performance.
From Permission Usage to Compliance Analysis | Privacy Engineering & Technology Education Discussion (PETed) Recording
We have been analyzing Android apps for regulatory requirements for eight years. We have analyzed Android apps for COPPA, CCPA, and Health Compliance (HIPAA, HBNR, and FTC Act). In this talk, Primal Wijesekera presents the lessons learned after analyzing thousands of apps, the technical challenges we face while analyzing Android apps, patterns of non-compliance issues we uncovered, and the likely root causes of non-compliance.
Deceptive Design – Dark Patterns Beyond the Interface | Privacy Engineering & Technology Education Discussion (PETed) Recording
This talk is a clarion call for a multi-faceted approach to combating dark patterns, combining legal reform, education, and ethical design principles to safeguard digital citizenship in an increasingly manipulative digital landscape.
Deploying Decentralized Privacy-Preserving Contact Tracing | Privacy Engineering & Technology Education Discussion (PETed) Recording
Digital contact tracing systems promised to help combat the COVID-19 pandemic; but in doing so introduce privacy risks. Privacy-friendly contact tracing systems enable notification of exposed people without privacy harms. In this webinar Wouter Lueks will talk about his experience designing a large-scale privacy-friendly digital contact tracing system that later led to the system adopted by Google and Apple, and experiences deploying such a privacy-friendly system in the wild.
Assurance Cases | Privacy Engineering & Technology Education Discussion (PETed) Recording
Assurance cases are gaining traction as a means of certification in Aerospace and other safety and security critical industries. However, these assurance cases can become overwhelming and complicated, even for moderately complex systems. Therefore, there is a compelling requirement to develop new automation that can aid in creating and assessing assurance cases.
Vector Databases: AI Uses, Privacy Risks, and Mitigations | Privacy Engineering & Technology Education Discussion (PETed) Recording
Developers are rushing to adopt new AI tools and techniques, and private data for AI systems is shifting out of models and into vector databases. These new databases are immature from a security and privacy perspective, and the attacks against them are numerous and growing by the day. Understanding, controlling, monitoring, and protecting the data in these databases should be a top priority of security and privacy teams.
What Can Go Wrong With Your AI? | Privacy Engineering & Technology Education Discussion (PETed) Recording
Do AI systems require a holistic risk-based approach? How can you identify and assess all the different risks? During this session we discuss the problem of risk identification and assessment in AI systems. We will look into the ideas in Europe of introducing a Fundamental Rights Impact Assessment and we will explore some of the available tools with Isabel Barberá.