BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Institute of Operational Privacy Design - ECPv6.15.18//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://instituteofprivacydesign.org
X-WR-CALDESC:Events for Institute of Operational Privacy Design
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20230312T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20231105T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20240310T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20241103T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20250309T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20251102T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20260308T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20261101T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VTIMEZONE
TZID:America/Halifax
BEGIN:DAYLIGHT
TZOFFSETFROM:-0400
TZOFFSETTO:-0300
TZNAME:ADT
DTSTART:20220313T060000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0300
TZOFFSETTO:-0400
TZNAME:AST
DTSTART:20221106T050000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0400
TZOFFSETTO:-0300
TZNAME:ADT
DTSTART:20230312T060000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0300
TZOFFSETTO:-0400
TZNAME:AST
DTSTART:20231105T050000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0400
TZOFFSETTO:-0300
TZNAME:ADT
DTSTART:20240310T060000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0300
TZOFFSETTO:-0400
TZNAME:AST
DTSTART:20241103T050000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0400
TZOFFSETTO:-0300
TZNAME:ADT
DTSTART:20250309T060000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0300
TZOFFSETTO:-0400
TZNAME:AST
DTSTART:20251102T050000
END:STANDARD
END:VTIMEZONE
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20220313T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20221106T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20230312T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20231105T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20240310T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20241103T060000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
TZNAME:EDT
DTSTART:20250309T070000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
TZNAME:EST
DTSTART:20251102T060000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20250319T130000
DTEND;TZID=America/New_York:20250319T140000
DTSTAMP:20260405T152756
CREATED:20250205T231218Z
LAST-MODIFIED:20250303T184649Z
UID:8611-1742389200-1742392800@instituteofprivacydesign.org
SUMMARY:Design Assurance Standard v1.0 Launch Webinar
DESCRIPTION:The Design Process Standard (Process Standard) was adopted in January 2023 with this Design Assurance Standard (Assurance Standard) following two years later. While the Process Standard details the components necessary in a design process to incorporate privacy considerations and reduce privacy risks\, this Assurance Standard uses an assurance case to confirm an organization’s claim that a specific product\, service\, or business process has been designed\, developed\, or deployed with privacy aforethought. \n  \nDate & Time:\nMarch 19\, 2025 @ 1:00 PM – 2:00 PM EDT (UTC−04:00) \n  \nWebinar Resources:\n\n\nIntroducing the Design Assurance Standard v1.0 \nAssurance Cases | Privacy Engineering & Technology Education Discussion (PETed) Recording\nIntroducing the Design Process Standard (v 1.0)\nDesign Process Standard Launch Webinar Recording\n\n\n  \nRegister Here \nYour email address will be used to send you a calendar invitation to the event and subsequent recording before being deleted.
URL:https://instituteofprivacydesign.org/event/design-assurance-standard-v-1-0-launch-webinar/
LOCATION:https://iopd.whereby.com/webinar
CATEGORIES:IOPD Events,Webinar
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2025/02/DASlaunch-mar19.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20250227T130000
DTEND;TZID=America/New_York:20250227T140000
DTSTAMP:20260405T152756
CREATED:20241101T182054Z
LAST-MODIFIED:20250124T202427Z
UID:7628-1740661200-1740664800@instituteofprivacydesign.org
SUMMARY:The Case for Researching Applied Privacy Enhancing Technologies | Q1 Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a recorded 10-minute introduction followed by a 40-minute informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. \nThe participants will be asked to bring questions related to the topic. Come back every quarter for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nFebruary 27th\, 2025 @ 1:00 PM – 2:00 PM ET \n  \nTopic:\nThe Case for Researching Applied Privacy Enhancing Technologies \n  \nProblem Statement:\nHow to safely expand access to administrative tax payer data? \n  \nSynopsis:\nResearch on privacy enhancing approaches for sharing data has grown significantly over the past two decades. This increased interest has led to extensive theoretical and methodological research\, but the number of practical applications of privacy enhancing technologies has lagged far behind. This talk will focus on the Safe Data Technologies Project\, providing an overview of the project and the approach we have taken to conducting privacy research with the specific aim of putting theory into practice and incorporating user input. \nNational Bureau of Economic Research Working Paper that gives this overview: https://www.nber.org/papers/w32909 \nProject landing webpage: https://www.urban.org/projects/safe-data-technologies  \n  \nPre-Discussion Resources:\n\n\nGroup webpage: https://www.urban.org/research-methods/data-governance-and-privacy\nEducational materials on personal website: https://clairemckaybowen.com/education/\n\n\n  \n\nSpeaker:\n\nClaire Bowen \nClaire McKay Bowen (she/her) is a senior fellow and leads the Data Governance and Privacy practice area at the Urban Institute. Her research primarily focuses on developing technical and policy solutions to safely expand access to confidential data that advances evidence-based policy-making. She also has interest in improving science communication and integrating data equity into the data privacy process. In 2024\, she became an American Statistical Association Fellow “for her significant contributions in the field of statistical data privacy\, leadership activities in support of the profession\, and commitment to mentoring the next generation of statisticians and data scientists.” Further\, she is a member of the Census Scientific Advisory Committee and several other data governance and data privacy committees as well as an adjunct professor at Stonehill College. \n  \nModerator:\nKimberly Lancaster \nTrusted Privacy Advisor who Guides Data Protection\, Drives Operational Excellence\, and Leads with Integrity by aligning with InfoSec\, Security\, GRC\, Compliance\, and Data Governance. Board Member\, Speaker\, and Author. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to this month’s event! Please reach out to a current member to be invited as a guest. If you are already a member\, subscribe to our PETed Mailing List for announcements and monthly invitations!
URL:https://instituteofprivacydesign.org/event/q1-2025-privacy-engineering-technology-education-discussion-peted/
LOCATION:https://iopd.whereby.com/peted-discussion
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/11/1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20250221T110000
DTEND;TZID=America/New_York:20250221T120000
DTSTAMP:20260405T152756
CREATED:20250115T214440Z
LAST-MODIFIED:20250512T180032Z
UID:8555-1740135600-1740139200@instituteofprivacydesign.org
SUMMARY:Artificial Intelligence (AI) Governance | PETed Sip & Chat
DESCRIPTION:Join our monthly PETed Sip & Chat hosted by Janelle Hsia! 3rd Friday of every month at 11:00AM – 12:00 PM ET throughout 2025.\nDespite all the advantages and benefits of AI\, most of us are at the beginning of our AI governance journey.  We need to learn how to handle and protect data during its use.  Let’s talk about some of the key aspects of AI governance: \n\nEthical principles: Defining and adhering to principles like fairness\, transparency\, and responsibility.\nRegulations and laws: Creating and enforcing laws that govern AI development and deployment.\nAccountability and oversight: Ensuring there’s a system in place for overseeing AI actions and holding parties accountable for outcomes.\nData privacy and security: Protecting sensitive data and ensuring privacy through robust security measures.\nBias and fairness: Identifying and mitigating biases in AI systems to promote fairness and equality.\nTransparency: Making AI systems’ operations understandable and accessible to users and stakeholders.
URL:https://instituteofprivacydesign.org/event/february-peted-sip-chat/
LOCATION:https://us02web.zoom.us/j/87976844288?pwd=WMjmQhAbTS5W3Hbnfp50blJ5rS0vxt.1&from=addon
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2025/01/FebpetedSipnChat.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20250212T140000
DTEND;TZID=America/New_York:20250212T150000
DTSTAMP:20260405T152756
CREATED:20250127T210915Z
LAST-MODIFIED:20250127T210915Z
UID:8568-1739368800-1739372400@instituteofprivacydesign.org
SUMMARY:Q1 2025 All Hands Meeting
DESCRIPTION:ALL HANDS MEETING 📢 Ambassadors\, Advisors\, Volunteers\, and Committee Members are all invited to join us Wednesday\, February 12th from 2:00 – 3:00 PM EST (11 AM PST / 8 PM CET). We will welcome our new members\, give updates on current projects\, and report on the Design Assurance Standard V. 1.0 slated for publishing in March! \nPlease reach out to admin22@instituteofprivacydesign.org if you have yet to receive an invitation. Guest invites are first come first serve. If you have a friend or colleague interested\, send us their email and we’ll add them to the guest list!
URL:https://instituteofprivacydesign.org/event/q1-2025-all-hands-meeting/
LOCATION:https://iopd.whereby.com/all-hands
CATEGORIES:All Hands Meeting
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2025/01/Feb12allhands.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20250117T110000
DTEND;TZID=America/New_York:20250117T120000
DTSTAMP:20260405T152756
CREATED:20241122T193351Z
LAST-MODIFIED:20250512T180200Z
UID:8031-1737111600-1737115200@instituteofprivacydesign.org
SUMMARY:Data Subject Requests (DSRs) | PETed Sip & Chat
DESCRIPTION:Our first monthly PETed Sip & Chat hosted by Janelle Hsia! Join us the 3rd Friday of every month at 11:00AM ET throughout 2025. \n  \nDo you have challenging Data Subject Requests (DSRs) such as:\n\nScalability: Managing a high volume of DSRs can be overwhelming\, especially for smaller organizations.\nData Fragmentation: Dispersed data across multiple systems and repositories makes it difficult to locate and retrieve relevant information.\nAutomation: Manual processing of DSRs is time-consuming and prone to errors\, while automation solutions can be costly and require significant integration efforts.\nThird-Party Data: Handling requests involving third-party data requires additional coordination with external parties.\n\nIs your DSR process complex and utilizing a lot of internal resources? What are some best practices?\n\nEstablish Clear Processes: Define and document DSR handling procedures to ensure consistency and compliance.\nPrioritize Transparency: On your privacy notice\, provide clear information to data subjects about their rights and the processing of their personal data.\nContinuously Improve: Refine DSR management processes based on feedback\, lessons learned\, and changing regulatory requirements.\n\nBring some of your DSR scenarios for us to discuss!\n\nAn individual requests access to all personal data processed by an organization from multiple systems and different formats.\nReceiving requests from third parties like Privacy Hawk or Privacy Bee.\nRequest involves sensitive information requiring specialized handling and redaction.\nDeletion requests in system that do not have a technical solution for physical deletion.\nA data subject request that requires clarification on the scope of their request.\nIn the middle of a request\, the data subject changes their email address.
URL:https://instituteofprivacydesign.org/event/january-sip-chat/
LOCATION:https://us02web.zoom.us/j/87976844288?pwd=WMjmQhAbTS5W3Hbnfp50blJ5rS0vxt.1&from=addon
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/11/Jansipnchat.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20241106T150000
DTEND;TZID=America/New_York:20241106T160000
DTSTAMP:20260405T152756
CREATED:20241017T194124Z
LAST-MODIFIED:20241017T194124Z
UID:7548-1730905200-1730908800@instituteofprivacydesign.org
SUMMARY:Q4 All Hands Meeting 2024
DESCRIPTION:ALL HANDS MEETING 📢 Ambassadors\, Advisors\, Volunteers\, and Committee Members are all invited to join us Wednesday\, November 6th from 3:00 – 4:00 PM ET. We will meet to welcome our new members\, give updates on current projects\, and report on the Design Assurance Standard V. 1.0 slated for publishing in 2025! \nPlease reach out to admin22@instituteofprivacydesign.org if you have yet to receive an invitation. Guest invites are first come first serve. If you have a friend or colleague interested\, send us their email and we’ll add them to the guest list!
URL:https://instituteofprivacydesign.org/event/q4-all-hands-meeting-2024/
LOCATION:https://iopd.whereby.com/all-hands
CATEGORIES:All Hands Meeting,IOPD Events
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/10/allhands-nov624.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20241024T120000
DTEND;TZID=America/New_York:20241024T130000
DTSTAMP:20260405T152756
CREATED:20240924T222553Z
LAST-MODIFIED:20241002T205225Z
UID:7368-1729771200-1729774800@instituteofprivacydesign.org
SUMMARY:Integrating Privacy Early | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a recorded 10-minute introduction followed by a 40-minute informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. \nThe participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nOctober 24th\, 2024 @ 12:00 PM EDT / 6:00 PM CEST \n  \nTopic:\nIntegrating Privacy Early  \n  \nSynopsis:\nEnsuring that privacy is incorporated from the design stage of products (Privacy by Design) is a challenge\, especially when teams are pressured to deliver products quickly. This requires privacy engineers to work closely with developers and product teams from the outset. \n  \nLessons Learned:\n\nPrivacy is more than just the requirements posed by regulation.  The very fabric of how the internet is changing due to browser’s promoting privacy.  This affects aspects of engineering that need to be accounted for when designing web applications.\nThese changes have secondary effects to existing business processes\, such as Marketing\, Personalization and Measurement.  Thought needs to be given to how these tools work in the context of browser privacy controls\, in order to determine if they’ll meet business objectives.\n\n  \nPre-Discussion Resources:\n\nhttps://webkit.org/tracking-prevention/\nhttps://learn.microsoft.com/en-us/microsoft-edge/web-platform/tracking-prevention \nhttps://cunderwood.dev/2022/11/11/enter-third-party-cloaking-mitigation/ \nhttps://cunderwood.dev/2023/06/08/link-tracking-protection-in-ios17/\nhttps://cunderwood.dev/2023/06/12/vendors-impacted-by-link-tracking-protection/\n\n  \nSpeaker:\nCory Underwood  \nI’m a Senior Privacy Engineer for a consultancy known as Further.  In my day to day\, I assist clients with compliance audits and explain how the various regulatory and tech changes may impact existing business operations (particularly with marketing activities).  I have a deep background in Security\, Privacy and Web Development and maintain a blog https://cunderwood.dev where I discuss privacy and security changes and how businesses may need to react. Cory helps business avoid the scary and fright of not being in alignment with Privacy by Design. \n  \nModerator:\nKimberly Lancaster \nTrusted Privacy Advisor who Guides Data Protection\, Drives Operational Excellence\, and Leads with Integrity by aligning with InfoSec\, Security\, GRC\, Compliance\, and Data Governance. Board Member\, Speaker\, and Author. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to this month’s event! Please reach out to a current member to be invited as a guest. If you are already a member\, subscribe to our PETed Mailing List for announcements and monthly invitations!
URL:https://instituteofprivacydesign.org/event/integrating-privacy-early-privacy-engineering-technology-education-discussion-peted/
LOCATION:https://iopd.whereby.com/peted-discussion
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/09/1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20241003T130000
DTEND;TZID=America/New_York:20241003T140000
DTSTAMP:20260405T152756
CREATED:20240923T185519Z
LAST-MODIFIED:20240923T185551Z
UID:7361-1727960400-1727964000@instituteofprivacydesign.org
SUMMARY:Semi-Annual Privacy Wiki Review
DESCRIPTION:UPCOMING: New Semi-Annual Privacy Wiki Review October 3rd\, 2024 @ 1PM ET at https://iopd.whereby.com/privacy-wiki! Our Chief Editors will introduce themselves\, welcome new volunteers\, announce changes\, and answer questions. \nCalendar invites will go out over the privacywiki@lists.privacy.wiki discussion list. Reach out to admin22@instituteofprivacydesign.org if you did not receive and need an invitation. You can manage your subscription to the privacywiki@lists.privacy.wiki anytime at http://lists.privacy.wiki/listinfo.cgi/privacywiki-privacy.wiki
URL:https://instituteofprivacydesign.org/event/semi-annual-privacy-wiki-review/
LOCATION:https://iopd.whereby.com/privacy-wiki
CATEGORIES:IOPD Events,PrivacyWiki
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2023/07/Privacy-Wiki-Blue-3.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20241001T100000
DTEND;TZID=America/New_York:20241001T103000
DTSTAMP:20260405T152756
CREATED:20240910T202432Z
LAST-MODIFIED:20240910T214224Z
UID:7232-1727776800-1727778600@instituteofprivacydesign.org
SUMMARY:Design Assurance Standard Public Launch Webinar
DESCRIPTION:The Design Process Standard (Process Standard) was adopted in January 2023 with this Design Assurance Standard (Assurance Standard) following two years later. While the Process Standard details the components necessary in a design process to incorporate privacy considerations and reduce privacy risks\, this Assurance Standard uses an assurance case to confirm an organization’s claim that a specific product\, service\, or business process has been designed\, developed\, or deployed with privacy aforethought. \nIn other words\, the Assurance Standard doesn’t apply to an organization but to a specific object of evaluation. The intent of this certifiable standard is for organizations to demonstrate that they have achieved reasonable assurance around “privacy by design and default” claims. \n  \nRegistration:\nhttps://instituteofprivacydesign.org/design-assurance-standard-public-launch-webinar-registration/   \n  \nDate & Time:\nOctober 1st\, 2024 @ 10:00 – 10:30 AM EDT \n  \nSpeaker(s):\nR. Jason Cronk \nWith over two decades of experience in principle and trust consultancy\, Jason Cronk is a seasoned privacy engineer\, developer\, author of the IAPP textbook “Strategic Privacy by Design\,” Privacy Engineering Section Leader at the IAPP\, and founder and president of the Institute of Operational Privacy Design. His knowledge and involvement reaches across the spectrum as an active member of the academic\, engineering\, legal and professional privacy communities and a pioneering voice in the development of privacy by design.
URL:https://instituteofprivacydesign.org/event/design-assurance-standard-public-launch-webinar/
LOCATION:https://iopd.whereby.com/webinar
CATEGORIES:IOPD Events
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/09/2024-10-01-Design-Assurance-Standard-Public-Release.pptx.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20240926T120000
DTEND;TZID=America/New_York:20240926T130000
DTSTAMP:20260405T152756
CREATED:20240823T184444Z
LAST-MODIFIED:20240823T205629Z
UID:7103-1727352000-1727355600@instituteofprivacydesign.org
SUMMARY:Protecting Children in the Digital Age – Privacy Risks & Rewards | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a recorded 10-minute introduction followed by a 40-minute informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. \nThe participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nSeptember 26th\, 2024 @ 12:00 PM EDT / 6:00 PM CEST \n  \nTopic:\nProtecting Children in the Digital Age – Privacy Risks & Rewards \n  \nSynopsis:\nOver the past few years there has been a tsunami of new privacy legislation driven by the EU\, UK and US to address user data privacy and online harms and risks to users including children. Industry faces the challenge of aligning numerous regulations at federal and state level in the US and with the EU GDPR\, the Digital Services Act and the UK’s Online Safety Act to name just a few. There is a spotlight on age assurance and age verification and increasing requirements for parent consent and controls. Privacy is no longer seen just as a cost centre but as an investment in brand trust and integrity. \n  \nPre-Discussion Resources:\n\n\n\n\nComplying with COPPA: Frequently Asked Questions | Federal Trade Commission\nChildren and the UK GDPR | ICO\nThe Children’s code: what is it? | ICO\nThe Children’s Online Privacy Protection Act (COPPA)\nGDPRkids™ \nUK’s Children’s Code \nHistory of COPPA & GDPR Violations\n\n\n\n\n\n  \n\nSpeaker:\n\nClaire Quinn\, CIPP/e – Chief Privacy Officer \nClaire Quinn is a subject matter expert (SME) specializing in COPPA\, GDPR\, Children’s Code and child privacy and safety in the digital world. She has worked closely with major well-known child directed entertainment brands\, third party service providers\, moderation companies\, CEOP and US law enforcement. Claire also works directly with regulators including the FTC and EU data protection authorities. She is an accredited member of the International Association of Privacy Professionals (IAPP) and has more than 25 years’ experience in media and online. \nHer expertise is often called upon by industry and she regularly participates in webinars\, panels and events. Claire was on the FTC COPPA Rule review workshop panel in DC and has participated at the IAPP Europe Data Congress in Brussels. Claire contributes to numerous articles and white papers and has written expert witness reports. \nHer previous experience includes launching B2C websites for United News & Media\, Head of Lycos UK where she helped to launch Europe’s biggest chatroom. In addition\, Claire was Chief of Safety at MMO WeeWorld.com\, a US virtual world with mobile apps for tweens and teens. \n  \nModerator:\nAndrei Dumitru \nAfter working more than ten years in the IT department of an International Organization\, I focused in the last years on privacy and data protection. I think privacy is a fundamental right that cannot exist today without computer security so I am always searching to find a balance between both and see how privacy can be embedded into technology. I am particularly interested in privacy by design and dark patterns\, and always curious to see how privacy is integrated (or not) in current products and services. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to this month’s event! Please reach out to a current member to be invited as a guest. If you are already a member\, subscribe to our PETed Mailing List for announcements and monthly invitations!
URL:https://instituteofprivacydesign.org/event/protecting-children-in-the-digital-age-privacy-risks-rewards-privacy-engineering-technology-education-discussion-peted/
LOCATION:https://iopd.whereby.com/peted-discussion
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/08/1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20240822T120000
DTEND;TZID=America/New_York:20240822T130000
DTSTAMP:20260405T152756
CREATED:20240605T205101Z
LAST-MODIFIED:20240729T181638Z
UID:6748-1724328000-1724331600@instituteofprivacydesign.org
SUMMARY:Explainability by Design: Shaping the Future of AI Accountability and Digital Agency | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a recorded 10-minute introduction followed by a 40-minute informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. \nThe participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nAugust 22\, 2024 @ 12:00 PM EDT / 6:00 PM CEST \n  \nTopic:\nExplainability by Design: Shaping the Future of AI Accountability and Digital Agency \n  \nSynopsis:\nShoshana advocates for the establishment and recognition of digital agency and understanding as a fundamental human right\, serving as a reinforcement and guiding principle for the evolving legal frameworks working to keep pace with technological innovation. This right bridges the gaps between requirements around transparency and around mere transparency and the requisite layers of explainability\, ensuring AI systems provide meaningful and accessible explanations of their decision-making processes. \nShe also advocates for the criticality of a time bound global mandate for new technology to be explainable by design by a specific date\, and to have a lessened impact on the environment. This mandate would create an incentive for investment in furthering AI technology development to allow the systems to be mightier (to support explainability) and lighter (to facilitate the integration of standards to reduce environmental impact.) \nJust as the integration of privacy notices became essential for compliance and trust\, and accountability in business transactions\, the inevitability of explainability notices and addenda in AI is undeniable. Both businesses and individuals require these notices ( and explainability attendant same) to operate effectively within an AI technology ecosystem. The evolution of privacy laws demonstrated that transparency and accountability are fundamental to building trust and meeting regulatory standards\, and analogous mechanisms are needed to facilitate explainability and accountability in business and consumer interactions in the age of AI. \n  \nProblem Statement:\nDespite current and evolving regulatory frameworks aiming to address transparency and accountability in AI\, the rapid advancement of technology continues to expand the gaps between regulatory requirements\, the needs of individuals and businesses\, and the ability of AI technologies to provide the necessary levels of explainability. These gaps hinder effective oversight\, erode the trust of users and impacted individuals\, and compromise environmental sustainability. Mandating explainability by design is essential to bridge these gaps\, ensuring that AI systems can be transparent\, explained\, and meet regulatory standards and societal and environmental expectations. \n  \nPre-Discussion Resources:\n\n\nHOME | I Statements (aideiprivacystatements.com) (Explainability by Design Resources are at the bottom of the page.)\nExplainability Case Studies\nExplainability Rubric\nAdvancing Explainability through AI Literacy and Design Resources\nA Taxonomy of Explanations to Support Explainability by Design\nOne proposed approach: Explainability by Design: A Methodology to Support Explanations in Decision-Making Systems (arxiv.org)\nExplainable Convolutional Neural Networks\n\n  \nSpeaker:\n\nShoshana Rosenberg \nShoshana Rosenberg is a thought leader at the intersection of privacy law and DEI\, a seasoned corporate attorney and business executive with a broad global purview and over 16 years in international data protection law. She is also the co-founder of Women in AI Governance and the founder of SafePorter\, a PICCASO EU award-winning B Corp enabling privacy-by-design DEI and engagement programs and true data minimization that protects individuals and eliminates organizational data risk around DEI. Shoshana is a founding member and strategic programs advisor at Logical AI Governance\, where she developed the PRISM Framework and the AI Governance training programs to enable privacy and compliance professionals to build\, operationalize\, and refine AI Governance programs. A U.S. Navy veteran\, data ethics and emerging technology fanatic\, and a member of the IAPP Diversity in Privacy Advisory Board\, Shoshana is a passionate advocate for social entrepreneurship and inclusion by design. \n  \nModerator:\nNicole Nguyen \nNicole Nguyen is an enthusiastic privacy professional\, deftly bridging technology\, business\, and law. Beginning in intellectual property and patent prosecution\, she gained practical legal insights and technical acumen\, which she later applied to data privacy across various regulated industries. She honed her skills across operations\, regulatory analysis\, and engineering\, to streamline data handling\, shape privacy policies\, and drive privacy-by-design principles. Recognized for her leadership and technical prowess\, Nicole guides strategic decisions\, resolves complex issues\, and bolsters business resilience. Her passion lies in crafting comprehensive data governance programs\, weaving security\, privacy\, and risk disciplines into robust frameworks that safeguard business value and promote a future-ready\, privacy-conscious culture. B.S. Chemical and Electrical Engineering and M.S. Information Systems\, Cybersecurity and Business Intelligence. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to this month’s event! Please reach out to a current member to be invited as a guest. If you are already a member\, subscribe to our PETed Mailing List for announcements and monthly invitations!
URL:https://instituteofprivacydesign.org/event/august-2024-privacy-engineering-technology-education-discussion-peted/
LOCATION:https://iopd.whereby.com/peted-discussion
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/06/1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20240731T150000
DTEND;TZID=America/New_York:20240731T160000
DTSTAMP:20260405T152756
CREATED:20240605T221416Z
LAST-MODIFIED:20240605T221534Z
UID:6769-1722438000-1722441600@instituteofprivacydesign.org
SUMMARY:Q3 All Hands Meeting 2024
DESCRIPTION:ALL HANDS MEETING 📢 Ambassadors\, Advisors\, Volunteers\, and Committee Members are all invited to join us Wednesday\, July 31st from 3:00 – 4:00 PM. We will meet to welcome our new members\, give updates on current projects\, and report on the new standard currently in development! \nPlease reach out to admin22@instituteofprivacydesign.org if you have yet to receive an invitation. Guest invites are first come first serve. If you have a friend or colleague interested\, send us their email and we’ll add them to the guest list!
URL:https://instituteofprivacydesign.org/event/q3-all-hands-meeting-2024/
LOCATION:https://iopd.whereby.com/all-hands
CATEGORIES:All Hands Meeting,IOPD Events
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/06/Q3allhands.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Halifax:20240725T120000
DTEND;TZID=America/Halifax:20240725T130000
DTSTAMP:20260405T152756
CREATED:20240507T182637Z
LAST-MODIFIED:20240514T201320Z
UID:6579-1721908800-1721912400@instituteofprivacydesign.org
SUMMARY:Differential Privacy in Practice: Unlocking Insights from Data while Protecting Individual Privacy | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a recorded 10-minute introduction followed by a 40-minute informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. \nThe participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nJuly 25\, 2024 @ 12:00 PM EDT / 6:00 PM CEST \n  \nTopic:\nHow to deploy differential privacy in practice to unlock insights from sensitive\, regulated\, or proprietary data.  \n  \nSynopsis:\nIn this talk Gerome will explain how we have successfully used differential privacy to unlock insights from highly sensitive data. This will include examples of practical deployments of the technology at major enterprises: the challenges\, solutions\, and lessons learned. Gerome will describe the benefits differential privacy can offer to a data custodian who is responsible for the management of sensitive data\, including clarity about the privacy risk involved in a data release and the ability to share higher-quality data. \n  \nProblem Statement:\nDifferential privacy allows sensitive data to be shared in a manner that preserves insights in the data while offering a rigorous guarantee of protection for contributing individuals. \n  \nPre-Discussion Resources:\n\nBlog post: A friendly introduction to differential privacy\nBlog post: An overview of various privacy-enhancing technologies\, showing where differential privacy fits\nBlog post: A look at some failures of legacy anonymization technologies\nA reference textbook on differential privacy\, focused on applications.\nA reference textbook on differential privacy\, focused on theory. \nNIST Differential Privacy Blog series \nNIST Guidelines for Evaluating Differential Privacy Guarantees\nTumult Labs Case Study: Publishing Wikipedia Usage Data with strong privacy\nA video tour of Tumult Analytics\, easy-to-use software for differentially private computation \n\n  \nSpeaker:\nGerome Miklau \nGerome Miklau is co-founder and CEO of Tumult Labs\, a start-up focused on commercializing privacy technology. He is on leave from his position as a Professor of Computer Science at the University of Massachusetts\, Amherst. Prior to founding Tumult Labs\, he consulted for the U.S. Census Bureau on disclosure avoidance algorithms used for the 2020 Decennial Census. \nHis academic research focuses on private\, secure\, and equitable data management. He designs algorithms to accurately learn from data without disclosing sensitive facts about individuals\, primarily in the model of differential privacy. He studies fair and responsible data management. He has also designed novel techniques for controlling access to data\, limiting retention of data\, and resisting forensic analysis. \nHe received his Ph.D. in Computer Science from the University of Washington in 2005. He earned Bachelor’s degrees in Mathematics and in Rhetoric from the University of California\, Berkeley\, in 1995. \n  \nModerator:\nMary Yip \nMary is a privacy officer who is passionate about privacy and data protection. She is responsible for the oversight of privacy compliance and privacy risks management for several entities within her organization. Mary’s diverse experience in privacy\, risk management\, auditing\, business analysis\, and project management contributes to her effectiveness in leading and supporting privacy programs across entities. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to this month’s event! Please reach out to a current member to be invited as a guest. If you are already a member\, subscribe to our PETed Mailing List for announcements and monthly invitations!
URL:https://instituteofprivacydesign.org/event/differential-privacy-in-practice-unlocking-insights-from-data-while-protecting-individual-privacy-privacy-engineering-technology-education-discussion-peted/
LOCATION:https://iopd.whereby.com/peted-discussion
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/03/2-1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20240627T120000
DTEND;TZID=America/New_York:20240627T130000
DTSTAMP:20260405T152756
CREATED:20240605T213332Z
LAST-MODIFIED:20240605T213637Z
UID:6760-1719489600-1719493200@instituteofprivacydesign.org
SUMMARY:Design Process Standard Deep Dive | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a recorded 10-minute introduction followed by a 40-minute informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. \nThe participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nJune 27\, 2024 @ 12:00 PM EDT / 6:00 PM CEST \n  \nTopic:\nDesign Process Standard Deep Dive \n  \nSynopsis:\nIOPD President R Jason Cronk welcomes you to ask all your burning questions about the Design Process Standard published last year. The need for this standard is a culmination of several factors\, and details the components necessary in a design process to incorporate privacy considerations and reduce privacy risks to individuals. The process could be the design of products\, services or business processes and spans the lifecycle from ideation to deployment. This standard covers privacy and is not limited to “data protection” or any specific jurisdictional approach. Privacy is a broader concept than data protection and covers all interactions between individuals and others in society and the social norms governing those interactions. This standard is purposefully ambiguous in that regards. \n  \nProblem Statement:\nPrivacy by design is an international concept that has been promoted by regulators worldwide and has been adopted into laws and regulations. Until now\, it has been ‘squishy’\, hard to define\, and difficult to implement. \n  \nPre-Discussion Resources:\n\nDesign Process Standard V. 1.0\nDesign Process Standard Launch Webinar Recording\n\n  \nSpeaker:\nR Jason Cronk \nWith over two decades of experience in principle and trust consultancy\, Jason Cronk is a seasoned privacy engineer\, developer\, author of the IAPP textbook “Strategic Privacy by Design\,” Privacy Engineering Section Leader at the IAPP\, and founder and president of the Institute of Operational Privacy Design. His knowledge and involvement reaches across the spectrum as an active member of the academic\, engineering\, legal and professional privacy communities and a pioneering voice in the development of privacy by design. \n  \nModerator:\nJanelle Hsia \nJanelle Hsia is the President and Founder of Privacy SWAN Consulting working as a trainer\, consultant\, and trusted advisor for strategic and tactical decision-making. While she is focused on the field of privacy and data protection\, Janelle Hsia is not a lawyer and brings a diverse background with strong leadership\, technical\, and business skills spanning 20 years in the areas of project management\, IT\, privacy\, security\, data governance\, and process improvement. Janelle Hsia is also Co-Founder and Vice-President of the Institute of Operational Privacy Design. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to this month’s event! Please reach out to a current member to be invited as a guest. If you are already a member\, subscribe to our PETed Mailing List for announcements and monthly invitations!
URL:https://instituteofprivacydesign.org/event/design-process-standard-deep-dive-privacy-engineering-technology-education-discussion-peted/
LOCATION:https://iopd.whereby.com/peted-discussion
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/05/1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20240620T160000
DTEND;TZID=America/New_York:20240620T170000
DTSTAMP:20260405T152757
CREATED:20240606T201206Z
LAST-MODIFIED:20240620T195953Z
UID:6765-1718899200-1718902800@instituteofprivacydesign.org
SUMMARY:Let's Talk Privacy Certifications | Pop-Up Webinar
DESCRIPTION:Date & Time:\nJune 20\, 2024 @ 4:00 PM EDT \n  \nSynopsis:\nCertification can be a wonderful validation that a company is doing something right. But the certification market can be confusing and hard to interpret at times. Different certifications apply to different aspect of company’s operation. The menagerie of players (certification bodies\, assurance assessors\, schema owners) can be mind-numbing. In our journey to create standards and certifications\, the IOPD has had to untangle this mess to find our own role in this vast and complex ecosystem. While we’re blogged about this previously\, this webinar will go further looking at many of the privacy certifications currently on the market\, including: \n\n\n\n\nEuropean Data Protection Seal\nMSECB ISO 31700-1:2023 – Privacy by Design Framework\nLOCS:23 Standard\nCARU Safe Harbor Program\nTRUSTe Enterprise Privacy Certification\nISO 27701\nas well as our own Design Process Standard and forthcoming Privacy by Design and Default Trustmark\n\n\n  \nSpeaker:\nR Jason Cronk \nWith over two decades of experience in principle and trust consultancy\, Jason Cronk is a seasoned privacy engineer\, developer\, author of the IAPP textbook “Strategic Privacy by Design\,” Privacy Engineering Section Leader at the IAPP\, and founder and president of the Institute of Operational Privacy Design. His knowledge and involvement reaches across the spectrum as an active member of the academic\, engineering\, legal and professional privacy communities and a pioneering voice in the development of privacy by design.
URL:https://instituteofprivacydesign.org/event/lets-talk-privacy-certifications/
LOCATION:https://iopd.whereby.com/institute-of-operational-privacy-design
CATEGORIES:IOPD Events
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/06/grfwecedc.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Halifax:20240523T120000
DTEND;TZID=America/Halifax:20240523T130000
DTSTAMP:20260405T152757
CREATED:20240314T182713Z
LAST-MODIFIED:20240514T201232Z
UID:6408-1716465600-1716469200@instituteofprivacydesign.org
SUMMARY:Data Access and Deletion in the Large Scale Structured and Unstructured Datasets | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a recorded 10-minute introduction followed by a 40-minute informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. \nThe participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nMay 23\, 2024 @ 12:00 PM EDT / 6:00 PM CEST \n  \nTopic:\nData Access and Deletion in the Large Scale Structured and Unstructured Datasets \n  \nSynopsis:\nMany jurisdictions grant the public the rights to access and request deletion of their data. As privacy technologists\, it often falls to us to create the systems for responding to these requests. While a response may seem simple on its face (grab all of the data related to this person)\, there are a host of both technical and organizational challenges when responding to these requests at scale\, including: keeping an up to date map between people and the systems that store their data; issuing access and deleting requests without affecting production system performance; and redacting information unrelated to the subject from unstructured data. We’ll talk through these and other challenges and how privacy teams can tackle them. \n  \nProblem Statement:\nPrivacy technologists struggle to efficiently handle large-scale requests for data access and deletion due to challenges in mapping data across systems\, avoiding production disruptions\, and redacting unrelated information from unstructured data. This requires scalable solutions to ensure regulatory compliance and protect data rights without compromising system performance. \n  \nRelated PETs (Privacy-Enhancing Technologies):\nThere are a variety of PETs potentially involved\, most notably tools for anonymization/pseudonymization. There are also a large number of data and ML-related engineering technical topics to be discussed. \n  \nPre-Discussion Resources:\n\nLea Kissner writing for the IAPP about considerations for data retention/deletion: https://iapp.org/news/a/data-retention-in-a-distributed-system/\nKatharina Koerner speaking at PEPR 2023 about legal standards for anonymization/deidentification\, which are critical to understand in the context of determining what needs to be deleted: https://www.youtube.com/watch?v=m5u3AM5PaD4\nNandita Rao Narla at PEPR 2021 about data deletion: https://www.youtube.com/watch?v=f6-EUEmBuPw&list=PL_cjZ5iVWe7n0sU5g0o8zTZSMLAvfl4nL&index=8\nUK ICO FAQ on responding to employee access request\, with some really helpful examples detailing just how much sifting through of documents needs to be done: https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/employment/subject-access-request-q-and-as-for-employers/\n\n  \nSpeaker:\nJosh Schwartz\nJosh Schwartz is the CEO and cofounder of Phaselab\, a startup that helps privacy teams manage their unstructured data. Prior to founding Phaselab he served as CTO and Data Protection Officer of Chartbeat\, a leading analytics company\, where he led the operation of petabyte-scale data infrastructure. He studied machine learning as a PhD student at MIT’s Computer Science and Artificial Intelligence lab. \n  \nModerator:\nNicole Nguyen \nNicole Nguyen is an enthusiastic privacy professional\, deftly bridging technology\, business\, and law. Beginning in intellectual property and patent prosecution\, she gained practical legal insights and technical acumen\, which she later applied to data privacy across various regulated industries. She honed her skills across operations\, regulatory analysis\, and engineering\, to streamline data handling\, shape privacy policies\, and drive privacy-by-design principles. Recognized for her leadership and technical prowess\, Nicole guides strategic decisions\, resolves complex issues\, and bolsters business resilience. Her passion lies in crafting comprehensive data governance programs\, weaving security\, privacy\, and risk disciplines into robust frameworks that safeguard business value and promote a future-ready\, privacy-conscious culture. B.S. Chemical and Electrical Engineering and M.S. Information Systems\, Cybersecurity and Business Intelligence. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to this month’s event! Please reach out to a current member to be invited as a guest. If you are already a member\, subscribe to our PETed Mailing List for announcements and monthly invitations!
URL:https://instituteofprivacydesign.org/event/data-access-and-deletion-in-the-large-scale-structured-and-unstructured-datasets-privacy-engineering-technology-education-discussion-peted/
LOCATION:https://iopd.whereby.com/peted-discussion
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/03/4.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Halifax:20240425T120000
DTEND;TZID=America/Halifax:20240425T130000
DTSTAMP:20260405T152757
CREATED:20240103T231906Z
LAST-MODIFIED:20240416T180307Z
UID:5410-1714046400-1714050000@instituteofprivacydesign.org
SUMMARY:From Permission Usage to Compliance Analysis | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a recorded 10-minute introduction followed by a 40-minute informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. \nThe participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nApril 25\, 2024 @ 12:00 PM EDT / 5:00 PM CET \n  \nTopic:\nFrom Permission Usage to Compliance Analysis: Lessons Learned Analyzing Android Apps for 10 years \n  \nSynopsis:\nWe have been analyzing Android apps for regulatory requirements for eight years. We have analyzed Android apps for COPPA\, CCPA\, and Health Compliance (HIPAA\, HBNR\, and FTC Act). In this talk\, I present the lessons learned after analyzing thousands of apps\, the technical challenges we face while analyzing Android apps\, patterns of non-compliance issues we uncovered\, and the likely root causes of non-compliance. The talk will touch upon challenges posed by third-party code in complying with regulatory requirements\, the importance of privacy assessment\, and how the technical realm has changed over time for privacy assessments. \n  \nProblem Statement:\nWhat are the risks posed by the use of third-party code in the mobile ecosystem? How can you identify those risks before they become a regulatory headache? \n  \nRelated PETs (Privacy-Enhancing Technologies):\n\nAccountability\nCode Transparency\nPermission Usage\nPrivacy Assessment\nDynamic Analysis\n\n  \nPre-Discussion Resources:\n\nhttps://petsymposium.org/popets/2018/popets-2018-0021.php \nhttps://www.usenix.org/system/files/sec19-reardon.pdf \nhttps://petsymposium.org/popets/2023/popets-2023-0072.php \nhttps://conpro23.ieee-security.org/papers/samarin-conpro23.pdf \nhttps://www.ieee-security.org/TC/SPW2021/ConPro/papers/samarin-conpro21.pdf \nhttps://www.ieee-security.org/TC/SPW2019/ConPro/papers/okoyomon-conpro19.pdf \nhttps://www.usenix.org/system/files/conference/usenixsecurity15/sec15-paper-wijesekera.pdf \nhttps://petsymposium.org/popets/2022/popets-2022-0108.pdf \nhttps://petsymposium.org/popets/2020/popets-2020-0050.pdf \nhttps://www.issa.org/event/taking-responsibility-for-someone-elses-code-studying-the-privacy-behaviors-of-mobile-apps-at-scale/ \n\n  \nSpeaker:\nPrimal Wijesekera\nPrimal Wijesekera is a research scientist in the Usable Security and Privacy Research Group at ICSI and holds an EECS appointment at the University of California\, Berkeley. His research exposes current privacy and security vulnerabilities and provides systematic solutions to meet consumers’ privacy expectations. He has extensive experience in mobile app analysis for privacy and security violations and implementing privacy protections for Android. He has published in top-tier security venues (IEEE S&P\, USENIX Security) and usable security and privacy venues (ACM CHI\, SOUPS\, PETS). He received his Ph.D. from the University of British Columbia\, although he carried out his Ph.D. research at UC Berkeley. His research on privacy on mobile platforms has received the Caspar Bowden Award for Outstanding Research in Privacy Enhancing Technologies\, the USENIX Security Distinguished Paper Award\, the AEPD Emilio Aced Personal Data Protection Research Award\, and the CNIL-INRIA Privacy Award. He is a PI/Co-PI on multiple NSF Projects. He has also helped federal regulators in sensitive privacy investigations. He has also held an engineering position at Microsoft. \n  \nModerator:\nKimberly Lancaster \nTrusted Privacy Advisor who Guides Data Protection\, Drives Operational Excellence\, and Leads with Integrity by aligning with InfoSec\, Security\, GRC\, Compliance\, and Data Governance. Board Member\, Speaker\, and Author. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to this month’s event! Please reach out to a current member to be invited as a guest. If you are already a member\, subscribe to our PETed Mailing List for announcements and monthly invitations!
URL:https://instituteofprivacydesign.org/event/march-2024-privacy-engineering-technology-education-discussion-peted/
LOCATION:https://iopd.whereby.com/peted-discussion
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/01/4-1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Halifax:20240424T150000
DTEND;TZID=America/Halifax:20240424T160000
DTSTAMP:20260405T152757
CREATED:20240401T194352Z
LAST-MODIFIED:20240401T194352Z
UID:6464-1713970800-1713974400@instituteofprivacydesign.org
SUMMARY:Q2 All Hands Meeting
DESCRIPTION:ALL HANDS MEETING 📢 Ambassadors\, Advisors\, Volunteers\, and Committee Members are all invited to join us Wednesday\, April 24th from 3:00 – 4:00 PM. We will meet to welcome our new members\, report on the new Standards Subcommittees\, and discuss the Privacy Design Seal currently in development! \nPlease reach out to admin22@instituteofprivacydesign.org if you have yet to receive an invitation. Guest invites are first come first serve. If you have a friend or colleague interested\, send us their email and we’ll add them to the guest list!
URL:https://instituteofprivacydesign.org/event/q2-all-hands-meeting/
LOCATION:https://iopd.whereby.com/all-hands
CATEGORIES:All Hands Meeting
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/04/AllHandsApr24.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Halifax:20240328T130000
DTEND;TZID=America/Halifax:20240328T140000
DTSTAMP:20260405T152757
CREATED:20240131T192636Z
LAST-MODIFIED:20240312T181817Z
UID:5913-1711630800-1711634400@instituteofprivacydesign.org
SUMMARY:Deceptive Design – Dark Patterns Beyond the Interface | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a recorded 10-minute introduction followed by a 40-minute informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. \nThe participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nMarch 28\, 2024 @ 1:00 PM EDT / 6:00 PM CET \n  \nTopic:\nDeceptive Design – Dark Patterns Beyond the Interface \n  \nSynopsis:\nJoin us for a riveting exploration into the murky waters of  ‘dark patterns’\, those cunning design strategies that steer us away from our true intentions online. Unravel the complexities of how these deceptive practices not only compromise our autonomy but also our wallets\, as we delve into the realms of consumer and data protection laws. With a keen eye on the European Union’s regulatory landscape\, Dr Leiser will dissect how current legislation like the General Data Protection Regulation\, the Unfair Commercial Practices Directive and the new Digital Services Act is being outmanoeuvred by Big tech and ponder the necessary evolution of laws to curb these manipulative designs that lurk behind every click. From sneaky sign-ups to misleading layouts\, learn how to spot these digital traps and the legal mechanisms at our disposal to combat them. This talk is a clarion call for a more informed and vigilant digital citizenship in an age where technology’s reach is ubiquitous\, and its influence\, undeniable. \n  \nLessons Learned:\nThe talk is a clarion call for a multi-faceted approach to combating dark patterns\, combining legal reform\, education\, and ethical design principles to safeguard digital citizenship in an increasingly manipulative digital landscape. \n  \nPre-Webinar Resources:\n\n Deceptive Design website – https://deceptive.design \n\n  \nSpeaker:\nDr M.R. (Mark) Leiser\nDr M.R. Leiser is a regulatory theorist specialising in Digital\, Legal\, and Platform Regulation at Vrije Universiteit Amsterdam in the Department of Transnational Legal Studies\, Amsterdam Law & Technology Institute). In his research and lecturing\, he focuses on law and digital technologies related to deceptive design\, dark patterns\, consumer protection\, and the use and regulation of AI and digital technologies. Alongside the running of the deceptive.design website\, he has published 30 pieces of academic work\, and several book chapters on AI\, company law\, deceptive design\, and data protection matters. He is a sought-after public speaker and has presented his findings to the UN\, the EU Parliament\, the Council of Europe\, and various regulators. \n  \nModerator:\nAndrei Dumitru \nAfter working more than ten years in the IT department of an International Organization\, I focused in the last years on privacy and data protection. I think privacy is a fundamental right that cannot exist today without computer security so I am always searching to find a balance between both and see how privacy can be embedded into technology. I am particularly interested in privacy by design and dark patterns\, and always curious to see how privacy is integrated (or not) in current products and services. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to this month’s event! Please reach out to a current member to be invited as a guest. If you are already a member\, subscribe to our PETed Mailing List for announcements and monthly invitations!
URL:https://instituteofprivacydesign.org/event/deceptive-design-dark-patterns-beyond-the-interface-privacy-engineering-technology-education-discussion-peted/
LOCATION:https://iopd.whereby.com/peted-discussion
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/01/4.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20240222T120000
DTEND;TZID=America/New_York:20240222T130000
DTSTAMP:20260405T152757
CREATED:20240103T231725Z
LAST-MODIFIED:20240207T204551Z
UID:5405-1708603200-1708606800@instituteofprivacydesign.org
SUMMARY:Deploying Decentralized Privacy-Preserving Contact Tracing | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a recorded 10-minute introduction followed by a 40-minute informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. \nThe participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nFebruary 22\, 2024 @ 12:00 PM ET / 6:00 PM CET \n  \nTopic:\nDeploying decentralized privacy-preserving contact tracing \n  \nProblem Statement:\nDigital contact tracing systems promised to help combat the COVID-19 pandemic; but in doing so introduce privacy risks. Privacy-friendly contact tracing systems enable notification of exposed people without privacy harms. \n  \nSynopsis:\nIn this webinar Wouter will talk about his experience designing a large-scale privacy-friendly digital contact tracing system that later led to the system adopted by Google and Apple\, and experiences deploying such a privacy-friendly system in the wild. \n  \nPre-Webinar Resources:\n\nDeploying Decentralized\, Privacy-Preserving Proximity Tracing https://dl.acm.org/doi/pdf/10.1145/3524107\nDecentralized Privacy-Preserving Proximity Tracing (original DP3T whitepaper) https://arxiv.org/pdf/2005.12273.pdf\n\n  \nSpeaker:\nWouter Lueks\nWouter Lueks is a tenure-track faculty member at the CISPA Helmholtz Center for Information Security in Saarbrücken\, German. Before that he was a postdoctoral researcher at EPFL in Lausanne\, Switzerland where he worked with Carmela Troncoso. He is interested in solving real-world problems by designing end-to-end privacy-friendly systems. To do so he combines privacy\, applied cryptography\, and systems research. His work has real-world impact. For instance\, his designs for privacy-friendly contact tracing have been deployed in millions of phones around the world\, and his secure document search system is being deployed by a large organisation for investigative journalists. \n  \nModerator:\nAndrei Dumitru \nAfter working more than ten years in the IT department of an International Organization\, I focused in the last years on privacy and data protection. I think privacy is a fundamental right that cannot exist today without computer security so I am always searching to find a balance between both and see how privacy can be embedded into technology. I am particularly interested in privacy by design and dark patterns\, and always curious to see how privacy is integrated (or not) in current products and services. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to the recurring monthly event. If you are already a member\, but need an invitation please contact admin22. 
URL:https://instituteofprivacydesign.org/event/dark-patterns-february-2024-privacy-engineering-technology-education-discussion-peted/
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/01/1-1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20240214T150000
DTEND;TZID=America/New_York:20240214T160000
DTSTAMP:20260405T152757
CREATED:20240117T190010Z
LAST-MODIFIED:20240117T190010Z
UID:5497-1707922800-1707926400@instituteofprivacydesign.org
SUMMARY:Q1 2024 All Hands Meeting
DESCRIPTION:Ambassadors\, Advisors\, Volunteers\, and Committee Members are all invited to join us Wednesday\, February 14th from 3:00 – 4:00 PM. We will meet to welcome our new members\, discuss community growth\, and report on the Privacy Design Seal in development! Agenda will be announced ASAP.  \nPlease reach out to admin22@instituteofprivacydesign.org if you are a member and have yet to receive invitation details. Guest invites limited. Ask a current member to nominate you to receive an invitation. If you are not a member\, apply to join as an Ambassador to get invited and see what’s up! 
URL:https://instituteofprivacydesign.org/event/q1-2024-all-hands-meeting/
CATEGORIES:All Hands Meeting
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/01/IOPD-AHM-Q12024.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20240125T120000
DTEND;TZID=America/New_York:20240125T130000
DTSTAMP:20260405T152757
CREATED:20240103T231752Z
LAST-MODIFIED:20240125T165323Z
UID:5403-1706184000-1706187600@instituteofprivacydesign.org
SUMMARY:Assurance Cases | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a one-hour informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases.\nThe participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nJanuary 25th\, 2024 @ 12:00 PM ET \n  \nTopic:\nAssurance Cases \n  \nSynopsis:\nAssurance cases are gaining traction as a means of certification in Aerospace and other safety and security critical industries. However\, these assurance cases can become overwhelming and complicated\, even for moderately complex systems. Therefore\, there is a compelling requirement to develop new automation that can aid in creating and assessing assurance cases. In this introductory presentation for the webinar to facilitate subsequent discussion\, I introduce a rigorous framework that eliminates adhoc construction of assurance cases with emphasis on the validity and soundness of the argumentation process\, confidence of the claims/arguments/evidences and the systematic exploration of defeaters. I briefly discuss the tools and automation support for Assurance 2.0 that was developed in the Clarissa project for a DARPA ARCOS program and finally highlight the key capabilities through examples. \n  \nPre-Webinar Resources:\n\nCLARISSA: Foundations\, Tools & Automation for Assurance Cases\, Presented at 42nd Digital Avionics Systems Conference (DASC)\, Barcelona\, Spain\, October 2023\, https://www.csl.sri.com/~rushby/papers/clarissa-dasc23.pdf\nAdelard ASCE tool https://www.adelard.com/asce/\nAdvoCATE NASA Tool: https://ntrs.nasa.gov/citations/20220009664\nAssurance 2.0 Methodology for Assurance cases: https://arxiv.org/abs/2205.04522\nAssurance 2.0: A Manifesto 2004.10474v3.pdf (arxiv.org) \n\n  \nSpeaker:\nVatsan Varadarajan\nDr. Srivatsan Varadarajan is a Technical Engineering Fellow within Advanced Technology for Honeywell Aerospace. Research areas include AI/ML technologies for autonomy\, certifiable software\, formal methods (e.g.\, model checkers\, theorem provers) for automated verification. Extensive expertise in the development of distributed\, fault-tolerant networks\, wireless communications and dependable\, embedded hardware and software platforms. In his current role\, he oversees specific research and technology development projects in the core areas of certifiable platforms and assurance technologies for autonomous avionics systems. He has over 30 publications in conferences and journals and has 25 Patents awarded to date. He has a PhD in Computer Science and MS degrees in Mathematics and Computer sciences and Engineering. \n  \nModerator:\nSteve Hickman\nSteve is the Founder of Epistimis\, a company dedicated to providing tools that support true Privacy by Design. Prior to founding Epistimis\, Steve worked in the privacy organization at Meta/Facebook where he saw first-hand how current approaches won’t scale. Prior to working at Meta\, Steve worked at Honeywell Aerospace Advanced Technology\, where he worked on design tooling for the US Army. (Epistimis Modeling Tool uses the same conceptual foundation developed for the US Military). Prior to that\, he managed the Honeywell Aerospace Invention / Patent review process for 4 years\, which processed ~1000 invention disclosures each year. There he improved and automated processes\, created training\, and managed 12+ invention review teams consisting of 75+ technical experts from across Honeywell Aerospace. Prior to that Steve was one of architects of HiLiTE\, a tool for test and code generation for data flow and state designs used in Honeywell Aerospace’s flight and engine control software. Steve holds a BSEE from Rice University\, a MS Comp Sci from Southern Methodist University\, and a JD (Intellectual Property) from Mitchell Hamline School of Law. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to the recurring monthly event. If you are already a member\, but need an invitation please contact admin22. 
URL:https://instituteofprivacydesign.org/event/assurance-cases-january-2024-privacy-engineering-technology-education-discussion-peted/
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2024/01/1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20231207T120000
DTEND;TZID=America/New_York:20231207T130000
DTSTAMP:20260405T152757
CREATED:20231025T221918Z
LAST-MODIFIED:20231025T223802Z
UID:4978-1701950400-1701954000@instituteofprivacydesign.org
SUMMARY:Vector Databases: AI Uses\, Privacy Risks\, and Mitigations | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our new IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a one-hour informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. The participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \nDate & Time:\nDecember 7th\, 2023 @ 12:00 PM ET \nTopic:\nVector Databases: AI Uses\, Privacy Risks\, and Mitigations \nProblem Statement:\nDevelopers are rushing to adopt new AI tools and techniques\, and private data for AI systems is shifting out of models and into vector databases. These new databases are immature from a security and privacy perspective\, and the attacks against them are numerous and growing by the day. Understanding\, controlling\, monitoring\, and protecting the data in these databases should be a top priority of security and privacy teams. \nSynopsis:\nA short discussion of what a vector database is\, how its popularity is soaring\, why organizations are adopting it\, and the many risks and misconceptions associated with the new types of data going into them. \nRelated PETs:\n\nHomomorphic Encryption\nData-in-Use Encryption\nCustomer-held Encryption Keys\n\nPre-Webinar Resources:\n\nGoogle: Meet AI’s multitool: Vector embeddings \nOpenAI: Introducing text and code embeddings \nIBM: What is retrieval-augmented generation? \nRetrieval Augmented Generation using Azure Machine Learning prompt flow \n“Embeddings Aren’t Human Readable” And Other Nonsense \nText Embeddings Reveal (Almost) As Much As Text \n“Inverting facial recognition models (Can we teach a neural net to convert face embedding vectors back to images?) \n\nSpeaker:\nPatrick Walsh \nPatrick Walsh has more than 20 years of experience building security products and enterprise software solutions. Most recently\, he ran an Engineering division at Oracle\, bringing productivity and insights to the world’s largest companies. Patrick now leads IronCore Labs\, a data protection platform that helps businesses get back control of their data so they can meet increasingly stringent data protection and privacy requirements. \nModerator:\nJanelle Hsia \nJanelle Hsia is the President and Founder of Privacy SWAN Consulting working as a trainer\, consultant\, and trusted advisor for strategic and tactical decision-making. While she is focused on the field of privacy and data protection\, Janelle Hsia is not a lawyer and brings a diverse background with strong leadership\, technical\, and business skills spanning 20 years in the areas of project management\, IT\, privacy\, security\, data governance\, and process improvement. Janelle Hsia is also Co-Founder and Vice-President of the Institute of Operational Privacy Design. \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only webinar. Join as an Ambassador to get invited to the recurring monthly event. If you are already a member\, but need an invitation please contact admin22@instituteofprivacydesign.org. 
URL:https://instituteofprivacydesign.org/event/vector-databases-ai-uses-privacy-risks-and-mitigations-privacy-engineering-technology-education-discussion-peted/
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2023/10/1-1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/New_York:20231102T120000
DTEND;TZID=America/New_York:20231102T130000
DTSTAMP:20260405T152757
CREATED:20231009T194457Z
LAST-MODIFIED:20231026T153729Z
UID:4893-1698926400-1698930000@instituteofprivacydesign.org
SUMMARY:What Can Go Wrong With Your AI? | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our new IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a one-hour informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. The participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n \nDate & Time:\nNovember 2nd\, 2023 @ 12:00 PM ET \n \nTopic:\nWhat Can Go Wrong With Your AI? \n \nProblem Statement:\nDo AI systems require a holistic risk-based approach? How can you identify and assess all the different risks? \n \nSynopsis:\nDuring this session we are going to discuss the problem of risk identification and assessment in AI systems. We will look into the ideas in Europe of introducing a Fundamental Rights Impact Assessment and we will explore some of the available tools. \n \nRelated PETs:\n\nPLOT4ai\nATLAS\n\n  \nPre-Webinar Resources:\n\nhttps://plot4.ai/\nhttps://datasociety.net/wp-content/uploads/2021/11/HUDIERA-Full-Paper_FINAL.pdf\nhttps://rm.coe.int/huderaf-coe-final-1-2752-6741-5300-v-1/1680a3f688\nhttps://www.ippapublicpolicy.org/file/paper/6476ec2fe5c7f.pdf\nhttps://www.government.nl/documents/reports/2022/03/31/impact-assessment-fundamental-rights-and-algorithms\n\n  \nSpeaker:\nIsabel Barberá \nIsabel Barberá is the co-founder of Rhite\, a consultancy specialised in Privacy Engineering and Responsible AI based in The Netherlands. She advises local and international organisations in the public and private sector on aspects related to Privacy Engineering\, Privacy and Security by Design and Responsible AI. Isabel is one of the members of the European agency for Cybersecurity (ENISA) Working Group on Data Protection Engineering and a member of the European Data Protection Board (EDPB) Pool of Experts. She is also the author of the AI privacy threat modeling library PLOT4ai\, an open source AI risk assessment tool created to help organisations build Responsible AI systems. Find her on LinkedIn at https://www.linkedin.com/in/isabelbarbera/ \n \n \nModerator:\nNicole Nguyen \nNicole Nguyen is an enthusiastic privacy professional\, deftly bridging technology\, business\, and law. Beginning in intellectual property and patent prosecution\, she gained practical legal insights and technical acumen\, which she later applied to data privacy across various regulated industries. She honed her skills across operations\, regulatory analysis\, and engineering\, to streamline data handling\, shape privacy policies\, and drive privacy-by-design principles. Recognized for her leadership and technical prowess\, Nicole guides strategic decisions\, resolves complex issues\, and bolsters business resilience. Her passion lies in crafting comprehensive data governance programs\, weaving security\, privacy\, and risk disciplines into robust frameworks that safeguard business value and promote a future-ready\, privacy-conscious culture. B.S. Chemical and Electrical Engineering and M.S. Information Systems\, Cybersecurity and Business Intelligence. \n \n \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only webinar. Join as an Ambassador to get invited to the recurring monthly event. If you are already a member\, but need an invitation please contact admin22@instituteofprivacydesign.org. 
URL:https://instituteofprivacydesign.org/event/what-can-go-wrong-with-your-ai-privacy-engineering-technology-education-discussion-peted/
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2023/10/IOPD-LinkedIn.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Halifax:20230928T120000
DTEND;TZID=America/Halifax:20230928T130000
DTSTAMP:20260405T152757
CREATED:20230828T142102Z
LAST-MODIFIED:20230912T204835Z
UID:4645-1695902400-1695906000@instituteofprivacydesign.org
SUMMARY:Navigating the CRM Data Landscape: Security\, Compliance\, and Innovation | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our second webinar in the new IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a one-hour informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. \nThe participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nSeptember 28th\, 2023 @ 12:00 PM ET \n  \nTopic:\nNavigating the CRM Data Landscape: Security\, Compliance\, and Innovation \n  \nSynopsis:\nJoin us for an engaging and informative interactive session featuring Mike Smith\, Distinguished Security Architect at Salesforce. Mike is a Fellow of Information Privacy with the International Association of Privacy Professionals (IAPP)\, and is a seasoned expert in data security\, privacy\, and compliance\, particularly in the world of SaaS applications. In this session\, we will explore three critical aspects of managing data and protecting privacy in today’s digital world: \n\nProtecting Data in CRM: Discover practical strategies and best practices to safeguard your valuable customer data within your Customer Relationship Management (CRM) system. Learn how to mitigate risks\, prevent data breaches\, and build trust with your customers.\nComplying with Data Privacy Laws: Understand the tools that are available within Salesforce for privacy program management\, including data anonymization\, retention\, and managing Data Subjects Access Requests (DSARs). Learn how to ensure compliance while maintaining operational efficiency.\nUsing Generative AI with Enterprise Data\, in a Trusted and Secure Manner: Explore the possibilities of harnessing generative AI technologies for innovation within your enterprise\, while protecting personal data and other sensitive information that powers your business. Discover how to leverage AI in a secure\, responsible\, and ethical way to generate insights\, streamline tasks\, and drive business growth.\n\nOur host\, Nicole\, will moderate a dynamic Q&A session following the introduction\, where you’ll have the opportunity to ask questions and engage with Mike on these vital topics. Whether you’re a business leader\, IT professional\, or data enthusiast\, this interactive session is your gateway to navigating the complex world of data security\, compliance\, and innovation within Salesforce. Don’t miss out on this opportunity to stay informed and empowered in the data-driven era! \n  \nSpeaker:\nMike Smith \n  \nModerator:\nNicole Nguyen \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only event. Join as an Ambassador before the 1st of each month to get invited to the recurring monthly event. If you are already a member\, but need an invitation please contact admin22. 
URL:https://instituteofprivacydesign.org/event/september-privacy-engineering-technology-education-discussion-peted/
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2023/08/1-1.png
END:VEVENT
BEGIN:VEVENT
DTSTART;TZID=America/Halifax:20230824T120000
DTEND;TZID=America/Halifax:20230824T130000
DTSTAMP:20260405T152757
CREATED:20230817T222937Z
LAST-MODIFIED:20230818T181903Z
UID:4592-1692878400-1692882000@instituteofprivacydesign.org
SUMMARY:Using ChatGPT (or similar tools) with Privacy Compliance | Privacy Engineering & Technology Education Discussion (PETed)
DESCRIPTION:Join our first webinar in the new IOPD Privacy Engineering & Technology Education Discussion (PETed) Series! The format of the webinar will be a one-hour informal discussion and interaction with members of the IOPD. The goal will be a discussion on how to solve a specific privacy problem or privacy related resource topic and the latest implementation techniques for some of the biggest challenges like synthetic data\, zero-knowledge proofs\, homomorphic encryption\, and translucent databases. The participants will be asked to bring questions related to the topic. Come back the fourth Thursday of every month for a new discussion\, new speaker\, and new insights on the most cutting-edge privacy challenges! \n  \nDate & Time:\nAugust 24th\, 2023 @ 12:00 PM ET \n  \nTopic:\nUsing ChatGPT (or similar tools) with Privacy Compliance: Enabling companies to identify and use unstructured data for analysis and sharing. \n  \nSynopsis:\nDo your employees unknowingly put personal or sensitive information (like mobile number\, email address\, social security number\, etc.) in emails or PDF documents and accidently expose that information to the wrong people? In a world when employees are using tools like ChatGPT for productivity\, do they need help generating synthetic data or learning what data to redact before uploading? Do you need help identifying and correcting privacy policy violations related to data sharing in real-time AND being able to provide leadership metrics? Are you looking for ways to show accountability with your privacy program? If you need answers to questions like these\, please join us as we explore a new privacy tool\, with a live demonstration\, that has been called the Grammarly of data privacy. \n  \nRelated PETs:\n\nSynthetic data\nRedaction\nEncryption\nIdentify privacy policy violation\nReal-time personal data awareness/training\nMetrics and accountability\n\n  \nPre-Webinar Resources:\n\nhttps://www.entrepreneur.com/en-in/technology/this-platform-is-grammarly-for-data-privacy/441441\nhttps://www.unifi.ai/download-priviliy\nhttps://chrome.google.com/webstore/detail/privily/ckgpondnkhpodegigclccglbdflgmmld?hl=en&authuser=3 \n\n  \nSpeaker:\nPramod Misra \nPramod Misra is passionate about using data for automation\, analytics and decision-making. He is a co-founder of Unifi.ai based in Atlanta. Unifi.ai is on a mission enabling corporate employees using AI tools with privacy compliance . His work has been quoted in Forbes\, Inc. Entrepreneur\,ceoworld and other leading publications.  He has over 20 years of full-time work experience and worked with companies like Vodafone\, Novartis\, P&G. Pramod is also an Advisor of the IOPD. \n  \nModerator:\nJanelle RW Hsia \nJanelle Hsia is the President and Founder of Privacy SWAN Consulting working as a trainer\, consultant\, and trusted advisor for strategic and tactical decision-making. While she is focused on the field of privacy and data protection\, Janelle Hsia is not a lawyer and brings a diverse background with strong leadership\, technical\, and business skills spanning 20 years in the areas of project management\, IT\, privacy\, security\, data governance\, and process improvement. Janelle Hsia is also Co-Founder and Vice-President of the Institute of Operational Privacy Design. \n  \nThe IOPD Privacy Engineering & Technology Education Discussion (PETed) Series is a members-only webinar. Join as an Ambassador to get invited to the recurring monthly event. If you are already a member\, but need an invitation please contact admin22. 
URL:https://instituteofprivacydesign.org/event/peted-aug23-using-chatgpt-or-similar-tools-with-privacy-compliance/
CATEGORIES:PETed
ATTACH;FMTTYPE=image/png:https://instituteofprivacydesign.org/wp-content/uploads/2023/08/1.png
END:VEVENT
END:VCALENDAR