{"id":166,"date":"2021-05-24T02:53:45","date_gmt":"2021-05-24T02:53:45","guid":{"rendered":"http:\/\/www.podc.org\/podc2021\/?page_id=166"},"modified":"2021-10-29T14:41:40","modified_gmt":"2021-10-29T14:41:40","slug":"cynthia-dwork","status":"publish","type":"page","link":"https:\/\/www.podc.org\/podc2021\/cynthia-dwork\/","title":{"rendered":"Cynthia Dwork"},"content":{"rendered":"\n<figure class=\"wp-block-image size-large is-style-default\"><img loading=\"lazy\" decoding=\"async\" width=\"768\" height=\"1024\" src=\"https:\/\/www.podc.org\/podc2021\/wp-content\/uploads\/sites\/13\/2021\/06\/cynthia-min-768x1024.jpg\" alt=\"\" class=\"wp-image-198\" srcset=\"https:\/\/www.podc.org\/podc2021\/wp-content\/uploads\/sites\/13\/2021\/06\/cynthia-min-768x1024.jpg 768w, https:\/\/www.podc.org\/podc2021\/wp-content\/uploads\/sites\/13\/2021\/06\/cynthia-min-225x300.jpg 225w, https:\/\/www.podc.org\/podc2021\/wp-content\/uploads\/sites\/13\/2021\/06\/cynthia-min-1152x1536.jpg 1152w, https:\/\/www.podc.org\/podc2021\/wp-content\/uploads\/sites\/13\/2021\/06\/cynthia-min-1536x2048.jpg 1536w, https:\/\/www.podc.org\/podc2021\/wp-content\/uploads\/sites\/13\/2021\/06\/cynthia-min-1200x1600.jpg 1200w, https:\/\/www.podc.org\/podc2021\/wp-content\/uploads\/sites\/13\/2021\/06\/cynthia-min-scaled.jpg 1920w\" sizes=\"auto, (max-width: 768px) 85vw, 768px\" \/><\/figure>\n\n\n\n<p class=\"has-text-align-center\"><strong>Differential Privacy in Distributed Environments: An Overview and Open Questions<\/strong><br>Cynthia Dwork, Harvard University and Radcliffe Institute for Advanced Study<\/p>\n\n\n\n<h2 class=\"has-text-align-left wp-block-heading\">Talk Abstract<\/h2>\n\n\n\n<p>Differential privacy is a mathematically rigorous definition of privacy tailored to statistical analysis of large datasets. Differentially private algorithms are equipped with a parameter which controls the formal measure of privacy loss. All algorithms have utility\/privacy tradeoffs, and the goal of algorithmic research in differential privacy is to optimize this tradeoff.<br><br>Differential privacy is most widely studied in the centralized model, in which a trusted and trustworthy curator has access to raw data. Deployment in industry has focused on the local model, where privacy is \u201crolled into\u201d the data on the client via randomization before being collected. There is a separation between the power of the centralized and local models.<br><br>After a brief recap of differential privacy and its properties, we will survey a few highlights of differential privacy in a variety of distributed settings that lie between the local and centralized models, and conclude with suggestions for future research.<\/p>\n\n\n\n<p><strong>Keywords<\/strong><br>Differential privacy; distributed algorithms<\/p>\n\n\n\n<h2 class=\"has-text-align-left wp-block-heading\">Speaker Biography<\/h2>\n\n\n\n<p>Cynthia Dwork, Gordon McKay Professor of Computer Science at the Harvard Paulson School of Engineering, Radcliffe Alumnae Professor at the Radcliffe Institute for Advanced Study, and Affiliated Faculty at Harvard Law School and Harvard Department of Statistics, uses theoretical computer science to place societal problems on a firm mathematical foundation. Dwork\u2019s earliest work in distributed computing established the pillars on which every fault tolerant system has been built for decades. Her innovations have modernized cryptography to the ungoverned interactions of the internet and the era of quantum computing, and her invention of Differential Privacy has revolutionized privacy-preserving statistical data analysis. In 2012 she launched the theoretical investigation of algorithmic fairness. The winner of numerous awards, Dwork is a member of the NAS, the NAE, the American Academy of Arts and Sciences, and the American Philosophical Society.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Differential Privacy in Distributed Environments: An Overview and Open QuestionsCynthia Dwork, Harvard University and Radcliffe Institute for Advanced Study Talk Abstract Differential privacy is a mathematically rigorous definition of privacy tailored to statistical analysis of large datasets. Differentially private algorithms are equipped with a parameter which controls the formal measure of privacy loss. All algorithms &hellip; <a href=\"https:\/\/www.podc.org\/podc2021\/cynthia-dwork\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Cynthia Dwork&#8221;<\/span><\/a><\/p>\n","protected":false},"author":17,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-166","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/www.podc.org\/podc2021\/wp-json\/wp\/v2\/pages\/166","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.podc.org\/podc2021\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/www.podc.org\/podc2021\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/www.podc.org\/podc2021\/wp-json\/wp\/v2\/users\/17"}],"replies":[{"embeddable":true,"href":"https:\/\/www.podc.org\/podc2021\/wp-json\/wp\/v2\/comments?post=166"}],"version-history":[{"count":5,"href":"https:\/\/www.podc.org\/podc2021\/wp-json\/wp\/v2\/pages\/166\/revisions"}],"predecessor-version":[{"id":216,"href":"https:\/\/www.podc.org\/podc2021\/wp-json\/wp\/v2\/pages\/166\/revisions\/216"}],"wp:attachment":[{"href":"https:\/\/www.podc.org\/podc2021\/wp-json\/wp\/v2\/media?parent=166"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}