How Privacy-Enhancing Technologies Are Fulfilling Cryptography's Potential
Published on October 31, 2022 at 01:45AM
Here's the Guardian's report on new cryptographic techniques where "you can share data while keeping that data private" — known by the umbrella term "privacy-enhancing technologies" (or "Pets). They offer opportunities for data holders to pool their data in new and useful ways. In the health sector, for example, strict rules prohibit hospitals from sharing patients' medical data. Yet if hospitals were able to combine their data into larger datasets, doctors would have more information, which would enable them to make better decisions on treatments. Indeed, a project in Switzerland using Pets has since June allowed medical researchers at four independent teaching hospitals to conduct analysis on their combined data of about 250,000 patients, with no loss of privacy between institutions. Juan Troncoso, co-founder and CEO of Tune Insight, which runs the project, says: "The dream of personalised medicine relies on larger and higher-quality datasets. Pets can make this dream come true while complying with regulations and protecting people's privacy rights. This technology will be transformative for precision medicine and beyond." The past couple of years have seen the emergence of dozens of Pet startups in advertising, insurance, marketing, machine learning, cybersecurity, fintech and cryptocurrencies. According to research firm Everest Group, the market for Pets was $2bn last year and will grow to more than $50bn in 2026. Governments are also getting interested. Last year, the United Nations launched its "Pet Lab", which was nothing to do with the welfare of domestic animals, but instead a forum for national statistical offices to find ways to share their data across borders while protecting the privacy of their citizens. Jack Fitzsimons, founder of the UN Pet Lab, says: "Pets are one of the most important technologies of our generation. They have fundamentally changed the game, because they offer the promise that private data is only used for its intended purposes...." The emergence of applications has driven the theory, which is now sufficiently well developed to be commercially viable. Microsoft, for example, uses fully homomorphic encryption when you register a new password: the password is encrypted and then sent to a server who checks whether or not that password is in a list of passwords that have been discovered in data breaches, without the server being able to identify your password. Meta, Google and Apple have also over the last year or so been introducing similar tools to some of their products. The article offers quick explanations of zero-knowledge proofs, secure multiparty computation, and fully homomorphic encryption (which allows the performance of analytics on data by a second party who never reads the data or learns the result). And "In addition to new cryptographic techniques, Pets also include advances in computational statistics such as 'differential privacy', an idea from 2006 in which noise is added to results in order to preserve the privacy of individuals."
Published on October 31, 2022 at 01:45AM
Here's the Guardian's report on new cryptographic techniques where "you can share data while keeping that data private" — known by the umbrella term "privacy-enhancing technologies" (or "Pets). They offer opportunities for data holders to pool their data in new and useful ways. In the health sector, for example, strict rules prohibit hospitals from sharing patients' medical data. Yet if hospitals were able to combine their data into larger datasets, doctors would have more information, which would enable them to make better decisions on treatments. Indeed, a project in Switzerland using Pets has since June allowed medical researchers at four independent teaching hospitals to conduct analysis on their combined data of about 250,000 patients, with no loss of privacy between institutions. Juan Troncoso, co-founder and CEO of Tune Insight, which runs the project, says: "The dream of personalised medicine relies on larger and higher-quality datasets. Pets can make this dream come true while complying with regulations and protecting people's privacy rights. This technology will be transformative for precision medicine and beyond." The past couple of years have seen the emergence of dozens of Pet startups in advertising, insurance, marketing, machine learning, cybersecurity, fintech and cryptocurrencies. According to research firm Everest Group, the market for Pets was $2bn last year and will grow to more than $50bn in 2026. Governments are also getting interested. Last year, the United Nations launched its "Pet Lab", which was nothing to do with the welfare of domestic animals, but instead a forum for national statistical offices to find ways to share their data across borders while protecting the privacy of their citizens. Jack Fitzsimons, founder of the UN Pet Lab, says: "Pets are one of the most important technologies of our generation. They have fundamentally changed the game, because they offer the promise that private data is only used for its intended purposes...." The emergence of applications has driven the theory, which is now sufficiently well developed to be commercially viable. Microsoft, for example, uses fully homomorphic encryption when you register a new password: the password is encrypted and then sent to a server who checks whether or not that password is in a list of passwords that have been discovered in data breaches, without the server being able to identify your password. Meta, Google and Apple have also over the last year or so been introducing similar tools to some of their products. The article offers quick explanations of zero-knowledge proofs, secure multiparty computation, and fully homomorphic encryption (which allows the performance of analytics on data by a second party who never reads the data or learns the result). And "In addition to new cryptographic techniques, Pets also include advances in computational statistics such as 'differential privacy', an idea from 2006 in which noise is added to results in order to preserve the privacy of individuals."
Read more of this story at Slashdot.
Comments
Post a Comment