dp-transformers repository
Motivated by our recent work (opens in new tab), we are releasing a repository for training transformer models with differential privacy. Our repository is based on integrating the Opacus library (opens in new tab) to the Hugging Face (opens…
dp-transformers
Training transformer models with differential privacy Transformer models have recently taken the field of Natural Language Processing (NLP) by storm as large language models based on the transformer architecture have shown impressive performance across a…
Availability attacks create shortcuts
Accelerating the Delfs-Galbraith algorithm with fast subfield root detection
In this talk, we discuss the general supersingular isogeny problem, the foundational hardness assumption underpinning isogeny-based cryptography. We implement and optimize the best attack against this problem – the Delfs-Galbraith algorithm – to explicitly determine…