|
|
ProgrammeTENTATIVE PROGRAMME (subject to changes) 18/03 (Monday) 09:15-09:30: Welcome09:30-10:30: Luca De Feo -- Isogeny graphs in cryptography (1)11:00-12:00: Luca De Feo -- Isogeny graphs in cryptography (2)14:00-17:00: Damien Vergnaud -- Introduction to public-key cryptography18:00-19:00: Luca De Feo -- Isogeny graphs in cryptography (3)19/03 (Tuesday)08:45-10:15: Damien Stehlé -- Introduction to lattice-based cryptography10:45-12:15: Pierrick Gaudry -- Analysis of discrete logarithm algorithms17:30-18:30: Chloe Martindale -- Cryptographic applications of isogeny graphs of genus 2 and 3 curves18:45-19:45: Zvika Brakerski -- Worst-case hardness for LPN and cryptographic hashing via code smoothing 20/03 (Wednesday)08:45-10:15: Léo Ducas -- Lattice algorithms10:45-12:15: Pierrick Gaudry -- Analysis of discrete logarithm algorithms17:30-18:30: Christophe Petit -- Isogeny-based cryptography: cryptanalysis results18:45-19:45: Benjamin Wesolowski -- The discrete logarithm problem in finite fields of small characteristic 21/03 (Thursday)08:45-10:15: Damien Stehlé -- Introduction to lattice-based cryptography10:45-12:15: Léo Ducas -- Lattice algorithms 17:30-18:30: Thijs Laarhoven -- SVP algorithms 22/03 (Friday) 09:30-10:30: Wouter Castryck -- Structured variants of LWE11:00-12:00: Shweta Agrawal -- Mathematical assumptions underlying code obfuscation
ABSTRACTS Analysis of discrete logarithm algorithms: arithmetic, analytic and geometric tools (mini-course, Pierrick Gaudry). The discrete logarithm problem in finite field is an important algorithmic question upon which the security of many cryptographic Isogeny graphs in cryptography (mini-course, Luca De Feo). Elliptic curves have been at the heart of public key cryptography for more than 30 years. However, most of what can currently be achieved with elliptic curves is going to be rendered useless by the advent of large scale quantum computers. Introduction to lattice-based cryptography (mini-course, Damien Stehlé). Lattice-based cryptography is one of the most promising cryptographic approaches to obtain efficient cryptographic primitives conjectured to resist quantum computations. For instance, in the second round of the NIST standardization process for post-quantum cryptography, 12 out of the 26 remaining candidates rely on lattices. In the area of lattice-based cryptography, the Learning With Errors problem (LWE) introduced by Regev in 2005 plays a central role. Reduction algorithms for lattices and ideal lattices (mini-course, Léo Ducas). Lattice reduction is the task of finding a short basis of a lattice. It is an important tool in computational number theory, and in Mathematical assumptions underlying code obfuscation (Shweta Agrawal). In recent times, there has been significant interest in constructing the cryptographic primitive of "indistinguishability bfuscation". Standard cryptographic hardness assumptions appear insufficient for this task, and we now have a variety of new mathematical conjectures to fill the gap. I will define the notion of indistinguishability obfuscation, briefly describe its importance and discuss the new mathematical conjectures, hard distributions, known attacks and open problems. Cryptographic applications of isogeny graphs of genus 2 and 3 curves (Chloe Martindale). We will introduce some background theory on genus 2 and 3 curves and look at the structures of the isogeny graphs that occur for some cases of these curves. We will show how knowledge of the graph structure can be used for both pre- and post-quantum cryptographic applications. This will cover some current research topics as well as some open questions in the area. The discrete logarithm problem in finite fields of small characteristic (Benjamin Wesolowski). The presumed difficulty of computing discrete logarithms in certain groups is essential for the security of a number of communication protocols deployed today. One of the most classic choices for the underlying group is the multiplicative group of a finite field. However, when the characteristic of the field is small, recent algorithms allow to compute logarithms efficiently. These methods are only heuristic: they seem to always work, yet we do not know how to prove it. We will discuss the gap between the fastest provable and heuristic algorithms, and the recent directions towards a fast provable algorithm. Structured variants of LWE (Wouter Castryck). The celebrated LWE problem is about solving perturbed linear systems of equations over finite fields. During the last decade, several variations to this problem have been proposed in which the linear part carries extra structure, where the main goal is to speed up and compactify the cryptographic applications, without compromising the security (conjecturally). Along with these proposals came various suggestions for the error distribution from which the perturbation is to be sampled. We will give an overview of the existing "structured" variants to LWE, discuss their similarities and differences, and discuss some bad choices that can be made, which do affect the security. Isogeny-based cryptography: cryptanalysis results (Christophe Petit). In this talk we will survey known results on the security of isogeny-based protocols, particularly focusing on existing cryptanalysis approaches. We also aim to highlight what are the easy and remaining hard problems in this area. Worst-case hardness for LPN and cryptographic hashing via code smoothing (Zvika Brakerski). We present a worst case decoding problem whose hardness reduces to that of solving the Learning Parity with Noise (LPN) problem, in some parameter regime. Prior to this work, no worst case hardness result was known for LPN (as opposed to syntactically similar problems such as Learning with Errors). The caveat is that this worst case problem is only mildly hard and in particular admits a quasi-polynomial time algorithm, whereas the LPN variant used in the reduction requires extremely high noise rate of $1/2-1/\poly(n)$. Thus we can only show that ``very hard'' LPN is harder than some ``very mildly hard'' worst case problem. We note that LPN with noise $1/2-1/\poly(n)$ already implies symmetric cryptography.
Specifically, we consider the $(n,m,w)$-nearest codeword problem ($(n,m,w)$-NCP) which takes as input a generating matrix for a binary linear code in $m$ dimensions and rank $n$, and a target vector which is very close to the code (Hamming distance at most $w$), and asks to find the codeword nearest to the target vector. We show that for balanced (unbiased) codes and for relative error $w/m \approx {\log^2 n}/{n}$, $(n,m,w)$-NCP can be solved given oracle access to an LPN distinguisher with noise ratio $1/2-1/\poly(n)$.
Our proof relies on a smoothing lemma for codes which we show to have further implications: We show that $(n,m,w)$-NCP with the aforementioned parameters lies in the complexity class $SearchBPP^{SZK}$ (i.e.\ reducible to a problem
that has a statistical zero knowledge protocol) implying that it is unlikely to be $NP$-hard. We then show that the hardness of LPN with very low noise rate $\log^2(n)/n$ implies the existence of collision resistant hash functions (our aforementioned result implies that in this parameter regime LPN is also in $BPP^{SZK}$).
Joint work with Vadim Lyubashevsky, Vinod Vaikuntanathan and Daniel Wichs.
Finding a Nash Equilibrium is No Easier than Breaking Fiat-Shamir (Alon Rosen). The Fiat-Shamir heuristic transforms a public-coin interactive proof into a non-interactive argument, by replacing the verifier with a cryptographic hash function that is applied to the protocol's transcript. Constructing hash functions for which this transformation is sound is a central and long-standing open question in cryptography. Evolutionary lattice algorithms (Thijs Laarhoven). The security of lattice-based cryptography relies on the hardness of lattice problems such as the shortest vector problem, and our inability of solving these problems efficiently with state-of-the-art lattice algorithms. For accurately assessing the security of these schemes, it is important to understand these underlying cryptanalytic algorithms well, and see how they relate to methods from other research areas. In this talk we will take a look at lattice problems from an Artificial Intelligence-based point of view, and discuss how various techniques and tools from evolutionary algorithms naturally appear in the current best methods for solving lattice problems in practice. |