Hardness of Bounded Distance Decoding on Lattice in l_p Norms
Monday, October 19, 2020 1:30pm to 2:45pm

About this Event
Bounded Distance Decoding BDD_{p,\alpha} is the problem of decoding a lattice when the target point is promised to be within an \alpha factor of the minimum distance of the lattice, in the l_p norm. We prove that BDD_{p, \alpha} is NP-hard under randomized reductions where \alpha goes to 1/2 as p goes to \infty (and for \alpha=1/2 when p=\infty), thereby showing the hardness of decoding for distances approaching the unique-decoding radius for large p. We also show fine-grained hardness for BDD_{p,\alpha}. For example, we prove that for all p \in [1,\infty) \setminus 2\Z and constants C > 1, \eps > 0, there is no 2^{(1-\eps)n/C}-time algorithm for BDD_{p,\alpha} for some constant \alpha (which approaches 1/2 as p goes to \infty), assuming the randomized Strong Exponential Time Hypothesis (SETH). Moreover, essentially all of our results also hold (under analogous non-uniform assumptions) for BDD with preprocessing, in which unbounded precomputation can be applied to the lattice before the target is available.
Compared to prior work on the hardness of BDD_{p,\alpha} by Liu, Lyubashevsky, and Micciancio (APPROX-RANDOM 2008), our results improve the values of \alpha for which the problem is known to be NP-hard for all p > p_1 \approx 4.2773, and give the very first fine-grained hardness for BDD (in any norm). Our reductions rely on a special family of ``locally dense'' lattices in l_p norms, which we construct by modifying the integer-lattice sparsification technique of Aggarwal and Stephens-Davidowitz (STOC 2018).
Joint work with Chris Peikert. This work appeared at CCC 2020, and is available at https://arxiv.org/abs/2003.07903