University of Wollongong
Browse

Impact of the modulus switching technique on some attacks against learning problems

Download (728.68 kB)
journal contribution
posted on 2024-11-15, 22:30 authored by Quoc Huy Le, Pradeep Mishra, Satoshi Nakamura, Koha Kinjo, Steven DuongSteven Duong, Masaya Yasuda
© The Institution of Engineering and Technology 2019. The modulus switching technique has been used in some cryptographic applications as well as in cryptanalysis. For cryptanalysis against the learning with errors (LWE) problem and the learning with rounding (LWR) problem, it seems that one does not know whether the technique is really useful or not. This work supplies a complete view of the impact of this technique on the decoding attack, the dual attack and the primal attack against both LWE and LWR. For each attack, the authors give the optimal formula for the switching modulus. The formulas get involved the number of LWE/LWR samples, which differs from the known formula in the literature. They also attain the corresponding sufficient conditions saying when one should utilise the technique. Surprisingly, restricted to the LWE/LWR problem that the secret vector is much shorter than the error vector, they also show that performing the modulus switching before using the so-called rescaling technique in the dual attack and the primal attack make these attacks worse than only exploiting the rescaling technique as reported by Bai and Galbraith at the Australasian conference on information security and privacy (ACISP) 2014 conference. As an application, they theoretically assess the influence of the modulus switching on the LWE/LWR-based second round NIST PQC submissions.

History

Citation

Le, Q., Mishra, P., Nakamura, S., Kinjo, K., Duong, H. & Yasuda, M. (2020). Impact of the modulus switching technique on some attacks against learning problems. IET Information Security, 14 (3), 286-303.

Language

English

RIS ID

142776

Usage metrics

    Categories

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC