1

I know that the $\mathrm{LLL}$ algorithm can find a short, not necessarily the shortest, basis in polynomial time.

My question is: if we construct a lattice from $\textbf{A}$ and then run $\mathrm{LLL}$ on the lattice, would it help in finding the solution or clue to find the solution to the $\mathrm{SIS}$ problem (especially for small dimension matrices)?

$\mathbb{Z}^{n}_{q} = n$ dimensional vectors modulo $q$ (for simplicity say, $q$ is prime and $n = m$)

$\textbf{Goal}$: find nontrivial short vector $z \in \mathbb{Z}^m$ such that:

$\begin{pmatrix}\\ \dots \text{A} \dots \\ \\\end{pmatrix} \times \begin{pmatrix}\\z\\\\\end{pmatrix} = 0 \in \mathbb{Z}^{n}_{q}$

Note that $\textbf{A}$ is a $n\times m$ matrix and $z$ is a $m \times 1$ matrix or vector.

I understand that $\mathrm{LLL}$ solves Merkle–Hellman knapsack cryptosystem. But it is different from $\mathrm{SIS}$ problem.

Patriot
  • 3,162
  • 3
  • 20
  • 66
Node.JS
  • 322
  • 3
  • 16

1 Answers1

2

As mentioned by 111 in the comments, solving SIS comes down to finding a short vector in the underlying lattice of some prescribed norm $\beta$ or less. Lattice basis reduction methods like LLL help in the sense that they can reduce a basis with long lattice vectors to a basis with shorter, more orthogonal lattice vectors.

If the reduction is strong enough (LLL, or BKZ with large enough block size) then you expect that the first basis vector of the reduced basis is a solution to SIS. (How strong your basis reduction should be to find a solution depends on $n, m, q$ and on the required length of the short lattice vector, $\beta$.)

So to answer your overall question: LLL can solve the easiest SIS instances (with large $\beta$), and can assist in solving harder SIS instances (with small $\beta$) by finding a shorter basis, which makes finding even shorter lattice vectors a bit easier.

TMM
  • 343
  • 4
  • 13