2

I've got two questions regarding the paper FHEW: Bootstrapping Homomorphic Encryption in less than a second.

First, the final error of a ciphertext after the refresh procedure is stated as following a gaussian of standard deviation:

$\beta = \sqrt{\dfrac{q^2}{Q^2}\left( \zeta^2 \cdot \dfrac{B_{r}^2}{12} \cdot nd_r\cdot \dfrac{q}{2} \cdot 2Nd' + \sigma^2Nd_{ks}\right) + \dfrac{\|\mathbf{s}\|^2+1}{12}}$

Then, after giving their chosen parameters they conclude that $\beta = 6.94$. I don't understand where this result comes from. Even if we ignore the (positive) first term in the square root, as $\|s\| \leq n/2$ (with $n=500$ in the parameters), the standard deviation $\beta$ should be way bigger than $6.94$... Where does this result come from?

Second, they then say that the probability of error per homomorphic NAND is $p = 1 - erf(r/\sqrt{2})$ where $r = \dfrac{q/8}{\sqrt{2}\beta}$. I know that $p$ is the probability to get a sample outside of the interval $[-r;r]$, but I don't get why they set $r$ that way. What does it represent?

Patriot
  • 3,162
  • 3
  • 20
  • 66
Binou
  • 448
  • 5
  • 14

1 Answers1

2

Apologies for seeing this question just now. For your first question, well, there seems to be a typo: I should have written $\|s\|^2 \leq n/2$, and then we have $\sqrt{(\|s\|^2 +1) /12 } = \sqrt{251/12} \approx 4.57$.

Now, we take 2 ciphertext with LWE error of std-deviation $\beta$, and sum them. Assuming independence the std-dev of the sum of the errors is $\sqrt 2 \beta$. For correctness, we need this sum of errors to be less than $q/8$ (Here, to save every last bit, we relax the naive condition stated in Lemma 7, and directly bound the sum $q/8 + |e_0 + e_1|$ by $q/4$ rather than bounding $e_0$ and $e_1$).

answer to comment on decreasing error via swapping KeySwitch and $\textsf{HomNAND}$

Just for the sake of this explanation, let me lie about the actual noise propagation, and assume that each operation HN and KS just add a constant to the quantity of noise, denoted $h$ and $k$.

The ``easy to describe scheme'' does $\textsf{HomNAND}(KS(X_1), KS(X_2))$, which leads to an error of the form $(x_1 + k) + (x_2 + k) + h = x_1 + x_2 + h + 2k$.

The ``better but less easy to describe scheme'' does $KS(\textsf{HomNAND}(X1, X_2)) $which leads to and error $(x_1+x_2+h) + k$. That is one ks less.

It turns out that even with the actual formula, we also gain a little bit on the final error with this trick.

LeoDucas
  • 1,466
  • 7
  • 12