-1

In general (if possible to determine) which would make an output harder to turn into plain text with a computer?

  • Extreme Length (minor complexity change)
  • Extreme algorithmic complexity (minor length change)

(Assuming these are hidden patterns of course)

Possible simple example for length might be a sentence turned numerals by changing to unicode identifier (like A=0041), this being translated back into words i.e. zero zero four one and for good measure each of those words characters turned back into unicode.

Using example of letter A

=0041
=Zerozerofourone
=005A00200065002000720020006F0020007A00200065002000720020006F002000660020006F00200075002000720020006F0020006E00200065

So, in this example each character becomes extremely long, and a couple permutations would cause it to become even longer.

I do not have an algorithm to use for an example opposing this, but assuming no difference in character lengths between output and input are the same, but the way in which the character's are encrypted is complex. (not necessarily duplicating, large base of character to choose from, etc.)

I hope this hasn't been too cryptic.

Mike Edward Moras
  • 18,161
  • 12
  • 87
  • 240
No Time
  • 109
  • 4

1 Answers1

1

Basically, "length" increases the time a brute force or other generic attack takes, while "complexity" makes it more difficult or less likely to find attacks faster than those.

However:

  1. The length in question varies depending on the algorithm or operation. It can be the length of keys, the block size of a block cipher, the state size of a PRG or the output length of a MAC or hash function.

    Typically a string is not encrypted as a longer ciphertext (except due to IV, padding etc.). You could do that, but there isn't really a reason to: the number of possible ways to encrypt $n$-bit plaintext as $n$-bit ciphertext is so large for $n \ge 128$ that there is no need to expand it.

  2. The complexity in question isn't normal algorithmic complexity (Kolmogorov complexity). For example, many cryptographic algorithms must be non-linear to be secure. No matter how complex you make a linear algorithm, it will lose to a simpler non-linear algorithm in such cases.

    Instead, it is computational complexity. For an encryption algorithm to be secure, decryption must be impossible for computationally bounded adversaries without the key. The algorithm itself can be fairly simple (e.g. modular exponentiation in ElGamal) as long as there is no way to reverse it without the key.

otus
  • 32,462
  • 5
  • 75
  • 167