2

I'm building a cloud storage application that will store files on my servers.

I want these files to be encrypted with AES in CTR mode and the key will be always the same.

I'm wondering what would be the best way to derive an IV.

The encryption will happen on many servers so I can't use progressive values (e.g. incrementing first 8 bytes for every different encrypted file) since I can't keep count across servers.

Numbers of stored files can reach tens of billions.

What's the best practise in this scenario to derive the IV (or maybe a Key + IV)?

  1. Should I use only one key that never changes?
  2. Should I derive a new key for every encrypted file (from the master key)?

Edit:

If key mustn't be the same, would it be secure to do the following:

  1. A new key (256-bit) is randomly-generated for every file.
  2. This key is encrypted with the masterkey in ECB mode (since it's highly entropy data, do we need CBC?)
  3. The file key is appended at the beginning of the file.
user28875
  • 23
  • 4

1 Answers1

1

You may want to rethink your whole setup. Like mentioned in the comments, there is a lot more to securely storing files than throwing encryption on them. In particular, you need to be concerned with authentication (CTR without authentication is malleable). I would also look into client side encryption in anything that puts user data in the cloud.

Still, there are probably some situations where just using CTR to encrypt a bunch of files is the right thing to do, so I will answer your questions below.


What's the best practise in this scenario to derive the IV (or maybe a Key + IV)?

Usually in CTR mode you have a nonce concatenated with a counter. If you want to support arbitrary file sizes, you probably want the counter to be 64 bits. However, that does not leave enough space for a nonce, since 64-bit nonces will collide after a few billion files. If e.g. your API restricts you to this setup, you would be better off with per-file keys.

If you can use random 128-bit nonces, with e.g. the lower half incremented as the counter, that should be fine. You would avoid the need for key setup, which would be an advantage especially with small files.

If key mustn't be the same, would it be secure to do the following:

  1. A new key (256-bit) is randomly-generated for every file.
  2. This key is encrypted with the masterkey in ECB mode (since it's highly entropy data, do we need CBC?)
  3. The file key is appended at the beginning of the file.

ECB would be acceptable for encrypting random data, but concerns about authentication remain. Alternatively you could use key derivation. That way you can use e.g. a random 128-bit salt for each file even with 256-bit keys.

otus
  • 32,462
  • 5
  • 75
  • 167