I'm using this code in Java to generate the cipher text using simple AES algorithm:
public String encrypt(String message, String key) {
skeySpec = new SecretKeySpec(HexUtil.HexfromString(key), "AES");
cipher = Cipher.getInstance("AES");
cipher.init(1, skeySpec);
byte encstr[] = cipher.doFinal(message.getBytes());
return HexUtil.HextoString(encstr);
}
And the function HexfromString is:
public static byte[] HexfromString(String s) {
int i = s.length();
byte[] byte0 = new byte[(i + 1) / 2];
int j = 0;
int k = 0;
if (i % 2 == 1) byte0[k++] = (byte) HexfromDigit(s.charAt(j++));
while (j < i) {
int v1 = HexfromDigit(s.charAt(j++)) << 4;
int v2 = HexfromDigit(s.charAt(j++));
byte0[k++] = (byte) (v1 | v2);
}
return byte0;
}
I wrote the following code in Golang to mimic the above result.
func EncryptAES(secretKey string, plaintext string) string {
key := hex.DecodeString(secretKey)
c, err := aes.NewCipher(key)
CheckError(err)
out := make([]byte, len(plaintext))
c.Encrypt(out, []byte(plaintext))
return hex.EncodeToString(out)
}
But the issue is that the []bytes key returned from hex.DecodeString() is in unsigned Int where as in Java, the result is in signed Int. And obviously, the encrypted text results are also different, even though every input is same.