HBASE-25263 Various improvements of column family encryption

This PR is a follow-up of HBASE-25181 (#2539), where several issues were
discussed on the PR:

1. Currently we use PBKDF2WithHmacSHA1 key generation algorithm to generate a
secret key for HFile / WalFile encryption, when the user is defining a string
encryption key in the hbase shell. This algorithm is not secure enough and
not allowed in certain environments (e.g. on FIPS compliant clusters). We are
changing it to PBKDF2WithHmacSHA384. It will not break backward-compatibility,
as even the tables created by the shell using the new algorithm will be able
to load (e.g. during bulkload / replication) the HFiles serialized with the
key generated by an old algorithm, as the HFiles themselves already contain
the key necessary for their decryption.

Smaller issues fixed by this commit:

2. Improve the documentation e.g. with the changes introduced by HBASE-25181
and also by some points discussed on the Jira ticket of HBASE-25263.

3. In EncryptionUtil.createEncryptionContext the various encryption config
checks should throw IllegalStateExceptions instead of RuntimeExceptions.

4. Test cases in TestEncryptionTest.java should be broken down into smaller
tests.

5. TestEncryptionDisabled.java should use ExpectedException JUnit rule to
validate exceptions.

closes #2676

Signed-off-by: Peter Somogyi <psomogyi@apache.org>
This commit is contained in:
Mate Szalay-Beko 2020-11-17 20:05:44 +01:00 committed by Peter Somogyi
parent b142f5dcd2
commit 451a4b06b1
7 changed files with 175 additions and 80 deletions

View File

@ -192,6 +192,7 @@ public final class EncryptionUtil {
* @param family The current column descriptor.
* @return The created encryption context.
* @throws IOException if an encryption key for the column cannot be unwrapped
* @throws IllegalStateException in case of encryption related configuration errors
*/
public static Encryption.Context createEncryptionContext(Configuration conf,
ColumnFamilyDescriptor family) throws IOException {
@ -199,7 +200,7 @@ public final class EncryptionUtil {
String cipherName = family.getEncryptionType();
if (cipherName != null) {
if(!Encryption.isEncryptionEnabled(conf)) {
throw new RuntimeException("Encryption for family '" + family.getNameAsString()
throw new IllegalStateException("Encryption for family '" + family.getNameAsString()
+ "' configured with type '" + cipherName + "' but the encryption feature is disabled");
}
Cipher cipher;
@ -211,13 +212,13 @@ public final class EncryptionUtil {
// Use the algorithm the key wants
cipher = Encryption.getCipher(conf, key.getAlgorithm());
if (cipher == null) {
throw new RuntimeException("Cipher '" + key.getAlgorithm() + "' is not available");
throw new IllegalStateException("Cipher '" + key.getAlgorithm() + "' is not available");
}
// Fail if misconfigured
// We use the encryption type specified in the column schema as a sanity check on
// what the wrapped key is telling us
if (!cipher.getName().equalsIgnoreCase(cipherName)) {
throw new RuntimeException("Encryption for family '" + family.getNameAsString()
throw new IllegalStateException("Encryption for family '" + family.getNameAsString()
+ "' configured with type '" + cipherName + "' but key specifies algorithm '"
+ cipher.getName() + "'");
}
@ -225,7 +226,7 @@ public final class EncryptionUtil {
// Family does not provide key material, create a random key
cipher = Encryption.getCipher(conf, cipherName);
if (cipher == null) {
throw new RuntimeException("Cipher '" + cipherName + "' is not available");
throw new IllegalStateException("Cipher '" + cipherName + "' is not available");
}
key = cipher.getRandomKey();
}

View File

@ -37,6 +37,7 @@ import org.apache.commons.io.IOUtils;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HConstants;
import org.apache.hadoop.hbase.io.crypto.aes.AES;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.hbase.util.Pair;
import org.apache.hadoop.util.ReflectionUtils;
@ -245,21 +246,11 @@ public final class Encryption {
*
*/
public static byte[] pbkdf128(String... args) {
byte[] salt = new byte[128];
Bytes.random(salt);
StringBuilder sb = new StringBuilder();
for (String s: args) {
sb.append(s);
}
PBEKeySpec spec = new PBEKeySpec(sb.toString().toCharArray(), salt, 10000, 128);
try {
return SecretKeyFactory.getInstance("PBKDF2WithHmacSHA1")
.generateSecret(spec).getEncoded();
} catch (NoSuchAlgorithmException e) {
throw new RuntimeException(e);
} catch (InvalidKeySpecException e) {
throw new RuntimeException(e);
}
return generateSecretKey("PBKDF2WithHmacSHA1", AES.KEY_LENGTH, sb.toString().toCharArray());
}
/**
@ -268,19 +259,69 @@ public final class Encryption {
*
*/
public static byte[] pbkdf128(byte[]... args) {
byte[] salt = new byte[128];
Bytes.random(salt);
StringBuilder sb = new StringBuilder();
for (byte[] b: args) {
sb.append(Arrays.toString(b));
}
PBEKeySpec spec = new PBEKeySpec(sb.toString().toCharArray(), salt, 10000, 128);
return generateSecretKey("PBKDF2WithHmacSHA1", AES.KEY_LENGTH, sb.toString().toCharArray());
}
/**
* Return a key derived from the concatenation of the supplied arguments using
* PBKDF2WithHmacSHA384 key derivation algorithm at 10,000 iterations.
*
* The length of the returned key is determined based on the need of the cypher algorithm.
* E.g. for the default "AES" we will need a 128 bit long key, while if the user is using
* a custom cipher, we might generate keys with other length.
*
* This key generation method is used currently e.g. in the HBase Shell (admin.rb) to generate a
* column family data encryption key, if the user provided an ENCRYPTION_KEY parameter.
*/
public static byte[] generateSecretKey(Configuration conf, String cypherAlg, String... args) {
StringBuilder sb = new StringBuilder();
for (String s: args) {
sb.append(s);
}
int keyLengthBytes = Encryption.getCipher(conf, cypherAlg).getKeyLength();
return generateSecretKey("PBKDF2WithHmacSHA384", keyLengthBytes, sb.toString().toCharArray());
}
/**
* Return a key derived from the concatenation of the supplied arguments using
* PBKDF2WithHmacSHA384 key derivation algorithm at 10,000 iterations.
*
* The length of the returned key is determined based on the need of the cypher algorithm.
* E.g. for the default "AES" we will need a 128 bit long key, while if the user is using
* a custom cipher, we might generate keys with other length.
*
* This key generation method is used currently e.g. in the HBase Shell (admin.rb) to generate a
* column family data encryption key, if the user provided an ENCRYPTION_KEY parameter.
*/
public static byte[] generateSecretKey(Configuration conf, String cypherAlg, byte[]... args) {
StringBuilder sb = new StringBuilder();
for (byte[] b: args) {
sb.append(Arrays.toString(b));
}
int keyLength = Encryption.getCipher(conf, cypherAlg).getKeyLength();
return generateSecretKey("PBKDF2WithHmacSHA384", keyLength, sb.toString().toCharArray());
}
/**
* Return a key (byte array) derived from the supplied password argument using the given
* algorithm with a random salt at 10,000 iterations.
*
* @param algorithm the secret key generation algorithm to use
* @param keyLengthBytes the length of the key to be derived (in bytes, not in bits)
* @param password char array to use as password for the key generation algorithm
* @return secret key encoded as a byte array
*/
private static byte[] generateSecretKey(String algorithm, int keyLengthBytes, char[] password) {
byte[] salt = new byte[keyLengthBytes];
Bytes.random(salt);
PBEKeySpec spec = new PBEKeySpec(password, salt, 10000, keyLengthBytes*8);
try {
return SecretKeyFactory.getInstance("PBKDF2WithHmacSHA1")
.generateSecret(spec).getEncoded();
} catch (NoSuchAlgorithmException e) {
throw new RuntimeException(e);
} catch (InvalidKeySpecException e) {
return SecretKeyFactory.getInstance(algorithm).generateSecret(spec).getEncoded();
} catch (NoSuchAlgorithmException | InvalidKeySpecException e) {
throw new RuntimeException(e);
}
}

View File

@ -66,7 +66,7 @@ public class EncryptionTest {
throw new IOException("Key provider " + providerClassName + " failed test: " +
e.getMessage(), e);
}
} else if (result.booleanValue() == false) {
} else if (!result) {
throw new IOException("Key provider " + providerClassName + " previously failed test");
}
}
@ -91,7 +91,7 @@ public class EncryptionTest {
throw new IOException("Cipher provider " + providerClassName + " failed test: " +
e.getMessage(), e);
}
} else if (result.booleanValue() == false) {
} else if (!result) {
throw new IOException("Cipher provider " + providerClassName + " previously failed test");
}
}
@ -154,7 +154,7 @@ public class EncryptionTest {
cipherResults.put(cipher, false);
throw new IOException("Cipher " + cipher + " failed test: " + e.getMessage(), e);
}
} else if (result.booleanValue() == false) {
} else if (!result) {
throw new IOException("Cipher " + cipher + " previously failed test");
}
}

View File

@ -17,9 +17,6 @@
*/
package org.apache.hadoop.hbase.regionserver;
import static org.junit.Assert.assertTrue;
import static org.junit.Assert.fail;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.DoNotRetryIOException;
import org.apache.hadoop.hbase.HBaseClassTestRule;
@ -37,8 +34,10 @@ import org.apache.hadoop.hbase.util.TableDescriptorChecker;
import org.junit.AfterClass;
import org.junit.BeforeClass;
import org.junit.ClassRule;
import org.junit.Rule;
import org.junit.Test;
import org.junit.experimental.categories.Category;
import org.junit.rules.ExpectedException;
@Category({MasterTests.class, MediumTests.class})
public class TestEncryptionDisabled {
@ -47,6 +46,9 @@ public class TestEncryptionDisabled {
public static final HBaseClassTestRule CLASS_RULE =
HBaseClassTestRule.forClass(TestEncryptionDisabled.class);
@Rule
public ExpectedException exception = ExpectedException.none();
private static final HBaseTestingUtility TEST_UTIL = new HBaseTestingUtility();
private static Configuration conf = TEST_UTIL.getConfiguration();
private static TableDescriptorBuilder tdb;
@ -82,15 +84,9 @@ public class TestEncryptionDisabled {
tdb.setColumnFamily(columnFamilyDescriptorBuilder.build());
// Create the test table, we expect to get back an exception
try {
TEST_UTIL.getAdmin().createTable(tdb.build());
} catch (DoNotRetryIOException e) {
assertTrue(e.getMessage().contains("encryption is disabled on the cluster"));
return;
} catch (Exception e) {
throw new RuntimeException("create table command failed for the wrong reason", e);
}
fail("create table command unexpectedly succeeded");
exception.expect(DoNotRetryIOException.class);
exception.expectMessage("encryption is disabled on the cluster");
TEST_UTIL.getAdmin().createTable(tdb.build());
}
@Test

View File

@ -19,6 +19,7 @@ package org.apache.hadoop.hbase.util;
import static org.junit.Assert.fail;
import java.io.IOException;
import java.security.Key;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseClassTestRule;
@ -44,39 +45,37 @@ public class TestEncryptionTest {
HBaseClassTestRule.forClass(TestEncryptionTest.class);
@Test
public void testTestKeyProvider() {
public void testTestKeyProvider() throws Exception {
Configuration conf = HBaseConfiguration.create();
try {
conf.set(HConstants.CRYPTO_KEYPROVIDER_CONF_KEY, KeyProviderForTesting.class.getName());
EncryptionTest.testKeyProvider(conf);
} catch (Exception e) {
fail("Instantiation of test key provider should have passed");
}
try {
conf.set(HConstants.CRYPTO_KEYPROVIDER_CONF_KEY, FailingKeyProvider.class.getName());
EncryptionTest.testKeyProvider(conf);
fail("Instantiation of bad test key provider should have failed check");
} catch (Exception e) { }
conf.set(HConstants.CRYPTO_KEYPROVIDER_CONF_KEY, KeyProviderForTesting.class.getName());
EncryptionTest.testKeyProvider(conf);
}
@Test(expected = IOException.class)
public void testBadKeyProvider() throws Exception {
Configuration conf = HBaseConfiguration.create();
conf.set(HConstants.CRYPTO_KEYPROVIDER_CONF_KEY, FailingKeyProvider.class.getName());
EncryptionTest.testKeyProvider(conf);
fail("Instantiation of bad test key provider should have failed check");
}
@Test
public void testTestCipherProvider() {
public void testDefaultCipherProvider() throws Exception {
Configuration conf = HBaseConfiguration.create();
try {
conf.set(HConstants.CRYPTO_CIPHERPROVIDER_CONF_KEY, DefaultCipherProvider.class.getName());
EncryptionTest.testCipherProvider(conf);
} catch (Exception e) {
fail("Instantiation of test cipher provider should have passed");
}
try {
conf.set(HConstants.CRYPTO_CIPHERPROVIDER_CONF_KEY, FailingCipherProvider.class.getName());
EncryptionTest.testCipherProvider(conf);
fail("Instantiation of bad test cipher provider should have failed check");
} catch (Exception e) { }
conf.set(HConstants.CRYPTO_CIPHERPROVIDER_CONF_KEY, DefaultCipherProvider.class.getName());
EncryptionTest.testCipherProvider(conf);
}
@Test(expected = IOException.class)
public void testBadCipherProvider() throws Exception {
Configuration conf = HBaseConfiguration.create();
conf.set(HConstants.CRYPTO_CIPHERPROVIDER_CONF_KEY, FailingCipherProvider.class.getName());
EncryptionTest.testCipherProvider(conf);
fail("Instantiation of bad test cipher provider should have failed check");
}
@Test
public void testTestCipher() {
public void testAESCipher() {
Configuration conf = HBaseConfiguration.create();
conf.set(HConstants.CRYPTO_KEYPROVIDER_CONF_KEY, KeyProviderForTesting.class.getName());
String algorithm =
@ -86,14 +85,18 @@ public class TestEncryptionTest {
} catch (Exception e) {
fail("Test for cipher " + algorithm + " should have succeeded");
}
try {
EncryptionTest.testEncryption(conf, "foobar", null);
fail("Test for bogus cipher should have failed");
} catch (Exception e) { }
}
@Test(expected = IOException.class)
public void testUnknownCipher() throws Exception {
Configuration conf = HBaseConfiguration.create();
conf.set(HConstants.CRYPTO_KEYPROVIDER_CONF_KEY, KeyProviderForTesting.class.getName());
EncryptionTest.testEncryption(conf, "foobar", null);
fail("Test for bogus cipher should have failed");
}
@Test
public void testTestEnabled() {
public void testTestEnabledWithDefaultConfig() {
Configuration conf = HBaseConfiguration.create();
conf.set(HConstants.CRYPTO_KEYPROVIDER_CONF_KEY, KeyProviderForTesting.class.getName());
String algorithm =
@ -104,7 +107,14 @@ public class TestEncryptionTest {
fail("Test for cipher " + algorithm + " should have succeeded, when " +
Encryption.CRYPTO_ENABLED_CONF_KEY + " is not set");
}
}
@Test
public void testTestEnabledWhenCryptoIsExplicitlyEnabled() {
Configuration conf = HBaseConfiguration.create();
conf.set(HConstants.CRYPTO_KEYPROVIDER_CONF_KEY, KeyProviderForTesting.class.getName());
String algorithm =
conf.get(HConstants.CRYPTO_KEY_ALGORITHM_CONF_KEY, HConstants.CIPHER_AES);
conf.setBoolean(Encryption.CRYPTO_ENABLED_CONF_KEY, true);
try {
EncryptionTest.testEncryption(conf, algorithm, null);
@ -112,15 +122,19 @@ public class TestEncryptionTest {
fail("Test for cipher " + algorithm + " should have succeeded, when " +
Encryption.CRYPTO_ENABLED_CONF_KEY + " is set to true");
}
conf.setBoolean(Encryption.CRYPTO_ENABLED_CONF_KEY, false);
try {
EncryptionTest.testEncryption(conf, algorithm, null);
fail("Test for cipher " + algorithm + " should have failed, when " +
Encryption.CRYPTO_ENABLED_CONF_KEY + " is set to false");
} catch (Exception e) { }
}
@Test(expected = IOException.class)
public void testTestEnabledWhenCryptoIsExplicitlyDisabled() throws Exception {
Configuration conf = HBaseConfiguration.create();
conf.set(HConstants.CRYPTO_KEYPROVIDER_CONF_KEY, KeyProviderForTesting.class.getName());
String algorithm =
conf.get(HConstants.CRYPTO_KEY_ALGORITHM_CONF_KEY, HConstants.CIPHER_AES);
conf.setBoolean(Encryption.CRYPTO_ENABLED_CONF_KEY, false);
EncryptionTest.testEncryption(conf, algorithm, null);
}
public static class FailingKeyProvider implements KeyProvider {
@Override

View File

@ -1137,8 +1137,8 @@ module Hbase
algorithm = arg.delete(ColumnFamilyDescriptorBuilder::ENCRYPTION).upcase
cfdb.setEncryptionType(algorithm)
if arg.include?(ColumnFamilyDescriptorBuilder::ENCRYPTION_KEY)
key = org.apache.hadoop.hbase.io.crypto.Encryption.pbkdf128(
arg.delete(ColumnFamilyDescriptorBuilder::ENCRYPTION_KEY)
key = org.apache.hadoop.hbase.io.crypto.Encryption.generateSecretKey(
@conf, algorithm, arg.delete(ColumnFamilyDescriptorBuilder::ENCRYPTION_KEY)
)
cfdb.setEncryptionKey(org.apache.hadoop.hbase.security.EncryptionUtil.wrapKey(@conf, key,
algorithm))

View File

@ -1585,6 +1585,23 @@ It is also possible to encrypt the WAL.
Even though WALs are transient, it is necessary to encrypt the WALEdits to avoid circumventing HFile protections for encrypted column families, in the event that the underlying filesystem is compromised.
When WAL encryption is enabled, all WALs are encrypted, regardless of whether the relevant HFiles are encrypted.
==== Enable or disable the feature.
The "Transparent Encryption of Data At Rest" feature is enabled by default, meaning the users can
define tables with column families where the HFiles and WAL files will be encrypted by HBase,
assuming the feature is properly configured (see <<hbase.encryption.server.configuration>>).
In some cases (e.g. due to custom security policies), the operator of the HBase cluster might wish
to only rely on an encryption at rest mechanism outside of HBase (e.g. those offered by HDFS) and
wants to ensure that HBase's encryption at rest system is inactive. Since
link:https://issues.apache.org/jira/browse/HBASE-25181[HBASE-25181] it is possible to explicitly
disable HBase's own encryption by setting `hbase.crypto.enabled` to `false`. This configuration is
`true` by default. If it is set to `false`, the users won't be able to create any table
(column family) with HFile and WAL file encryption and the related create table shell (or API)
commands will fail if they try.
[[hbase.encryption.server.configuration]]
==== Server-Side Configuration
This procedure assumes you are using the default Java keystore implementation.
@ -1687,6 +1704,25 @@ You can include these in the HMaster's _hbase-site.xml_ as well, but the HMaster
</property>
----
. (Optional) Configure encryption key hash algorithm.
+
Since link:https://issues.apache.org/jira/browse/HBASE-25181[HBASE-25181] it is possible to use
custom encryption key hash algorithm instead of the default MD5 algorithm. This hash is needed to
verify the secret key during decryption. The MD5 algorithm is considered weak, and can not be used
in some (e.g. FIPS compliant) clusters.
+
The hash is set via the configuration option `hbase.crypto.key.hash.algorithm`. It should be set to
a JDK `MessageDigest` algorithm like "MD5", "SHA-384" or "SHA-512". The default is "MD5" for
backward compatibility. An example of this configuration parameter on a FIPS-compliant cluster:
[source,xml]
+
----
<property>
<name>hbase.crypto.key.hash.algorithm</name>
<value>SHA-384</value>
</property>
----
. Configure permissions on the _hbase-site.xml_ file.
+
Because the keystore password is stored in the hbase-site.xml, you need to ensure that only the HBase user can read the _hbase-site.xml_ file, using file ownership and permissions.
@ -1713,12 +1749,20 @@ Refer to the official API for usage instructions.
Enable Encryption on a Column Family::
To enable encryption on a column family, you can either use HBase Shell or the Java API.
After enabling encryption, trigger a major compaction.
When the major compaction completes, the HFiles will be encrypted.
When the major compaction completes, the compacted new HFiles will be encrypted.
However, depending on the compaction settings, it is possible that not all the HFiles will be
rewritten during a major compaction and there still might remain some old unencrypted HFiles.
Also please note, that the snapshots are immutable. So the snapshots taken before you enabled the
encryption will still contain the unencrypted HFiles.
Rotate the Data Key::
To rotate the data key, first change the ColumnFamily key in the column descriptor, then trigger a major compaction.
When compaction is complete, all HFiles will be re-encrypted using the new data key.
Until the compaction completes, the old HFiles will still be readable using the old key.
During compaction, the compacted HFiles will be re-encrypted using the new data key.
However, depending on the compaction settings, it is possible that not all the HFiles will be
rewritten during a major compaction and there still might remain some old HFiles encrypted with the old key.
Also please note, that the snapshots are immutable. So the snapshots taken before the changing of
the encryption key will still contain the HFiles written using the old key.
Switching Between Using a Random Data Key and Specifying A Key::
If you configured a column family to use a specific key and you want to return to the default behavior of using a randomly-generated key for that column family, use the Java API to alter the `HColumnDescriptor` so that no value is sent with the key `ENCRYPTION_KEY`.
@ -1728,7 +1772,6 @@ Rotate the Master Key::
Then update the KeyStore to contain a new master key, and keep the old master key in the KeyStore using a different alias.
Next, configure fallback to the old master key in the _hbase-site.xml_ file.
::
[[hbase.secure.bulkload]]
=== Secure Bulk Load