HBASE-15244 More doc around native lib setup and check and crc
This commit is contained in:
parent
bab812df28
commit
703e975d66
|
@ -142,14 +142,23 @@ In general, you need to weigh your options between smaller size and faster compr
|
|||
[[hadoop.native.lib]]
|
||||
=== Making use of Hadoop Native Libraries in HBase
|
||||
|
||||
The Hadoop shared library has a bunch of facility including compression libraries and fast crc'ing. To make this facility available to HBase, do the following. HBase/Hadoop will fall back to use alternatives if it cannot find the native library versions -- or fail outright if you asking for an explicit compressor and there is no alternative available.
|
||||
The Hadoop shared library has a bunch of facility including compression libraries and fast crc'ing -- hardware crc'ing if your chipset supports it.
|
||||
To make this facility available to HBase, do the following. HBase/Hadoop will fall back to use alternatives if it cannot find the native library
|
||||
versions -- or fail outright if you asking for an explicit compressor and there is no alternative available.
|
||||
|
||||
If you see the following in your HBase logs, you know that HBase was unable to locate the Hadoop native libraries:
|
||||
First make sure of your Hadoop. Fix this message if you are seeing it starting Hadoop processes:
|
||||
----
|
||||
16/02/09 22:40:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
|
||||
----
|
||||
It means is not properly pointing at its native libraries or the native libs were compiled for another platform.
|
||||
Fix this first.
|
||||
|
||||
Then if you see the following in your HBase logs, you know that HBase was unable to locate the Hadoop native libraries:
|
||||
[source]
|
||||
----
|
||||
2014-08-07 09:26:20,139 WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
|
||||
----
|
||||
If the libraries loaded successfully, the WARN message does not show.
|
||||
If the libraries loaded successfully, the WARN message does not show. Usually this means you are good to go but read on.
|
||||
|
||||
Let's presume your Hadoop shipped with a native library that suits the platform you are running HBase on.
|
||||
To check if the Hadoop native library is available to HBase, run the following tool (available in Hadoop 2.1 and greater):
|
||||
|
@ -167,8 +176,13 @@ bzip2: false
|
|||
----
|
||||
Above shows that the native hadoop library is not available in HBase context.
|
||||
|
||||
The above NativeLibraryChecker tool may come back saying all is hunky-dory
|
||||
-- i.e. all libs show 'true', that they are available -- but follow the below
|
||||
presecription anyways to ensure the native libs are available in HBase context,
|
||||
when it goes to use them.
|
||||
|
||||
To fix the above, either copy the Hadoop native libraries local or symlink to them if the Hadoop and HBase stalls are adjacent in the filesystem.
|
||||
You could also point at their location by setting the `LD_LIBRARY_PATH` environment variable.
|
||||
You could also point at their location by setting the `LD_LIBRARY_PATH` environment variable in your hbase-env.sh.
|
||||
|
||||
Where the JVM looks to find native libraries is "system dependent" (See `java.lang.System#loadLibrary(name)`). On linux, by default, is going to look in _lib/native/PLATFORM_ where `PLATFORM` is the label for the platform your HBase is installed on.
|
||||
On a local linux machine, it seems to be the concatenation of the java properties `os.name` and `os.arch` followed by whether 32 or 64 bit.
|
||||
|
@ -183,8 +197,29 @@ For example:
|
|||
----
|
||||
So in this case, the PLATFORM string is `Linux-amd64-64`.
|
||||
Copying the Hadoop native libraries or symlinking at _lib/native/Linux-amd64-64_ will ensure they are found.
|
||||
Check with the Hadoop _NativeLibraryChecker_.
|
||||
Rolling restart after you have made this change.
|
||||
|
||||
Here is an example of how you would set up the symlinks.
|
||||
Let the hadoop and hbase installs be in your home directory. Assume your hadoop native libs
|
||||
are at ~/hadoop/lib/native. Assume you are on a Linux-amd64-64 platform. In this case,
|
||||
you would do the following to link the hadoop native lib so hbase could find them.
|
||||
----
|
||||
...
|
||||
$ mkdir -p ~/hbaseLinux-amd64-64 -> /home/stack/hadoop/lib/native/lib/native/
|
||||
$ cd ~/hbase/lib/native/
|
||||
$ ln -s ~/hadoop/lib/native Linux-amd64-64
|
||||
$ ls -la
|
||||
# Linux-amd64-64 -> /home/USER/hadoop/lib/native
|
||||
...
|
||||
----
|
||||
|
||||
If you see PureJavaCrc32C in a stack track or if you see something like the below in a perf trace, then native is not working; you are using the java CRC functions rather than native:
|
||||
----
|
||||
5.02% perf-53601.map [.] Lorg/apache/hadoop/util/PureJavaCrc32C;.update
|
||||
----
|
||||
See link:https://issues.apache.org/jira/browse/HBASE-11927[HBASE-11927 Use Native Hadoop Library for HFile checksum (And flip default from CRC32 to CRC32C)],
|
||||
for more on native checksumming support. See in particular the release note for how to check if your hardware to see if your processor has support for hardware CRCs.
|
||||
Or checkout the Apache link:https://blogs.apache.org/hbase/entry/saving_cpu_using_native_hadoop[Checksums in HBase] blog post.
|
||||
|
||||
Here is example of how to point at the Hadoop libs with `LD_LIBRARY_PATH` environment variable:
|
||||
[source]
|
||||
|
|
Loading…
Reference in New Issue