HDFS-4852. Merging change r1619967 from trunk to branch-2.

git-svn-id: https://svn.apache.org/repos/asf/hadoop/common/branches/branch-2@1619968 13f79535-47bb-0310-9956-ffa450edef68
This commit is contained in:
Chris Nauroth 2014-08-23 05:30:59 +00:00
parent 33229c1299
commit b2d86ebf78
2 changed files with 20 additions and 11 deletions

View File

@ -285,6 +285,8 @@ Release 2.6.0 - UNRELEASED
HDFS-6829. DFSAdmin refreshSuperUserGroupsConfiguration failed in HDFS-6829. DFSAdmin refreshSuperUserGroupsConfiguration failed in
security cluster (zhaoyunjiong via Arpit Agarwal) security cluster (zhaoyunjiong via Arpit Agarwal)
HDFS-4852. libhdfs documentation is out of date. (cnauroth)
Release 2.5.0 - 2014-08-11 Release 2.5.0 - 2014-08-11
INCOMPATIBLE CHANGES INCOMPATIBLE CHANGES

View File

@ -26,14 +26,17 @@ C API libhdfs
(HDFS). It provides C APIs to a subset of the HDFS APIs to manipulate (HDFS). It provides C APIs to a subset of the HDFS APIs to manipulate
HDFS files and the filesystem. libhdfs is part of the Hadoop HDFS files and the filesystem. libhdfs is part of the Hadoop
distribution and comes pre-compiled in distribution and comes pre-compiled in
<<<${HADOOP_PREFIX}/libhdfs/libhdfs.so>>> . <<<${HADOOP_HDFS_HOME}/lib/native/libhdfs.so>>> . libhdfs is compatible with
Windows and can be built on Windows by running <<<mvn compile>>> within the
<<<hadoop-hdfs-project/hadoop-hdfs>>> directory of the source tree.
* The APIs * The APIs
The libhdfs APIs are a subset of: {{{hadoop fs APIs}}}. The libhdfs APIs are a subset of the
{{{../../api/org/apache/hadoop/fs/FileSystem.html}Hadoop FileSystem APIs}}.
The header file for libhdfs describes each API in detail and is The header file for libhdfs describes each API in detail and is
available in <<<${HADOOP_PREFIX}/src/c++/libhdfs/hdfs.h>>> available in <<<${HADOOP_HDFS_HOME}/include/hdfs.h>>>.
* A Sample Program * A Sample Program
@ -61,18 +64,22 @@ C API libhdfs
* How To Link With The Library * How To Link With The Library
See the Makefile for <<<hdfs_test.c>>> in the libhdfs source directory See the CMake file for <<<test_libhdfs_ops.c>>> in the libhdfs source
(<<<${HADOOP_PREFIX}/src/c++/libhdfs/Makefile>>>) or something like: directory (<<<hadoop-hdfs-project/hadoop-hdfs/src/CMakeLists.txt>>>) or
<<<gcc above_sample.c -I${HADOOP_PREFIX}/src/c++/libhdfs -L${HADOOP_PREFIX}/libhdfs -lhdfs -o above_sample>>> something like:
<<<gcc above_sample.c -I${HADOOP_HDFS_HOME}/include -L${HADOOP_HDFS_HOME}/lib/native -lhdfs -o above_sample>>>
* Common Problems * Common Problems
The most common problem is the <<<CLASSPATH>>> is not set properly when The most common problem is the <<<CLASSPATH>>> is not set properly when
calling a program that uses libhdfs. Make sure you set it to all the calling a program that uses libhdfs. Make sure you set it to all the
Hadoop jars needed to run Hadoop itself. Currently, there is no way to Hadoop jars needed to run Hadoop itself as well as the right configuration
programmatically generate the classpath, but a good bet is to include directory containing <<<hdfs-site.xml>>>. It is not valid to use wildcard
all the jar files in <<<${HADOOP_PREFIX}>>> and <<<${HADOOP_PREFIX}/lib>>> as well syntax for specifying multiple jars. It may be useful to run
as the right configuration directory containing <<<hdfs-site.xml>>> <<<hadoop classpath --glob>>> or <<<hadoop classpath --jar <path>>>> to
generate the correct classpath for your deployment. See
{{{../hadoop-common/CommandsManual.html#classpath}Hadoop Commands Reference}}
for more information on this command.
* Thread Safe * Thread Safe