HDFS-4852. Merging change r1619967 from trunk to branch-2.
git-svn-id: https://svn.apache.org/repos/asf/hadoop/common/branches/branch-2@1619968 13f79535-47bb-0310-9956-ffa450edef68
This commit is contained in:
parent
33229c1299
commit
b2d86ebf78
|
@ -285,6 +285,8 @@ Release 2.6.0 - UNRELEASED
|
|||
HDFS-6829. DFSAdmin refreshSuperUserGroupsConfiguration failed in
|
||||
security cluster (zhaoyunjiong via Arpit Agarwal)
|
||||
|
||||
HDFS-4852. libhdfs documentation is out of date. (cnauroth)
|
||||
|
||||
Release 2.5.0 - 2014-08-11
|
||||
|
||||
INCOMPATIBLE CHANGES
|
||||
|
|
|
@ -26,14 +26,17 @@ C API libhdfs
|
|||
(HDFS). It provides C APIs to a subset of the HDFS APIs to manipulate
|
||||
HDFS files and the filesystem. libhdfs is part of the Hadoop
|
||||
distribution and comes pre-compiled in
|
||||
<<<${HADOOP_PREFIX}/libhdfs/libhdfs.so>>> .
|
||||
<<<${HADOOP_HDFS_HOME}/lib/native/libhdfs.so>>> . libhdfs is compatible with
|
||||
Windows and can be built on Windows by running <<<mvn compile>>> within the
|
||||
<<<hadoop-hdfs-project/hadoop-hdfs>>> directory of the source tree.
|
||||
|
||||
* The APIs
|
||||
|
||||
The libhdfs APIs are a subset of: {{{hadoop fs APIs}}}.
|
||||
The libhdfs APIs are a subset of the
|
||||
{{{../../api/org/apache/hadoop/fs/FileSystem.html}Hadoop FileSystem APIs}}.
|
||||
|
||||
The header file for libhdfs describes each API in detail and is
|
||||
available in <<<${HADOOP_PREFIX}/src/c++/libhdfs/hdfs.h>>>
|
||||
available in <<<${HADOOP_HDFS_HOME}/include/hdfs.h>>>.
|
||||
|
||||
* A Sample Program
|
||||
|
||||
|
@ -55,24 +58,28 @@ C API libhdfs
|
|||
fprintf(stderr, "Failed to 'flush' %s\n", writePath);
|
||||
exit(-1);
|
||||
}
|
||||
hdfsCloseFile(fs, writeFile);
|
||||
hdfsCloseFile(fs, writeFile);
|
||||
}
|
||||
----
|
||||
|
||||
* How To Link With The Library
|
||||
|
||||
See the Makefile for <<<hdfs_test.c>>> in the libhdfs source directory
|
||||
(<<<${HADOOP_PREFIX}/src/c++/libhdfs/Makefile>>>) or something like:
|
||||
<<<gcc above_sample.c -I${HADOOP_PREFIX}/src/c++/libhdfs -L${HADOOP_PREFIX}/libhdfs -lhdfs -o above_sample>>>
|
||||
See the CMake file for <<<test_libhdfs_ops.c>>> in the libhdfs source
|
||||
directory (<<<hadoop-hdfs-project/hadoop-hdfs/src/CMakeLists.txt>>>) or
|
||||
something like:
|
||||
<<<gcc above_sample.c -I${HADOOP_HDFS_HOME}/include -L${HADOOP_HDFS_HOME}/lib/native -lhdfs -o above_sample>>>
|
||||
|
||||
* Common Problems
|
||||
|
||||
The most common problem is the <<<CLASSPATH>>> is not set properly when
|
||||
calling a program that uses libhdfs. Make sure you set it to all the
|
||||
Hadoop jars needed to run Hadoop itself. Currently, there is no way to
|
||||
programmatically generate the classpath, but a good bet is to include
|
||||
all the jar files in <<<${HADOOP_PREFIX}>>> and <<<${HADOOP_PREFIX}/lib>>> as well
|
||||
as the right configuration directory containing <<<hdfs-site.xml>>>
|
||||
Hadoop jars needed to run Hadoop itself as well as the right configuration
|
||||
directory containing <<<hdfs-site.xml>>>. It is not valid to use wildcard
|
||||
syntax for specifying multiple jars. It may be useful to run
|
||||
<<<hadoop classpath --glob>>> or <<<hadoop classpath --jar <path>>>> to
|
||||
generate the correct classpath for your deployment. See
|
||||
{{{../hadoop-common/CommandsManual.html#classpath}Hadoop Commands Reference}}
|
||||
for more information on this command.
|
||||
|
||||
* Thread Safe
|
||||
|
||||
|
|
Loading…
Reference in New Issue