HDFS-11223. Fix typos in HttpFs documentations. Contributed by Yiqun Lin.
(cherry picked from commit4c2cf5560f
) (cherry picked from commit0478597ea9
)
This commit is contained in:
parent
59b0857c6b
commit
39bf840398
|
@ -50,7 +50,7 @@ IMPORTANT: Replace `#HTTPFSUSER#` with the Unix user that will start the HttpFS
|
||||||
Restart Hadoop
|
Restart Hadoop
|
||||||
--------------
|
--------------
|
||||||
|
|
||||||
You need to restart Hadoop for the proxyuser configuration ot become active.
|
You need to restart Hadoop for the proxyuser configuration to become active.
|
||||||
|
|
||||||
Start/Stop HttpFS
|
Start/Stop HttpFS
|
||||||
-----------------
|
-----------------
|
||||||
|
|
|
@ -15,7 +15,7 @@
|
||||||
Hadoop HDFS over HTTP - Documentation Sets
|
Hadoop HDFS over HTTP - Documentation Sets
|
||||||
==========================================
|
==========================================
|
||||||
|
|
||||||
HttpFS is a server that provides a REST HTTP gateway supporting all HDFS File System operations (read and write). And it is inteoperable with the **webhdfs** REST HTTP API.
|
HttpFS is a server that provides a REST HTTP gateway supporting all HDFS File System operations (read and write). And it is interoperable with the **webhdfs** REST HTTP API.
|
||||||
|
|
||||||
HttpFS can be used to transfer data between clusters running different versions of Hadoop (overcoming RPC versioning issues), for example using Hadoop DistCP.
|
HttpFS can be used to transfer data between clusters running different versions of Hadoop (overcoming RPC versioning issues), for example using Hadoop DistCP.
|
||||||
|
|
||||||
|
@ -23,9 +23,9 @@ HttpFS can be used to access data in HDFS on a cluster behind of a firewall (the
|
||||||
|
|
||||||
HttpFS can be used to access data in HDFS using HTTP utilities (such as curl and wget) and HTTP libraries Perl from other languages than Java.
|
HttpFS can be used to access data in HDFS using HTTP utilities (such as curl and wget) and HTTP libraries Perl from other languages than Java.
|
||||||
|
|
||||||
The **webhdfs** client FileSytem implementation can be used to access HttpFS using the Hadoop filesystem command (`hadoop fs`) line tool as well as from Java applications using the Hadoop FileSystem Java API.
|
The **webhdfs** client FileSystem implementation can be used to access HttpFS using the Hadoop filesystem command (`hadoop fs`) line tool as well as from Java applications using the Hadoop FileSystem Java API.
|
||||||
|
|
||||||
HttpFS has built-in security supporting Hadoop pseudo authentication and HTTP SPNEGO Kerberos and other pluggable authentication mechanims. It also provides Hadoop proxy user support.
|
HttpFS has built-in security supporting Hadoop pseudo authentication and HTTP SPNEGO Kerberos and other pluggable authentication mechanisms. It also provides Hadoop proxy user support.
|
||||||
|
|
||||||
How Does HttpFS Works?
|
How Does HttpFS Works?
|
||||||
----------------------
|
----------------------
|
||||||
|
|
Loading…
Reference in New Issue