Add to 'getting started' note about hbase being file handles hog.

git-svn-id: https://svn.apache.org/repos/asf/hadoop/hbase/trunk@648746 13f79535-47bb-0310-9956-ffa450edef68
This commit is contained in:
Michael Stack 2008-04-16 16:45:00 +00:00
parent baa84cb4a1
commit ebbdb04cc9
1 changed files with 5 additions and 0 deletions

View File

@ -32,6 +32,11 @@
ssh must be installed and sshd must be running to use Hadoop's ssh must be installed and sshd must be running to use Hadoop's
scripts to manage remote Hadoop daemons. scripts to manage remote Hadoop daemons.
</li> </li>
<li>HBase currently is a file handle hog. The usual default of
1024 on *nix systems is insufficient if you are loading any significant
amount of data into regionservers. See the
<a href="http://wiki.apache.org/hadoop/Hbase/FAQ#6">FAQ: Why do I see "java.io.IOException...(Too many open files)" in my logs?</a>
for how to up the limit.</li>
</ul> </ul>
<h2><a name="getting_started" >Getting Started</a></h2> <h2><a name="getting_started" >Getting Started</a></h2>