HADOOP-8436. NPE In getLocalPathForWrite ( path, conf ) when the required context item is not configured. Contributed by Brahma Reddy Battula. (harsh)

git-svn-id: https://svn.apache.org/repos/asf/hadoop/common/trunk@1389799 13f79535-47bb-0310-9956-ffa450edef68
(cherry picked from commit a7e450c7cc)

Conflicts:
	hadoop-common-project/hadoop-common/CHANGES.txt
This commit is contained in:
Harsh J 2012-09-25 11:10:11 +00:00 committed by Zhihai Xu
parent 3c06162259
commit 9bb6fba759
3 changed files with 24 additions and 0 deletions

View File

@ -575,6 +575,10 @@ Release 2.8.0 - UNRELEASED
HADOOP-12386. RetryPolicies.RETRY_FOREVER should be able to specify a
retry interval. (Sunil G via wangda)
HADOOP-8436. NPE In getLocalPathForWrite ( path, conf ) when the
required context item is not configured
(Brahma Reddy Battula via harsh)
OPTIMIZATIONS
HADOOP-12051. ProtobufRpcEngine.invoke() should use Exception.toString()

View File

@ -265,6 +265,9 @@ public class LocalDirAllocator {
private synchronized void confChanged(Configuration conf)
throws IOException {
String newLocalDirs = conf.get(contextCfgItemName);
if (null == newLocalDirs) {
throw new IOException(contextCfgItemName + " not configured");
}
if (!newLocalDirs.equals(savedLocalDirs)) {
localDirs = StringUtils.getTrimmedStrings(newLocalDirs);
localFS = FileSystem.getLocal(conf);

View File

@ -299,6 +299,23 @@ public class TestLocalDirAllocator {
}
}
/*
* Test when mapred.local.dir not configured and called
* getLocalPathForWrite
*/
@Test
public void testShouldNotthrowNPE() throws Exception {
Configuration conf1 = new Configuration();
try {
dirAllocator.getLocalPathForWrite("/test", conf1);
fail("Exception not thrown when " + CONTEXT + " is not set");
} catch (IOException e) {
assertEquals(CONTEXT + " not configured", e.getMessage());
} catch (NullPointerException e) {
fail("Lack of configuration should not have thrown an NPE.");
}
}
/** Test no side effect files are left over. After creating a temp
* temp file, remove both the temp file and its parent. Verify that
* no files or directories are left over as can happen when File objects