LUCENE-5965: CorruptIndexException requires a String or DataInput resource

git-svn-id: https://svn.apache.org/repos/asf/lucene/dev/trunk@1626372 13f79535-47bb-0310-9956-ffa450edef68
This commit is contained in:
Robert Muir 2014-09-20 00:25:48 +00:00
parent 079ad96d33
commit 8310bd892c
44 changed files with 230 additions and 350 deletions

View File

@ -7,25 +7,8 @@ http://s.apache.org/luceneversions
New Features
* LUCENE-5945: All file handling converted to NIO.2 apis. (Robert Muir)
* LUCENE-5946: SimpleFSDirectory now uses Files.newByteChannel, for
portability with custom FileSystemProviders. If you want the old
non-interruptible behavior of RandomAccessFile, use RAFDirectory
in the misc/ module. (Uwe Schindler, Robert Muir)
* SOLR-3359: Added analyzer attribute/property to SynonymFilterFactory.
(Ryo Onodera via Koji Sekiguchi)
* LUCENE-5648: Index and search date ranges, particularly multi-valued ones. It's
implemented in the spatial module as DateRangePrefixTree used with
NumberRangePrefixTreeStrategy. (David Smiley)
API Changes
* LUCENE-4535: oal.util.FilterIterator is now an internal API.
(Adrien Grand)
* LUCENE-3312: The API of oal.document was restructured to
differentiate between stored documents and indexed documents.
IndexReader.document(int) now returns StoredDocument
@ -34,65 +17,6 @@ API Changes
(Nikola Tanković, Uwe Schindler, Chris Male, Mike McCandless,
Robert Muir)
* LUCENE-4924: DocIdSetIterator.docID() must now return -1 when the iterator is
not positioned. This change affects all classes that inherit from
DocIdSetIterator, including DocsEnum and DocsAndPositionsEnum. (Adrien Grand)
* LUCENE-5127: Reduce RAM usage of FixedGapTermsIndex. Remove
IndexWriterConfig.setTermIndexInterval, IndexWriterConfig.setReaderTermsIndexDivisor,
and termsIndexDivisor from StandardDirectoryReader. These options have been no-ops
with the default codec since Lucene 4.0. If you want to configure the interval for
this term index, pass it directly in your codec, where it can also be configured
per-field. (Robert Muir)
* LUCENE-5388: Remove Reader from Tokenizer's constructor and from
Analyzer's createComponents. TokenStreams now always get their input
via setReader.
(Benson Margulies via Robert Muir - pull request #16)
* LUCENE-5527: The Collector API has been refactored to use a dedicated Collector
per leaf. (Shikhar Bhushan, Adrien Grand)
* LUCENE-4246: IndexWriter.close now always closes, even if it throws
an exception. The new IndexWriterConfig.setCommitOnClose (default
true) determines whether close() should commit before closing.
* LUCENE-5608, LUCENE-5565: Refactor SpatialPrefixTree/Cell API. Doesn't use Strings
as tokens anymore, and now iterates cells on-demand during indexing instead of
building a collection. RPT now has more setters. (David Smiley)
* LUCENE-5666: Change uninverted access (sorting, faceting, grouping, etc)
to use the DocValues API instead of FieldCache. For FieldCache functionality,
use UninvertingReader in lucene/misc (or implement your own FilterReader).
UninvertingReader is more efficient: supports multi-valued numeric fields,
detects when a multi-valued field is single-valued, reuses caches
of compatible types (e.g. SORTED also supports BINARY and SORTED_SET access
without insanity). "Insanity" is no longer possible unless you explicitly want it.
Rename FieldCache* and DocTermOrds* classes in the search package to DocValues*.
Move SortedSetSortField to core and add SortedSetFieldSource to queries/, which
takes the same selectors. Add helper methods to DocValues.java that are better
suited for search code (never return null, etc). (Mike McCandless, Robert Muir)
* LUCENE-5871: Remove Version from IndexWriterConfig. Use
IndexWriterConfig.setCommitOnClose to change the behavior of IndexWriter.close().
The default has been changed to match that of 4.x.
(Ryan Ernst, Mike McCandless)
Documentation
* LUCENE-5392: Add/improve analysis package documentation to reflect
analysis API changes. (Benson Margulies via Robert Muir - pull request #17)
Other
* LUCENE-5563: Removed sep layout: which has fallen behind on features and doesn't
perform as well as other options. (Robert Muir)
* LUCENE-5858: Moved compatibility codecs to 'lucene-backward-codecs.jar'.
(Adrien Grand, Robert Muir)
* LUCENE-5915: Remove Pulsing postings format. (Robert Muir)
======================= Lucene 5.0.0 ======================
New Features
@ -133,6 +57,9 @@ API Changes:
* LUCENE-5900: Deprecated more constructors taking Version in *InfixSuggester and
ICUCollationKeyAnalyzer, and removed TEST_VERSION_CURRENT from the test framework.
(Ryan Ernst)
* LUCENE-5965: CorruptIndexException requires a String or DataInput resource.
(Robert Muir)
Bug Fixes

View File

@ -159,7 +159,7 @@ final class Lucene40DocValuesReader extends DocValuesProducer {
}
};
} else {
throw new CorruptIndexException("invalid VAR_INTS header byte: " + header + " (resource=" + input + ")");
throw new CorruptIndexException("invalid VAR_INTS header byte: " + header, input);
}
}
@ -169,7 +169,7 @@ final class Lucene40DocValuesReader extends DocValuesProducer {
Lucene40DocValuesFormat.INTS_VERSION_CURRENT);
int valueSize = input.readInt();
if (valueSize != 1) {
throw new CorruptIndexException("invalid valueSize: " + valueSize);
throw new CorruptIndexException("invalid valueSize: " + valueSize, input);
}
int maxDoc = state.segmentInfo.getDocCount();
final byte values[] = new byte[maxDoc];
@ -191,7 +191,7 @@ final class Lucene40DocValuesReader extends DocValuesProducer {
Lucene40DocValuesFormat.INTS_VERSION_CURRENT);
int valueSize = input.readInt();
if (valueSize != 2) {
throw new CorruptIndexException("invalid valueSize: " + valueSize);
throw new CorruptIndexException("invalid valueSize: " + valueSize, input);
}
int maxDoc = state.segmentInfo.getDocCount();
final short values[] = new short[maxDoc];
@ -215,7 +215,7 @@ final class Lucene40DocValuesReader extends DocValuesProducer {
Lucene40DocValuesFormat.INTS_VERSION_CURRENT);
int valueSize = input.readInt();
if (valueSize != 4) {
throw new CorruptIndexException("invalid valueSize: " + valueSize);
throw new CorruptIndexException("invalid valueSize: " + valueSize, input);
}
int maxDoc = state.segmentInfo.getDocCount();
final int values[] = new int[maxDoc];
@ -239,7 +239,7 @@ final class Lucene40DocValuesReader extends DocValuesProducer {
Lucene40DocValuesFormat.INTS_VERSION_CURRENT);
int valueSize = input.readInt();
if (valueSize != 8) {
throw new CorruptIndexException("invalid valueSize: " + valueSize);
throw new CorruptIndexException("invalid valueSize: " + valueSize, input);
}
int maxDoc = state.segmentInfo.getDocCount();
final long values[] = new long[maxDoc];
@ -263,7 +263,7 @@ final class Lucene40DocValuesReader extends DocValuesProducer {
Lucene40DocValuesFormat.FLOATS_VERSION_CURRENT);
int valueSize = input.readInt();
if (valueSize != 4) {
throw new CorruptIndexException("invalid valueSize: " + valueSize);
throw new CorruptIndexException("invalid valueSize: " + valueSize, input);
}
int maxDoc = state.segmentInfo.getDocCount();
final int values[] = new int[maxDoc];
@ -287,7 +287,7 @@ final class Lucene40DocValuesReader extends DocValuesProducer {
Lucene40DocValuesFormat.FLOATS_VERSION_CURRENT);
int valueSize = input.readInt();
if (valueSize != 8) {
throw new CorruptIndexException("invalid valueSize: " + valueSize);
throw new CorruptIndexException("invalid valueSize: " + valueSize, input);
}
int maxDoc = state.segmentInfo.getDocCount();
final long values[] = new long[maxDoc];

View File

@ -99,7 +99,7 @@ class Lucene40FieldInfosReader extends FieldInfosReader {
}
if (oldNormsType.mapping != null) {
if (oldNormsType.mapping != DocValuesType.NUMERIC) {
throw new CorruptIndexException("invalid norm type: " + oldNormsType + " (resource=" + input + ")");
throw new CorruptIndexException("invalid norm type: " + oldNormsType, input);
}
attributes.put(LEGACY_NORM_TYPE_KEY, oldNormsType.name());
}

View File

@ -58,7 +58,7 @@ public class Lucene40SegmentInfoReader extends SegmentInfoReader {
final Version version = Version.parse(input.readString());
final int docCount = input.readInt();
if (docCount < 0) {
throw new CorruptIndexException("invalid docCount: " + docCount + " (resource=" + input + ")");
throw new CorruptIndexException("invalid docCount: " + docCount, input);
}
final boolean isCompoundFile = input.readByte() == SegmentInfo.YES;
final Map<String,String> diagnostics = input.readStringStringMap();

View File

@ -128,7 +128,7 @@ public final class Lucene40StoredFieldsReader extends StoredFieldsReader impleme
this.size = (int) (indexSize >> 3);
// Verify two sources of "maxDoc" agree:
if (this.size != si.getDocCount()) {
throw new CorruptIndexException("doc counts differ for segment " + segment + ": fieldsReader shows " + this.size + " but segmentInfo shows " + si.getDocCount());
throw new CorruptIndexException("doc counts differ for segment " + segment + ": fieldsReader shows " + this.size + " but segmentInfo shows " + si.getDocCount(), indexStream);
}
numTotalDocs = (int) (indexSize >> 3);
success = true;
@ -221,7 +221,7 @@ public final class Lucene40StoredFieldsReader extends StoredFieldsReader impleme
visitor.doubleField(info, Double.longBitsToDouble(fieldsStream.readLong()));
return;
default:
throw new CorruptIndexException("Invalid numeric type: " + Integer.toHexString(numeric));
throw new CorruptIndexException("Invalid numeric type: " + Integer.toHexString(numeric), fieldsStream);
}
} else {
final int length = fieldsStream.readVInt();
@ -248,7 +248,7 @@ public final class Lucene40StoredFieldsReader extends StoredFieldsReader impleme
fieldsStream.readLong();
return;
default:
throw new CorruptIndexException("Invalid numeric type: " + Integer.toHexString(numeric));
throw new CorruptIndexException("Invalid numeric type: " + Integer.toHexString(numeric), fieldsStream);
}
} else {
final int length = fieldsStream.readVInt();

View File

@ -125,10 +125,10 @@ public class Lucene40TermVectorsReader extends TermVectorsReader implements Clos
assert HEADER_LENGTH_DOCS == tvd.getFilePointer();
assert HEADER_LENGTH_FIELDS == tvf.getFilePointer();
if (tvxVersion != tvdVersion) {
throw new CorruptIndexException("version mismatch: tvx=" + tvxVersion + " != tvd=" + tvdVersion + " (resource=" + tvd + ")");
throw new CorruptIndexException("version mismatch: tvx=" + tvxVersion + " != tvd=" + tvdVersion, tvd);
}
if (tvxVersion != tvfVersion) {
throw new CorruptIndexException("version mismatch: tvx=" + tvxVersion + " != tvf=" + tvfVersion + " (resource=" + tvf + ")");
throw new CorruptIndexException("version mismatch: tvx=" + tvxVersion + " != tvf=" + tvfVersion, tvf);
}
numTotalDocs = (int) (tvx.length()-HEADER_LENGTH_INDEX >> 4);

View File

@ -144,7 +144,7 @@ class Lucene42DocValuesProducer extends DocValuesProducer {
VERSION_START,
VERSION_CURRENT);
if (version != version2) {
throw new CorruptIndexException("Format versions mismatch");
throw new CorruptIndexException("Format versions mismatch: meta=" + version + ", data=" + version2, data);
}
if (version >= VERSION_CHECKSUM) {
@ -172,7 +172,7 @@ class Lucene42DocValuesProducer extends DocValuesProducer {
if (info == null) {
// trickier to validate more: because we re-use for norms, because we use multiple entries
// for "composite" types like sortedset, etc.
throw new CorruptIndexException("Invalid field number: " + fieldNumber + " (resource=" + meta + ")");
throw new CorruptIndexException("Invalid field number: " + fieldNumber, meta);
}
int fieldType = meta.readByte();
if (fieldType == NUMBER) {
@ -186,7 +186,7 @@ class Lucene42DocValuesProducer extends DocValuesProducer {
case UNCOMPRESSED:
break;
default:
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format, meta);
}
if (entry.format != UNCOMPRESSED) {
entry.packedIntsVersion = meta.readVInt();
@ -209,7 +209,7 @@ class Lucene42DocValuesProducer extends DocValuesProducer {
entry.numOrds = meta.readVLong();
fsts.put(info.name, entry);
} else {
throw new CorruptIndexException("invalid entry type: " + fieldType + ", input=" + meta);
throw new CorruptIndexException("invalid entry type: " + fieldType, meta);
}
fieldNumber = meta.readVInt();
}
@ -260,7 +260,7 @@ class Lucene42DocValuesProducer extends DocValuesProducer {
case TABLE_COMPRESSED:
int size = data.readVInt();
if (size > 256) {
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, input=" + data);
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, got=" + size, data);
}
final long decode[] = new long[size];
for (int i = 0; i < decode.length; i++) {

View File

@ -117,7 +117,7 @@ final class Lucene42FieldInfosReader extends FieldInfosReader {
} else if (b == 4) {
return DocValuesType.SORTED_SET;
} else {
throw new CorruptIndexException("invalid docvalues byte: " + b + " (resource=" + input + ")");
throw new CorruptIndexException("invalid docvalues byte: " + b, input);
}
}
}

View File

@ -138,7 +138,7 @@ class Lucene45DocValuesProducer extends DocValuesProducer implements Closeable {
Lucene45DocValuesFormat.VERSION_START,
Lucene45DocValuesFormat.VERSION_CURRENT);
if (version != version2) {
throw new CorruptIndexException("Format versions mismatch");
throw new CorruptIndexException("Format versions mismatch: meta=" + version + ", data=" + version2, data);
}
if (version >= Lucene45DocValuesFormat.VERSION_CHECKSUM) {
@ -162,19 +162,19 @@ class Lucene45DocValuesProducer extends DocValuesProducer implements Closeable {
private void readSortedField(int fieldNumber, IndexInput meta, FieldInfos infos) throws IOException {
// sorted = binary + numeric
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sorted entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + fieldNumber + " is corrupt", meta);
}
if (meta.readByte() != Lucene45DocValuesFormat.BINARY) {
throw new CorruptIndexException("sorted entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + fieldNumber + " is corrupt", meta);
}
BinaryEntry b = readBinaryEntry(meta);
binaries.put(fieldNumber, b);
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sorted entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + fieldNumber + " is corrupt", meta);
}
if (meta.readByte() != Lucene45DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sorted entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + fieldNumber + " is corrupt", meta);
}
NumericEntry n = readNumericEntry(meta);
ords.put(fieldNumber, n);
@ -183,28 +183,28 @@ class Lucene45DocValuesProducer extends DocValuesProducer implements Closeable {
private void readSortedSetFieldWithAddresses(int fieldNumber, IndexInput meta, FieldInfos infos) throws IOException {
// sortedset = binary + numeric (addresses) + ordIndex
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt", meta);
}
if (meta.readByte() != Lucene45DocValuesFormat.BINARY) {
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt", meta);
}
BinaryEntry b = readBinaryEntry(meta);
binaries.put(fieldNumber, b);
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt", meta);
}
if (meta.readByte() != Lucene45DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt", meta);
}
NumericEntry n1 = readNumericEntry(meta);
ords.put(fieldNumber, n1);
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt", meta);
}
if (meta.readByte() != Lucene45DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt", meta);
}
NumericEntry n2 = readNumericEntry(meta);
ordIndexes.put(fieldNumber, n2);
@ -218,7 +218,7 @@ class Lucene45DocValuesProducer extends DocValuesProducer implements Closeable {
if ((lenientFieldInfoCheck && fieldNumber < 0) || (!lenientFieldInfoCheck && infos.fieldInfo(fieldNumber) == null)) {
// trickier to validate more: because we re-use for norms, because we use multiple entries
// for "composite" types like sortedset, etc.
throw new CorruptIndexException("Invalid field number: " + fieldNumber + " (resource=" + meta + ")");
throw new CorruptIndexException("Invalid field number: " + fieldNumber, meta);
}
byte type = meta.readByte();
if (type == Lucene45DocValuesFormat.NUMERIC) {
@ -235,17 +235,17 @@ class Lucene45DocValuesProducer extends DocValuesProducer implements Closeable {
readSortedSetFieldWithAddresses(fieldNumber, meta, infos);
} else if (ss.format == SORTED_SET_SINGLE_VALUED_SORTED) {
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt", meta);
}
if (meta.readByte() != Lucene45DocValuesFormat.SORTED) {
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + fieldNumber + " is corrupt", meta);
}
readSortedField(fieldNumber, meta, infos);
} else {
throw new AssertionError();
}
} else {
throw new CorruptIndexException("invalid type: " + type + ", resource=" + meta);
throw new CorruptIndexException("invalid type: " + type, meta);
}
fieldNumber = meta.readVInt();
}
@ -267,11 +267,11 @@ class Lucene45DocValuesProducer extends DocValuesProducer implements Closeable {
break;
case TABLE_COMPRESSED:
if (entry.count > Integer.MAX_VALUE) {
throw new CorruptIndexException("Cannot use TABLE_COMPRESSED with more than MAX_VALUE values, input=" + meta);
throw new CorruptIndexException("Cannot use TABLE_COMPRESSED with more than MAX_VALUE values, got=" + entry.count, meta);
}
final int uniqueValues = meta.readVInt();
if (uniqueValues > 256) {
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, input=" + meta);
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, got=" + uniqueValues, meta);
}
entry.table = new long[uniqueValues];
for (int i = 0; i < uniqueValues; ++i) {
@ -281,7 +281,7 @@ class Lucene45DocValuesProducer extends DocValuesProducer implements Closeable {
case DELTA_COMPRESSED:
break;
default:
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format, meta);
}
return entry;
}
@ -309,7 +309,7 @@ class Lucene45DocValuesProducer extends DocValuesProducer implements Closeable {
entry.blockSize = meta.readVInt();
break;
default:
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format, meta);
}
return entry;
}
@ -322,7 +322,7 @@ class Lucene45DocValuesProducer extends DocValuesProducer implements Closeable {
entry.format = SORTED_SET_WITH_ADDRESSES;
}
if (entry.format != SORTED_SET_SINGLE_VALUED_SORTED && entry.format != SORTED_SET_WITH_ADDRESSES) {
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format, meta);
}
return entry;
}

View File

@ -122,7 +122,7 @@ class Lucene49DocValuesProducer extends DocValuesProducer implements Closeable {
Lucene49DocValuesFormat.VERSION_START,
Lucene49DocValuesFormat.VERSION_CURRENT);
if (version != version2) {
throw new CorruptIndexException("Format versions mismatch");
throw new CorruptIndexException("Format versions mismatch: meta=" + version + ", data=" + version2, data);
}
// NOTE: data file is too costly to verify checksum against all the bytes on open,
@ -144,19 +144,19 @@ class Lucene49DocValuesProducer extends DocValuesProducer implements Closeable {
private void readSortedField(FieldInfo info, IndexInput meta) throws IOException {
// sorted = binary + numeric
if (meta.readVInt() != info.number) {
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene49DocValuesFormat.BINARY) {
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt", meta);
}
BinaryEntry b = readBinaryEntry(meta);
binaries.put(info.name, b);
if (meta.readVInt() != info.number) {
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene49DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt", meta);
}
NumericEntry n = readNumericEntry(meta);
ords.put(info.name, n);
@ -165,28 +165,28 @@ class Lucene49DocValuesProducer extends DocValuesProducer implements Closeable {
private void readSortedSetFieldWithAddresses(FieldInfo info, IndexInput meta) throws IOException {
// sortedset = binary + numeric (addresses) + ordIndex
if (meta.readVInt() != info.number) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene49DocValuesFormat.BINARY) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
BinaryEntry b = readBinaryEntry(meta);
binaries.put(info.name, b);
if (meta.readVInt() != info.number) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene49DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
NumericEntry n1 = readNumericEntry(meta);
ords.put(info.name, n1);
if (meta.readVInt() != info.number) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene49DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
NumericEntry n2 = readNumericEntry(meta);
ordIndexes.put(info.name, n2);
@ -200,7 +200,7 @@ class Lucene49DocValuesProducer extends DocValuesProducer implements Closeable {
FieldInfo info = infos.fieldInfo(fieldNumber);
if (info == null) {
// trickier to validate more: because we use multiple entries for "composite" types like sortedset, etc.
throw new CorruptIndexException("Invalid field number: " + fieldNumber + " (resource=" + meta + ")");
throw new CorruptIndexException("Invalid field number: " + fieldNumber, meta);
}
byte type = meta.readByte();
if (type == Lucene49DocValuesFormat.NUMERIC) {
@ -217,10 +217,10 @@ class Lucene49DocValuesProducer extends DocValuesProducer implements Closeable {
readSortedSetFieldWithAddresses(info, meta);
} else if (ss.format == SORTED_SINGLE_VALUED) {
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene49DocValuesFormat.SORTED) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
readSortedField(info, meta);
} else {
@ -230,18 +230,18 @@ class Lucene49DocValuesProducer extends DocValuesProducer implements Closeable {
SortedSetEntry ss = readSortedSetEntry(meta);
sortedNumerics.put(info.name, ss);
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene49DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt", meta);
}
numerics.put(info.name, readNumericEntry(meta));
if (ss.format == SORTED_WITH_ADDRESSES) {
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene49DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt", meta);
}
NumericEntry ordIndex = readNumericEntry(meta);
ordIndexes.put(info.name, ordIndex);
@ -249,7 +249,7 @@ class Lucene49DocValuesProducer extends DocValuesProducer implements Closeable {
throw new AssertionError();
}
} else {
throw new CorruptIndexException("invalid type: " + type + ", resource=" + meta);
throw new CorruptIndexException("invalid type: " + type, meta);
}
fieldNumber = meta.readVInt();
}
@ -271,7 +271,7 @@ class Lucene49DocValuesProducer extends DocValuesProducer implements Closeable {
case TABLE_COMPRESSED:
final int uniqueValues = meta.readVInt();
if (uniqueValues > 256) {
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, input=" + meta);
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, got=" + uniqueValues, meta);
}
entry.table = new long[uniqueValues];
for (int i = 0; i < uniqueValues; ++i) {
@ -288,7 +288,7 @@ class Lucene49DocValuesProducer extends DocValuesProducer implements Closeable {
entry.blockSize = meta.readVInt();
break;
default:
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format, meta);
}
entry.endOffset = meta.readLong();
return entry;
@ -317,7 +317,7 @@ class Lucene49DocValuesProducer extends DocValuesProducer implements Closeable {
entry.blockSize = meta.readVInt();
break;
default:
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format, meta);
}
return entry;
}
@ -326,7 +326,7 @@ class Lucene49DocValuesProducer extends DocValuesProducer implements Closeable {
SortedSetEntry entry = new SortedSetEntry();
entry.format = meta.readVInt();
if (entry.format != SORTED_SINGLE_VALUED && entry.format != SORTED_WITH_ADDRESSES) {
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format, meta);
}
return entry;
}

View File

@ -186,7 +186,7 @@ public final class Lucene40PostingsWriter extends PushPostingsWriterBase {
final int delta = docID - lastDocID;
if (docID < 0 || (df > 0 && delta <= 0)) {
throw new CorruptIndexException("docs out of order (" + docID + " <= " + lastDocID + " ) (freqOut: " + freqOut + ")");
throw new CorruptIndexException("docs out of order (" + docID + " <= " + lastDocID + " )", freqOut.toString());
}
if ((++df % skipInterval) == 0) {

View File

@ -145,7 +145,7 @@ public class BlockTermsReader extends FieldsProducer {
final int numFields = in.readVInt();
if (numFields < 0) {
throw new CorruptIndexException("invalid number of fields: " + numFields + " (resource=" + in + ")");
throw new CorruptIndexException("invalid number of fields: " + numFields, in);
}
for(int i=0;i<numFields;i++) {
final int field = in.readVInt();
@ -158,17 +158,17 @@ public class BlockTermsReader extends FieldsProducer {
final int docCount = in.readVInt();
final int longsSize = version >= BlockTermsWriter.VERSION_META_ARRAY ? in.readVInt() : 0;
if (docCount < 0 || docCount > info.getDocCount()) { // #docs with field must be <= #docs
throw new CorruptIndexException("invalid docCount: " + docCount + " maxDoc: " + info.getDocCount() + " (resource=" + in + ")");
throw new CorruptIndexException("invalid docCount: " + docCount + " maxDoc: " + info.getDocCount(), in);
}
if (sumDocFreq < docCount) { // #postings must be >= #docs with field
throw new CorruptIndexException("invalid sumDocFreq: " + sumDocFreq + " docCount: " + docCount + " (resource=" + in + ")");
throw new CorruptIndexException("invalid sumDocFreq: " + sumDocFreq + " docCount: " + docCount, in);
}
if (sumTotalTermFreq != -1 && sumTotalTermFreq < sumDocFreq) { // #positions must be >= #postings
throw new CorruptIndexException("invalid sumTotalTermFreq: " + sumTotalTermFreq + " sumDocFreq: " + sumDocFreq + " (resource=" + in + ")");
throw new CorruptIndexException("invalid sumTotalTermFreq: " + sumTotalTermFreq + " sumDocFreq: " + sumDocFreq, in);
}
FieldReader previous = fields.put(fieldInfo.name, new FieldReader(fieldInfo, numTerms, termsStartPointer, sumTotalTermFreq, sumDocFreq, docCount, longsSize));
if (previous != null) {
throw new CorruptIndexException("duplicate fields: " + fieldInfo.name + " (resource=" + in + ")");
throw new CorruptIndexException("duplicate fields: " + fieldInfo.name, in);
}
}
success = true;

View File

@ -92,7 +92,7 @@ public class FixedGapTermsIndexReader extends TermsIndexReaderBase {
indexInterval = in.readVInt();
if (indexInterval < 1) {
throw new CorruptIndexException("invalid indexInterval: " + indexInterval + " (resource=" + in + ")");
throw new CorruptIndexException("invalid indexInterval: " + indexInterval, in);
}
packedIntsVersion = in.readVInt();
blocksize = in.readVInt();
@ -102,26 +102,26 @@ public class FixedGapTermsIndexReader extends TermsIndexReaderBase {
// Read directory
final int numFields = in.readVInt();
if (numFields < 0) {
throw new CorruptIndexException("invalid numFields: " + numFields + " (resource=" + in + ")");
throw new CorruptIndexException("invalid numFields: " + numFields, in);
}
//System.out.println("FGR: init seg=" + segment + " div=" + indexDivisor + " nF=" + numFields);
for(int i=0;i<numFields;i++) {
final int field = in.readVInt();
final long numIndexTerms = in.readVInt(); // TODO: change this to a vLong if we fix writer to support > 2B index terms
if (numIndexTerms < 0) {
throw new CorruptIndexException("invalid numIndexTerms: " + numIndexTerms + " (resource=" + in + ")");
throw new CorruptIndexException("invalid numIndexTerms: " + numIndexTerms, in);
}
final long termsStart = in.readVLong();
final long indexStart = in.readVLong();
final long packedIndexStart = in.readVLong();
final long packedOffsetsStart = in.readVLong();
if (packedIndexStart < indexStart) {
throw new CorruptIndexException("invalid packedIndexStart: " + packedIndexStart + " indexStart: " + indexStart + "numIndexTerms: " + numIndexTerms + " (resource=" + in + ")");
throw new CorruptIndexException("invalid packedIndexStart: " + packedIndexStart + " indexStart: " + indexStart + "numIndexTerms: " + numIndexTerms, in);
}
final FieldInfo fieldInfo = fieldInfos.fieldInfo(field);
FieldIndexData previous = fields.put(fieldInfo.name, new FieldIndexData(in, termBytes, indexStart, termsStart, packedIndexStart, packedOffsetsStart, numIndexTerms));
if (previous != null) {
throw new CorruptIndexException("duplicate field: " + fieldInfo.name + " (resource=" + in + ")");
throw new CorruptIndexException("duplicate field: " + fieldInfo.name, in);
}
}
success = true;

View File

@ -71,7 +71,7 @@ public class VariableGapTermsIndexReader extends TermsIndexReaderBase {
// Read directory
final int numFields = in.readVInt();
if (numFields < 0) {
throw new CorruptIndexException("invalid numFields: " + numFields + " (resource=" + in + ")");
throw new CorruptIndexException("invalid numFields: " + numFields, in);
}
for(int i=0;i<numFields;i++) {
@ -80,7 +80,7 @@ public class VariableGapTermsIndexReader extends TermsIndexReaderBase {
final FieldInfo fieldInfo = fieldInfos.fieldInfo(field);
FieldIndexData previous = fields.put(fieldInfo.name, new FieldIndexData(in, fieldInfo, indexStart));
if (previous != null) {
throw new CorruptIndexException("duplicate field: " + fieldInfo.name + " (resource=" + in + ")");
throw new CorruptIndexException("duplicate field: " + fieldInfo.name, in);
}
}
success = true;

View File

@ -99,7 +99,7 @@ public final class OrdsBlockTreeTermsReader extends FieldsProducer {
OrdsBlockTreeTermsWriter.VERSION_START,
OrdsBlockTreeTermsWriter.VERSION_CURRENT);
if (indexVersion != version) {
throw new CorruptIndexException("mixmatched version files: " + in + "=" + version + "," + indexIn + "=" + indexVersion);
throw new CorruptIndexException("mixmatched version files: " + in + "=" + version + "," + indexIn + "=" + indexVersion, indexIn);
}
// verify
@ -121,7 +121,7 @@ public final class OrdsBlockTreeTermsReader extends FieldsProducer {
final int numFields = in.readVInt();
if (numFields < 0) {
throw new CorruptIndexException("invalid numFields: " + numFields + " (resource=" + in + ")");
throw new CorruptIndexException("invalid numFields: " + numFields, in);
}
for(int i=0;i<numFields;i++) {
@ -146,20 +146,20 @@ public final class OrdsBlockTreeTermsReader extends FieldsProducer {
BytesRef minTerm = readBytesRef(in);
BytesRef maxTerm = readBytesRef(in);
if (docCount < 0 || docCount > info.getDocCount()) { // #docs with field must be <= #docs
throw new CorruptIndexException("invalid docCount: " + docCount + " maxDoc: " + info.getDocCount() + " (resource=" + in + ")");
throw new CorruptIndexException("invalid docCount: " + docCount + " maxDoc: " + info.getDocCount(), in);
}
if (sumDocFreq < docCount) { // #postings must be >= #docs with field
throw new CorruptIndexException("invalid sumDocFreq: " + sumDocFreq + " docCount: " + docCount + " (resource=" + in + ")");
throw new CorruptIndexException("invalid sumDocFreq: " + sumDocFreq + " docCount: " + docCount, in);
}
if (sumTotalTermFreq != -1 && sumTotalTermFreq < sumDocFreq) { // #positions must be >= #postings
throw new CorruptIndexException("invalid sumTotalTermFreq: " + sumTotalTermFreq + " sumDocFreq: " + sumDocFreq + " (resource=" + in + ")");
throw new CorruptIndexException("invalid sumTotalTermFreq: " + sumTotalTermFreq + " sumDocFreq: " + sumDocFreq, in);
}
final long indexStartFP = indexIn.readVLong();
OrdsFieldReader previous = fields.put(fieldInfo.name,
new OrdsFieldReader(this, fieldInfo, numTerms, rootCode, sumTotalTermFreq, sumDocFreq, docCount,
indexStartFP, longsSize, indexIn, minTerm, maxTerm));
if (previous != null) {
throw new CorruptIndexException("duplicate field: " + fieldInfo.name + " (resource=" + in + ")");
throw new CorruptIndexException("duplicate field: " + fieldInfo.name, in);
}
}
indexIn.close();

View File

@ -118,7 +118,7 @@ class DirectDocValuesProducer extends DocValuesProducer {
VERSION_START,
VERSION_CURRENT);
if (version != version2) {
throw new CorruptIndexException("Format versions mismatch");
throw new CorruptIndexException("Format versions mismatch: meta=" + version + ", data=" + version2, data);
}
// NOTE: data file is too costly to verify checksum against all the bytes on open,
@ -221,7 +221,7 @@ class DirectDocValuesProducer extends DocValuesProducer {
SortedNumericEntry entry = readSortedNumericEntry(meta, true);
sortedNumerics.put(info.name, entry);
} else {
throw new CorruptIndexException("invalid entry type: " + fieldType + ", field= " + info.name + ", input=" + meta);
throw new CorruptIndexException("invalid entry type: " + fieldType + ", field= " + info.name, meta);
}
fieldNumber = meta.readVInt();
}

View File

@ -145,18 +145,18 @@ public class FSTOrdTermsReader extends FieldsProducer {
private void checkFieldSummary(SegmentInfo info, IndexInput indexIn, IndexInput blockIn, TermsReader field, TermsReader previous) throws IOException {
// #docs with field must be <= #docs
if (field.docCount < 0 || field.docCount > info.getDocCount()) {
throw new CorruptIndexException("invalid docCount: " + field.docCount + " maxDoc: " + info.getDocCount() + " (resource=" + indexIn + ", " + blockIn + ")");
throw new CorruptIndexException("invalid docCount: " + field.docCount + " maxDoc: " + info.getDocCount() + " (blockIn=" + blockIn + ")", indexIn);
}
// #postings must be >= #docs with field
if (field.sumDocFreq < field.docCount) {
throw new CorruptIndexException("invalid sumDocFreq: " + field.sumDocFreq + " docCount: " + field.docCount + " (resource=" + indexIn + ", " + blockIn + ")");
throw new CorruptIndexException("invalid sumDocFreq: " + field.sumDocFreq + " docCount: " + field.docCount + " (blockIn=" + blockIn + ")", indexIn);
}
// #positions must be >= #postings
if (field.sumTotalTermFreq != -1 && field.sumTotalTermFreq < field.sumDocFreq) {
throw new CorruptIndexException("invalid sumTotalTermFreq: " + field.sumTotalTermFreq + " sumDocFreq: " + field.sumDocFreq + " (resource=" + indexIn + ", " + blockIn + ")");
throw new CorruptIndexException("invalid sumTotalTermFreq: " + field.sumTotalTermFreq + " sumDocFreq: " + field.sumDocFreq + " (blockIn=" + blockIn + ")", indexIn);
}
if (previous != null) {
throw new CorruptIndexException("duplicate fields: " + field.fieldInfo.name + " (resource=" + indexIn + ", " + blockIn + ")");
throw new CorruptIndexException("duplicate fields: " + field.fieldInfo.name + " (blockIn=" + blockIn + ")", indexIn);
}
}

View File

@ -129,18 +129,18 @@ public class FSTTermsReader extends FieldsProducer {
private void checkFieldSummary(SegmentInfo info, IndexInput in, TermsReader field, TermsReader previous) throws IOException {
// #docs with field must be <= #docs
if (field.docCount < 0 || field.docCount > info.getDocCount()) {
throw new CorruptIndexException("invalid docCount: " + field.docCount + " maxDoc: " + info.getDocCount() + " (resource=" + in + ")");
throw new CorruptIndexException("invalid docCount: " + field.docCount + " maxDoc: " + info.getDocCount(), in);
}
// #postings must be >= #docs with field
if (field.sumDocFreq < field.docCount) {
throw new CorruptIndexException("invalid sumDocFreq: " + field.sumDocFreq + " docCount: " + field.docCount + " (resource=" + in + ")");
throw new CorruptIndexException("invalid sumDocFreq: " + field.sumDocFreq + " docCount: " + field.docCount, in);
}
// #positions must be >= #postings
if (field.sumTotalTermFreq != -1 && field.sumTotalTermFreq < field.sumDocFreq) {
throw new CorruptIndexException("invalid sumTotalTermFreq: " + field.sumTotalTermFreq + " sumDocFreq: " + field.sumDocFreq + " (resource=" + in + ")");
throw new CorruptIndexException("invalid sumTotalTermFreq: " + field.sumTotalTermFreq + " sumDocFreq: " + field.sumDocFreq, in);
}
if (previous != null) {
throw new CorruptIndexException("duplicate fields: " + field.fieldInfo.name + " (resource=" + in + ")");
throw new CorruptIndexException("duplicate fields: " + field.fieldInfo.name, in);
}
}

View File

@ -141,7 +141,7 @@ class MemoryDocValuesProducer extends DocValuesProducer {
VERSION_START,
VERSION_CURRENT);
if (version != version2) {
throw new CorruptIndexException("Format versions mismatch");
throw new CorruptIndexException("Format versions mismatch: meta=" + version + ", data=" + version2, data);
}
// NOTE: data file is too costly to verify checksum against all the bytes on open,
@ -175,7 +175,7 @@ class MemoryDocValuesProducer extends DocValuesProducer {
case GCD_COMPRESSED:
break;
default:
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format, meta);
}
entry.packedIntsVersion = meta.readVInt();
entry.count = meta.readLong();
@ -215,7 +215,7 @@ class MemoryDocValuesProducer extends DocValuesProducer {
numEntries++;
FieldInfo info = infos.fieldInfo(fieldNumber);
if (info == null) {
throw new CorruptIndexException("invalid field number: " + fieldNumber + " (resource=" + meta + ")");
throw new CorruptIndexException("invalid field number: " + fieldNumber, meta);
}
int fieldType = meta.readByte();
if (fieldType == NUMBER) {
@ -245,7 +245,7 @@ class MemoryDocValuesProducer extends DocValuesProducer {
entry.singleton = true;
sortedNumerics.put(info.name, entry);
} else {
throw new CorruptIndexException("invalid entry type: " + fieldType + ", fieldName=" + info.name + ", input=" + meta);
throw new CorruptIndexException("invalid entry type: " + fieldType + ", fieldName=" + info.name, meta);
}
fieldNumber = meta.readVInt();
}
@ -295,7 +295,7 @@ class MemoryDocValuesProducer extends DocValuesProducer {
case TABLE_COMPRESSED:
int size = data.readVInt();
if (size > 256) {
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, input=" + data);
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, got=" + size, data);
}
final long decode[] = new long[size];
for (int i = 0; i < decode.length; i++) {

View File

@ -178,7 +178,7 @@ class SimpleTextDocValuesReader extends DocValuesProducer {
try {
bd = (BigDecimal) decoder.parse(scratch.get().utf8ToString());
} catch (ParseException pe) {
throw new CorruptIndexException("failed to parse BigDecimal value (resource=" + in + ")", pe);
throw new CorruptIndexException("failed to parse BigDecimal value", in, pe);
}
SimpleTextUtil.readLine(in, scratch); // read the line telling us if its real or not
return BigInteger.valueOf(field.minValue).add(bd.toBigIntegerExact()).longValue();
@ -241,7 +241,7 @@ class SimpleTextDocValuesReader extends DocValuesProducer {
try {
len = decoder.parse(new String(scratch.bytes(), LENGTH.length, scratch.length() - LENGTH.length, StandardCharsets.UTF_8)).intValue();
} catch (ParseException pe) {
throw new CorruptIndexException("failed to parse int length (resource=" + in + ")", pe);
throw new CorruptIndexException("failed to parse int length", in, pe);
}
term.grow(len);
term.setLength(len);
@ -271,7 +271,7 @@ class SimpleTextDocValuesReader extends DocValuesProducer {
try {
len = decoder.parse(new String(scratch.bytes(), LENGTH.length, scratch.length() - LENGTH.length, StandardCharsets.UTF_8)).intValue();
} catch (ParseException pe) {
throw new CorruptIndexException("failed to parse int length (resource=" + in + ")", pe);
throw new CorruptIndexException("failed to parse int length", in, pe);
}
// skip past bytes
byte bytes[] = new byte[len];
@ -318,7 +318,7 @@ class SimpleTextDocValuesReader extends DocValuesProducer {
try {
return (int) ordDecoder.parse(scratch.get().utf8ToString()).longValue()-1;
} catch (ParseException pe) {
throw new CorruptIndexException("failed to parse ord (resource=" + in + ")", pe);
throw new CorruptIndexException("failed to parse ord", in, pe);
}
} catch (IOException ioe) {
throw new RuntimeException(ioe);
@ -338,7 +338,7 @@ class SimpleTextDocValuesReader extends DocValuesProducer {
try {
len = decoder.parse(new String(scratch.bytes(), LENGTH.length, scratch.length() - LENGTH.length, StandardCharsets.UTF_8)).intValue();
} catch (ParseException pe) {
throw new CorruptIndexException("failed to parse int length (resource=" + in + ")", pe);
throw new CorruptIndexException("failed to parse int length", in, pe);
}
term.grow(len);
term.setLength(len);
@ -447,7 +447,7 @@ class SimpleTextDocValuesReader extends DocValuesProducer {
try {
len = decoder.parse(new String(scratch.bytes(), LENGTH.length, scratch.length() - LENGTH.length, StandardCharsets.UTF_8)).intValue();
} catch (ParseException pe) {
throw new CorruptIndexException("failed to parse int length (resource=" + in + ")", pe);
throw new CorruptIndexException("failed to parse int length", in, pe);
}
term.grow(len);
term.setLength(len);

View File

@ -87,14 +87,14 @@ class SimpleTextUtil {
String expectedChecksum = String.format(Locale.ROOT, "%020d", input.getChecksum());
SimpleTextUtil.readLine(input, scratch);
if (StringHelper.startsWith(scratch.get(), CHECKSUM) == false) {
throw new CorruptIndexException("SimpleText failure: expected checksum line but got " + scratch.get().utf8ToString() + " (resource=" + input + ")");
throw new CorruptIndexException("SimpleText failure: expected checksum line but got " + scratch.get().utf8ToString(), input);
}
String actualChecksum = new BytesRef(scratch.bytes(), CHECKSUM.length, scratch.length() - CHECKSUM.length).utf8ToString();
if (!expectedChecksum.equals(actualChecksum)) {
throw new CorruptIndexException("SimpleText checksum failure: " + actualChecksum + " != " + expectedChecksum + " (resource=" + input + ")");
throw new CorruptIndexException("SimpleText checksum failure: " + actualChecksum + " != " + expectedChecksum, input);
}
if (input.length() != input.getFilePointer()) {
throw new CorruptIndexException("Unexpected stuff at the end of file, please be careful with your text editor! (resource=" + input + ")");
throw new CorruptIndexException("Unexpected stuff at the end of file, please be careful with your text editor!", input);
}
}
}

View File

@ -133,7 +133,7 @@ public final class CodecUtil {
// Safety to guard against reading a bogus string:
final int actualHeader = in.readInt();
if (actualHeader != CODEC_MAGIC) {
throw new CorruptIndexException("codec header mismatch: actual header=" + actualHeader + " vs expected header=" + CODEC_MAGIC + " (resource: " + in + ")");
throw new CorruptIndexException("codec header mismatch: actual header=" + actualHeader + " vs expected header=" + CODEC_MAGIC, in);
}
return checkHeaderNoMagic(in, codec, minVersion, maxVersion);
}
@ -145,7 +145,7 @@ public final class CodecUtil {
public static int checkHeaderNoMagic(DataInput in, String codec, int minVersion, int maxVersion) throws IOException {
final String actualCodec = in.readString();
if (!actualCodec.equals(codec)) {
throw new CorruptIndexException("codec mismatch: actual codec=" + actualCodec + " vs expected codec=" + codec + " (resource: " + in + ")");
throw new CorruptIndexException("codec mismatch: actual codec=" + actualCodec + " vs expected codec=" + codec, in);
}
final int actualVersion = in.readInt();
@ -209,11 +209,10 @@ public final class CodecUtil {
long expectedChecksum = in.readLong();
if (expectedChecksum != actualChecksum) {
throw new CorruptIndexException("checksum failed (hardware problem?) : expected=" + Long.toHexString(expectedChecksum) +
" actual=" + Long.toHexString(actualChecksum) +
" (resource=" + in + ")");
" actual=" + Long.toHexString(actualChecksum), in);
}
if (in.getFilePointer() != in.length()) {
throw new CorruptIndexException("did not read all bytes from file: read " + in.getFilePointer() + " vs size " + in.length() + " (resource: " + in + ")");
throw new CorruptIndexException("did not read all bytes from file: read " + in.getFilePointer() + " vs size " + in.length(), in);
}
return actualChecksum;
}
@ -232,12 +231,12 @@ public final class CodecUtil {
private static void validateFooter(IndexInput in) throws IOException {
final int magic = in.readInt();
if (magic != FOOTER_MAGIC) {
throw new CorruptIndexException("codec footer mismatch: actual footer=" + magic + " vs expected footer=" + FOOTER_MAGIC + " (resource: " + in + ")");
throw new CorruptIndexException("codec footer mismatch: actual footer=" + magic + " vs expected footer=" + FOOTER_MAGIC, in);
}
final int algorithmID = in.readInt();
if (algorithmID != 0) {
throw new CorruptIndexException("codec footer mismatch: unknown algorithmID: " + algorithmID);
throw new CorruptIndexException("codec footer mismatch: unknown algorithmID: " + algorithmID, in);
}
}
@ -249,7 +248,7 @@ public final class CodecUtil {
@Deprecated
public static void checkEOF(IndexInput in) throws IOException {
if (in.getFilePointer() != in.length()) {
throw new CorruptIndexException("did not read all bytes from file: read " + in.getFilePointer() + " vs size " + in.length() + " (resource: " + in + ")");
throw new CorruptIndexException("did not read all bytes from file: read " + in.getFilePointer() + " vs size " + in.length(), in);
}
}

View File

@ -116,7 +116,7 @@ public final class BlockTreeTermsReader extends FieldsProducer {
ioContext);
int indexVersion = readIndexHeader(indexIn);
if (indexVersion != version) {
throw new CorruptIndexException("mixmatched version files: " + in + "=" + version + "," + indexIn + "=" + indexVersion);
throw new CorruptIndexException("mixmatched version files: " + in + "=" + version + "," + indexIn + "=" + indexVersion, indexIn);
}
// verify
@ -142,32 +142,32 @@ public final class BlockTreeTermsReader extends FieldsProducer {
final int numFields = in.readVInt();
if (numFields < 0) {
throw new CorruptIndexException("invalid numFields: " + numFields + " (resource=" + in + ")");
throw new CorruptIndexException("invalid numFields: " + numFields, in);
}
for(int i=0;i<numFields;i++) {
final int field = in.readVInt();
final long numTerms = in.readVLong();
if (numTerms <= 0) {
throw new CorruptIndexException("Illegal numTerms for field number: " + field + " (resource=" + in + ")");
throw new CorruptIndexException("Illegal numTerms for field number: " + field, in);
}
final int numBytes = in.readVInt();
if (numBytes < 0) {
throw new CorruptIndexException("invalid rootCode for field number: " + field + ", numBytes=" + numBytes + " (resource=" + in + ")");
throw new CorruptIndexException("invalid rootCode for field number: " + field + ", numBytes=" + numBytes, in);
}
final BytesRef rootCode = new BytesRef(new byte[numBytes]);
in.readBytes(rootCode.bytes, 0, numBytes);
rootCode.length = numBytes;
final FieldInfo fieldInfo = fieldInfos.fieldInfo(field);
if (fieldInfo == null) {
throw new CorruptIndexException("invalid field number: " + field + ", resource=" + in + ")");
throw new CorruptIndexException("invalid field number: " + field, in);
}
final long sumTotalTermFreq = fieldInfo.getIndexOptions() == IndexOptions.DOCS_ONLY ? -1 : in.readVLong();
final long sumDocFreq = in.readVLong();
final int docCount = in.readVInt();
final int longsSize = version >= BlockTreeTermsWriter.VERSION_META_ARRAY ? in.readVInt() : 0;
if (longsSize < 0) {
throw new CorruptIndexException("invalid longsSize for field: " + fieldInfo.name + ", longsSize=" + longsSize + " (resource=" + in + ")");
throw new CorruptIndexException("invalid longsSize for field: " + fieldInfo.name + ", longsSize=" + longsSize, in);
}
BytesRef minTerm, maxTerm;
if (version >= BlockTreeTermsWriter.VERSION_MIN_MAX_TERMS) {
@ -177,20 +177,20 @@ public final class BlockTreeTermsReader extends FieldsProducer {
minTerm = maxTerm = null;
}
if (docCount < 0 || docCount > info.getDocCount()) { // #docs with field must be <= #docs
throw new CorruptIndexException("invalid docCount: " + docCount + " maxDoc: " + info.getDocCount() + " (resource=" + in + ")");
throw new CorruptIndexException("invalid docCount: " + docCount + " maxDoc: " + info.getDocCount(), in);
}
if (sumDocFreq < docCount) { // #postings must be >= #docs with field
throw new CorruptIndexException("invalid sumDocFreq: " + sumDocFreq + " docCount: " + docCount + " (resource=" + in + ")");
throw new CorruptIndexException("invalid sumDocFreq: " + sumDocFreq + " docCount: " + docCount, in);
}
if (sumTotalTermFreq != -1 && sumTotalTermFreq < sumDocFreq) { // #positions must be >= #postings
throw new CorruptIndexException("invalid sumTotalTermFreq: " + sumTotalTermFreq + " sumDocFreq: " + sumDocFreq + " (resource=" + in + ")");
throw new CorruptIndexException("invalid sumTotalTermFreq: " + sumTotalTermFreq + " sumDocFreq: " + sumDocFreq, in);
}
final long indexStartFP = indexIn.readVLong();
FieldReader previous = fields.put(fieldInfo.name,
new FieldReader(this, fieldInfo, numTerms, rootCode, sumTotalTermFreq, sumDocFreq, docCount,
indexStartFP, longsSize, indexIn, minTerm, maxTerm));
if (previous != null) {
throw new CorruptIndexException("duplicate field: " + fieldInfo.name + " (resource=" + in + ")");
throw new CorruptIndexException("duplicate field: " + fieldInfo.name, in);
}
}
indexIn.close();

View File

@ -22,7 +22,6 @@ import static org.apache.lucene.util.BitUtil.zigZagDecode;
import java.io.IOException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import org.apache.lucene.index.CorruptIndexException;
@ -85,7 +84,7 @@ public final class CompressingStoredFieldsIndexReader implements Cloneable, Acco
avgChunkDocs[blockCount] = fieldsIndexIn.readVInt();
final int bitsPerDocBase = fieldsIndexIn.readVInt();
if (bitsPerDocBase > 32) {
throw new CorruptIndexException("Corrupted bitsPerDocBase (resource=" + fieldsIndexIn + ")");
throw new CorruptIndexException("Corrupted bitsPerDocBase: " + bitsPerDocBase, fieldsIndexIn);
}
docBasesDeltas[blockCount] = PackedInts.getReaderNoHeader(fieldsIndexIn, PackedInts.Format.PACKED, packedIntsVersion, numChunks, bitsPerDocBase);
@ -94,7 +93,7 @@ public final class CompressingStoredFieldsIndexReader implements Cloneable, Acco
avgChunkSizes[blockCount] = fieldsIndexIn.readVLong();
final int bitsPerStartPointer = fieldsIndexIn.readVInt();
if (bitsPerStartPointer > 64) {
throw new CorruptIndexException("Corrupted bitsPerStartPointer (resource=" + fieldsIndexIn + ")");
throw new CorruptIndexException("Corrupted bitsPerStartPointer: " + bitsPerStartPointer, fieldsIndexIn);
}
startPointersDeltas[blockCount] = PackedInts.getReaderNoHeader(fieldsIndexIn, PackedInts.Format.PACKED, packedIntsVersion, numChunks, bitsPerStartPointer);

View File

@ -135,7 +135,7 @@ public final class CompressingStoredFieldsReader extends StoredFieldsReader {
fieldsStream = d.openInput(fieldsStreamFN, context);
if (version >= VERSION_CHECKSUM) {
if (maxPointer + CodecUtil.footerLength() != fieldsStream.length()) {
throw new CorruptIndexException("Invalid fieldsStream maxPointer (file truncated?): maxPointer=" + maxPointer + ", length=" + fieldsStream.length());
throw new CorruptIndexException("Invalid fieldsStream maxPointer (file truncated?): maxPointer=" + maxPointer + ", length=" + fieldsStream.length(), fieldsStream);
}
} else {
maxPointer = fieldsStream.length();
@ -144,7 +144,7 @@ public final class CompressingStoredFieldsReader extends StoredFieldsReader {
final String codecNameDat = formatName + CODEC_SFX_DAT;
final int fieldsVersion = CodecUtil.checkHeader(fieldsStream, codecNameDat, VERSION_START, VERSION_CURRENT);
if (version != fieldsVersion) {
throw new CorruptIndexException("Version mismatch between stored fields index and data: " + version + " != " + fieldsVersion);
throw new CorruptIndexException("Version mismatch between stored fields index and data: " + version + " != " + fieldsVersion, fieldsStream);
}
assert CodecUtil.headerLength(codecNameDat) == fieldsStream.getFilePointer();
@ -256,7 +256,7 @@ public final class CompressingStoredFieldsReader extends StoredFieldsReader {
|| docBase + chunkDocs > numDocs) {
throw new CorruptIndexException("Corrupted: docID=" + docID
+ ", docBase=" + docBase + ", chunkDocs=" + chunkDocs
+ ", numDocs=" + numDocs + " (resource=" + fieldsStream + ")");
+ ", numDocs=" + numDocs, fieldsStream);
}
final int numStoredFields, offset, length, totalLength;
@ -270,7 +270,7 @@ public final class CompressingStoredFieldsReader extends StoredFieldsReader {
if (bitsPerStoredFields == 0) {
numStoredFields = fieldsStream.readVInt();
} else if (bitsPerStoredFields > 31) {
throw new CorruptIndexException("bitsPerStoredFields=" + bitsPerStoredFields + " (resource=" + fieldsStream + ")");
throw new CorruptIndexException("bitsPerStoredFields=" + bitsPerStoredFields, fieldsStream);
} else {
final long filePointer = fieldsStream.getFilePointer();
final PackedInts.Reader reader = PackedInts.getDirectReaderNoHeader(fieldsStream, PackedInts.Format.PACKED, packedIntsVersion, chunkDocs, bitsPerStoredFields);
@ -284,7 +284,7 @@ public final class CompressingStoredFieldsReader extends StoredFieldsReader {
offset = (docID - docBase) * length;
totalLength = chunkDocs * length;
} else if (bitsPerStoredFields > 31) {
throw new CorruptIndexException("bitsPerLength=" + bitsPerLength + " (resource=" + fieldsStream + ")");
throw new CorruptIndexException("bitsPerLength=" + bitsPerLength, fieldsStream);
} else {
final PackedInts.ReaderIterator it = PackedInts.getReaderIteratorNoHeader(fieldsStream, PackedInts.Format.PACKED, packedIntsVersion, chunkDocs, bitsPerLength, 1);
int off = 0;
@ -302,7 +302,7 @@ public final class CompressingStoredFieldsReader extends StoredFieldsReader {
}
if ((length == 0) != (numStoredFields == 0)) {
throw new CorruptIndexException("length=" + length + ", numStoredFields=" + numStoredFields + " (resource=" + fieldsStream + ")");
throw new CorruptIndexException("length=" + length + ", numStoredFields=" + numStoredFields, fieldsStream);
}
if (numStoredFields == 0) {
// nothing to do
@ -450,7 +450,7 @@ public final class CompressingStoredFieldsReader extends StoredFieldsReader {
|| docBase + chunkDocs > numDocs) {
throw new CorruptIndexException("Corrupted: current docBase=" + this.docBase
+ ", current numDocs=" + this.chunkDocs + ", new docBase=" + docBase
+ ", new numDocs=" + chunkDocs + " (resource=" + fieldsStream + ")");
+ ", new numDocs=" + chunkDocs, fieldsStream);
}
this.docBase = docBase;
this.chunkDocs = chunkDocs;
@ -469,7 +469,7 @@ public final class CompressingStoredFieldsReader extends StoredFieldsReader {
if (bitsPerStoredFields == 0) {
Arrays.fill(numStoredFields, 0, chunkDocs, fieldsStream.readVInt());
} else if (bitsPerStoredFields > 31) {
throw new CorruptIndexException("bitsPerStoredFields=" + bitsPerStoredFields + " (resource=" + fieldsStream + ")");
throw new CorruptIndexException("bitsPerStoredFields=" + bitsPerStoredFields, fieldsStream);
} else {
final PackedInts.ReaderIterator it = PackedInts.getReaderIteratorNoHeader(fieldsStream, PackedInts.Format.PACKED, packedIntsVersion, chunkDocs, bitsPerStoredFields, 1);
for (int i = 0; i < chunkDocs; ++i) {
@ -481,7 +481,7 @@ public final class CompressingStoredFieldsReader extends StoredFieldsReader {
if (bitsPerLength == 0) {
Arrays.fill(lengths, 0, chunkDocs, fieldsStream.readVInt());
} else if (bitsPerLength > 31) {
throw new CorruptIndexException("bitsPerLength=" + bitsPerLength);
throw new CorruptIndexException("bitsPerLength=" + bitsPerLength, fieldsStream);
} else {
final PackedInts.ReaderIterator it = PackedInts.getReaderIteratorNoHeader(fieldsStream, PackedInts.Format.PACKED, packedIntsVersion, chunkDocs, bitsPerLength, 1);
for (int i = 0; i < chunkDocs; ++i) {
@ -511,7 +511,7 @@ public final class CompressingStoredFieldsReader extends StoredFieldsReader {
decompressor.decompress(fieldsStream, chunkSize, 0, chunkSize, bytes);
}
if (bytes.length != chunkSize) {
throw new CorruptIndexException("Corrupted: expected chunk size = " + chunkSize() + ", got " + bytes.length + " (resource=" + fieldsStream + ")");
throw new CorruptIndexException("Corrupted: expected chunk size = " + chunkSize() + ", got " + bytes.length, fieldsStream);
}
}

View File

@ -382,7 +382,7 @@ public final class CompressingStoredFieldsWriter extends StoredFieldsWriter {
// decompress
it.decompress();
if (startOffsets[it.chunkDocs - 1] + it.lengths[it.chunkDocs - 1] != it.bytes.length) {
throw new CorruptIndexException("Corrupted: expected chunk size=" + startOffsets[it.chunkDocs - 1] + it.lengths[it.chunkDocs - 1] + ", got " + it.bytes.length);
throw new CorruptIndexException("Corrupted: expected chunk size=" + startOffsets[it.chunkDocs - 1] + it.lengths[it.chunkDocs - 1] + ", got " + it.bytes.length, it.fieldsStream);
}
// copy non-deleted docs
for (; docID < it.docBase + it.chunkDocs; docID = nextLiveDoc(docID + 1, liveDocs, maxDoc)) {

View File

@ -131,7 +131,7 @@ public final class CompressingTermVectorsReader extends TermVectorsReader implem
final String codecNameDat = formatName + CODEC_SFX_DAT;
int version2 = CodecUtil.checkHeader(vectorsStream, codecNameDat, VERSION_START, VERSION_CURRENT);
if (version != version2) {
throw new CorruptIndexException("Version mismatch between stored fields index and data: " + version + " != " + version2);
throw new CorruptIndexException("Version mismatch between stored fields index and data: " + version + " != " + version2, vectorsStream);
}
assert CodecUtil.headerLength(codecNameDat) == vectorsStream.getFilePointer();
@ -220,7 +220,7 @@ public final class CompressingTermVectorsReader extends TermVectorsReader implem
final int docBase = vectorsStream.readVInt();
final int chunkDocs = vectorsStream.readVInt();
if (doc < docBase || doc >= docBase + chunkDocs || docBase + chunkDocs > numDocs) {
throw new CorruptIndexException("docBase=" + docBase + ",chunkDocs=" + chunkDocs + ",doc=" + doc + " (resource=" + vectorsStream + ")");
throw new CorruptIndexException("docBase=" + docBase + ",chunkDocs=" + chunkDocs + ",doc=" + doc, vectorsStream);
}
final int skip; // number of fields to skip

View File

@ -134,7 +134,7 @@ public abstract class CompressionMode {
}
final int decompressedLength = LZ4.decompress(in, offset + length, bytes.bytes, 0);
if (decompressedLength > originalLength) {
throw new CorruptIndexException("Corrupted: lengths mismatch: " + decompressedLength + " > " + originalLength + " (resource=" + in + ")");
throw new CorruptIndexException("Corrupted: lengths mismatch: " + decompressedLength + " > " + originalLength, in);
}
bytes.offset = offset;
bytes.length = length;
@ -222,7 +222,7 @@ public abstract class CompressionMode {
}
}
if (bytes.length != originalLength) {
throw new CorruptIndexException("Lengths mismatch: " + bytes.length + " != " + originalLength + " (resource=" + in + ")");
throw new CorruptIndexException("Lengths mismatch: " + bytes.length + " != " + originalLength, in);
}
bytes.offset = offset;
bytes.length = length;

View File

@ -29,7 +29,6 @@ import org.apache.lucene.store.IOContext;
import org.apache.lucene.store.IndexInput;
import org.apache.lucene.store.IndexOutput;
import org.apache.lucene.util.BitUtil;
import org.apache.lucene.util.IOUtils;
import org.apache.lucene.util.MutableBits;
/** Optimized implementation of a vector of bits. This is more-or-less like
@ -216,8 +215,7 @@ final class BitVector implements Cloneable, MutableBits {
#BitVector(Directory, String, IOContext)}. */
public final void write(Directory d, String name, IOContext context) throws IOException {
assert !(d instanceof CompoundFileDirectory);
IndexOutput output = d.createOutput(name, context);
try {
try (IndexOutput output = d.createOutput(name, context)) {
output.writeInt(-2);
CodecUtil.writeHeader(output, CODEC, VERSION_CURRENT);
if (isSparse()) {
@ -228,8 +226,6 @@ final class BitVector implements Cloneable, MutableBits {
}
CodecUtil.writeFooter(output);
assert verifyCount();
} finally {
IOUtils.close(output);
}
}
@ -330,9 +326,7 @@ final class BitVector implements Cloneable, MutableBits {
<code>d</code>, as written by the {@link #write} method.
*/
public BitVector(Directory d, String name, IOContext context) throws IOException {
ChecksumIndexInput input = d.openChecksumInput(name, context);
try {
try (ChecksumIndexInput input = d.openChecksumInput(name, context)) {
final int firstInt = input.readInt();
if (firstInt == -2) {
@ -363,8 +357,6 @@ final class BitVector implements Cloneable, MutableBits {
CodecUtil.checkEOF(input);
}
assert verifyCount();
} finally {
input.close();
}
}

View File

@ -90,10 +90,10 @@ public class Lucene40LiveDocsFormat extends LiveDocsFormat {
String filename = IndexFileNames.fileNameFromGeneration(info.info.name, DELETES_EXTENSION, info.getDelGen());
final BitVector liveDocs = new BitVector(dir, filename, context);
if (liveDocs.length() != info.info.getDocCount()) {
throw new CorruptIndexException("liveDocs.length()=" + liveDocs.length() + "info.docCount=" + info.info.getDocCount() + " (filename=" + filename + ")");
throw new CorruptIndexException("liveDocs.length()=" + liveDocs.length() + "info.docCount=" + info.info.getDocCount(), filename);
}
if (liveDocs.count() != info.info.getDocCount() - info.getDelCount()) {
throw new CorruptIndexException("liveDocs.count()=" + liveDocs.count() + " info.docCount=" + info.info.getDocCount() + " info.getDelCount()=" + info.getDelCount() + " (filename=" + filename + ")");
throw new CorruptIndexException("liveDocs.count()=" + liveDocs.count() + " info.docCount=" + info.info.getDocCount() + " info.getDelCount()=" + info.getDelCount(), filename);
}
return liveDocs;
}

View File

@ -281,7 +281,7 @@ public final class Lucene41PostingsWriter extends PushPostingsWriterBase {
final int docDelta = docID - lastDocID;
if (docID < 0 || (docCount > 0 && docDelta <= 0)) {
throw new CorruptIndexException("docs out of order (" + docID + " <= " + lastDocID + " ) (docOut: " + docOut + ")");
throw new CorruptIndexException("docs out of order (" + docID + " <= " + lastDocID + " )", docOut.toString());
}
docDeltaBuffer[docBufferUpto] = docDelta;

View File

@ -96,11 +96,10 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
/** expert: instantiates a new reader */
Lucene410DocValuesProducer(SegmentReadState state, String dataCodec, String dataExtension, String metaCodec, String metaExtension) throws IOException {
String metaName = IndexFileNames.segmentFileName(state.segmentInfo.name, state.segmentSuffix, metaExtension);
// read in the entries from the metadata file.
ChecksumIndexInput in = state.directory.openChecksumInput(metaName, state.context);
this.maxDoc = state.segmentInfo.getDocCount();
boolean success = false;
try {
// read in the entries from the metadata file.
try (ChecksumIndexInput in = state.directory.openChecksumInput(metaName, state.context)) {
version = CodecUtil.checkHeader(in, metaCodec,
Lucene410DocValuesFormat.VERSION_START,
Lucene410DocValuesFormat.VERSION_CURRENT);
@ -113,24 +112,17 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
numFields = readFields(in, state.fieldInfos);
CodecUtil.checkFooter(in);
success = true;
} finally {
if (success) {
IOUtils.close(in);
} else {
IOUtils.closeWhileHandlingException(in);
}
}
String dataName = IndexFileNames.segmentFileName(state.segmentInfo.name, state.segmentSuffix, dataExtension);
this.data = state.directory.openInput(dataName, state.context);
success = false;
boolean success = false;
try {
final int version2 = CodecUtil.checkHeader(data, dataCodec,
Lucene410DocValuesFormat.VERSION_START,
Lucene410DocValuesFormat.VERSION_CURRENT);
if (version != version2) {
throw new CorruptIndexException("Format versions mismatch");
throw new CorruptIndexException("Format versions mismatch: meta=" + version + ", data=" + version2, data);
}
// NOTE: data file is too costly to verify checksum against all the bytes on open,
@ -152,19 +144,19 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
private void readSortedField(FieldInfo info, IndexInput meta) throws IOException {
// sorted = binary + numeric
if (meta.readVInt() != info.number) {
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene410DocValuesFormat.BINARY) {
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt", meta);
}
BinaryEntry b = readBinaryEntry(meta);
binaries.put(info.name, b);
if (meta.readVInt() != info.number) {
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene410DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sorted entry for field: " + info.name + " is corrupt", meta);
}
NumericEntry n = readNumericEntry(meta);
ords.put(info.name, n);
@ -173,28 +165,28 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
private void readSortedSetFieldWithAddresses(FieldInfo info, IndexInput meta) throws IOException {
// sortedset = binary + numeric (addresses) + ordIndex
if (meta.readVInt() != info.number) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene410DocValuesFormat.BINARY) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
BinaryEntry b = readBinaryEntry(meta);
binaries.put(info.name, b);
if (meta.readVInt() != info.number) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene410DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
NumericEntry n1 = readNumericEntry(meta);
ords.put(info.name, n1);
if (meta.readVInt() != info.number) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene410DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
NumericEntry n2 = readNumericEntry(meta);
ordIndexes.put(info.name, n2);
@ -208,7 +200,7 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
FieldInfo info = infos.fieldInfo(fieldNumber);
if (info == null) {
// trickier to validate more: because we use multiple entries for "composite" types like sortedset, etc.
throw new CorruptIndexException("Invalid field number: " + fieldNumber + " (resource=" + meta + ")");
throw new CorruptIndexException("Invalid field number: " + fieldNumber, meta);
}
byte type = meta.readByte();
if (type == Lucene410DocValuesFormat.NUMERIC) {
@ -225,10 +217,10 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
readSortedSetFieldWithAddresses(info, meta);
} else if (ss.format == SORTED_SINGLE_VALUED) {
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene410DocValuesFormat.SORTED) {
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortedset entry for field: " + info.name + " is corrupt", meta);
}
readSortedField(info, meta);
} else {
@ -238,18 +230,18 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
SortedSetEntry ss = readSortedSetEntry(meta);
sortedNumerics.put(info.name, ss);
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene410DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt", meta);
}
numerics.put(info.name, readNumericEntry(meta));
if (ss.format == SORTED_WITH_ADDRESSES) {
if (meta.readVInt() != fieldNumber) {
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt", meta);
}
if (meta.readByte() != Lucene410DocValuesFormat.NUMERIC) {
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt (resource=" + meta + ")");
throw new CorruptIndexException("sortednumeric entry for field: " + info.name + " is corrupt", meta);
}
NumericEntry ordIndex = readNumericEntry(meta);
ordIndexes.put(info.name, ordIndex);
@ -257,7 +249,7 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
throw new AssertionError();
}
} else {
throw new CorruptIndexException("invalid type: " + type + ", resource=" + meta);
throw new CorruptIndexException("invalid type: " + type, meta);
}
fieldNumber = meta.readVInt();
}
@ -279,7 +271,7 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
case TABLE_COMPRESSED:
final int uniqueValues = meta.readVInt();
if (uniqueValues > 256) {
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, input=" + meta);
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, got=" + uniqueValues, meta);
}
entry.table = new long[uniqueValues];
for (int i = 0; i < uniqueValues; ++i) {
@ -296,7 +288,7 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
entry.blockSize = meta.readVInt();
break;
default:
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=", meta);
}
entry.endOffset = meta.readLong();
return entry;
@ -325,7 +317,7 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
entry.blockSize = meta.readVInt();
break;
default:
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format, meta);
}
return entry;
}
@ -334,7 +326,7 @@ class Lucene410DocValuesProducer extends DocValuesProducer implements Closeable
SortedSetEntry entry = new SortedSetEntry();
entry.format = meta.readVInt();
if (entry.format != SORTED_SINGLE_VALUED && entry.format != SORTED_WITH_ADDRESSES) {
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format, meta);
}
return entry;
}

View File

@ -33,7 +33,6 @@ import org.apache.lucene.store.ChecksumIndexInput;
import org.apache.lucene.store.Directory;
import org.apache.lucene.store.IOContext;
import org.apache.lucene.store.IndexInput;
import org.apache.lucene.util.IOUtils;
/**
* Lucene 4.6 FieldInfos reader.
@ -50,10 +49,7 @@ final class Lucene46FieldInfosReader extends FieldInfosReader {
@Override
public FieldInfos read(Directory directory, String segmentName, String segmentSuffix, IOContext context) throws IOException {
final String fileName = IndexFileNames.segmentFileName(segmentName, segmentSuffix, Lucene46FieldInfosFormat.EXTENSION);
ChecksumIndexInput input = directory.openChecksumInput(fileName, context);
boolean success = false;
try {
try (ChecksumIndexInput input = directory.openChecksumInput(fileName, context)) {
int codecVersion = CodecUtil.checkHeader(input, Lucene46FieldInfosFormat.CODEC_NAME,
Lucene46FieldInfosFormat.FORMAT_START,
Lucene46FieldInfosFormat.FORMAT_CURRENT);
@ -65,7 +61,7 @@ final class Lucene46FieldInfosReader extends FieldInfosReader {
String name = input.readString();
final int fieldNumber = input.readVInt();
if (fieldNumber < 0) {
throw new CorruptIndexException("invalid field number for field: " + name + ", fieldNumber=" + fieldNumber + " (resource=" + input + ")");
throw new CorruptIndexException("invalid field number for field: " + name + ", fieldNumber=" + fieldNumber, input);
}
byte bits = input.readByte();
boolean isIndexed = (bits & Lucene46FieldInfosFormat.IS_INDEXED) != 0;
@ -100,15 +96,7 @@ final class Lucene46FieldInfosReader extends FieldInfosReader {
} else {
CodecUtil.checkEOF(input);
}
FieldInfos fieldInfos = new FieldInfos(infos);
success = true;
return fieldInfos;
} finally {
if (success) {
input.close();
} else {
IOUtils.closeWhileHandlingException(input);
}
return new FieldInfos(infos);
}
}
@ -126,7 +114,7 @@ final class Lucene46FieldInfosReader extends FieldInfosReader {
} else if (b == 5) {
return DocValuesType.SORTED_NUMERIC;
} else {
throw new CorruptIndexException("invalid docvalues byte: " + b + " (resource=" + input + ")");
throw new CorruptIndexException("invalid docvalues byte: " + b, input);
}
}
}

View File

@ -29,7 +29,6 @@ import org.apache.lucene.index.IndexFileNames;
import org.apache.lucene.store.IndexOutput;
import org.apache.lucene.store.Directory;
import org.apache.lucene.store.IOContext;
import org.apache.lucene.util.IOUtils;
/**
* Lucene 4.6 FieldInfos writer.
@ -46,9 +45,7 @@ final class Lucene46FieldInfosWriter extends FieldInfosWriter {
@Override
public void write(Directory directory, String segmentName, String segmentSuffix, FieldInfos infos, IOContext context) throws IOException {
final String fileName = IndexFileNames.segmentFileName(segmentName, segmentSuffix, Lucene46FieldInfosFormat.EXTENSION);
IndexOutput output = directory.createOutput(fileName, context);
boolean success = false;
try {
try (IndexOutput output = directory.createOutput(fileName, context)) {
CodecUtil.writeHeader(output, Lucene46FieldInfosFormat.CODEC_NAME, Lucene46FieldInfosFormat.FORMAT_CURRENT);
output.writeVInt(infos.size());
for (FieldInfo fi : infos) {
@ -82,13 +79,6 @@ final class Lucene46FieldInfosWriter extends FieldInfosWriter {
output.writeStringStringMap(fi.attributes());
}
CodecUtil.writeFooter(output);
success = true;
} finally {
if (success) {
output.close();
} else {
IOUtils.closeWhileHandlingException(output);
}
}
}

View File

@ -29,7 +29,6 @@ import org.apache.lucene.index.SegmentInfo;
import org.apache.lucene.store.ChecksumIndexInput;
import org.apache.lucene.store.Directory;
import org.apache.lucene.store.IOContext;
import org.apache.lucene.util.IOUtils;
import org.apache.lucene.util.Version;
/**
@ -47,16 +46,14 @@ public class Lucene46SegmentInfoReader extends SegmentInfoReader {
@Override
public SegmentInfo read(Directory dir, String segment, IOContext context) throws IOException {
final String fileName = IndexFileNames.segmentFileName(segment, "", Lucene46SegmentInfoFormat.SI_EXTENSION);
final ChecksumIndexInput input = dir.openChecksumInput(fileName, context);
boolean success = false;
try {
try (ChecksumIndexInput input = dir.openChecksumInput(fileName, context)) {
int codecVersion = CodecUtil.checkHeader(input, Lucene46SegmentInfoFormat.CODEC_NAME,
Lucene46SegmentInfoFormat.VERSION_START,
Lucene46SegmentInfoFormat.VERSION_CURRENT);
final Version version = Version.parse(input.readString());
final int docCount = input.readInt();
if (docCount < 0) {
throw new CorruptIndexException("invalid docCount: " + docCount + " (resource=" + input + ")");
throw new CorruptIndexException("invalid docCount: " + docCount, input);
}
final boolean isCompoundFile = input.readByte() == SegmentInfo.YES;
final Map<String,String> diagnostics = input.readStringStringMap();
@ -78,16 +75,7 @@ public class Lucene46SegmentInfoReader extends SegmentInfoReader {
final SegmentInfo si = new SegmentInfo(dir, version, segment, docCount, isCompoundFile, null, diagnostics, id);
si.setFiles(files);
success = true;
return si;
} finally {
if (!success) {
IOUtils.closeWhileHandlingException(input);
} else {
input.close();
}
}
}
}

View File

@ -67,30 +67,22 @@ class Lucene49NormsProducer extends NormsProducer {
Lucene49NormsProducer(SegmentReadState state, String dataCodec, String dataExtension, String metaCodec, String metaExtension) throws IOException {
maxDoc = state.segmentInfo.getDocCount();
String metaName = IndexFileNames.segmentFileName(state.segmentInfo.name, state.segmentSuffix, metaExtension);
// read in the entries from the metadata file.
ChecksumIndexInput in = state.directory.openChecksumInput(metaName, state.context);
boolean success = false;
ramBytesUsed = new AtomicLong(RamUsageEstimator.shallowSizeOfInstance(getClass()));
try {
// read in the entries from the metadata file.
try (ChecksumIndexInput in = state.directory.openChecksumInput(metaName, state.context)) {
version = CodecUtil.checkHeader(in, metaCodec, VERSION_START, VERSION_CURRENT);
readFields(in, state.fieldInfos);
CodecUtil.checkFooter(in);
success = true;
} finally {
if (success) {
IOUtils.close(in);
} else {
IOUtils.closeWhileHandlingException(in);
}
}
String dataName = IndexFileNames.segmentFileName(state.segmentInfo.name, state.segmentSuffix, dataExtension);
this.data = state.directory.openInput(dataName, state.context);
success = false;
boolean success = false;
try {
final int version2 = CodecUtil.checkHeader(data, dataCodec, VERSION_START, VERSION_CURRENT);
if (version != version2) {
throw new CorruptIndexException("Format versions mismatch");
throw new CorruptIndexException("Format versions mismatch: meta=" + version + ",data=" + version2, data);
}
// NOTE: data file is too costly to verify checksum against all the bytes on open,
@ -112,9 +104,9 @@ class Lucene49NormsProducer extends NormsProducer {
while (fieldNumber != -1) {
FieldInfo info = infos.fieldInfo(fieldNumber);
if (info == null) {
throw new CorruptIndexException("Invalid field number: " + fieldNumber + " (resource=" + meta + ")");
throw new CorruptIndexException("Invalid field number: " + fieldNumber, meta);
} else if (!info.hasNorms()) {
throw new CorruptIndexException("Invalid field: " + info.name + " (resource=" + meta + ")");
throw new CorruptIndexException("Invalid field: " + info.name, meta);
}
NormsEntry entry = new NormsEntry();
entry.format = meta.readByte();
@ -126,7 +118,7 @@ class Lucene49NormsProducer extends NormsProducer {
case DELTA_COMPRESSED:
break;
default:
throw new CorruptIndexException("Unknown format: " + entry.format + ", input=" + meta);
throw new CorruptIndexException("Unknown format: " + entry.format, meta);
}
norms.put(info.name, entry);
fieldNumber = meta.readVInt();
@ -197,7 +189,7 @@ class Lucene49NormsProducer extends NormsProducer {
int packedVersion = data.readVInt();
int size = data.readVInt();
if (size > 256) {
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, input=" + data);
throw new CorruptIndexException("TABLE_COMPRESSED cannot have more than 256 distinct values, got=" + size, data);
}
final long decode[] = new long[size];
for (int i = 0; i < decode.length; i++) {

View File

@ -18,6 +18,9 @@ package org.apache.lucene.index;
*/
import java.io.IOException;
import java.util.Objects;
import org.apache.lucene.store.DataInput;
/**
* This exception is thrown when Lucene detects
@ -25,12 +28,22 @@ import java.io.IOException;
*/
public class CorruptIndexException extends IOException {
/** Create exception with a message only */
public CorruptIndexException(String message) {
super(message);
public CorruptIndexException(String message, DataInput input) {
this(message, input, null);
}
/** Create exception with message and root cause. */
public CorruptIndexException(String message, Throwable cause) {
super(message, cause);
public CorruptIndexException(String message, DataInput input, Throwable cause) {
this(message, Objects.toString(input), cause);
}
/** Create exception with a message only */
public CorruptIndexException(String message, String resourceDescription) {
this(message, resourceDescription, null);
}
/** Create exception with message and root cause. */
public CorruptIndexException(String message, String resourceDescription, Throwable cause) {
super(message + " (resource=" + resourceDescription + ")", cause);
}
}

View File

@ -217,7 +217,7 @@ final class IndexFileDeleter implements Closeable {
try {
sis.read(directory, currentSegmentsFile);
} catch (IOException e) {
throw new CorruptIndexException("failed to locate current segments_N file \"" + currentSegmentsFile + "\"");
throw new CorruptIndexException("unable to read current segments_N file", currentSegmentsFile, e);
}
if (infoStream.isEnabled("IFD")) {
infoStream.message("IFD", "forced open of current segments file " + segmentInfos.getSegmentsFileName());

View File

@ -34,8 +34,8 @@ public class IndexFormatTooNewException extends CorruptIndexException {
*
* @lucene.internal */
public IndexFormatTooNewException(String resourceDesc, int version, int minVersion, int maxVersion) {
super("Format version is not supported (resource: " + resourceDesc + "): "
+ version + " (needs to be between " + minVersion + " and " + maxVersion + ")");
super("Format version is not supported: "
+ version + " (needs to be between " + minVersion + " and " + maxVersion + ")", resourceDesc);
assert resourceDesc != null;
}

View File

@ -32,8 +32,8 @@ public class IndexFormatTooOldException extends CorruptIndexException {
*
* @lucene.internal */
public IndexFormatTooOldException(String resourceDesc, String version) {
super("Format version is not supported (resource: " + resourceDesc + "): " +
version + ". This version of Lucene only supports indexes created with release 4.0 and later.");
super("Format version is not supported: " +
version + ". This version of Lucene only supports indexes created with release 4.0 and later.", resourceDesc);
assert resourceDesc != null;
}
@ -56,9 +56,9 @@ public class IndexFormatTooOldException extends CorruptIndexException {
*
* @lucene.internal */
public IndexFormatTooOldException(String resourceDesc, int version, int minVersion, int maxVersion) {
super("Format version is not supported (resource: " + resourceDesc + "): " +
super("Format version is not supported: " +
version + " (needs to be between " + minVersion + " and " + maxVersion +
"). This version of Lucene only supports indexes created with release 4.0 and later.");
"). This version of Lucene only supports indexes created with release 4.0 and later.", resourceDesc);
assert resourceDesc != null;
}

View File

@ -19,7 +19,6 @@ package org.apache.lucene.index;
import java.io.IOException;
import java.io.PrintStream;
import java.nio.file.NoSuchFileException;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Collection;
@ -302,7 +301,7 @@ public final class SegmentInfos implements Cloneable, Iterable<SegmentCommitInfo
counter = input.readInt();
int numSegments = input.readInt();
if (numSegments < 0) {
throw new CorruptIndexException("invalid segment count: " + numSegments + " (resource: " + input + ")");
throw new CorruptIndexException("invalid segment count: " + numSegments, input);
}
for (int seg = 0; seg < numSegments; seg++) {
String segName = input.readString();
@ -313,7 +312,7 @@ public final class SegmentInfos implements Cloneable, Iterable<SegmentCommitInfo
long delGen = input.readLong();
int delCount = input.readInt();
if (delCount < 0 || delCount > info.getDocCount()) {
throw new CorruptIndexException("invalid deletion count: " + delCount + " vs docCount=" + info.getDocCount() + " (resource: " + input + ")");
throw new CorruptIndexException("invalid deletion count: " + delCount + " vs docCount=" + info.getDocCount(), input);
}
long fieldInfosGen = -1;
if (format >= VERSION_46) {
@ -372,7 +371,8 @@ public final class SegmentInfos implements Cloneable, Iterable<SegmentCommitInfo
final long checksumNow = input.getChecksum();
final long checksumThen = input.readLong();
if (checksumNow != checksumThen) {
throw new CorruptIndexException("checksum mismatch in segments file (resource: " + input + ")");
throw new CorruptIndexException("checksum failed (hardware problem?) : expected=" + Long.toHexString(checksumThen) +
" actual=" + Long.toHexString(checksumNow), input);
}
CodecUtil.checkEOF(input);
}

View File

@ -147,7 +147,7 @@ public final class CompoundFileDirectory extends BaseDirectory {
final String id = entriesStream.readString();
FileEntry previous = mapping.put(id, fileEntry);
if (previous != null) {
throw new CorruptIndexException("Duplicate cfs entry id=" + id + " in CFS: " + entriesStream);
throw new CorruptIndexException("Duplicate cfs entry id=" + id + " in CFS ", entriesStream);
}
fileEntry.offset = entriesStream.readLong();
fileEntry.length = entriesStream.readLong();

View File

@ -135,26 +135,26 @@ class TaxonomyIndexArrays extends ParallelTaxonomyArrays {
// shouldn't really happen, if it does, something's wrong
if (positions == null || positions.advance(first) == DocIdSetIterator.NO_MORE_DOCS) {
throw new CorruptIndexException("Missing parent data for category " + first);
throw new CorruptIndexException("Missing parent data for category " + first, reader.toString());
}
int num = reader.maxDoc();
for (int i = first; i < num; i++) {
if (positions.docID() == i) {
if (positions.freq() == 0) { // shouldn't happen
throw new CorruptIndexException("Missing parent data for category " + i);
throw new CorruptIndexException("Missing parent data for category " + i, reader.toString());
}
parents[i] = positions.nextPosition();
if (positions.nextDoc() == DocIdSetIterator.NO_MORE_DOCS) {
if (i + 1 < num) {
throw new CorruptIndexException("Missing parent data for category "+ (i + 1));
throw new CorruptIndexException("Missing parent data for category "+ (i + 1), reader.toString());
}
break;
}
} else { // this shouldn't happen
throw new CorruptIndexException("Missing parent data for category " + i);
throw new CorruptIndexException("Missing parent data for category " + i, reader.toString());
}
}
}

View File

@ -92,7 +92,7 @@ public final class VersionBlockTreeTermsReader extends FieldsProducer {
ioContext);
int indexVersion = readIndexHeader(indexIn);
if (indexVersion != version) {
throw new CorruptIndexException("mixmatched version files: " + in + "=" + version + "," + indexIn + "=" + indexVersion);
throw new CorruptIndexException("mixmatched version files: " + in + "=" + version + "," + indexIn + "=" + indexVersion, indexIn);
}
// verify
@ -113,7 +113,7 @@ public final class VersionBlockTreeTermsReader extends FieldsProducer {
final int numFields = in.readVInt();
if (numFields < 0) {
throw new CorruptIndexException("invalid numFields: " + numFields + " (resource=" + in + ")");
throw new CorruptIndexException("invalid numFields: " + numFields, in);
}
for(int i=0;i<numFields;i++) {
@ -137,20 +137,20 @@ public final class VersionBlockTreeTermsReader extends FieldsProducer {
BytesRef minTerm = readBytesRef(in);
BytesRef maxTerm = readBytesRef(in);
if (docCount < 0 || docCount > info.getDocCount()) { // #docs with field must be <= #docs
throw new CorruptIndexException("invalid docCount: " + docCount + " maxDoc: " + info.getDocCount() + " (resource=" + in + ")");
throw new CorruptIndexException("invalid docCount: " + docCount + " maxDoc: " + info.getDocCount(), in);
}
if (sumDocFreq < docCount) { // #postings must be >= #docs with field
throw new CorruptIndexException("invalid sumDocFreq: " + sumDocFreq + " docCount: " + docCount + " (resource=" + in + ")");
throw new CorruptIndexException("invalid sumDocFreq: " + sumDocFreq + " docCount: " + docCount, in);
}
if (sumTotalTermFreq != -1 && sumTotalTermFreq < sumDocFreq) { // #positions must be >= #postings
throw new CorruptIndexException("invalid sumTotalTermFreq: " + sumTotalTermFreq + " sumDocFreq: " + sumDocFreq + " (resource=" + in + ")");
throw new CorruptIndexException("invalid sumTotalTermFreq: " + sumTotalTermFreq + " sumDocFreq: " + sumDocFreq, in);
}
final long indexStartFP = indexIn.readVLong();
VersionFieldReader previous = fields.put(fieldInfo.name,
new VersionFieldReader(this, fieldInfo, numTerms, rootCode, sumTotalTermFreq, sumDocFreq, docCount,
indexStartFP, longsSize, indexIn, minTerm, maxTerm));
if (previous != null) {
throw new CorruptIndexException("duplicate field: " + fieldInfo.name + " (resource=" + in + ")");
throw new CorruptIndexException("duplicate field: " + fieldInfo.name, in);
}
}
indexIn.close();