Introduced in SOLR-14033 by including
commons-compress as a compile time
dependency in Solr core instead of as
as test only dependency.
Signed-off-by: Kevin Risden <krisden@apache.org>
* Better logging and error reporting
* Fixing deploy command to handle previously undeployed packages
* Test now uses @LogLevel annotation
* Deploy command had a hard coded collection name by mistake, fix it
* split "resource-and-plugin-loading.adoc" into "resource-loading.adoc" and "libs.adoc" then overhauled both.
* enhanced "config-sets.adoc", moving some content in from elsewhere; bit of an overhaul.
* solr-plugins.adoc is now top-level; overhauled content
* Move resource-loading.adoc up a level in the TOC to underneath "The Well-Configured Solr Instance.
* Separate out the leading sentence.
The default configset no longer has the following:
- Library inclusions (<lib ../>) for extraction, solr-cell libs, clustering, velocity and language identifier
- /browse, /tvrh and /update/extract handlers
- TermVector component (if someone wants it, can be added using config APIs)
- XSLT response writer
- Velocity response writer
If you want to use them in your collections, please add them to your configset manually or through the Config APIs.
* Using collapse with grouping would cause inconsistent behavior.
This is because grouping calls the same postfilter twice without
resetting the internal state of the DocValues cache
* Using expand with grouping would cause NPE
Solr tests now have a similar policy to Lucene, loopback use only. If a
test tries to resolve or connect to the internet, it will get SecurityException.
Some solr tests explicitly try to talk to dead nodes with real
networking. This is not good and asking for trouble, but use low loopback port numbers instead of
multicast addresses. The idea is that it fails faster. Move these to
constants so that stuff isn't copy-pasted everywhere, in case we have to
do something different later.
This removes the Solr security manager hacks
for Hadoop. It does so by:
* Using a fake group mapping class instead of ShellGroupMapping
* Copies a few Hadoop classes and modifies them for tests with no Shell
* Nulls out some of the static variables in the tests
The Hadoop files were copied from Apache Hadoop 3.2.0
and copied to the test package to be only picked up
during tests. They were modified to remove the need to
shell out for access. The assumption is that these
HDFS integration tests only run on Unix based systems
and therefore Windows compatibility was removed in some
of the modified classes. The long term goal is to remove
these custom Hadoop classes. All the copied classes are
in the org.apache.hadoop package.
Signed-off-by: Kevin Risden <krisden@apache.org>
There are applications of ExpandComponent that intentionally do not
involve prior collapsing of results on the expand field, which can lead
to an NPE in expand component when expand.field (for matched docs) has
fewer unique values than the number of matched docs.
This commit refines the approach taken in SOLR-13877, which addressed
the same underlying issue.
* rename stdDev, variance methods to reflect the functionality
* add util functions to compute corrected stdDev and variance
* use DocValuesIterator#advanceExact to check if values exists for the doc
whoami displays a warning if the effective-uid is not in /etc/password.
This can happen in certain situations when running in a docker
container. This replaces the 'whoami' usage with a safer check.
The solr permissions are weak sauce due to the huge number of features, third-party dependencies, etc.
Hence they have access to do many things. For "scripting" such as velocity we have to look at a more aggressive stance:
Step 1: Can we wrap a sandbox around the whole goddamn thing and call it a day?
Step 2: Let's separate the "engine" from "untrusted code" and only be an asshole to the latter.
Step 3: Java's security is shit, Lets contain that classloader and whitelist access.
Restrict this to only minimal paths like lucene. It is the defense for directory traversal attacks.
It will also help find bad bugs where things are reading filesystem in the wrong locations.
On windows, these objects can't be inspected due to security restrictions. So the test runner fails the tests since it does not know how big the leak is.
Unfortunately, as a first start this is very weak protection against
e.g. XSS. This is because some 'unsafe-xxx' rules must be present due
to the insecurity of angular JS: Until SOLR-13987 is fixed, XSS & co are
still easy.