Merge branch 'master' into feature/autoscaling

# Conflicts:
#	solr/CHANGES.txt
#	solr/core/src/java/org/apache/solr/cloud/ZkController.java
This commit is contained in:
Shalin Shekhar Mangar 2017-06-27 10:48:05 +05:30
commit e863d0f548
195 changed files with 2350 additions and 10756 deletions

View File

@ -111,6 +111,11 @@ Upgrading from Solr 6.x
* The unused 'valType' option has been removed from ExternalFileField, if you have this in your schema you * The unused 'valType' option has been removed from ExternalFileField, if you have this in your schema you
can safely remove it. see SOLR-10929 for more details. can safely remove it. see SOLR-10929 for more details.
* SOLR-10574: basic_configs and data_driven_schema_configs have now been merged into _default. It has data driven nature
enabled by default, and can be turned off (after creating a collection) with:
curl http://host:8983/solr/mycollection/config -d '{"set-user-property": {"update.autoCreateFields":"false"}}'
Please see SOLR-10574 for details.
New Features New Features
---------------------- ----------------------
* SOLR-9857, SOLR-9858: Collect aggregated metrics from nodes and shard leaders in overseer. (ab) * SOLR-9857, SOLR-9858: Collect aggregated metrics from nodes and shard leaders in overseer. (ab)
@ -175,6 +180,11 @@ New Features
* SOLR-10406: v2 API error messages list the URL request path as /solr/____v2/... when the original path was /v2/... (Cao Manh Dat, noble) * SOLR-10406: v2 API error messages list the URL request path as /solr/____v2/... when the original path was /v2/... (Cao Manh Dat, noble)
* SOLR-10574: New _default config set replacing basic_configs and data_driven_schema_configs.
(Ishan Chattopadhyaya, noble, shalin, hossman, David Smiley, Jan Hoydahl, Alexandre Rafalovich)
* SOLR-10272: Use _default config set if no collection.configName is specified with CREATE (Ishan Chattopadhyaya)
* SOLR-10496: New ComputePlanAction for autoscaling which uses the policy framework to compute cluster * SOLR-10496: New ComputePlanAction for autoscaling which uses the policy framework to compute cluster
operations upon a trigger fire. (Noble Paul, shalin) operations upon a trigger fire. (Noble Paul, shalin)
@ -212,6 +222,8 @@ Bug Fixes
thus disabling the global check, and replaces it with specific checks where desired via thus disabling the global check, and replaces it with specific checks where desired via
QueryUtils.build(). (yonik) QueryUtils.build(). (yonik)
* SOLR-10948: Fix extraction component to treat DatePointField the same as TrieDateField (hossman)
* SOLR-10602: Triggers should be able to restore state from old instances when taking over. (shalin) * SOLR-10602: Triggers should be able to restore state from old instances when taking over. (shalin)
* SOLR-10714: OverseerTriggerThread does not start triggers on overseer start until autoscaling * SOLR-10714: OverseerTriggerThread does not start triggers on overseer start until autoscaling
@ -239,12 +251,6 @@ Optimizations
* SOLR-10727: Avoid polluting the filter cache for certain types of faceting (typically ranges) when * SOLR-10727: Avoid polluting the filter cache for certain types of faceting (typically ranges) when
the base docset is empty. (David Smiley) the base docset is empty. (David Smiley)
* SOLR-9981: Performance improvements and bug fixes for the Analytics component. Performance fix that
stops the reading of ALL lucene segments over and again for each stats collector. The AtomicReaderContext
that refers to the "current " segment is reused. This fix shows an improvement of about 25% in query
time for a dataset of ~10M (=9.8M) records. Given the nature of the fix, the improvement should get
better as the dataset increases. Fix for the NPE during comparison (Houston Putman)
Other Changes Other Changes
---------------------- ----------------------
* SOLR-10236: Removed FieldType.getNumericType(). Use getNumberType() instead. (Tomás Fernández Löbbe) * SOLR-10236: Removed FieldType.getNumericType(). Use getNumberType() instead. (Tomás Fernández Löbbe)
@ -350,6 +356,12 @@ Other Changes
increase the visibility of builder elements to be protected so extending the builder, and the clients is possible. increase the visibility of builder elements to be protected so extending the builder, and the clients is possible.
(Jason Gerlowski, Anshum Gupta) (Jason Gerlowski, Anshum Gupta)
* SOLR-10823: Add reporting period to SolrMetricReporter base class. (Christine Poerschke)
* SOLR-10807: Randomize Points based numeric field types in (more) test schemas
- SOLR-10946: Randomize the usage of Points based numerics in solrj test schemas (hossman)
- SOLR-10947: Randomize the usage of Points based numerics in contrib test schemas (hossman)
================== 6.7.0 ================== ================== 6.7.0 ==================
Consult the LUCENE_CHANGES.txt file for additional, low level, changes in this release. Consult the LUCENE_CHANGES.txt file for additional, low level, changes in this release.

View File

@ -419,12 +419,11 @@ function print_usage() {
echo "" echo ""
echo " -d <confdir> Configuration directory to copy when creating the new core, built-in options are:" echo " -d <confdir> Configuration directory to copy when creating the new core, built-in options are:"
echo "" echo ""
echo " basic_configs: Minimal Solr configuration" echo " _default: Minimal configuration, which supports enabling/disabling field-guessing support"
echo " data_driven_schema_configs: Managed schema with field-guessing support enabled"
echo " sample_techproducts_configs: Example configuration with many optional features enabled to" echo " sample_techproducts_configs: Example configuration with many optional features enabled to"
echo " demonstrate the full power of Solr" echo " demonstrate the full power of Solr"
echo "" echo ""
echo " If not specified, default is: data_driven_schema_configs" echo " If not specified, default is: _default"
echo "" echo ""
echo " Alternatively, you can pass the path to your own configuration directory instead of using" echo " Alternatively, you can pass the path to your own configuration directory instead of using"
echo " one of the built-in configurations, such as: bin/solr create_core -c mycore -d /tmp/myconfig" echo " one of the built-in configurations, such as: bin/solr create_core -c mycore -d /tmp/myconfig"
@ -441,12 +440,11 @@ function print_usage() {
echo "" echo ""
echo " -d <confdir> Configuration directory to copy when creating the new collection, built-in options are:" echo " -d <confdir> Configuration directory to copy when creating the new collection, built-in options are:"
echo "" echo ""
echo " basic_configs: Minimal Solr configuration" echo " _default: Minimal configuration, which supports enabling/disabling field-guessing support"
echo " data_driven_schema_configs: Managed schema with field-guessing support enabled"
echo " sample_techproducts_configs: Example configuration with many optional features enabled to" echo " sample_techproducts_configs: Example configuration with many optional features enabled to"
echo " demonstrate the full power of Solr" echo " demonstrate the full power of Solr"
echo "" echo ""
echo " If not specified, default is: data_driven_schema_configs" echo " If not specified, default is: _default"
echo "" echo ""
echo " Alternatively, you can pass the path to your own configuration directory instead of using" echo " Alternatively, you can pass the path to your own configuration directory instead of using"
echo " one of the built-in configurations, such as: bin/solr create_collection -c mycoll -d /tmp/myconfig" echo " one of the built-in configurations, such as: bin/solr create_collection -c mycoll -d /tmp/myconfig"
@ -934,15 +932,13 @@ if [[ "$SCRIPT_CMD" == "create" || "$SCRIPT_CMD" == "create_core" || "$SCRIPT_CM
done done
fi fi
if [ -z "$CREATE_CONFDIR" ]; then # validate the confdir arg (if provided)
CREATE_CONFDIR='data_driven_schema_configs' if ! [ -z "$CREATE_CONFDIR" ]; then
fi
# validate the confdir arg
if [[ ! -d "$SOLR_TIP/server/solr/configsets/$CREATE_CONFDIR" && ! -d "$CREATE_CONFDIR" ]]; then if [[ ! -d "$SOLR_TIP/server/solr/configsets/$CREATE_CONFDIR" && ! -d "$CREATE_CONFDIR" ]]; then
echo -e "\nSpecified configuration directory $CREATE_CONFDIR not found!\n" echo -e "\nSpecified configuration directory $CREATE_CONFDIR not found!\n"
exit 1 exit 1
fi fi
fi
if [ -z "$CREATE_NAME" ]; then if [ -z "$CREATE_NAME" ]; then
echo "Name (-c) argument is required!" echo "Name (-c) argument is required!"
@ -950,11 +946,6 @@ if [[ "$SCRIPT_CMD" == "create" || "$SCRIPT_CMD" == "create_core" || "$SCRIPT_CM
exit 1 exit 1
fi fi
# If not defined, use the collection name for the name of the configuration in Zookeeper
if [ -z "$CREATE_CONFNAME" ]; then
CREATE_CONFNAME="$CREATE_NAME"
fi
if [ -z "$CREATE_PORT" ]; then if [ -z "$CREATE_PORT" ]; then
for ID in `ps auxww | grep java | grep start\.jar | awk '{print $2}' | sort -r` for ID in `ps auxww | grep java | grep start\.jar | awk '{print $2}' | sort -r`
do do
@ -1663,6 +1654,11 @@ else
fi fi
fi fi
# Set the default configset dir to be bootstrapped as _default
if [ -z "$DEFAULT_CONFDIR" ]; then
DEFAULT_CONFDIR="$SOLR_SERVER_DIR/solr/configsets/_default/conf"
fi
# This is quite hacky, but examples rely on a different log4j.properties # This is quite hacky, but examples rely on a different log4j.properties
# so that we can write logs for examples to $SOLR_HOME/../logs # so that we can write logs for examples to $SOLR_HOME/../logs
if [ -z "$SOLR_LOGS_DIR" ]; then if [ -z "$SOLR_LOGS_DIR" ]; then
@ -1913,7 +1909,7 @@ function launch_solr() {
"-Djetty.port=$SOLR_PORT" "-DSTOP.PORT=$stop_port" "-DSTOP.KEY=$STOP_KEY" \ "-Djetty.port=$SOLR_PORT" "-DSTOP.PORT=$stop_port" "-DSTOP.KEY=$STOP_KEY" \
"${SOLR_HOST_ARG[@]}" "-Duser.timezone=$SOLR_TIMEZONE" \ "${SOLR_HOST_ARG[@]}" "-Duser.timezone=$SOLR_TIMEZONE" \
"-Djetty.home=$SOLR_SERVER_DIR" "-Dsolr.solr.home=$SOLR_HOME" "-Dsolr.data.home=$SOLR_DATA_HOME" "-Dsolr.install.dir=$SOLR_TIP" \ "-Djetty.home=$SOLR_SERVER_DIR" "-Dsolr.solr.home=$SOLR_HOME" "-Dsolr.data.home=$SOLR_DATA_HOME" "-Dsolr.install.dir=$SOLR_TIP" \
"${LOG4J_CONFIG[@]}" "${SOLR_OPTS[@]}") "-Dsolr.default.confdir=$DEFAULT_CONFDIR" "${LOG4J_CONFIG[@]}" "${SOLR_OPTS[@]}")
if [ "$SOLR_MODE" == "solrcloud" ]; then if [ "$SOLR_MODE" == "solrcloud" ]; then
IN_CLOUD_MODE=" in SolrCloud mode" IN_CLOUD_MODE=" in SolrCloud mode"

View File

@ -404,12 +404,11 @@ echo -c name Name of core to create
echo. echo.
echo -d confdir Configuration directory to copy when creating the new core, built-in options are: echo -d confdir Configuration directory to copy when creating the new core, built-in options are:
echo. echo.
echo basic_configs: Minimal Solr configuration echo _default: Minimal configuration, which supports enabling/disabling field-guessing support
echo data_driven_schema_configs: Managed schema with field-guessing support enabled
echo sample_techproducts_configs: Example configuration with many optional features enabled to echo sample_techproducts_configs: Example configuration with many optional features enabled to
echo demonstrate the full power of Solr echo demonstrate the full power of Solr
echo. echo.
echo If not specified, default is: data_driven_schema_configs echo If not specified, default is: _default
echo. echo.
echo Alternatively, you can pass the path to your own configuration directory instead of using echo Alternatively, you can pass the path to your own configuration directory instead of using
echo one of the built-in configurations, such as: bin\solr create_core -c mycore -d c:/tmp/myconfig echo one of the built-in configurations, such as: bin\solr create_core -c mycore -d c:/tmp/myconfig
@ -428,12 +427,11 @@ echo -c name Name of collection to create
echo. echo.
echo -d confdir Configuration directory to copy when creating the new collection, built-in options are: echo -d confdir Configuration directory to copy when creating the new collection, built-in options are:
echo. echo.
echo basic_configs: Minimal Solr configuration echo _default: Minimal configuration, which supports enabling/disabling field-guessing support
echo data_driven_schema_configs: Managed schema with field-guessing support enabled
echo sample_techproducts_configs: Example configuration with many optional features enabled to echo sample_techproducts_configs: Example configuration with many optional features enabled to
echo demonstrate the full power of Solr echo demonstrate the full power of Solr
echo. echo.
echo If not specified, default is: data_driven_schema_configs echo If not specified, default is: _default
echo. echo.
echo Alternatively, you can pass the path to your own configuration directory instead of using echo Alternatively, you can pass the path to your own configuration directory instead of using
echo one of the built-in configurations, such as: bin\solr create_collection -c mycoll -d c:/tmp/myconfig echo one of the built-in configurations, such as: bin\solr create_collection -c mycoll -d c:/tmp/myconfig
@ -1214,13 +1212,15 @@ IF "%JAVA_VENDOR%" == "IBM J9" (
set GCLOG_OPT="-Xloggc:!SOLR_LOGS_DIR!\solr_gc.log" -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=9 -XX:GCLogFileSize=20M set GCLOG_OPT="-Xloggc:!SOLR_LOGS_DIR!\solr_gc.log" -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=9 -XX:GCLogFileSize=20M
) )
IF "%DEFAULT_CONFDIR%"=="" set "DEFAULT_CONFDIR=%SOLR_SERVER_DIR%\solr\configsets\_default\conf"
IF "%FG%"=="1" ( IF "%FG%"=="1" (
REM run solr in the foreground REM run solr in the foreground
title "Solr-%SOLR_PORT%" title "Solr-%SOLR_PORT%"
echo %SOLR_PORT%>"%SOLR_TIP%"\bin\solr-%SOLR_PORT%.port echo %SOLR_PORT%>"%SOLR_TIP%"\bin\solr-%SOLR_PORT%.port
"%JAVA%" %SERVEROPT% %SOLR_JAVA_MEM% %START_OPTS% %GCLOG_OPT% ^ "%JAVA%" %SERVEROPT% %SOLR_JAVA_MEM% %START_OPTS% %GCLOG_OPT% ^
-Dlog4j.configuration="%LOG4J_CONFIG%" -DSTOP.PORT=!STOP_PORT! -DSTOP.KEY=%STOP_KEY% ^ -Dlog4j.configuration="%LOG4J_CONFIG%" -DSTOP.PORT=!STOP_PORT! -DSTOP.KEY=%STOP_KEY% ^
-Dsolr.solr.home="%SOLR_HOME%" -Dsolr.install.dir="%SOLR_TIP%" ^ -Dsolr.solr.home="%SOLR_HOME%" -Dsolr.install.dir="%SOLR_TIP%" -Dsolr.default.confdir="%DEFAULT_CONFDIR%" ^
-Djetty.host=%SOLR_JETTY_HOST% -Djetty.port=%SOLR_PORT% -Djetty.home="%SOLR_SERVER_DIR%" ^ -Djetty.host=%SOLR_JETTY_HOST% -Djetty.port=%SOLR_PORT% -Djetty.home="%SOLR_SERVER_DIR%" ^
-Djava.io.tmpdir="%SOLR_SERVER_DIR%\tmp" -jar start.jar "%SOLR_JETTY_CONFIG%" "%SOLR_JETTY_ADDL_CONFIG%" -Djava.io.tmpdir="%SOLR_SERVER_DIR%\tmp" -jar start.jar "%SOLR_JETTY_CONFIG%" "%SOLR_JETTY_ADDL_CONFIG%"
) ELSE ( ) ELSE (
@ -1228,13 +1228,13 @@ IF "%FG%"=="1" (
"%JAVA%" %SERVEROPT% %SOLR_JAVA_MEM% %START_OPTS% %GCLOG_OPT% ^ "%JAVA%" %SERVEROPT% %SOLR_JAVA_MEM% %START_OPTS% %GCLOG_OPT% ^
-Dlog4j.configuration="%LOG4J_CONFIG%" -DSTOP.PORT=!STOP_PORT! -DSTOP.KEY=%STOP_KEY% ^ -Dlog4j.configuration="%LOG4J_CONFIG%" -DSTOP.PORT=!STOP_PORT! -DSTOP.KEY=%STOP_KEY% ^
-Dsolr.log.muteconsole ^ -Dsolr.log.muteconsole ^
-Dsolr.solr.home="%SOLR_HOME%" -Dsolr.install.dir="%SOLR_TIP%" ^ -Dsolr.solr.home="%SOLR_HOME%" -Dsolr.install.dir="%SOLR_TIP%" -Dsolr.default.confdir="%DEFAULT_CONFDIR%" ^
-Djetty.host=%SOLR_JETTY_HOST% -Djetty.port=%SOLR_PORT% -Djetty.home="%SOLR_SERVER_DIR%" ^ -Djetty.host=%SOLR_JETTY_HOST% -Djetty.port=%SOLR_PORT% -Djetty.home="%SOLR_SERVER_DIR%" ^
-Djava.io.tmpdir="%SOLR_SERVER_DIR%\tmp" -jar start.jar "%SOLR_JETTY_CONFIG%" "%SOLR_JETTY_ADDL_CONFIG%" > "!SOLR_LOGS_DIR!\solr-%SOLR_PORT%-console.log" -Djava.io.tmpdir="%SOLR_SERVER_DIR%\tmp" -jar start.jar "%SOLR_JETTY_CONFIG%" "%SOLR_JETTY_ADDL_CONFIG%" > "!SOLR_LOGS_DIR!\solr-%SOLR_PORT%-console.log"
echo %SOLR_PORT%>"%SOLR_TIP%"\bin\solr-%SOLR_PORT%.port echo %SOLR_PORT%>"%SOLR_TIP%"\bin\solr-%SOLR_PORT%.port
REM now wait to see Solr come online ... REM now wait to see Solr come online ...
"%JAVA%" %SOLR_SSL_OPTS% %AUTHC_OPTS% %SOLR_ZK_CREDS_AND_ACLS% -Dsolr.install.dir="%SOLR_TIP%" ^ "%JAVA%" %SOLR_SSL_OPTS% %AUTHC_OPTS% %SOLR_ZK_CREDS_AND_ACLS% -Dsolr.install.dir="%SOLR_TIP%" -Dsolr.default.confdir="%DEFAULT_CONFDIR%"^
-Dlog4j.configuration="file:%DEFAULT_SERVER_DIR%\scripts\cloud-scripts\log4j.properties" ^ -Dlog4j.configuration="file:%DEFAULT_SERVER_DIR%\scripts\cloud-scripts\log4j.properties" ^
-classpath "%DEFAULT_SERVER_DIR%\solr-webapp\webapp\WEB-INF\lib\*;%DEFAULT_SERVER_DIR%\lib\ext\*" ^ -classpath "%DEFAULT_SERVER_DIR%\solr-webapp\webapp\WEB-INF\lib\*;%DEFAULT_SERVER_DIR%\lib\ext\*" ^
org.apache.solr.util.SolrCLI status -maxWaitSecs 30 -solr !SOLR_URL_SCHEME!://%SOLR_TOOL_HOST%:%SOLR_PORT%/solr org.apache.solr.util.SolrCLI status -maxWaitSecs 30 -solr !SOLR_URL_SCHEME!://%SOLR_TOOL_HOST%:%SOLR_PORT%/solr
@ -1404,10 +1404,8 @@ IF "!CREATE_NAME!"=="" (
set "SCRIPT_ERROR=Name (-c) is a required parameter for %SCRIPT_CMD%" set "SCRIPT_ERROR=Name (-c) is a required parameter for %SCRIPT_CMD%"
goto invalid_cmd_line goto invalid_cmd_line
) )
IF "!CREATE_CONFDIR!"=="" set CREATE_CONFDIR=data_driven_schema_configs
IF "!CREATE_NUM_SHARDS!"=="" set CREATE_NUM_SHARDS=1 IF "!CREATE_NUM_SHARDS!"=="" set CREATE_NUM_SHARDS=1
IF "!CREATE_REPFACT!"=="" set CREATE_REPFACT=1 IF "!CREATE_REPFACT!"=="" set CREATE_REPFACT=1
IF "!CREATE_CONFNAME!"=="" set CREATE_CONFNAME=!CREATE_NAME!
REM Find a port that Solr is running on REM Find a port that Solr is running on
if "!CREATE_PORT!"=="" ( if "!CREATE_PORT!"=="" (
@ -1433,7 +1431,7 @@ if "%SCRIPT_CMD%"=="create_core" (
org.apache.solr.util.SolrCLI create_core -name !CREATE_NAME! -solrUrl !SOLR_URL_SCHEME!://%SOLR_TOOL_HOST%:!CREATE_PORT!/solr ^ org.apache.solr.util.SolrCLI create_core -name !CREATE_NAME! -solrUrl !SOLR_URL_SCHEME!://%SOLR_TOOL_HOST%:!CREATE_PORT!/solr ^
-confdir !CREATE_CONFDIR! -configsetsDir "%SOLR_TIP%\server\solr\configsets" -confdir !CREATE_CONFDIR! -configsetsDir "%SOLR_TIP%\server\solr\configsets"
) else ( ) else (
"%JAVA%" %SOLR_SSL_OPTS% %AUTHC_OPTS% %SOLR_ZK_CREDS_AND_ACLS% -Dsolr.install.dir="%SOLR_TIP%" ^ + "%JAVA%" %SOLR_SSL_OPTS% %AUTHC_OPTS% %SOLR_ZK_CREDS_AND_ACLS% -Dsolr.install.dir="%SOLR_TIP%" -Dsolr.default.confdir="%DEFAULT_CONFDIR%"^
-Dlog4j.configuration="file:%DEFAULT_SERVER_DIR%\scripts\cloud-scripts\log4j.properties" ^ -Dlog4j.configuration="file:%DEFAULT_SERVER_DIR%\scripts\cloud-scripts\log4j.properties" ^
-classpath "%DEFAULT_SERVER_DIR%\solr-webapp\webapp\WEB-INF\lib\*;%DEFAULT_SERVER_DIR%\lib\ext\*" ^ -classpath "%DEFAULT_SERVER_DIR%\solr-webapp\webapp\WEB-INF\lib\*;%DEFAULT_SERVER_DIR%\lib\ext\*" ^
org.apache.solr.util.SolrCLI create -name !CREATE_NAME! -shards !CREATE_NUM_SHARDS! -replicationFactor !CREATE_REPFACT! ^ org.apache.solr.util.SolrCLI create -name !CREATE_NAME! -shards !CREATE_NUM_SHARDS! -replicationFactor !CREATE_REPFACT! ^

View File

@ -28,7 +28,7 @@ import java.util.List;
import java.util.Map; import java.util.Map;
import java.util.Map.Entry; import java.util.Map.Entry;
import java.util.Set; import java.util.Set;
import java.util.HashMap; import java.util.TreeMap;
import com.google.common.collect.Iterables; import com.google.common.collect.Iterables;
import org.apache.lucene.index.LeafReaderContext; import org.apache.lucene.index.LeafReaderContext;
@ -98,7 +98,7 @@ public class FacetingAccumulator extends BasicAccumulator implements FacetValueA
List<RangeFacetRequest> rangeFreqs = request.getRangeFacets(); List<RangeFacetRequest> rangeFreqs = request.getRangeFacets();
List<QueryFacetRequest> queryFreqs = request.getQueryFacets(); List<QueryFacetRequest> queryFreqs = request.getQueryFacets();
this.fieldFacetExpressions = new HashMap<>(); this.fieldFacetExpressions = new TreeMap<>();
this.rangeFacetExpressions = new LinkedHashMap<>(rangeFreqs.size()); this.rangeFacetExpressions = new LinkedHashMap<>(rangeFreqs.size());
this.queryFacetExpressions = new LinkedHashMap<>(queryFreqs.size()); this.queryFacetExpressions = new LinkedHashMap<>(queryFreqs.size());
this.fieldFacetCollectors = new LinkedHashMap<>(fieldFreqs.size()); this.fieldFacetCollectors = new LinkedHashMap<>(fieldFreqs.size());
@ -120,8 +120,8 @@ public class FacetingAccumulator extends BasicAccumulator implements FacetValueA
final SchemaField ff = fr.getField(); final SchemaField ff = fr.getField();
final FieldFacetAccumulator facc = FieldFacetAccumulator.create(searcher, this, ff); final FieldFacetAccumulator facc = FieldFacetAccumulator.create(searcher, this, ff);
facetAccumulators.add(facc); facetAccumulators.add(facc);
fieldFacetExpressions.put(freq.getName(), new HashMap<String, Expression[]>() ); fieldFacetExpressions.put(freq.getName(), new TreeMap<String, Expression[]>() );
fieldFacetCollectors.put(freq.getName(), new HashMap<String,StatsCollector[]>()); fieldFacetCollectors.put(freq.getName(), new TreeMap<String,StatsCollector[]>());
} }
/** /**
* For each range and query facet request add a bucket to the corresponding * For each range and query facet request add a bucket to the corresponding
@ -299,22 +299,6 @@ public class FacetingAccumulator extends BasicAccumulator implements FacetValueA
@Override @Override
public int compare(Entry<String,Expression[]> o1, Entry<String,Expression[]> o2) { public int compare(Entry<String,Expression[]> o1, Entry<String,Expression[]> o2) {
// Handle nulls. Null is treated as an infinitely big number so that in case of ASCENDING sorts,
// Nulls will appear last. In case of DESC sorts, Nulls will appear last.
boolean firstIsNull = false;
if (o1 == null || o1.getValue() == null || o1.getValue()[comparatorExpressionPlace] == null)
firstIsNull = true;
boolean secondIsNull = false;
if (o2 == null || o2.getValue() == null || o2.getValue()[comparatorExpressionPlace] == null)
secondIsNull = true;
if (firstIsNull && secondIsNull)
return 0;
else if (firstIsNull)
return 1;
else if (secondIsNull)
return -1;
return comp.compare(o1.getValue()[comparatorExpressionPlace], o2.getValue()[comparatorExpressionPlace]); return comp.compare(o1.getValue()[comparatorExpressionPlace], o2.getValue()[comparatorExpressionPlace]);
} }
} }

View File

@ -29,19 +29,10 @@ public abstract class Expression {
public Comparator<Expression> comparator(final FacetSortDirection direction) { public Comparator<Expression> comparator(final FacetSortDirection direction) {
return (a, b) -> { return (a, b) -> {
boolean aIsNull = a.getValue() == null; if( direction == FacetSortDirection.ASCENDING ){
boolean bIsNull = b.getValue() == null; return a.getValue().compareTo(b.getValue());
if (aIsNull && bIsNull) return 0;
if( direction == FacetSortDirection.ASCENDING ){ // nulls are last for ASC sort
return aIsNull ? 1
: bIsNull ? -1
: a.getValue().compareTo(b.getValue());
} else { } else {
return aIsNull ? -1 return b.getValue().compareTo(a.getValue());
: bIsNull ? 1
: b.getValue().compareTo(a.getValue());
} }
}; };
} }

View File

@ -37,33 +37,20 @@ public class MinMaxStatsCollector implements StatsCollector{
protected MutableValue value; protected MutableValue value;
protected final Set<String> statsList; protected final Set<String> statsList;
protected final ValueSource source; protected final ValueSource source;
protected FunctionValues function;
protected ValueFiller valueFiller; protected ValueFiller valueFiller;
private CollectorState state;
public MinMaxStatsCollector(ValueSource source, Set<String> statsList, CollectorState state) { public MinMaxStatsCollector(ValueSource source, Set<String> statsList) {
this.source = source; this.source = source;
this.statsList = statsList; this.statsList = statsList;
this.state = state;
} }
public void setNextReader(LeafReaderContext context) throws IOException { public void setNextReader(LeafReaderContext context) throws IOException {
state.setNextReader(source, context); function = source.getValues(null, context);
valueFiller = state.function.getValueFiller(); valueFiller = function.getValueFiller();
value = valueFiller.getValue(); value = valueFiller.getValue();
} }
public static class CollectorState {
FunctionValues function;
LeafReaderContext context = null;
public void setNextReader(ValueSource source, LeafReaderContext context) throws IOException {
if (this.context != context) {
this.context = context;
this.function = source.getValues(null, context);
}
}
}
public void collect(int doc) throws IOException { public void collect(int doc) throws IOException {
valueFiller.fillValue(doc); valueFiller.fillValue(doc);
if( value.exists ){ if( value.exists ){
@ -114,7 +101,7 @@ public class MinMaxStatsCollector implements StatsCollector{
@Override @Override
public FunctionValues getFunction() { public FunctionValues getFunction() {
return state.function; return function;
} }
public String valueSourceString() { public String valueSourceString() {

View File

@ -29,16 +29,14 @@ public class NumericStatsCollector extends MinMaxStatsCollector {
protected double sumOfSquares = 0; protected double sumOfSquares = 0;
protected double mean = 0; protected double mean = 0;
protected double stddev = 0; protected double stddev = 0;
protected CollectorState state;
public NumericStatsCollector(ValueSource source, Set<String> statsList, CollectorState state) { public NumericStatsCollector(ValueSource source, Set<String> statsList) {
super(source, statsList, state); super(source, statsList);
this.state = state;
} }
public void collect(int doc) throws IOException { public void collect(int doc) throws IOException {
super.collect(doc); super.collect(doc);
double value = state.function.doubleVal(doc); double value = function.doubleVal(doc);
sum += value; sum += value;
sumOfSquares += (value * value); sumOfSquares += (value * value);
} }

View File

@ -33,7 +33,6 @@ import org.apache.lucene.queries.function.valuesource.IntFieldSource;
import org.apache.lucene.queries.function.valuesource.LongFieldSource; import org.apache.lucene.queries.function.valuesource.LongFieldSource;
import org.apache.solr.analytics.expression.ExpressionFactory; import org.apache.solr.analytics.expression.ExpressionFactory;
import org.apache.solr.analytics.request.ExpressionRequest; import org.apache.solr.analytics.request.ExpressionRequest;
import org.apache.solr.analytics.statistics.MinMaxStatsCollector.CollectorState;
import org.apache.solr.analytics.util.AnalyticsParams; import org.apache.solr.analytics.util.AnalyticsParams;
import org.apache.solr.analytics.util.valuesource.AbsoluteValueDoubleFunction; import org.apache.solr.analytics.util.valuesource.AbsoluteValueDoubleFunction;
import org.apache.solr.analytics.util.valuesource.AddDoubleFunction; import org.apache.solr.analytics.util.valuesource.AddDoubleFunction;
@ -214,32 +213,25 @@ public class StatsCollectorSupplierFactory {
} }
} }
} }
final CollectorState states[] = new CollectorState[statsArr.length];
for (int count = 0; count < statsArr.length; count++) {
states[count] = new CollectorState();
}
// Making the Supplier // Making the Supplier
return new Supplier<StatsCollector[]>() { return new Supplier<StatsCollector[]>() {
private final CollectorState collectorState[] = states;
public StatsCollector[] get() { public StatsCollector[] get() {
StatsCollector[] collectors = new StatsCollector[statsArr.length]; StatsCollector[] collectors = new StatsCollector[statsArr.length];
for (int count = 0; count < statsArr.length; count++) { for (int count = 0; count < statsArr.length; count++) {
if(numericBools[count]){ if(numericBools[count]){
StatsCollector sc = new NumericStatsCollector(sourceArr[count], statsArr[count], collectorState[count]); StatsCollector sc = new NumericStatsCollector(sourceArr[count], statsArr[count]);
if(uniqueBools[count]) sc = new UniqueStatsCollector(sc); if(uniqueBools[count]) sc = new UniqueStatsCollector(sc);
if(medianBools[count]) sc = new MedianStatsCollector(sc); if(medianBools[count]) sc = new MedianStatsCollector(sc);
if(percsArr[count]!=null) sc = new PercentileStatsCollector(sc,percsArr[count],percsNames[count]); if(percsArr[count]!=null) sc = new PercentileStatsCollector(sc,percsArr[count],percsNames[count]);
collectors[count]=sc; collectors[count]=sc;
} else if (dateBools[count]) { } else if (dateBools[count]) {
StatsCollector sc = new MinMaxStatsCollector(sourceArr[count], statsArr[count], collectorState[count]); StatsCollector sc = new MinMaxStatsCollector(sourceArr[count], statsArr[count]);
if(uniqueBools[count]) sc = new UniqueStatsCollector(sc); if(uniqueBools[count]) sc = new UniqueStatsCollector(sc);
if(medianBools[count]) sc = new DateMedianStatsCollector(sc); if(medianBools[count]) sc = new DateMedianStatsCollector(sc);
if(percsArr[count]!=null) sc = new PercentileStatsCollector(sc,percsArr[count],percsNames[count]); if(percsArr[count]!=null) sc = new PercentileStatsCollector(sc,percsArr[count],percsNames[count]);
collectors[count]=sc; collectors[count]=sc;
} else { } else {
StatsCollector sc = new MinMaxStatsCollector(sourceArr[count], statsArr[count], collectorState[count]); StatsCollector sc = new MinMaxStatsCollector(sourceArr[count], statsArr[count]);
if(uniqueBools[count]) sc = new UniqueStatsCollector(sc); if(uniqueBools[count]) sc = new UniqueStatsCollector(sc);
if(medianBools[count]) sc = new MedianStatsCollector(sc); if(medianBools[count]) sc = new MedianStatsCollector(sc);
if(percsArr[count]!=null) sc = new PercentileStatsCollector(sc,percsArr[count],percsNames[count]); if(percsArr[count]!=null) sc = new PercentileStatsCollector(sc,percsArr[count],percsNames[count]);

View File

@ -1,4 +0,0 @@
o.ar.s.min=min(double_dd)
o.ar.s.max=max(long_ld)
o.ar.ff=string_sd
o.ar.ff.string_sd.sortstatistic=min

View File

@ -1,14 +0,0 @@
<?xml version="1.0" encoding="UTF-8"?>
<analyticsRequestEnvelope stats="true" olap="true">
<analyticsRequest>
<name>MinMax Request</name>
<statistic>
<expression>min(double(double_dd))</expression>
<name>min</name>
</statistic>
<statistic>
<expression>max(long(long_ld))</expression>
<name>max</name>
</statistic>
</analyticsRequest>
</analyticsRequestEnvelope>

View File

@ -39,14 +39,14 @@
These are provided more for backward compatability, allowing one These are provided more for backward compatability, allowing one
to create a schema that matches an existing lucene index. to create a schema that matches an existing lucene index.
--> -->
<fieldType name="int" class="solr.TrieIntField" precisionStep="0" omitNorms="true" positionIncrementGap="0"/> <fieldType name="int" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="float" class="solr.TrieFloatField" precisionStep="0" omitNorms="true" positionIncrementGap="0"/> <fieldType name="float" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="long" class="solr.TrieLongField" precisionStep="0" omitNorms="true" positionIncrementGap="0"/> <fieldType name="long" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="double" class="solr.TrieDoubleField" precisionStep="0" omitNorms="true" positionIncrementGap="0"/> <fieldType name="double" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<!-- format for date is 1995-12-31T23:59:59.999Z and only the fractional <!-- format for date is 1995-12-31T23:59:59.999Z and only the fractional
seconds part (.999) is optional. seconds part (.999) is optional.
--> -->
<fieldType name="date" class="solr.TrieDateField" precisionStep="0" omitNorms="true" positionIncrementGap="0"/> <fieldType name="date" class="${solr.tests.DateFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="boolean" class="solr.BoolField"/> <fieldType name="boolean" class="solr.BoolField"/>
<fieldType name="string" class="solr.StrField"/> <fieldType name="string" class="solr.StrField"/>

View File

@ -48,6 +48,7 @@ import org.xml.sax.SAXException;
import com.google.common.collect.ObjectArrays; import com.google.common.collect.ObjectArrays;
@SolrTestCaseJ4.SuppressPointFields(bugUrl="https://issues.apache.org/jira/browse/SOLR-10949")
public class AbstractAnalyticsStatsTest extends SolrTestCaseJ4 { public class AbstractAnalyticsStatsTest extends SolrTestCaseJ4 {
protected static final String[] BASEPARMS = new String[]{ "q", "*:*", "indent", "true", "olap", "true", "rows", "0" }; protected static final String[] BASEPARMS = new String[]{ "q", "*:*", "indent", "true", "olap", "true", "rows", "0" };

View File

@ -16,7 +16,6 @@
*/ */
package org.apache.solr.analytics; package org.apache.solr.analytics;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
@ -60,7 +59,7 @@ public class NoFacetTest extends AbstractAnalyticsStatsTest {
@BeforeClass @BeforeClass
public static void beforeClass() throws Exception { public static void beforeClass() throws Exception {
initCore("solrconfig-analytics.xml","schema-analytics.xml"); initCore("solrconfig-basic.xml","schema-analytics.xml");
h.update("<delete><query>*:*</query></delete>"); h.update("<delete><query>*:*</query></delete>");
defaults.put("int_id", new Integer(0)); defaults.put("int_id", new Integer(0));
defaults.put("long_ld", new Long(0)); defaults.put("long_ld", new Long(0));

View File

@ -48,7 +48,7 @@ public class ExpressionTest extends AbstractAnalyticsStatsTest {
@BeforeClass @BeforeClass
public static void beforeClass() throws Exception { public static void beforeClass() throws Exception {
initCore("solrconfig-analytics.xml", "schema-analytics.xml"); initCore("solrconfig-basic.xml", "schema-analytics.xml");
h.update("<delete><query>*:*</query></delete>"); h.update("<delete><query>*:*</query></delete>");
for (int j = 0; j < NUM_LOOPS; ++j) { for (int j = 0; j < NUM_LOOPS; ++j) {

View File

@ -52,6 +52,7 @@ import javax.xml.xpath.XPathConstants;
import javax.xml.xpath.XPathExpressionException; import javax.xml.xpath.XPathExpressionException;
import javax.xml.xpath.XPathFactory; import javax.xml.xpath.XPathFactory;
@SolrTestCaseJ4.SuppressPointFields(bugUrl="https://issues.apache.org/jira/browse/SOLR-10949")
public class AbstractAnalyticsFacetTest extends SolrTestCaseJ4 { public class AbstractAnalyticsFacetTest extends SolrTestCaseJ4 {
protected static final HashMap<String,Object> defaults = new HashMap<>(); protected static final HashMap<String,Object> defaults = new HashMap<>();
@ -312,19 +313,4 @@ public class AbstractAnalyticsFacetTest extends SolrTestCaseJ4 {
IOUtils.closeWhileHandlingException(file, in); IOUtils.closeWhileHandlingException(file, in);
} }
} }
protected void removeNodes(String xPath, List<Double> string) throws XPathExpressionException {
NodeList missingNodes = getNodes(xPath);
List<Double> result = new ArrayList<Double>();
for (int idx = 0; idx < missingNodes.getLength(); ++idx) {
result.add(Double.parseDouble(missingNodes.item(idx).getTextContent()));
}
string.removeAll(result);
}
protected NodeList getNodes(String xPath) throws XPathExpressionException {
StringBuilder sb = new StringBuilder(xPath);
return (NodeList) xPathFact.newXPath().compile(sb.toString()).evaluate(doc, XPathConstants.NODESET);
}
} }

View File

@ -1,56 +0,0 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one or more
* contributor license agreements. See the NOTICE file distributed with
* this work for additional information regarding copyright ownership.
* The ASF licenses this file to You under the Apache License, Version 2.0
* (the "License"); you may not use this file except in compliance with
* the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.apache.solr.analytics.facet;
import org.apache.lucene.util.LuceneTestCase.SuppressCodecs;
import org.apache.solr.analytics.AbstractAnalyticsStatsTest;
import org.apache.solr.analytics.expression.ExpressionTest;
import org.junit.BeforeClass;
import org.junit.Test;
@SuppressCodecs({"Lucene3x","Lucene40","Lucene41","Lucene42","Appending","Asserting"})
public class FacetSortingTest extends AbstractAnalyticsStatsTest {
private static String fileName = "/analytics/requestFiles/facetSorting.txt";
@BeforeClass
public static void beforeClass() throws Exception {
initCore("solrconfig-analytics.xml", "schema-analytics.xml");
h.update("<delete><query>*:*</query></delete>");
// The data set below is so generated that in bucket corresponding fieldFacet B, double_dd column has null values
// and in bucket C corresponding to fieldFacet C has null values for column long_ld.
// FieldFaceting occurs on string_sd field
assertU(adoc("id", "1001", "string_sd", "A", "double_dd", "" + 3, "long_ld", "" + 1));
assertU(adoc("id", "1002", "string_sd", "A", "double_dd", "" + 25, "long_ld", "" + 2));
assertU(adoc("id", "1003", "string_sd", "B", "long_ld", "" + 3));
assertU(adoc("id", "1004", "string_sd", "B", "long_ld", "" + 4));
assertU(adoc("id", "1005", "string_sd", "C", "double_dd", "" + 17));
assertU(commit());
String response = h.query(request(fileToStringArr(ExpressionTest.class, fileName)));
System.out.println("Response=" + response);
setResponse(response);
}
@Test
public void addTest() throws Exception {
Double minResult = (Double) getStatResult("ar", "min", VAL_TYPE.DOUBLE);
Long maxResult = (Long) getStatResult("ar", "max", VAL_TYPE.LONG);
assertEquals(Double.valueOf(minResult), Double.valueOf(3.0));
assertEquals(Long.valueOf(maxResult),Long.valueOf(4));
}
}

View File

@ -44,7 +44,7 @@ public class FieldFacetExtrasTest extends AbstractAnalyticsFacetTest {
@BeforeClass @BeforeClass
public static void beforeClass() throws Exception { public static void beforeClass() throws Exception {
initCore("solrconfig-analytics.xml","schema-analytics.xml"); initCore("solrconfig-basic.xml","schema-analytics.xml");
h.update("<delete><query>*:*</query></delete>"); h.update("<delete><query>*:*</query></delete>");
//INT //INT

View File

@ -24,7 +24,6 @@ import java.util.List;
import org.junit.Assert; import org.junit.Assert;
import org.junit.BeforeClass; import org.junit.BeforeClass;
import org.junit.Test; import org.junit.Test;
import org.w3c.dom.Node;
public class FieldFacetTest extends AbstractAnalyticsFacetTest{ public class FieldFacetTest extends AbstractAnalyticsFacetTest{
@ -88,7 +87,7 @@ public class FieldFacetTest extends AbstractAnalyticsFacetTest{
@BeforeClass @BeforeClass
public static void beforeClass() throws Exception { public static void beforeClass() throws Exception {
initCore("solrconfig-analytics.xml","schema-analytics.xml"); initCore("solrconfig-basic.xml","schema-analytics.xml");
h.update("<delete><query>*:*</query></delete>"); h.update("<delete><query>*:*</query></delete>");
defaults.put("int", new Integer(0)); defaults.put("int", new Integer(0));
@ -1038,25 +1037,21 @@ public class FieldFacetTest extends AbstractAnalyticsFacetTest{
public void missingFacetTest() throws Exception { public void missingFacetTest() throws Exception {
//int MultiDate //int MultiDate
String xPath = "/response/lst[@name='stats']/lst[@name='missingf']/lst[@name='fieldFacets']/lst[@name='date_dtdm']/lst[@name='(MISSING)']"; String xPath = "/response/lst[@name='stats']/lst[@name='missingf']/lst[@name='fieldFacets']/lst[@name='date_dtdm']/lst[@name='(MISSING)']";
Node missingNodeXPath = getNode(xPath); assertNotNull(getRawResponse(), getNode(xPath));
assertNotNull(getRawResponse(), missingNodeXPath);
ArrayList<Double> string = getDoubleList("missingf", "fieldFacets", "date_dtdm", "double", "mean"); ArrayList<Double> string = getDoubleList("missingf", "fieldFacets", "date_dtdm", "double", "mean");
super.removeNodes(xPath, string); string.remove(0);
ArrayList<Double> stringTest = calculateNumberStat(multiDateTestStart, "mean"); ArrayList<Double> stringTest = calculateNumberStat(multiDateTestStart, "mean");
assertEquals(getRawResponse(), string,stringTest); assertEquals(getRawResponse(), string,stringTest);
//Int String //Int String
xPath = "/response/lst[@name='stats']/lst[@name='missingf']/lst[@name='fieldFacets']/lst[@name='string_sd']/lst[@name='(MISSING)']"; xPath = "/response/lst[@name='stats']/lst[@name='missingf']/lst[@name='fieldFacets']/lst[@name='string_sd']/lst[@name='(MISSING)']";
missingNodeXPath = getNode(xPath); assertNotNull(getRawResponse(), getNode(xPath));
String missingNodeXPathStr = xPath;
assertNotNull(getRawResponse(), missingNodeXPath);
xPath = "/response/lst[@name='stats']/lst[@name='missingf']/lst[@name='fieldFacets']/lst[@name='string_sd']/lst[@name='str0']"; xPath = "/response/lst[@name='stats']/lst[@name='missingf']/lst[@name='fieldFacets']/lst[@name='string_sd']/lst[@name='str0']";
assertNull(getRawResponse(), getNode(xPath)); assertNull(getRawResponse(), getNode(xPath));
List<Double> intString = getDoubleList("missingf", "fieldFacets", "string_sd", "double", "mean"); List<Double> intString = getDoubleList("missingf", "fieldFacets", "string_sd", "double", "mean");
removeNodes(missingNodeXPathStr, intString); intString.remove(0);
ArrayList<Double> intStringTest = calculateNumberStat(intStringTestStart, "mean"); ArrayList<Double> intStringTest = calculateNumberStat(intStringTestStart, "mean");
assertEquals(getRawResponse(), intString,intStringTest); assertEquals(getRawResponse(), intString,intStringTest);
@ -1065,6 +1060,8 @@ public class FieldFacetTest extends AbstractAnalyticsFacetTest{
ArrayList<ArrayList<Double>> intDateMissingTestStart = (ArrayList<ArrayList<Double>>) intDateTestStart.clone(); ArrayList<ArrayList<Double>> intDateMissingTestStart = (ArrayList<ArrayList<Double>>) intDateTestStart.clone();
ArrayList<Double> intDateTest = calculateNumberStat(intDateMissingTestStart, "mean"); ArrayList<Double> intDateTest = calculateNumberStat(intDateMissingTestStart, "mean");
assertEquals(getRawResponse(),intDate,intDateTest); assertEquals(getRawResponse(),intDate,intDateTest);
} }
private void checkStddevs(ArrayList<Double> list1, ArrayList<Double> list2) { private void checkStddevs(ArrayList<Double> list1, ArrayList<Double> list2) {

View File

@ -35,7 +35,7 @@ public class QueryFacetTest extends AbstractAnalyticsFacetTest {
@BeforeClass @BeforeClass
public static void beforeClass() throws Exception { public static void beforeClass() throws Exception {
initCore("solrconfig-analytics.xml","schema-analytics.xml"); initCore("solrconfig-basic.xml","schema-analytics.xml");
} }
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")

View File

@ -46,7 +46,7 @@ public class RangeFacetTest extends AbstractAnalyticsFacetTest {
@BeforeClass @BeforeClass
public static void beforeClass() throws Exception { public static void beforeClass() throws Exception {
initCore("solrconfig-analytics.xml","schema-analytics.xml"); initCore("solrconfig-basic.xml","schema-analytics.xml");
h.update("<delete><query>*:*</query></delete>"); h.update("<delete><query>*:*</query></delete>");
//INT //INT

View File

@ -35,7 +35,7 @@ public class FunctionTest extends AbstractAnalyticsStatsTest {
@BeforeClass @BeforeClass
public static void beforeClass() throws Exception { public static void beforeClass() throws Exception {
initCore("solrconfig-analytics.xml","schema-analytics.xml"); initCore("solrconfig-basic.xml","schema-analytics.xml");
h.update("<delete><query>*:*</query></delete>"); h.update("<delete><query>*:*</query></delete>");
for (int j = 0; j < NUM_LOOPS; ++j) { for (int j = 0; j < NUM_LOOPS; ++j) {

View File

@ -72,10 +72,10 @@
<!-- <!--
Default numeric field types. For faster range queries, consider the tint/tfloat/tlong/tdouble types. Default numeric field types. For faster range queries, consider the tint/tfloat/tlong/tdouble types.
--> -->
<fieldType name="int" class="solr.TrieIntField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="int" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="float" class="solr.TrieFloatField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="float" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="long" class="solr.TrieLongField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="long" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="double" class="solr.TrieDoubleField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="double" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<!-- <!--
Numeric field types that index each value at various levels of precision Numeric field types that index each value at various levels of precision
@ -87,10 +87,10 @@
indexed per value, slightly larger index size, and faster range queries. indexed per value, slightly larger index size, and faster range queries.
A precisionStep of 0 disables indexing at different precision levels. A precisionStep of 0 disables indexing at different precision levels.
--> -->
<fieldType name="tint" class="solr.TrieIntField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tint" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tfloat" class="solr.TrieFloatField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tfloat" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tlong" class="solr.TrieLongField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tlong" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tdouble" class="solr.TrieDoubleField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tdouble" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<!-- The format for this date field is of the form 1995-12-31T23:59:59Z, and <!-- The format for this date field is of the form 1995-12-31T23:59:59Z, and
@ -113,7 +113,7 @@
Consult the TrieDateField javadocs for more information. Consult the TrieDateField javadocs for more information.
--> -->
<fieldType name="date" class="solr.TrieDateField" sortMissingLast="true" omitNorms="true"/> <fieldType name="date" class="${solr.tests.DateFieldType}" docValues="${solr.tests.numeric.dv}" sortMissingLast="true" omitNorms="true"/>
<!-- The "RandomSortField" is not used to store or search any <!-- The "RandomSortField" is not used to store or search any

View File

@ -73,10 +73,10 @@
<!-- <!--
Default numeric field types. For faster range queries, consider the tint/tfloat/tlong/tdouble types. Default numeric field types. For faster range queries, consider the tint/tfloat/tlong/tdouble types.
--> -->
<fieldType name="int" class="solr.TrieIntField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="int" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="float" class="solr.TrieFloatField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="float" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="long" class="solr.TrieLongField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="long" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="double" class="solr.TrieDoubleField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="double" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="latLon" class="solr.LatLonType" subFieldType="double"/> <fieldType name="latLon" class="solr.LatLonType" subFieldType="double"/>
@ -90,10 +90,10 @@
indexed per value, slightly larger index size, and faster range queries. indexed per value, slightly larger index size, and faster range queries.
A precisionStep of 0 disables indexing at different precision levels. A precisionStep of 0 disables indexing at different precision levels.
--> -->
<fieldType name="tint" class="solr.TrieIntField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tint" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tfloat" class="solr.TrieFloatField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tfloat" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tlong" class="solr.TrieLongField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tlong" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tdouble" class="solr.TrieDoubleField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tdouble" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<!-- The format for this date field is of the form 1995-12-31T23:59:59Z, and <!-- The format for this date field is of the form 1995-12-31T23:59:59Z, and
@ -116,7 +116,7 @@
Consult the TrieDateField javadocs for more information. Consult the TrieDateField javadocs for more information.
--> -->
<fieldType name="date" class="solr.TrieDateField" sortMissingLast="true" omitNorms="true"/> <fieldType name="date" class="${solr.tests.DateFieldType}" docValues="${solr.tests.numeric.dv}" sortMissingLast="true" omitNorms="true"/>
<!-- The "RandomSortField" is not used to store or search any <!-- The "RandomSortField" is not used to store or search any

View File

@ -2,11 +2,11 @@
<fieldType name="string" class="solr.StrField" sortMissingLast="true" omitNorms="true"/> <fieldType name="string" class="solr.StrField" sortMissingLast="true" omitNorms="true"/>
<fieldType name="boolean" class="solr.BoolField" sortMissingLast="true" omitNorms="true"/> <fieldType name="boolean" class="solr.BoolField" sortMissingLast="true" omitNorms="true"/>
<fieldType name="tint" class="solr.TrieIntField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tint" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tfloat" class="solr.TrieFloatField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tfloat" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tlong" class="solr.TrieLongField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tlong" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tdouble" class="solr.TrieDoubleField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tdouble" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="date" class="solr.TrieDateField" sortMissingLast="true" omitNorms="true"/> <fieldType name="date" class="${solr.tests.DateFieldType}" docValues="${solr.tests.numeric.dv}" sortMissingLast="true" omitNorms="true"/>
<fieldType name="text" class="solr.TextField" positionIncrementGap="100"> <fieldType name="text" class="solr.TextField" positionIncrementGap="100">
<analyzer type="index"> <analyzer type="index">
<tokenizer class="solr.MockTokenizerFactory"/> <tokenizer class="solr.MockTokenizerFactory"/>

View File

@ -73,10 +73,10 @@
<!-- <!--
Default numeric field types. For faster range queries, consider the tint/tfloat/tlong/tdouble types. Default numeric field types. For faster range queries, consider the tint/tfloat/tlong/tdouble types.
--> -->
<fieldType name="int" class="solr.TrieIntField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="int" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="float" class="solr.TrieFloatField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="float" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="long" class="solr.TrieLongField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="long" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="double" class="solr.TrieDoubleField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="double" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<!-- <!--
Numeric field types that index each value at various levels of precision Numeric field types that index each value at various levels of precision
@ -88,10 +88,10 @@
indexed per value, slightly larger index size, and faster range queries. indexed per value, slightly larger index size, and faster range queries.
A precisionStep of 0 disables indexing at different precision levels. A precisionStep of 0 disables indexing at different precision levels.
--> -->
<fieldType name="tint" class="solr.TrieIntField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tint" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tfloat" class="solr.TrieFloatField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tfloat" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tlong" class="solr.TrieLongField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tlong" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tdouble" class="solr.TrieDoubleField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tdouble" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<!-- The format for this date field is of the form 1995-12-31T23:59:59Z, and <!-- The format for this date field is of the form 1995-12-31T23:59:59Z, and
@ -114,7 +114,7 @@
Consult the TrieDateField javadocs for more information. Consult the TrieDateField javadocs for more information.
--> -->
<fieldType name="date" class="solr.TrieDateField" sortMissingLast="true" omitNorms="true"/> <fieldType name="date" class="${solr.tests.DateFieldType}" docValues="${solr.tests.numeric.dv}" sortMissingLast="true" omitNorms="true"/>
<!-- The "RandomSortField" is not used to store or search any <!-- The "RandomSortField" is not used to store or search any

View File

@ -32,7 +32,7 @@ import org.apache.solr.common.SolrInputDocument;
import org.apache.solr.common.params.SolrParams; import org.apache.solr.common.params.SolrParams;
import org.apache.solr.schema.IndexSchema; import org.apache.solr.schema.IndexSchema;
import org.apache.solr.schema.SchemaField; import org.apache.solr.schema.SchemaField;
import org.apache.solr.schema.TrieDateField; import org.apache.solr.schema.NumberType;
import org.apache.tika.metadata.Metadata; import org.apache.tika.metadata.Metadata;
import org.apache.tika.metadata.TikaMetadataKeys; import org.apache.tika.metadata.TikaMetadataKeys;
import org.slf4j.Logger; import org.slf4j.Logger;
@ -321,7 +321,7 @@ public class SolrContentHandler extends DefaultHandler implements ExtractingPara
*/ */
protected String transformValue(String val, SchemaField schFld) { protected String transformValue(String val, SchemaField schFld) {
String result = val; String result = val;
if (schFld != null && schFld.getType() instanceof TrieDateField) { if (schFld != null && NumberType.DATE.equals(schFld.getType().getNumberType())) {
//try to transform the date //try to transform the date
try { try {
Date date = ExtractionDateUtil.parseDate(val, dateFormats); // may throw Date date = ExtractionDateUtil.parseDate(val, dateFormats); // may throw

View File

@ -31,10 +31,10 @@
<!-- <!--
Default numeric field types. For faster range queries, consider the tint/tfloat/tlong/tdouble types. Default numeric field types. For faster range queries, consider the tint/tfloat/tlong/tdouble types.
--> -->
<fieldType name="int" class="solr.TrieIntField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="int" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="float" class="solr.TrieFloatField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="float" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="long" class="solr.TrieLongField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="long" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="double" class="solr.TrieDoubleField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="double" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<!-- <!--
Numeric field types that index each value at various levels of precision Numeric field types that index each value at various levels of precision
@ -46,10 +46,10 @@
indexed per value, slightly larger index size, and faster range queries. indexed per value, slightly larger index size, and faster range queries.
A precisionStep of 0 disables indexing at different precision levels. A precisionStep of 0 disables indexing at different precision levels.
--> -->
<fieldType name="tint" class="solr.TrieIntField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tint" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tfloat" class="solr.TrieFloatField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tfloat" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tlong" class="solr.TrieLongField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tlong" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<fieldType name="tdouble" class="solr.TrieDoubleField" precisionStep="8" positionIncrementGap="0"/> <fieldType name="tdouble" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="8" positionIncrementGap="0"/>
<!-- Field type demonstrating an Analyzer failure --> <!-- Field type demonstrating an Analyzer failure -->
<fieldType name="failtype1" class="solr.TextField"> <fieldType name="failtype1" class="solr.TextField">
@ -105,7 +105,7 @@
<!-- format for date is 1995-12-31T23:59:59.999Z and only the fractional <!-- format for date is 1995-12-31T23:59:59.999Z and only the fractional
seconds part (.999) is optional. seconds part (.999) is optional.
--> -->
<fieldType name="date" class="solr.TrieDateField" sortMissingLast="true" omitNorms="true"/> <fieldType name="date" class="${solr.tests.DateFieldType}" docValues="${solr.tests.numeric.dv}" sortMissingLast="true" omitNorms="true"/>
<!-- solr.TextField allows the specification of custom <!-- solr.TextField allows the specification of custom
text analyzers specified as a tokenizer and a list text analyzers specified as a tokenizer and a list

View File

@ -41,11 +41,11 @@
<types> <types>
<fieldType name="string" class="solr.StrField" sortMissingLast="true" /> <fieldType name="string" class="solr.StrField" sortMissingLast="true" />
<fieldType name="boolean" class="solr.BoolField" sortMissingLast="true"/> <fieldType name="boolean" class="solr.BoolField" sortMissingLast="true"/>
<fieldType name="int" class="solr.TrieIntField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="int" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="float" class="solr.TrieFloatField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="float" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="long" class="solr.TrieLongField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="long" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="double" class="solr.TrieDoubleField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="double" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="date" class="solr.TrieDateField" precisionStep="0" positionIncrementGap="0"/> <fieldType name="date" class="${solr.tests.DateFieldType}" docValues="${solr.tests.numeric.dv}" precisionStep="0" positionIncrementGap="0"/>
<fieldtype name="binary" class="solr.BinaryField"/> <fieldtype name="binary" class="solr.BinaryField"/>
<fieldType name="text_ws" class="solr.TextField" positionIncrementGap="100"> <fieldType name="text_ws" class="solr.TextField" positionIncrementGap="100">

View File

@ -94,13 +94,13 @@
Default numeric field types. For faster range queries, consider Default numeric field types. For faster range queries, consider
the tint/tfloat/tlong/tdouble types. the tint/tfloat/tlong/tdouble types.
--> -->
<fieldType name="int" class="solr.TrieIntField" <fieldType name="int" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="0" omitNorms="true" positionIncrementGap="0" /> precisionStep="0" omitNorms="true" positionIncrementGap="0" />
<fieldType name="float" class="solr.TrieFloatField" <fieldType name="float" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="0" omitNorms="true" positionIncrementGap="0" /> precisionStep="0" omitNorms="true" positionIncrementGap="0" />
<fieldType name="long" class="solr.TrieLongField" <fieldType name="long" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="0" omitNorms="true" positionIncrementGap="0" /> precisionStep="0" omitNorms="true" positionIncrementGap="0" />
<fieldType name="double" class="solr.TrieDoubleField" <fieldType name="double" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="0" omitNorms="true" positionIncrementGap="0" /> precisionStep="0" omitNorms="true" positionIncrementGap="0" />
<!-- <!--
@ -113,13 +113,13 @@
queries. A precisionStep of 0 disables indexing at different queries. A precisionStep of 0 disables indexing at different
precision levels. precision levels.
--> -->
<fieldType name="tint" class="solr.TrieIntField" <fieldType name="tint" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="8" omitNorms="true" positionIncrementGap="0" /> precisionStep="8" omitNorms="true" positionIncrementGap="0" />
<fieldType name="tfloat" class="solr.TrieFloatField" <fieldType name="tfloat" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="8" omitNorms="true" positionIncrementGap="0" /> precisionStep="8" omitNorms="true" positionIncrementGap="0" />
<fieldType name="tlong" class="solr.TrieLongField" <fieldType name="tlong" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="8" omitNorms="true" positionIncrementGap="0" /> precisionStep="8" omitNorms="true" positionIncrementGap="0" />
<fieldType name="tdouble" class="solr.TrieDoubleField" <fieldType name="tdouble" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="8" omitNorms="true" positionIncrementGap="0" /> precisionStep="8" omitNorms="true" positionIncrementGap="0" />
<!-- <!--
@ -137,14 +137,14 @@
the TrieDateField javadocs for more information. Note: For faster the TrieDateField javadocs for more information. Note: For faster
range queries, consider the tdate type range queries, consider the tdate type
--> -->
<fieldType name="date" class="solr.TrieDateField" <fieldType name="date" class="${solr.tests.DateFieldType}" docValues="${solr.tests.numeric.dv}"
omitNorms="true" precisionStep="0" positionIncrementGap="0" /> omitNorms="true" precisionStep="0" positionIncrementGap="0" />
<!-- <!--
A Trie based date field for faster date range queries and date A Trie based date field for faster date range queries and date
faceting. faceting.
--> -->
<fieldType name="tdate" class="solr.TrieDateField" <fieldType name="tdate" class="${solr.tests.DateFieldType}" docValues="${solr.tests.numeric.dv}"
omitNorms="true" precisionStep="6" positionIncrementGap="0" /> omitNorms="true" precisionStep="6" positionIncrementGap="0" />
<!-- <!--

View File

@ -94,13 +94,13 @@
Default numeric field types. For faster range queries, consider Default numeric field types. For faster range queries, consider
the tint/tfloat/tlong/tdouble types. the tint/tfloat/tlong/tdouble types.
--> -->
<fieldType name="int" class="solr.TrieIntField" <fieldType name="int" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="0" omitNorms="true" positionIncrementGap="0"/> precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="float" class="solr.TrieFloatField" <fieldType name="float" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="0" omitNorms="true" positionIncrementGap="0"/> precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="long" class="solr.TrieLongField" <fieldType name="long" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="0" omitNorms="true" positionIncrementGap="0"/> precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="double" class="solr.TrieDoubleField" <fieldType name="double" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="0" omitNorms="true" positionIncrementGap="0"/> precisionStep="0" omitNorms="true" positionIncrementGap="0"/>
<!-- <!--
@ -113,13 +113,13 @@
queries. A precisionStep of 0 disables indexing at different queries. A precisionStep of 0 disables indexing at different
precision levels. precision levels.
--> -->
<fieldType name="tint" class="solr.TrieIntField" <fieldType name="tint" class="${solr.tests.IntegerFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="8" omitNorms="true" positionIncrementGap="0"/> precisionStep="8" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="tfloat" class="solr.TrieFloatField" <fieldType name="tfloat" class="${solr.tests.FloatFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="8" omitNorms="true" positionIncrementGap="0"/> precisionStep="8" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="tlong" class="solr.TrieLongField" <fieldType name="tlong" class="${solr.tests.LongFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="8" omitNorms="true" positionIncrementGap="0"/> precisionStep="8" omitNorms="true" positionIncrementGap="0"/>
<fieldType name="tdouble" class="solr.TrieDoubleField" <fieldType name="tdouble" class="${solr.tests.DoubleFieldType}" docValues="${solr.tests.numeric.dv}"
precisionStep="8" omitNorms="true" positionIncrementGap="0"/> precisionStep="8" omitNorms="true" positionIncrementGap="0"/>
<!-- <!--
@ -137,14 +137,14 @@
the TrieDateField javadocs for more information. Note: For faster the TrieDateField javadocs for more information. Note: For faster
range queries, consider the tdate type range queries, consider the tdate type
--> -->
<fieldType name="date" class="solr.TrieDateField" <fieldType name="date" class="${solr.tests.DateFieldType}" docValues="${solr.tests.numeric.dv}"
omitNorms="true" precisionStep="0" positionIncrementGap="0"/> omitNorms="true" precisionStep="0" positionIncrementGap="0"/>
<!-- <!--
A Trie based date field for faster date range queries and date A Trie based date field for faster date range queries and date
faceting. faceting.
--> -->
<fieldType name="tdate" class="solr.TrieDateField" <fieldType name="tdate" class="${solr.tests.DateFieldType}" docValues="${solr.tests.numeric.dv}"
omitNorms="true" precisionStep="6" positionIncrementGap="0"/> omitNorms="true" precisionStep="6" positionIncrementGap="0"/>
<!-- <!--

View File

@ -22,6 +22,7 @@ import java.lang.invoke.MethodHandles;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collections; import java.util.Collections;
import java.util.HashMap; import java.util.HashMap;
import java.util.HashSet;
import java.util.LinkedHashMap; import java.util.LinkedHashMap;
import java.util.List; import java.util.List;
import java.util.Map; import java.util.Map;
@ -51,6 +52,7 @@ import org.apache.solr.common.util.ContentStreamBase;
import org.apache.solr.common.util.NamedList; import org.apache.solr.common.util.NamedList;
import org.apache.solr.common.util.SimpleOrderedMap; import org.apache.solr.common.util.SimpleOrderedMap;
import org.apache.solr.common.util.Utils; import org.apache.solr.common.util.Utils;
import org.apache.solr.handler.admin.ConfigSetsHandlerApi;
import org.apache.solr.handler.component.ShardHandler; import org.apache.solr.handler.component.ShardHandler;
import org.apache.solr.handler.component.ShardRequest; import org.apache.solr.handler.component.ShardRequest;
import org.apache.solr.request.LocalSolrQueryRequest; import org.apache.solr.request.LocalSolrQueryRequest;
@ -306,18 +308,47 @@ public class CreateCollectionCmd implements Cmd {
List<String> configNames = null; List<String> configNames = null;
try { try {
configNames = ocmh.zkStateReader.getZkClient().getChildren(ZkConfigManager.CONFIGS_ZKNODE, null, true); configNames = ocmh.zkStateReader.getZkClient().getChildren(ZkConfigManager.CONFIGS_ZKNODE, null, true);
if (configNames != null && configNames.size() == 1) { if (configNames.contains(ConfigSetsHandlerApi.DEFAULT_CONFIGSET_NAME)) {
if (!".system".equals(coll)) {
copyDefaultConfigSetTo(configNames, coll);
}
return coll;
} else if (configNames != null && configNames.size() == 1) {
configName = configNames.get(0); configName = configNames.get(0);
// no config set named, but there is only 1 - use it // no config set named, but there is only 1 - use it
log.info("Only one config set found in zk - using it:" + configName); log.info("Only one config set found in zk - using it:" + configName);
} else if (configNames.contains(coll)) {
configName = coll;
} }
} catch (KeeperException.NoNodeException e) { } catch (KeeperException.NoNodeException e) {
} }
} }
return configName; return "".equals(configName)? null: configName;
}
/**
* Copies the _default configset to the specified configset name (overwrites if pre-existing)
*/
private void copyDefaultConfigSetTo(List<String> configNames, String targetConfig) {
ZkConfigManager configManager = new ZkConfigManager(ocmh.zkStateReader.getZkClient());
// if a configset named coll exists, delete the configset so that _default can be copied over
if (configNames.contains(targetConfig)) {
log.info("There exists a configset by the same name as the collection we're trying to create: " + targetConfig +
", deleting it so that we can copy the _default configs over and create the collection.");
try {
configManager.deleteConfigDir(targetConfig);
} catch (Exception e) {
throw new SolrException(ErrorCode.INVALID_STATE, "Error while deleting configset: " + targetConfig, e);
}
} else {
log.info("Only _default config set found, using it.");
}
// Copy _default into targetConfig
try {
configManager.copyConfigDir(ConfigSetsHandlerApi.DEFAULT_CONFIGSET_NAME, targetConfig, new HashSet<>());
} catch (Exception e) {
throw new SolrException(ErrorCode.INVALID_STATE, "Error while copying _default to " + targetConfig, e);
}
} }
public static void createCollectionZkNode(SolrZkClient zkClient, String collection, Map<String,String> params) { public static void createCollectionZkNode(SolrZkClient zkClient, String collection, Map<String,String> params) {
@ -413,26 +444,34 @@ public class CreateCollectionCmd implements Cmd {
} }
} }
// if there is only one conf, use that
try { try {
configNames = zkClient.getChildren(ZkConfigManager.CONFIGS_ZKNODE, null, configNames = zkClient.getChildren(ZkConfigManager.CONFIGS_ZKNODE, null,
true); true);
} catch (NoNodeException e) { } catch (NoNodeException e) {
// just keep trying // just keep trying
} }
if (configNames != null && configNames.size() == 1) {
// no config set named, but there is only 1 - use it
log.info("Only one config set found in zk - using it:" + configNames.get(0));
collectionProps.put(ZkController.CONFIGNAME_PROP, configNames.get(0));
break;
}
// check if there's a config set with the same name as the collection
if (configNames != null && configNames.contains(collection)) { if (configNames != null && configNames.contains(collection)) {
log.info( log.info(
"Could not find explicit collection configName, but found config name matching collection name - using that set."); "Could not find explicit collection configName, but found config name matching collection name - using that set.");
collectionProps.put(ZkController.CONFIGNAME_PROP, collection); collectionProps.put(ZkController.CONFIGNAME_PROP, collection);
break; break;
} }
// if _default exists, use that
if (configNames != null && configNames.contains(ConfigSetsHandlerApi.DEFAULT_CONFIGSET_NAME)) {
log.info(
"Could not find explicit collection configName, but found _default config set - using that set.");
collectionProps.put(ZkController.CONFIGNAME_PROP, ConfigSetsHandlerApi.DEFAULT_CONFIGSET_NAME);
break;
}
// if there is only one conf, use that
if (configNames != null && configNames.size() == 1) {
// no config set named, but there is only 1 - use it
log.info("Only one config set found in zk - using it:" + configNames.get(0));
collectionProps.put(ZkController.CONFIGNAME_PROP, configNames.get(0));
break;
}
log.info("Could not find collection configName - pausing for 3 seconds and trying again - try: " + retry); log.info("Could not find collection configName - pausing for 3 seconds and trying again - try: " + retry);
Thread.sleep(3000); Thread.sleep(3000);

View File

@ -16,15 +16,19 @@
*/ */
package org.apache.solr.cloud; package org.apache.solr.cloud;
import java.io.File;
import java.io.IOException; import java.io.IOException;
import java.io.UnsupportedEncodingException; import java.io.UnsupportedEncodingException;
import java.lang.invoke.MethodHandles; import java.lang.invoke.MethodHandles;
import java.net.InetAddress; import java.net.InetAddress;
import java.net.NetworkInterface; import java.net.NetworkInterface;
import java.net.URISyntaxException;
import java.net.URL;
import java.net.URLEncoder; import java.net.URLEncoder;
import java.net.UnknownHostException; import java.net.UnknownHostException;
import java.nio.charset.StandardCharsets; import java.nio.charset.StandardCharsets;
import java.nio.file.Path; import java.nio.file.Path;
import java.nio.file.Paths;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collection; import java.util.Collection;
import java.util.Collections; import java.util.Collections;
@ -57,6 +61,17 @@ import org.apache.solr.common.SolrException;
import org.apache.solr.common.SolrException.ErrorCode; import org.apache.solr.common.SolrException.ErrorCode;
import org.apache.solr.common.cloud.*; import org.apache.solr.common.cloud.*;
import org.apache.solr.common.cloud.Replica.Type; import org.apache.solr.common.cloud.Replica.Type;
import org.apache.solr.common.cloud.Slice;
import org.apache.solr.common.cloud.SolrZkClient;
import org.apache.solr.common.cloud.ZkACLProvider;
import org.apache.solr.common.cloud.ZkCmdExecutor;
import org.apache.solr.common.cloud.ZkConfigManager;
import org.apache.solr.common.cloud.ZkCoreNodeProps;
import org.apache.solr.common.cloud.ZkCredentialsProvider;
import org.apache.solr.common.cloud.ZkMaintenanceUtils;
import org.apache.solr.common.cloud.ZkNodeProps;
import org.apache.solr.common.cloud.ZkStateReader;
import org.apache.solr.common.cloud.ZooKeeperException;
import org.apache.solr.common.params.CollectionParams; import org.apache.solr.common.params.CollectionParams;
import org.apache.solr.common.params.CommonParams; import org.apache.solr.common.params.CommonParams;
import org.apache.solr.common.params.SolrParams; import org.apache.solr.common.params.SolrParams;
@ -71,7 +86,9 @@ import org.apache.solr.core.CoreContainer;
import org.apache.solr.core.CoreDescriptor; import org.apache.solr.core.CoreDescriptor;
import org.apache.solr.core.SolrCore; import org.apache.solr.core.SolrCore;
import org.apache.solr.core.SolrCoreInitializationException; import org.apache.solr.core.SolrCoreInitializationException;
import org.apache.solr.handler.admin.ConfigSetsHandlerApi;
import org.apache.solr.logging.MDCLoggingContext; import org.apache.solr.logging.MDCLoggingContext;
import org.apache.solr.servlet.SolrDispatchFilter;
import org.apache.solr.update.UpdateLog; import org.apache.solr.update.UpdateLog;
import org.apache.zookeeper.CreateMode; import org.apache.zookeeper.CreateMode;
import org.apache.zookeeper.KeeperException; import org.apache.zookeeper.KeeperException;
@ -654,7 +671,7 @@ public class ZkController {
* @throws KeeperException if there is a Zookeeper error * @throws KeeperException if there is a Zookeeper error
* @throws InterruptedException on interrupt * @throws InterruptedException on interrupt
*/ */
public static void createClusterZkNodes(SolrZkClient zkClient) throws KeeperException, InterruptedException { public static void createClusterZkNodes(SolrZkClient zkClient) throws KeeperException, InterruptedException, IOException {
ZkCmdExecutor cmdExecutor = new ZkCmdExecutor(zkClient.getZkClientTimeout()); ZkCmdExecutor cmdExecutor = new ZkCmdExecutor(zkClient.getZkClientTimeout());
cmdExecutor.ensureExists(ZkStateReader.LIVE_NODES_ZKNODE, zkClient); cmdExecutor.ensureExists(ZkStateReader.LIVE_NODES_ZKNODE, zkClient);
cmdExecutor.ensureExists(ZkStateReader.COLLECTIONS_ZKNODE, zkClient); cmdExecutor.ensureExists(ZkStateReader.COLLECTIONS_ZKNODE, zkClient);
@ -667,6 +684,48 @@ public class ZkController {
cmdExecutor.ensureExists(ZkStateReader.CLUSTER_STATE, emptyJson, CreateMode.PERSISTENT, zkClient); cmdExecutor.ensureExists(ZkStateReader.CLUSTER_STATE, emptyJson, CreateMode.PERSISTENT, zkClient);
cmdExecutor.ensureExists(ZkStateReader.SOLR_SECURITY_CONF_PATH, emptyJson, CreateMode.PERSISTENT, zkClient); cmdExecutor.ensureExists(ZkStateReader.SOLR_SECURITY_CONF_PATH, emptyJson, CreateMode.PERSISTENT, zkClient);
cmdExecutor.ensureExists(ZkStateReader.SOLR_AUTOSCALING_CONF_PATH, emptyJson, CreateMode.PERSISTENT, zkClient); cmdExecutor.ensureExists(ZkStateReader.SOLR_AUTOSCALING_CONF_PATH, emptyJson, CreateMode.PERSISTENT, zkClient);
bootstrapDefaultConfigSet(zkClient);
}
private static void bootstrapDefaultConfigSet(SolrZkClient zkClient) throws KeeperException, InterruptedException, IOException {
if (zkClient.exists("/configs/_default", true) == false) {
String configDirPath = getDefaultConfigDirPath();
if (configDirPath == null) {
log.warn("The _default configset could not be uploaded. Please provide 'solr.default.confdir' parameter that points to a configset" +
" intended to be the default. Current 'solr.default.confdir' value: {}", System.getProperty(SolrDispatchFilter.SOLR_DEFAULT_CONFDIR_ATTRIBUTE));
} else {
ZkMaintenanceUtils.upConfig(zkClient, Paths.get(configDirPath), ConfigSetsHandlerApi.DEFAULT_CONFIGSET_NAME);
}
}
}
/**
* Gets the absolute filesystem path of the _default configset to bootstrap from.
* First tries the sysprop "solr.default.confdir". If not found, tries to find
* the _default dir relative to the sysprop "solr.install.dir".
* If that fails as well, tries to get the _default from the
* classpath. Returns null if not found anywhere.
*/
private static String getDefaultConfigDirPath() {
String configDirPath = null;
String serverSubPath = "solr" + File.separator +
"configsets" + File.separator + "_default" +
File.separator + "conf";
String subPath = File.separator + "server" + File.separator + serverSubPath;
if (System.getProperty(SolrDispatchFilter.SOLR_DEFAULT_CONFDIR_ATTRIBUTE) != null && new File(System.getProperty(SolrDispatchFilter.SOLR_DEFAULT_CONFDIR_ATTRIBUTE)).exists()) {
configDirPath = new File(System.getProperty(SolrDispatchFilter.SOLR_DEFAULT_CONFDIR_ATTRIBUTE)).getAbsolutePath();
} else if (System.getProperty(SolrDispatchFilter.SOLR_INSTALL_DIR_ATTRIBUTE) != null &&
new File(System.getProperty(SolrDispatchFilter.SOLR_INSTALL_DIR_ATTRIBUTE) + subPath).exists()) {
configDirPath = new File(System.getProperty(SolrDispatchFilter.SOLR_INSTALL_DIR_ATTRIBUTE) + subPath).getAbsolutePath();
} else { // find "_default" in the classpath. This one is used for tests
URL classpathUrl = Thread.currentThread().getContextClassLoader().getResource(serverSubPath);
try {
if (classpathUrl != null && new File(classpathUrl.toURI()).exists()) {
configDirPath = new File(classpathUrl.toURI()).getAbsolutePath();
}
} catch (URISyntaxException ex) {}
}
return configDirPath;
} }
private void init(CurrentCoreDescriptorProvider registerOnReconnect) { private void init(CurrentCoreDescriptorProvider registerOnReconnect) {

View File

@ -31,6 +31,8 @@ import org.apache.solr.response.SolrQueryResponse;
public class ConfigSetsHandlerApi extends BaseHandlerApiSupport { public class ConfigSetsHandlerApi extends BaseHandlerApiSupport {
final public static String DEFAULT_CONFIGSET_NAME = "_default";
final ConfigSetsHandler configSetHandler; final ConfigSetsHandler configSetHandler;
static Collection<ApiCommand> apiCommands = createMapping(); static Collection<ApiCommand> apiCommands = createMapping();

View File

@ -216,6 +216,7 @@ public class ExpandComponent extends SearchComponent implements PluginInfoInitia
if(CollapsingQParserPlugin.HINT_TOP_FC.equals(hint)) { if(CollapsingQParserPlugin.HINT_TOP_FC.equals(hint)) {
Map<String, UninvertingReader.Type> mapping = new HashMap(); Map<String, UninvertingReader.Type> mapping = new HashMap();
mapping.put(field, UninvertingReader.Type.SORTED); mapping.put(field, UninvertingReader.Type.SORTED);
@SuppressWarnings("resource")
UninvertingReader uninvertingReader = new UninvertingReader(new ReaderWrapper(searcher.getSlowAtomicReader(), field), mapping); UninvertingReader uninvertingReader = new UninvertingReader(new ReaderWrapper(searcher.getSlowAtomicReader(), field), mapping);
values = uninvertingReader.getSortedDocValues(field); values = uninvertingReader.getSortedDocValues(field);
} else { } else {
@ -386,6 +387,7 @@ public class ExpandComponent extends SearchComponent implements PluginInfoInitia
if(CollapsingQParserPlugin.HINT_TOP_FC.equals(hint)) { if(CollapsingQParserPlugin.HINT_TOP_FC.equals(hint)) {
Map<String, UninvertingReader.Type> mapping = new HashMap(); Map<String, UninvertingReader.Type> mapping = new HashMap();
mapping.put(field, UninvertingReader.Type.SORTED); mapping.put(field, UninvertingReader.Type.SORTED);
@SuppressWarnings("resource")
UninvertingReader uninvertingReader = new UninvertingReader(new ReaderWrapper(searcher.getSlowAtomicReader(), field), mapping); UninvertingReader uninvertingReader = new UninvertingReader(new ReaderWrapper(searcher.getSlowAtomicReader(), field), mapping);
values = uninvertingReader.getSortedDocValues(field); values = uninvertingReader.getSortedDocValues(field);
} else { } else {

View File

@ -36,6 +36,7 @@ public abstract class SolrMetricReporter implements Closeable, PluginInfoInitial
protected final SolrMetricManager metricManager; protected final SolrMetricManager metricManager;
protected PluginInfo pluginInfo; protected PluginInfo pluginInfo;
protected boolean enabled = true; protected boolean enabled = true;
protected int period = SolrMetricManager.DEFAULT_CLOUD_REPORTER_PERIOD;
/** /**
* Create a reporter for metrics managed in a named registry. * Create a reporter for metrics managed in a named registry.
@ -85,6 +86,20 @@ public abstract class SolrMetricReporter implements Closeable, PluginInfoInitial
} }
} }
/**
* @param period - in seconds
*/
public void setPeriod(int period) {
this.period = period;
}
/**
* @return period, in seconds
*/
public int getPeriod() {
return period;
}
/** /**
* Get the effective {@link PluginInfo} instance that was used for * Get the effective {@link PluginInfo} instance that was used for
* initialization of this plugin. * initialization of this plugin.

View File

@ -35,7 +35,6 @@ public class SolrGangliaReporter extends SolrMetricReporter {
private String host = null; private String host = null;
private int port = -1; private int port = -1;
private boolean multicast; private boolean multicast;
private int period = 60;
private String instancePrefix = null; private String instancePrefix = null;
private List<String> filters = new ArrayList<>(); private List<String> filters = new ArrayList<>();
private boolean testing; private boolean testing;
@ -88,10 +87,6 @@ public class SolrGangliaReporter extends SolrMetricReporter {
} }
} }
public void setPeriod(int period) {
this.period = period;
}
public void setMulticast(boolean multicast) { public void setMulticast(boolean multicast) {
this.multicast = multicast; this.multicast = multicast;
} }

View File

@ -36,7 +36,6 @@ public class SolrGraphiteReporter extends SolrMetricReporter {
private String host = null; private String host = null;
private int port = -1; private int port = -1;
private int period = 60;
private boolean pickled = false; private boolean pickled = false;
private String instancePrefix = null; private String instancePrefix = null;
private List<String> filters = new ArrayList<>(); private List<String> filters = new ArrayList<>();
@ -90,10 +89,6 @@ public class SolrGraphiteReporter extends SolrMetricReporter {
this.pickled = pickled; this.pickled = pickled;
} }
public void setPeriod(int period) {
this.period = period;
}
@Override @Override
protected void doInit() { protected void doInit() {
if (reporter != null) { if (reporter != null) {

View File

@ -70,6 +70,7 @@ public class SolrJmxReporter extends SolrMetricReporter {
*/ */
public SolrJmxReporter(SolrMetricManager metricManager, String registryName) { public SolrJmxReporter(SolrMetricManager metricManager, String registryName) {
super(metricManager, registryName); super(metricManager, registryName);
period = 0; // setting to zero to indicate not applicable
setDomain(registryName); setDomain(registryName);
} }
@ -151,7 +152,9 @@ public class SolrJmxReporter extends SolrMetricReporter {
*/ */
@Override @Override
protected void validate() throws IllegalStateException { protected void validate() throws IllegalStateException {
// Nothing to validate if (period != 0) {
throw new IllegalStateException("Init argument 'period' is not supported for "+getClass().getCanonicalName());
}
} }

View File

@ -47,7 +47,6 @@ public class SolrSlf4jReporter extends SolrMetricReporter {
@SuppressWarnings("unused") // we need this to pass validate-source-patterns @SuppressWarnings("unused") // we need this to pass validate-source-patterns
private static final Logger log = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass()); private static final Logger log = LoggerFactory.getLogger(MethodHandles.lookup().lookupClass());
private int period = 60;
private String instancePrefix = null; private String instancePrefix = null;
private String logger = null; private String logger = null;
private List<String> filters = new ArrayList<>(); private List<String> filters = new ArrayList<>();
@ -91,10 +90,6 @@ public class SolrSlf4jReporter extends SolrMetricReporter {
this.logger = logger; this.logger = logger;
} }
public void setPeriod(int period) {
this.period = period;
}
@Override @Override
protected void doInit() { protected void doInit() {
if (instancePrefix == null) { if (instancePrefix == null) {

View File

@ -124,7 +124,6 @@ public class SolrClusterReporter extends SolrMetricReporter {
}}; }};
private String handler = MetricsCollectorHandler.HANDLER_PATH; private String handler = MetricsCollectorHandler.HANDLER_PATH;
private int period = SolrMetricManager.DEFAULT_CLOUD_REPORTER_PERIOD;
private List<SolrReporter.Report> reports = new ArrayList<>(); private List<SolrReporter.Report> reports = new ArrayList<>();
private SolrReporter reporter; private SolrReporter reporter;
@ -143,10 +142,6 @@ public class SolrClusterReporter extends SolrMetricReporter {
this.handler = handler; this.handler = handler;
} }
public void setPeriod(int period) {
this.period = period;
}
public void setReport(List<Map> reportConfig) { public void setReport(List<Map> reportConfig) {
if (reportConfig == null || reportConfig.isEmpty()) { if (reportConfig == null || reportConfig.isEmpty()) {
return; return;
@ -169,11 +164,6 @@ public class SolrClusterReporter extends SolrMetricReporter {
} }
} }
// for unit tests
int getPeriod() {
return period;
}
List<SolrReporter.Report> getReports() { List<SolrReporter.Report> getReports() {
return reports; return reports;
} }
@ -187,7 +177,7 @@ public class SolrClusterReporter extends SolrMetricReporter {
@Override @Override
protected void validate() throws IllegalStateException { protected void validate() throws IllegalStateException {
// Nothing to validate // (period < 1) means "don't start reporter" and so no (period > 0) validation needed
} }
@Override @Override

View File

@ -70,7 +70,6 @@ public class SolrShardReporter extends SolrMetricReporter {
}}; }};
private String handler = MetricsCollectorHandler.HANDLER_PATH; private String handler = MetricsCollectorHandler.HANDLER_PATH;
private int period = SolrMetricManager.DEFAULT_CLOUD_REPORTER_PERIOD;
private List<String> filters = new ArrayList<>(); private List<String> filters = new ArrayList<>();
private SolrReporter reporter; private SolrReporter reporter;
@ -90,10 +89,6 @@ public class SolrShardReporter extends SolrMetricReporter {
this.handler = handler; this.handler = handler;
} }
public void setPeriod(int period) {
this.period = period;
}
public void setFilter(List<String> filterConfig) { public void setFilter(List<String> filterConfig) {
if (filterConfig == null || filterConfig.isEmpty()) { if (filterConfig == null || filterConfig.isEmpty()) {
return; return;
@ -107,11 +102,6 @@ public class SolrShardReporter extends SolrMetricReporter {
} }
} }
// for unit tests
int getPeriod() {
return period;
}
@Override @Override
protected void doInit() { protected void doInit() {
if (filters.isEmpty()) { if (filters.isEmpty()) {
@ -122,7 +112,7 @@ public class SolrShardReporter extends SolrMetricReporter {
@Override @Override
protected void validate() throws IllegalStateException { protected void validate() throws IllegalStateException {
// Nothing to validate // (period < 1) means "don't start reporter" and so no (period > 0) validation needed
} }
@Override @Override

View File

@ -133,6 +133,10 @@ public class SolrDispatchFilter extends BaseSolrFilter {
public static final String SOLRHOME_ATTRIBUTE = "solr.solr.home"; public static final String SOLRHOME_ATTRIBUTE = "solr.solr.home";
public static final String SOLR_INSTALL_DIR_ATTRIBUTE = "solr.install.dir";
public static final String SOLR_DEFAULT_CONFDIR_ATTRIBUTE = "solr.default.confdir";
public static final String SOLR_LOG_MUTECONSOLE = "solr.log.muteconsole"; public static final String SOLR_LOG_MUTECONSOLE = "solr.log.muteconsole";
public static final String SOLR_LOG_LEVEL = "solr.log.level"; public static final String SOLR_LOG_LEVEL = "solr.log.level";
@ -223,7 +227,7 @@ public class SolrDispatchFilter extends BaseSolrFilter {
private void logWelcomeBanner() { private void logWelcomeBanner() {
log.info(" ___ _ Welcome to Apache Solr™ version {}", solrVersion()); log.info(" ___ _ Welcome to Apache Solr™ version {}", solrVersion());
log.info("/ __| ___| |_ _ Starting in {} mode on port {}", isCloudMode() ? "cloud" : "standalone", getSolrPort()); log.info("/ __| ___| |_ _ Starting in {} mode on port {}", isCloudMode() ? "cloud" : "standalone", getSolrPort());
log.info("\\__ \\/ _ \\ | '_| Install dir: {}", System.getProperty("solr.install.dir")); log.info("\\__ \\/ _ \\ | '_| Install dir: {}, Default config dir: {}", System.getProperty(SOLR_INSTALL_DIR_ATTRIBUTE), System.getProperty(SOLR_DEFAULT_CONFDIR_ATTRIBUTE));
log.info("|___/\\___/_|_| Start time: {}", Instant.now().toString()); log.info("|___/\\___/_|_| Start time: {}", Instant.now().toString());
} }

View File

@ -993,7 +993,7 @@ public class SolrCLI {
} }
} // end ApiTool class } // end ApiTool class
private static final String DEFAULT_CONFIG_SET = "data_driven_schema_configs"; private static final String DEFAULT_CONFIG_SET = "_default";
private static final long MS_IN_MIN = 60 * 1000L; private static final long MS_IN_MIN = 60 * 1000L;
private static final long MS_IN_HOUR = MS_IN_MIN * 60L; private static final long MS_IN_HOUR = MS_IN_MIN * 60L;
@ -1503,17 +1503,23 @@ public class SolrCLI {
maxShardsPerNode = ((numShards*replicationFactor)+numNodes-1)/numNodes; maxShardsPerNode = ((numShards*replicationFactor)+numNodes-1)/numNodes;
} }
String confname = cli.getOptionValue("confname", collectionName); String confname = cli.getOptionValue("confname");
boolean configExistsInZk = String confdir = cli.getOptionValue("confdir");
String configsetsDir = cli.getOptionValue("configsetsDir");
boolean configExistsInZk = confname != null && !"".equals(confname.trim()) &&
cloudSolrClient.getZkStateReader().getZkClient().exists("/configs/" + confname, true); cloudSolrClient.getZkStateReader().getZkClient().exists("/configs/" + confname, true);
if (".system".equals(collectionName)) { if (".system".equals(collectionName)) {
//do nothing //do nothing
} else if (configExistsInZk) { } else if (configExistsInZk) {
echo("Re-using existing configuration directory "+confname); echo("Re-using existing configuration directory "+confname);
} else { } else if (confdir != null && !"".equals(confdir.trim())){
Path confPath = ZkConfigManager.getConfigsetPath(cli.getOptionValue("confdir", DEFAULT_CONFIG_SET), if (confname == null || "".equals(confname.trim())) {
cli.getOptionValue("configsetsDir")); confname = collectionName;
}
Path confPath = ZkConfigManager.getConfigsetPath(confdir,
configsetsDir);
echo("Uploading " + confPath.toAbsolutePath().toString() + echo("Uploading " + confPath.toAbsolutePath().toString() +
" for config " + confname + " to ZooKeeper at " + cloudSolrClient.getZkHost()); " for config " + confname + " to ZooKeeper at " + cloudSolrClient.getZkHost());
@ -1531,13 +1537,15 @@ public class SolrCLI {
// doesn't seem to exist ... try to create // doesn't seem to exist ... try to create
String createCollectionUrl = String createCollectionUrl =
String.format(Locale.ROOT, String.format(Locale.ROOT,
"%s/admin/collections?action=CREATE&name=%s&numShards=%d&replicationFactor=%d&maxShardsPerNode=%d&collection.configName=%s", "%s/admin/collections?action=CREATE&name=%s&numShards=%d&replicationFactor=%d&maxShardsPerNode=%d",
baseUrl, baseUrl,
collectionName, collectionName,
numShards, numShards,
replicationFactor, replicationFactor,
maxShardsPerNode, maxShardsPerNode);
confname); if (confname != null && !"".equals(confname.trim())) {
createCollectionUrl = createCollectionUrl + String.format(Locale.ROOT, "&collection.configName=%s", confname);
}
echo("\nCreating new collection '"+collectionName+"' using command:\n"+createCollectionUrl+"\n"); echo("\nCreating new collection '"+collectionName+"' using command:\n"+createCollectionUrl+"\n");
@ -2681,7 +2689,7 @@ public class SolrCLI {
File exDir = setupExampleDir(serverDir, exampleDir, exampleName); File exDir = setupExampleDir(serverDir, exampleDir, exampleName);
String collectionName = "schemaless".equals(exampleName) ? "gettingstarted" : exampleName; String collectionName = "schemaless".equals(exampleName) ? "gettingstarted" : exampleName;
String configSet = String configSet =
"techproducts".equals(exampleName) ? "sample_techproducts_configs" : "data_driven_schema_configs"; "techproducts".equals(exampleName) ? "sample_techproducts_configs" : "_default";
boolean isCloudMode = cli.hasOption('c'); boolean isCloudMode = cli.hasOption('c');
String zkHost = cli.getOptionValue('z'); String zkHost = cli.getOptionValue('z');
@ -3054,7 +3062,7 @@ public class SolrCLI {
// yay! numNodes SolrCloud nodes running // yay! numNodes SolrCloud nodes running
int numShards = 2; int numShards = 2;
int replicationFactor = 2; int replicationFactor = 2;
String cloudConfig = "data_driven_schema_configs"; String cloudConfig = "_default";
String collectionName = "gettingstarted"; String collectionName = "gettingstarted";
File configsetsDir = new File(serverDir, "solr/configsets"); File configsetsDir = new File(serverDir, "solr/configsets");
@ -3089,7 +3097,7 @@ public class SolrCLI {
"How many replicas per shard would you like to create? [2] ", "a replication factor", 2, 1, 4); "How many replicas per shard would you like to create? [2] ", "a replication factor", 2, 1, 4);
echo("Please choose a configuration for the "+collectionName+" collection, available options are:"); echo("Please choose a configuration for the "+collectionName+" collection, available options are:");
String validConfigs = "basic_configs, data_driven_schema_configs, or sample_techproducts_configs ["+cloudConfig+"] "; String validConfigs = "_default or sample_techproducts_configs ["+cloudConfig+"] ";
cloudConfig = prompt(readInput, validConfigs, cloudConfig); cloudConfig = prompt(readInput, validConfigs, cloudConfig);
// validate the cloudConfig name // validate the cloudConfig name

View File

@ -0,0 +1,37 @@
<?xml version="1.0" encoding="UTF-8" ?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<schema name="default-config" version="1.6">
<field name="id" type="string" indexed="true" stored="true" required="true" multiValued="false" />
<field name="_version_" type="long" indexed="false" stored="false"/>
<field name="_root_" type="string" indexed="true" stored="false" docValues="false" />
<field name="_text_" type="text_general" indexed="true" stored="false" multiValued="true"/>
<fieldType name="string" class="solr.StrField" sortMissingLast="true" docValues="true" />
<fieldType name="long" class="solr.TrieLongField" docValues="true" precisionStep="0" positionIncrementGap="0"/>
<fieldType name="text_general" class="solr.TextField" positionIncrementGap="100" multiValued="true">
<analyzer type="index">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.LowerCaseFilterFactory"/>
</analyzer>
<analyzer type="query">
<tokenizer class="solr.StandardTokenizerFactory"/>
<filter class="solr.LowerCaseFilterFactory"/>
</analyzer>
</fieldType>
</schema>

View File

@ -1,4 +1,5 @@
<?xml version="1.0" encoding="UTF-8" ?> <?xml version="1.0" ?>
<!-- <!--
Licensed to the Apache Software Foundation (ASF) under one or more Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with contributor license agreements. See the NOTICE file distributed with
@ -16,27 +17,16 @@
limitations under the License. limitations under the License.
--> -->
<!-- If this file is found in the config directory, it will only be <config>
loaded once at startup. If it is found in Solr's data
directory, it will be re-loaded every commit.
See http://wiki.apache.org/solr/QueryElevationComponent for more info <directoryFactory name="DirectoryFactory" class="${solr.directoryFactory:solr.RAMDirectoryFactory}"/>
--> <luceneMatchVersion>${tests.luceneMatchVersion:LATEST}</luceneMatchVersion>
<elevate>
<!-- Query elevation examples
<query text="foo bar">
<doc id="1" />
<doc id="2" />
<doc id="3" />
</query>
for use with techproducts example <requestHandler name="my_error_handler" class="solr.ThrowErrorOnInitRequestHandler">
<str name="error">This is the _default configset, which is designed to throw error upon collection creation.</str>
</requestHandler>
<query text="ipod"> <schemaFactory class="ClassicIndexSchemaFactory"/>
<doc id="MA147LL/A" /> put the actual ipod at the top
<doc id="IW-02" exclude="true" /> exclude this cable
</query>
-->
</elevate> </config>

View File

@ -432,7 +432,7 @@ public class BaseCdcrDistributedZkTest extends AbstractDistribZkTestBase {
REPLICATION_FACTOR, replicationFactor, REPLICATION_FACTOR, replicationFactor,
CREATE_NODE_SET, createNodeSetStr, CREATE_NODE_SET, createNodeSetStr,
MAX_SHARDS_PER_NODE, maxShardsPerNode), MAX_SHARDS_PER_NODE, maxShardsPerNode),
client, null); client, "conf1");
} }
private CollectionAdminResponse createCollection(Map<String, List<Integer>> collectionInfos, String collectionName, private CollectionAdminResponse createCollection(Map<String, List<Integer>> collectionInfos, String collectionName,
@ -588,7 +588,7 @@ public class BaseCdcrDistributedZkTest extends AbstractDistribZkTestBase {
try (SolrClient client = createCloudClient(temporaryCollection)) { try (SolrClient client = createCloudClient(temporaryCollection)) {
assertEquals(0, CollectionAdminRequest assertEquals(0, CollectionAdminRequest
.createCollection(temporaryCollection, shardCount, 1) .createCollection(temporaryCollection, "conf1", shardCount, 1)
.setCreateNodeSet("") .setCreateNodeSet("")
.process(client).getStatus()); .process(client).getStatus());
for (int i = 0; i < jettys.size(); i++) { for (int i = 0; i < jettys.size(); i++) {

View File

@ -157,7 +157,7 @@ public class BasicDistributedZk2Test extends AbstractFullDistribZkTestBase {
private void testNodeWithoutCollectionForwarding() throws Exception { private void testNodeWithoutCollectionForwarding() throws Exception {
assertEquals(0, CollectionAdminRequest assertEquals(0, CollectionAdminRequest
.createCollection(ONE_NODE_COLLECTION, 1, 1) .createCollection(ONE_NODE_COLLECTION, "conf1", 1, 1)
.setCreateNodeSet("") .setCreateNodeSet("")
.process(cloudClient).getStatus()); .process(cloudClient).getStatus());
assertTrue(CollectionAdminRequest assertTrue(CollectionAdminRequest

View File

@ -576,7 +576,7 @@ public class BasicDistributedZkTest extends AbstractFullDistribZkTestBase {
protected void createCores(final HttpSolrClient client, protected void createCores(final HttpSolrClient client,
ThreadPoolExecutor executor, final String collection, final int numShards, int cnt) { ThreadPoolExecutor executor, final String collection, final int numShards, int cnt) {
try { try {
assertEquals(0, CollectionAdminRequest.createCollection(collection, numShards, 1) assertEquals(0, CollectionAdminRequest.createCollection(collection, "conf1", numShards, 1)
.setCreateNodeSet("") .setCreateNodeSet("")
.process(client).getStatus()); .process(client).getStatus());
} catch (SolrServerException | IOException e) { } catch (SolrServerException | IOException e) {
@ -614,8 +614,9 @@ public class BasicDistributedZkTest extends AbstractFullDistribZkTestBase {
return url2; return url2;
} }
@Override
protected CollectionAdminResponse createCollection(Map<String, List<Integer>> collectionInfos, protected CollectionAdminResponse createCollection(Map<String, List<Integer>> collectionInfos,
String collectionName, int numShards, int numReplicas, int maxShardsPerNode, SolrClient client, String createNodeSetStr) throws SolrServerException, IOException { String collectionName, String configSetName, int numShards, int numReplicas, int maxShardsPerNode, SolrClient client, String createNodeSetStr) throws SolrServerException, IOException {
// TODO: Use CollectionAdminRequest for this test // TODO: Use CollectionAdminRequest for this test
ModifiableSolrParams params = new ModifiableSolrParams(); ModifiableSolrParams params = new ModifiableSolrParams();
params.set("action", CollectionAction.CREATE.toString()); params.set("action", CollectionAction.CREATE.toString());
@ -633,6 +634,7 @@ public class BasicDistributedZkTest extends AbstractFullDistribZkTestBase {
collectionInfos.put(collectionName, list); collectionInfos.put(collectionName, list);
} }
params.set("name", collectionName); params.set("name", collectionName);
params.set("collection.configName", configSetName);
SolrRequest request = new QueryRequest(params); SolrRequest request = new QueryRequest(params);
request.setPath("/admin/collections"); request.setPath("/admin/collections");
@ -795,7 +797,7 @@ public class BasicDistributedZkTest extends AbstractFullDistribZkTestBase {
private void testANewCollectionInOneInstanceWithManualShardAssignement() throws Exception { private void testANewCollectionInOneInstanceWithManualShardAssignement() throws Exception {
log.info("### STARTING testANewCollectionInOneInstanceWithManualShardAssignement"); log.info("### STARTING testANewCollectionInOneInstanceWithManualShardAssignement");
assertEquals(0, CollectionAdminRequest.createCollection(oneInstanceCollection2, 2, 2) assertEquals(0, CollectionAdminRequest.createCollection(oneInstanceCollection2, "conf1", 2, 2)
.setCreateNodeSet("") .setCreateNodeSet("")
.setMaxShardsPerNode(4) .setMaxShardsPerNode(4)
.process(cloudClient).getStatus()); .process(cloudClient).getStatus());
@ -921,7 +923,7 @@ public class BasicDistributedZkTest extends AbstractFullDistribZkTestBase {
private void testANewCollectionInOneInstance() throws Exception { private void testANewCollectionInOneInstance() throws Exception {
log.info("### STARTING testANewCollectionInOneInstance"); log.info("### STARTING testANewCollectionInOneInstance");
CollectionAdminResponse response = CollectionAdminRequest.createCollection(oneInstanceCollection, 2, 2) CollectionAdminResponse response = CollectionAdminRequest.createCollection(oneInstanceCollection, "conf1", 2, 2)
.setCreateNodeSet(jettys.get(0).getNodeName()) .setCreateNodeSet(jettys.get(0).getNodeName())
.setMaxShardsPerNode(4) .setMaxShardsPerNode(4)
.process(cloudClient); .process(cloudClient);
@ -1087,7 +1089,7 @@ public class BasicDistributedZkTest extends AbstractFullDistribZkTestBase {
private void createNewCollection(final String collection) throws InterruptedException { private void createNewCollection(final String collection) throws InterruptedException {
try { try {
assertEquals(0, CollectionAdminRequest assertEquals(0, CollectionAdminRequest
.createCollection(collection, 2, 1) .createCollection(collection, "conf1", 2, 1)
.setCreateNodeSet("") .setCreateNodeSet("")
.process(cloudClient).getStatus()); .process(cloudClient).getStatus());
} catch (Exception e) { } catch (Exception e) {

View File

@ -61,7 +61,7 @@ public class ClusterStateUpdateTest extends SolrCloudTestCase {
public void testCoreRegistration() throws Exception { public void testCoreRegistration() throws Exception {
System.setProperty("solrcloud.update.delay", "1"); System.setProperty("solrcloud.update.delay", "1");
assertEquals(0, CollectionAdminRequest.createCollection("testcore", 1,1) assertEquals(0, CollectionAdminRequest.createCollection("testcore", "conf", 1, 1)
.setCreateNodeSet(cluster.getJettySolrRunner(0).getNodeName()) .setCreateNodeSet(cluster.getJettySolrRunner(0).getNodeName())
.process(cluster.getSolrClient()).getStatus()); .process(cluster.getSolrClient()).getStatus());

View File

@ -55,6 +55,25 @@ public class CollectionsAPISolrJTest extends SolrCloudTestCase {
.configure(); .configure();
} }
/**
* When a config name is not specified during collection creation, the _default should
* be used.
*/
@Test
public void testCreateWithDefaultConfigSet() throws Exception {
String collectionName = "solrj_default_configset";
CollectionAdminResponse response = CollectionAdminRequest.createCollection(collectionName, 2, 2) // no configset specified
.process(cluster.getSolrClient());
// The _default configset (for the tests) is designed to error out upon collection creation,
// so we just ensure that the correct error message was obtained.
assertFalse(response.isSuccess());
System.out.println("Errors are: "+response.getErrorMessages());
assertTrue(response.getErrorMessages() != null && response.getErrorMessages().size() > 0);
assertTrue(response.getErrorMessages().getVal(0).contains("This is the _default configset, which is designed"
+ " to throw error upon collection creation."));
}
@Test @Test
public void testCreateAndDeleteCollection() throws Exception { public void testCreateAndDeleteCollection() throws Exception {
String collectionName = "solrj_test"; String collectionName = "solrj_test";

View File

@ -80,7 +80,7 @@ public class ForceLeaderTest extends HttpPartitionTest {
handle.put("timestamp", SKIPVAL); handle.put("timestamp", SKIPVAL);
String testCollectionName = "forceleader_test_collection"; String testCollectionName = "forceleader_test_collection";
createCollection(testCollectionName, 1, 3, 1); createCollection(testCollectionName, "conf1", 1, 3, 1);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
try { try {
@ -166,7 +166,7 @@ public class ForceLeaderTest extends HttpPartitionTest {
handle.put("timestamp", SKIPVAL); handle.put("timestamp", SKIPVAL);
String testCollectionName = "forceleader_last_published"; String testCollectionName = "forceleader_last_published";
createCollection(testCollectionName, 1, 3, 1); createCollection(testCollectionName, "conf1", 1, 3, 1);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
log.info("Collection created: " + testCollectionName); log.info("Collection created: " + testCollectionName);

View File

@ -152,7 +152,7 @@ public class HttpPartitionTest extends AbstractFullDistribZkTestBase {
protected void testLeaderInitiatedRecoveryCRUD() throws Exception { protected void testLeaderInitiatedRecoveryCRUD() throws Exception {
String testCollectionName = "c8n_crud_1x2"; String testCollectionName = "c8n_crud_1x2";
String shardId = "shard1"; String shardId = "shard1";
createCollectionRetry(testCollectionName, 1, 2, 1); createCollectionRetry(testCollectionName, "conf1", 1, 2, 1);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
Replica leader = cloudClient.getZkStateReader().getLeaderRetry(testCollectionName, shardId); Replica leader = cloudClient.getZkStateReader().getLeaderRetry(testCollectionName, shardId);
@ -204,7 +204,7 @@ public class HttpPartitionTest extends AbstractFullDistribZkTestBase {
protected void testMinRf() throws Exception { protected void testMinRf() throws Exception {
// create a collection that has 1 shard and 3 replicas // create a collection that has 1 shard and 3 replicas
String testCollectionName = "collMinRf_1x3"; String testCollectionName = "collMinRf_1x3";
createCollection(testCollectionName, 1, 3, 1); createCollection(testCollectionName, "conf1", 1, 3, 1);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
sendDoc(1, 2); sendDoc(1, 2);
@ -290,7 +290,7 @@ public class HttpPartitionTest extends AbstractFullDistribZkTestBase {
protected void testRf2() throws Exception { protected void testRf2() throws Exception {
// create a collection that has 1 shard but 2 replicas // create a collection that has 1 shard but 2 replicas
String testCollectionName = "c8n_1x2"; String testCollectionName = "c8n_1x2";
createCollectionRetry(testCollectionName, 1, 2, 1); createCollectionRetry(testCollectionName, "conf1", 1, 2, 1);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
sendDoc(1); sendDoc(1);
@ -386,7 +386,7 @@ public class HttpPartitionTest extends AbstractFullDistribZkTestBase {
protected void testRf3() throws Exception { protected void testRf3() throws Exception {
// create a collection that has 1 shard but 2 replicas // create a collection that has 1 shard but 2 replicas
String testCollectionName = "c8n_1x3"; String testCollectionName = "c8n_1x3";
createCollectionRetry(testCollectionName, 1, 3, 1); createCollectionRetry(testCollectionName, "conf1", 1, 3, 1);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
@ -437,7 +437,7 @@ public class HttpPartitionTest extends AbstractFullDistribZkTestBase {
protected void testLeaderZkSessionLoss() throws Exception { protected void testLeaderZkSessionLoss() throws Exception {
String testCollectionName = "c8n_1x2_leader_session_loss"; String testCollectionName = "c8n_1x2_leader_session_loss";
createCollectionRetry(testCollectionName, 1, 2, 1); createCollectionRetry(testCollectionName, "conf1", 1, 2, 1);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
sendDoc(1); sendDoc(1);

View File

@ -61,7 +61,7 @@ public class LeaderFailoverAfterPartitionTest extends HttpPartitionTest {
// kill the leader ... see what happens // kill the leader ... see what happens
// create a collection that has 1 shard but 3 replicas // create a collection that has 1 shard but 3 replicas
String testCollectionName = "c8n_1x3_lf"; // _lf is leader fails String testCollectionName = "c8n_1x3_lf"; // _lf is leader fails
createCollection(testCollectionName, 1, 3, 1); createCollection(testCollectionName, "conf1", 1, 3, 1);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
sendDoc(1); sendDoc(1);

View File

@ -62,7 +62,7 @@ public class LeaderInitiatedRecoveryOnCommitTest extends BasicDistributedZkTest
// create a collection that has 2 shard and 2 replicas // create a collection that has 2 shard and 2 replicas
String testCollectionName = "c8n_2x2_commits"; String testCollectionName = "c8n_2x2_commits";
createCollection(testCollectionName, 2, 2, 1); createCollection(testCollectionName, "conf1", 2, 2, 1);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
List<Replica> notLeaders = List<Replica> notLeaders =
@ -105,7 +105,7 @@ public class LeaderInitiatedRecoveryOnCommitTest extends BasicDistributedZkTest
// create a collection that has 1 shard and 3 replicas // create a collection that has 1 shard and 3 replicas
String testCollectionName = "c8n_1x3_commits"; String testCollectionName = "c8n_1x3_commits";
createCollection(testCollectionName, 1, 3, 1); createCollection(testCollectionName, "conf1", 1, 3, 1);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
List<Replica> notLeaders = List<Replica> notLeaders =

View File

@ -86,7 +86,7 @@ public class LeaderInitiatedRecoveryOnShardRestartTest extends AbstractFullDistr
String testCollectionName = "all_in_lir"; String testCollectionName = "all_in_lir";
String shardId = "shard1"; String shardId = "shard1";
createCollection(testCollectionName, 1, 3, 1); createCollection(testCollectionName, "conf1", 1, 3, 1);
waitForRecoveriesToFinish(testCollectionName, false); waitForRecoveriesToFinish(testCollectionName, false);

View File

@ -98,12 +98,12 @@ public class ReplicationFactorTest extends AbstractFullDistribZkTestBase {
String shardId = "shard1"; String shardId = "shard1";
int minRf = 2; int minRf = 2;
CollectionAdminResponse resp = createCollection(testCollectionName, numShards, replicationFactor, maxShardsPerNode); CollectionAdminResponse resp = createCollection(testCollectionName, "conf1", numShards, replicationFactor, maxShardsPerNode);
if (resp.getResponse().get("failure") != null) { if (resp.getResponse().get("failure") != null) {
CollectionAdminRequest.deleteCollection(testCollectionName).process(cloudClient); CollectionAdminRequest.deleteCollection(testCollectionName).process(cloudClient);
resp = createCollection(testCollectionName, numShards, replicationFactor, maxShardsPerNode); resp = createCollection(testCollectionName, "conf1", numShards, replicationFactor, maxShardsPerNode);
if (resp.getResponse().get("failure") != null) { if (resp.getResponse().get("failure") != null) {
fail("Could not create " + testCollectionName); fail("Could not create " + testCollectionName);
@ -184,7 +184,7 @@ public class ReplicationFactorTest extends AbstractFullDistribZkTestBase {
String shardId = "shard1"; String shardId = "shard1";
int minRf = 2; int minRf = 2;
createCollection(testCollectionName, numShards, replicationFactor, maxShardsPerNode); createCollection(testCollectionName, "conf1", numShards, replicationFactor, maxShardsPerNode);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
List<Replica> replicas = List<Replica> replicas =

View File

@ -63,7 +63,7 @@ public class ShardRoutingCustomTest extends AbstractFullDistribZkTestBase {
setupJettySolrHome(jettyDir); setupJettySolrHome(jettyDir);
JettySolrRunner j = createJetty(jettyDir, createTempDir().toFile().getAbsolutePath(), "shardA", "solrconfig.xml", null); JettySolrRunner j = createJetty(jettyDir, createTempDir().toFile().getAbsolutePath(), "shardA", "solrconfig.xml", null);
assertEquals(0, CollectionAdminRequest assertEquals(0, CollectionAdminRequest
.createCollection(DEFAULT_COLLECTION, 1, 1) .createCollection(DEFAULT_COLLECTION, "conf1", 1, 1)
.setStateFormat(Integer.parseInt(getStateFormat())) .setStateFormat(Integer.parseInt(getStateFormat()))
.setCreateNodeSet("") .setCreateNodeSet("")
.process(cloudClient).getStatus()); .process(cloudClient).getStatus());

View File

@ -523,7 +523,7 @@ public class ShardSplitTest extends BasicDistributedZkTest {
log.info("Starting testSplitShardWithRule"); log.info("Starting testSplitShardWithRule");
String collectionName = "shardSplitWithRule"; String collectionName = "shardSplitWithRule";
CollectionAdminRequest.Create createRequest = CollectionAdminRequest.createCollection(collectionName,1,2) CollectionAdminRequest.Create createRequest = CollectionAdminRequest.createCollection(collectionName, "conf1", 1, 2)
.setRule("shard:*,replica:<2,node:*"); .setRule("shard:*,replica:<2,node:*");
CollectionAdminResponse response = createRequest.process(cloudClient); CollectionAdminResponse response = createRequest.process(cloudClient);
assertEquals(0, response.getStatus()); assertEquals(0, response.getStatus());

View File

@ -55,7 +55,8 @@ import static org.apache.solr.common.util.Utils.getObjectByPath;
/** /**
* Emulates bin/solr -e cloud -noprompt; bin/post -c gettingstarted example/exampledocs/*.xml; * Emulates bin/solr -e cloud -noprompt; bin/post -c gettingstarted example/exampledocs/*.xml;
* this test is useful for catching regressions in indexing the example docs in collections that * this test is useful for catching regressions in indexing the example docs in collections that
* use data-driven schema and managed schema features provided by configsets/data_driven_schema_configs. * use data driven functionality and managed schema features of the default configset
* (configsets/_default).
*/ */
public class SolrCloudExampleTest extends AbstractFullDistribZkTestBase { public class SolrCloudExampleTest extends AbstractFullDistribZkTestBase {
@ -73,8 +74,8 @@ public class SolrCloudExampleTest extends AbstractFullDistribZkTestBase {
log.info("testLoadDocsIntoGettingStartedCollection initialized OK ... running test logic"); log.info("testLoadDocsIntoGettingStartedCollection initialized OK ... running test logic");
String testCollectionName = "gettingstarted"; String testCollectionName = "gettingstarted";
File data_driven_schema_configs = new File(ExternalPaths.SCHEMALESS_CONFIGSET); File defaultConfigs = new File(ExternalPaths.DEFAULT_CONFIGSET);
assertTrue(data_driven_schema_configs.getAbsolutePath()+" not found!", data_driven_schema_configs.isDirectory()); assertTrue(defaultConfigs.getAbsolutePath()+" not found!", defaultConfigs.isDirectory());
Set<String> liveNodes = cloudClient.getZkStateReader().getClusterState().getLiveNodes(); Set<String> liveNodes = cloudClient.getZkStateReader().getClusterState().getLiveNodes();
if (liveNodes.isEmpty()) if (liveNodes.isEmpty())
@ -88,8 +89,8 @@ public class SolrCloudExampleTest extends AbstractFullDistribZkTestBase {
"-shards", "2", "-shards", "2",
"-replicationFactor", "2", "-replicationFactor", "2",
"-confname", testCollectionName, "-confname", testCollectionName,
"-confdir", "data_driven_schema_configs", "-confdir", "_default",
"-configsetsDir", data_driven_schema_configs.getParentFile().getParentFile().getAbsolutePath(), "-configsetsDir", defaultConfigs.getParentFile().getParentFile().getAbsolutePath(),
"-solrUrl", solrUrl "-solrUrl", solrUrl
}; };

View File

@ -695,7 +695,7 @@ public class TestConfigSetsAPI extends SolrTestCaseJ4 {
ConfigSetAdminRequest.List list = new ConfigSetAdminRequest.List(); ConfigSetAdminRequest.List list = new ConfigSetAdminRequest.List();
ConfigSetAdminResponse.List response = list.process(solrClient); ConfigSetAdminResponse.List response = list.process(solrClient);
Collection<String> actualConfigSets = response.getConfigSets(); Collection<String> actualConfigSets = response.getConfigSets();
assertEquals(0, actualConfigSets.size()); assertEquals(1, actualConfigSets.size()); // only the _default configset
// test multiple // test multiple
Set<String> configSets = new HashSet<String>(); Set<String> configSets = new HashSet<String>();
@ -706,8 +706,8 @@ public class TestConfigSetsAPI extends SolrTestCaseJ4 {
} }
response = list.process(solrClient); response = list.process(solrClient);
actualConfigSets = response.getConfigSets(); actualConfigSets = response.getConfigSets();
assertEquals(configSets.size(), actualConfigSets.size()); assertEquals(configSets.size() + 1, actualConfigSets.size());
assertTrue(configSets.containsAll(actualConfigSets)); assertTrue(actualConfigSets.containsAll(configSets));
} finally { } finally {
zkClient.close(); zkClient.close();
} }

View File

@ -63,7 +63,7 @@ public class TestOnReconnectListenerSupport extends AbstractFullDistribZkTestBas
String testCollectionName = "c8n_onreconnect_1x1"; String testCollectionName = "c8n_onreconnect_1x1";
String shardId = "shard1"; String shardId = "shard1";
createCollectionRetry(testCollectionName, 1, 1, 1); createCollectionRetry(testCollectionName, "conf1", 1, 1, 1);
cloudClient.setDefaultCollection(testCollectionName); cloudClient.setDefaultCollection(testCollectionName);
Replica leader = getShardLeader(testCollectionName, shardId, 30 /* timeout secs */); Replica leader = getShardLeader(testCollectionName, shardId, 30 /* timeout secs */);

View File

@ -130,9 +130,9 @@ public class TestPullReplica extends SolrCloudTestCase {
break; break;
case 1: case 1:
// Sometimes use v1 API // Sometimes use v1 API
String url = String.format(Locale.ROOT, "%s/admin/collections?action=CREATE&name=%s&numShards=%s&pullReplicas=%s&maxShardsPerNode=%s", String url = String.format(Locale.ROOT, "%s/admin/collections?action=CREATE&name=%s&collection.configName=%s&numShards=%s&pullReplicas=%s&maxShardsPerNode=%s",
cluster.getRandomJetty(random()).getBaseUrl(), cluster.getRandomJetty(random()).getBaseUrl(),
collectionName, collectionName, "conf",
2, // numShards 2, // numShards
3, // pullReplicas 3, // pullReplicas
100); // maxShardsPerNode 100); // maxShardsPerNode
@ -143,8 +143,8 @@ public class TestPullReplica extends SolrCloudTestCase {
case 2: case 2:
// Sometimes use V2 API // Sometimes use V2 API
url = cluster.getRandomJetty(random()).getBaseUrl().toString() + "/____v2/c"; url = cluster.getRandomJetty(random()).getBaseUrl().toString() + "/____v2/c";
String requestBody = String.format(Locale.ROOT, "{create:{name:%s, numShards:%s, pullReplicas:%s, maxShardsPerNode:%s %s}}", String requestBody = String.format(Locale.ROOT, "{create:{name:%s, config:%s, numShards:%s, pullReplicas:%s, maxShardsPerNode:%s %s}}",
collectionName, collectionName, "conf",
2, // numShards 2, // numShards
3, // pullReplicas 3, // pullReplicas
100, // maxShardsPerNode 100, // maxShardsPerNode

View File

@ -73,11 +73,11 @@ public class TestRandomRequestDistribution extends AbstractFullDistribZkTestBase
*/ */
private void testRequestTracking() throws Exception { private void testRequestTracking() throws Exception {
CollectionAdminRequest.createCollection("a1x2",1,2) CollectionAdminRequest.createCollection("a1x2", "conf1", 1, 2)
.setCreateNodeSet(nodeNames.get(0) + ',' + nodeNames.get(1)) .setCreateNodeSet(nodeNames.get(0) + ',' + nodeNames.get(1))
.process(cloudClient); .process(cloudClient);
CollectionAdminRequest.createCollection("b1x1",1,1) CollectionAdminRequest.createCollection("b1x1", "conf1", 1, 1)
.setCreateNodeSet(nodeNames.get(2)) .setCreateNodeSet(nodeNames.get(2))
.process(cloudClient); .process(cloudClient);
@ -128,7 +128,7 @@ public class TestRandomRequestDistribution extends AbstractFullDistribZkTestBase
private void testQueryAgainstDownReplica() throws Exception { private void testQueryAgainstDownReplica() throws Exception {
log.info("Creating collection 'football' with 1 shard and 2 replicas"); log.info("Creating collection 'football' with 1 shard and 2 replicas");
CollectionAdminRequest.createCollection("football",1,2) CollectionAdminRequest.createCollection("football", "conf1", 1, 2)
.setCreateNodeSet(nodeNames.get(0) + ',' + nodeNames.get(1)) .setCreateNodeSet(nodeNames.get(0) + ',' + nodeNames.get(1))
.process(cloudClient); .process(cloudClient);

View File

@ -174,7 +174,7 @@ public class TestSolrCloudWithKerberosAlt extends LuceneTestCase {
String configName = "solrCloudCollectionConfig"; String configName = "solrCloudCollectionConfig";
miniCluster.uploadConfigSet(SolrTestCaseJ4.TEST_PATH().resolve("collection1/conf"), configName); miniCluster.uploadConfigSet(SolrTestCaseJ4.TEST_PATH().resolve("collection1/conf"), configName);
CollectionAdminRequest.Create createRequest = CollectionAdminRequest.createCollection(collectionName,NUM_SHARDS,REPLICATION_FACTOR); CollectionAdminRequest.Create createRequest = CollectionAdminRequest.createCollection(collectionName, configName, NUM_SHARDS,REPLICATION_FACTOR);
Properties properties = new Properties(); Properties properties = new Properties();
properties.put(CoreDescriptor.CORE_CONFIG, "solrconfig-tlog.xml"); properties.put(CoreDescriptor.CORE_CONFIG, "solrconfig-tlog.xml");
properties.put("solr.tests.maxBufferedDocs", "100000"); properties.put("solr.tests.maxBufferedDocs", "100000");

View File

@ -157,9 +157,9 @@ public class TestTlogReplica extends SolrCloudTestCase {
break; break;
case 1: case 1:
// Sometimes don't use SolrJ // Sometimes don't use SolrJ
String url = String.format(Locale.ROOT, "%s/admin/collections?action=CREATE&name=%s&numShards=%s&tlogReplicas=%s&maxShardsPerNode=%s", String url = String.format(Locale.ROOT, "%s/admin/collections?action=CREATE&name=%s&collection.configName=%s&numShards=%s&tlogReplicas=%s&maxShardsPerNode=%s",
cluster.getRandomJetty(random()).getBaseUrl(), cluster.getRandomJetty(random()).getBaseUrl(),
collectionName, collectionName, "conf",
2, // numShards 2, // numShards
4, // tlogReplicas 4, // tlogReplicas
100); // maxShardsPerNode 100); // maxShardsPerNode
@ -170,8 +170,8 @@ public class TestTlogReplica extends SolrCloudTestCase {
case 2: case 2:
// Sometimes use V2 API // Sometimes use V2 API
url = cluster.getRandomJetty(random()).getBaseUrl().toString() + "/____v2/c"; url = cluster.getRandomJetty(random()).getBaseUrl().toString() + "/____v2/c";
String requestBody = String.format(Locale.ROOT, "{create:{name:%s, numShards:%s, tlogReplicas:%s, maxShardsPerNode:%s}}", String requestBody = String.format(Locale.ROOT, "{create:{name:%s, config:%s, numShards:%s, tlogReplicas:%s, maxShardsPerNode:%s}}",
collectionName, collectionName, "conf",
2, // numShards 2, // numShards
4, // tlogReplicas 4, // tlogReplicas
100); // maxShardsPerNode 100); // maxShardsPerNode

View File

@ -109,7 +109,7 @@ public class UnloadDistributedZkTest extends BasicDistributedZkTest {
final String coreName1 = collection+"_1"; final String coreName1 = collection+"_1";
final String coreName2 = collection+"_2"; final String coreName2 = collection+"_2";
assertEquals(0, CollectionAdminRequest.createCollection(collection, numShards, 1) assertEquals(0, CollectionAdminRequest.createCollection(collection, "conf1", numShards, 1)
.setCreateNodeSet("") .setCreateNodeSet("")
.process(cloudClient).getStatus()); .process(cloudClient).getStatus());
assertTrue(CollectionAdminRequest.addReplicaToShard(collection, "shard1") assertTrue(CollectionAdminRequest.addReplicaToShard(collection, "shard1")
@ -168,7 +168,7 @@ public class UnloadDistributedZkTest extends BasicDistributedZkTest {
JettySolrRunner jetty1 = jettys.get(0); JettySolrRunner jetty1 = jettys.get(0);
assertEquals(0, CollectionAdminRequest assertEquals(0, CollectionAdminRequest
.createCollection("unloadcollection", 1,1) .createCollection("unloadcollection", "conf1", 1,1)
.setCreateNodeSet(jetty1.getNodeName()) .setCreateNodeSet(jetty1.getNodeName())
.process(cloudClient).getStatus()); .process(cloudClient).getStatus());
ZkStateReader zkStateReader = getCommonCloudSolrClient().getZkStateReader(); ZkStateReader zkStateReader = getCommonCloudSolrClient().getZkStateReader();

View File

@ -26,7 +26,6 @@ import java.util.Map;
import org.apache.solr.client.solrj.SolrClient; import org.apache.solr.client.solrj.SolrClient;
import org.apache.solr.client.solrj.SolrRequest; import org.apache.solr.client.solrj.SolrRequest;
import org.apache.solr.client.solrj.SolrResponse; import org.apache.solr.client.solrj.SolrResponse;
import org.apache.solr.client.solrj.SolrServerException;
import org.apache.solr.client.solrj.impl.CloudSolrClient; import org.apache.solr.client.solrj.impl.CloudSolrClient;
import org.apache.solr.client.solrj.impl.HttpSolrClient; import org.apache.solr.client.solrj.impl.HttpSolrClient;
import org.apache.solr.client.solrj.request.CollectionAdminRequest; import org.apache.solr.client.solrj.request.CollectionAdminRequest;
@ -52,10 +51,11 @@ import static org.apache.solr.common.util.Utils.getObjectByPath;
* Test for AutoScalingHandler * Test for AutoScalingHandler
*/ */
public class AutoScalingHandlerTest extends SolrCloudTestCase { public class AutoScalingHandlerTest extends SolrCloudTestCase {
final static String CONFIGSET_NAME = "conf";
@BeforeClass @BeforeClass
public static void setupCluster() throws Exception { public static void setupCluster() throws Exception {
configureCluster(2) configureCluster(2)
.addConfig("conf", configset("cloud-minimal")) .addConfig(CONFIGSET_NAME, configset("cloud-minimal"))
.configure(); .configure();
} }
@ -455,20 +455,9 @@ public class AutoScalingHandlerTest extends SolrCloudTestCase {
try { try {
solrClient.request(req); solrClient.request(req);
fail("Adding a policy with 'cores' attribute should not have succeeded."); fail("Adding a policy with 'cores' attribute should not have succeeded.");
} catch (SolrServerException e) {
// todo one of these catch blocks should not be needed after SOLR-10768
if (e.getRootCause() instanceof HttpSolrClient.RemoteSolrException) {
HttpSolrClient.RemoteSolrException rootCause = (HttpSolrClient.RemoteSolrException) e.getRootCause();
// expected
assertTrue(rootCause.getMessage().contains("cores is only allowed in 'cluster-policy'"));
} else {
throw e;
}
} catch (HttpSolrClient.RemoteSolrException e) { } catch (HttpSolrClient.RemoteSolrException e) {
// expected // expected
assertTrue(e.getMessage().contains("cores is only allowed in 'cluster-policy'")); assertTrue(e.getMessage().contains("cores is only allowed in 'cluster-policy'"));
} catch (Exception e) {
throw e;
} }
setPolicyCommand = "{'set-policy': {" + setPolicyCommand = "{'set-policy': {" +
@ -670,7 +659,7 @@ public class AutoScalingHandlerTest extends SolrCloudTestCase {
assertEquals(0, violations.size()); assertEquals(0, violations.size());
// lets create a collection which violates the rule replicas < 2 // lets create a collection which violates the rule replicas < 2
CollectionAdminRequest.Create create = CollectionAdminRequest.Create.createCollection("readApiTestViolations", 1, 6); CollectionAdminRequest.Create create = CollectionAdminRequest.Create.createCollection("readApiTestViolations", CONFIGSET_NAME, 1, 6);
create.setMaxShardsPerNode(10); create.setMaxShardsPerNode(10);
CollectionAdminResponse adminResponse = create.process(solrClient); CollectionAdminResponse adminResponse = create.process(solrClient);
assertTrue(adminResponse.isSuccess()); assertTrue(adminResponse.isSuccess());

View File

@ -73,7 +73,7 @@ public class TestPolicyCloud extends SolrCloudTestCase {
cluster.getSolrClient().request(AutoScalingHandlerTest.createAutoScalingRequest(SolrRequest.METHOD.POST, commands)); cluster.getSolrClient().request(AutoScalingHandlerTest.createAutoScalingRequest(SolrRequest.METHOD.POST, commands));
String collectionName = "testCreateCollectionAddReplica"; String collectionName = "testCreateCollectionAddReplica";
CollectionAdminRequest.createCollection(collectionName, 1, 1) CollectionAdminRequest.createCollection(collectionName, "conf", 1, 1)
.setPolicy("c1") .setPolicy("c1")
.process(cluster.getSolrClient()); .process(cluster.getSolrClient());
@ -102,7 +102,7 @@ public class TestPolicyCloud extends SolrCloudTestCase {
assertEquals("success", response.get("result")); assertEquals("success", response.get("result"));
String collectionName = "testCreateCollectionSplitShard"; String collectionName = "testCreateCollectionSplitShard";
CollectionAdminRequest.createCollection(collectionName, 1, 2) CollectionAdminRequest.createCollection(collectionName, "conf", 1, 2)
.setPolicy("c1") .setPolicy("c1")
.setMaxShardsPerNode(10) .setMaxShardsPerNode(10)
.process(cluster.getSolrClient()); .process(cluster.getSolrClient());
@ -140,7 +140,7 @@ public class TestPolicyCloud extends SolrCloudTestCase {
Map<String, Object> json = Utils.getJson(cluster.getZkClient(), ZkStateReader.SOLR_AUTOSCALING_CONF_PATH, true); Map<String, Object> json = Utils.getJson(cluster.getZkClient(), ZkStateReader.SOLR_AUTOSCALING_CONF_PATH, true);
assertEquals("full json:"+ Utils.toJSONString(json) , "#EACH", assertEquals("full json:"+ Utils.toJSONString(json) , "#EACH",
Utils.getObjectByPath(json, true, "/policies/c1[0]/shard")); Utils.getObjectByPath(json, true, "/policies/c1[0]/shard"));
CollectionAdminRequest.createCollectionWithImplicitRouter("policiesTest", null, "s1,s2", 1) CollectionAdminRequest.createCollectionWithImplicitRouter("policiesTest", "conf", "s1,s2", 1)
.setPolicy("c1") .setPolicy("c1")
.process(cluster.getSolrClient()); .process(cluster.getSolrClient());

View File

@ -66,7 +66,7 @@ public class HdfsNNFailoverTest extends BasicDistributedZkTest {
@Test @Test
public void test() throws Exception { public void test() throws Exception {
createCollection(COLLECTION, 1, 1, 1); createCollection(COLLECTION, "conf1", 1, 1, 1);
waitForRecoveriesToFinish(COLLECTION, false); waitForRecoveriesToFinish(COLLECTION, false);

View File

@ -95,7 +95,7 @@ public class HdfsWriteToMultipleCollectionsTest extends BasicDistributedZkTest {
int docCount = random().nextInt(1313) + 1; int docCount = random().nextInt(1313) + 1;
int cnt = random().nextInt(4) + 1; int cnt = random().nextInt(4) + 1;
for (int i = 0; i < cnt; i++) { for (int i = 0; i < cnt; i++) {
createCollection(ACOLLECTION + i, 2, 2, 9); createCollection(ACOLLECTION + i, "conf1", 2, 2, 9);
} }
for (int i = 0; i < cnt; i++) { for (int i = 0; i < cnt; i++) {
waitForRecoveriesToFinish(ACOLLECTION + i, false); waitForRecoveriesToFinish(ACOLLECTION + i, false);

View File

@ -107,7 +107,7 @@ public class StressHdfsTest extends BasicDistributedZkTest {
Timer timer = new Timer(); Timer timer = new Timer();
try { try {
createCollection(DELETE_DATA_DIR_COLLECTION, 1, 1, 1); createCollection(DELETE_DATA_DIR_COLLECTION, "conf1", 1, 1, 1);
waitForRecoveriesToFinish(DELETE_DATA_DIR_COLLECTION, false); waitForRecoveriesToFinish(DELETE_DATA_DIR_COLLECTION, false);
@ -154,7 +154,7 @@ public class StressHdfsTest extends BasicDistributedZkTest {
if (nShards == 0) nShards = 1; if (nShards == 0) nShards = 1;
} }
createCollection(DELETE_DATA_DIR_COLLECTION, nShards, rep, maxReplicasPerNode); createCollection(DELETE_DATA_DIR_COLLECTION, "conf1", nShards, rep, maxReplicasPerNode);
waitForRecoveriesToFinish(DELETE_DATA_DIR_COLLECTION, false); waitForRecoveriesToFinish(DELETE_DATA_DIR_COLLECTION, false);

View File

@ -42,6 +42,10 @@ public class ThrowErrorOnInitRequestHandler extends RequestHandlerBase
@Override @Override
public void init(NamedList args) { public void init(NamedList args) {
String errorMessage = (String) args.get("error");
if (errorMessage != null) {
throw new Error(errorMessage);
}
throw new Error("Doing my job, throwing a java.lang.Error"); throw new Error("Doing my job, throwing a java.lang.Error");
} }
} }

View File

@ -65,7 +65,7 @@ public class TestTrackingShardHandlerFactory extends AbstractFullDistribZkTestBa
assertSame(trackingQueue, trackingShardHandlerFactory.getTrackingQueue()); assertSame(trackingQueue, trackingShardHandlerFactory.getTrackingQueue());
} }
createCollection(collectionName, 2, 1, 1); createCollection(collectionName, "conf1", 2, 1, 1);
waitForRecoveriesToFinish(collectionName, true); waitForRecoveriesToFinish(collectionName, true);

View File

@ -249,6 +249,11 @@ public class SolrMetricManagerTest extends SolrTestCaseJ4 {
} }
@Test
public void testDefaultCloudReporterPeriodUnchanged() throws Exception {
assertEquals(60, SolrMetricManager.DEFAULT_CLOUD_REPORTER_PERIOD);
}
private PluginInfo createPluginInfo(String name, String group, String registry) { private PluginInfo createPluginInfo(String name, String group, String registry) {
Map<String,String> attrs = new HashMap<>(); Map<String,String> attrs = new HashMap<>();
attrs.put("name", name); attrs.put("name", name);

View File

@ -53,6 +53,9 @@ public class MockMetricReporter extends SolrMetricReporter {
if (configurable == null) { if (configurable == null) {
throw new IllegalStateException("MockMetricReporter::configurable not defined."); throw new IllegalStateException("MockMetricReporter::configurable not defined.");
} }
if (period < 1) {
throw new IllegalStateException("Init argument 'period' is in time unit 'seconds' and must be at least 1.");
}
} }
public void setConfigurable(String configurable) { public void setConfigurable(String configurable) {

View File

@ -73,7 +73,7 @@ public class TestDefaultStatsCache extends BaseDistributedSearchTestCase {
commit(); commit();
if (aDocId != null) { if (aDocId != null) {
dfQuery("q", "id:"+aDocId,"debugQuery", "true", "fl", "*,score"); dfQuery("q", "{!cache=false}id:"+aDocId,"debugQuery", "true", "fl", "*,score");
} }
dfQuery("q", "a_t:one a_t:four", "debugQuery", "true", "fl", "*,score"); dfQuery("q", "a_t:one a_t:four", "debugQuery", "true", "fl", "*,score");
} }

View File

@ -418,7 +418,7 @@ public class TestSolrCLIRunExample extends SolrTestCaseJ4 {
// sthis test only support launching one SolrCloud node due to how MiniSolrCloudCluster works // sthis test only support launching one SolrCloud node due to how MiniSolrCloudCluster works
// and the need for setting the host and port system properties ... // and the need for setting the host and port system properties ...
String userInput = "1\n"+bindPort+"\n"+collectionName+"\n2\n2\ndata_driven_schema_configs\n"; String userInput = "1\n"+bindPort+"\n"+collectionName+"\n2\n2\n_default\n";
// simulate user input from stdin // simulate user input from stdin
InputStream userInputSim = new ByteArrayInputStream(userInput.getBytes(StandardCharsets.UTF_8)); InputStream userInputSim = new ByteArrayInputStream(userInput.getBytes(StandardCharsets.UTF_8));

View File

@ -62,12 +62,11 @@ server/solr/configsets
Directories containing different configuration options for running Solr. Directories containing different configuration options for running Solr.
basic_configs : Bare minimum configuration settings needed to run Solr. _default : Bare minimum configurations with field-guessing and managed schema turned
on by default, so as to start indexing data in Solr without having to design
data_driven_schema_configs : Field-guessing and managed schema mode; use this configuration if you want a schema upfront. You can use the REST API to manage your schema as you refine your index
to start indexing data in Solr without having to design a schema upfront. requirements. You can turn off the field (for a collection, say mycollection) guessing by:
You can use the REST API to manage your schema as you refine your index curl http://host:8983/solr/mycollection/config -d '{"set-user-property": {"update.autoCreateFields":"false"}}'
requirements.
sample_techproducts_configs : Comprehensive example configuration that demonstrates many of the powerful sample_techproducts_configs : Comprehensive example configuration that demonstrates many of the powerful
features of Solr, based on the use case of building a search solution for features of Solr, based on the use case of building a search solution for

Some files were not shown because too many files have changed in this diff Show More