[TEST] Lower ML model memory limit in HLRC datafeed tests (#55210)

The MachineLearningIT.testStopDatafeed test was creating 3
jobs each with the default model memory limit of 1GB.  This
meant that the test would not run on a machine with less than
10GB of RAM (due to the default ML memory percentage of 30%).

This change reduces the model memory limit for these jobs to
0.5GB, which means the test will run on a machine with only
5GB of RAM.

Relates to https://discuss.elastic.co/t/failed-ml-tests-when-running-the-gradle-check-task-against-unchanged-repo-code/227829
This commit is contained in:
David Roberts 2020-04-15 09:43:29 +01:00
parent 48048646e7
commit 8c33cad2b2
1 changed files with 2 additions and 0 deletions

View File

@ -162,6 +162,7 @@ import org.elasticsearch.client.ml.inference.trainedmodel.RegressionConfig;
import org.elasticsearch.client.ml.inference.trainedmodel.TargetType;
import org.elasticsearch.client.ml.inference.trainedmodel.langident.LangIdentNeuralNetwork;
import org.elasticsearch.client.ml.job.config.AnalysisConfig;
import org.elasticsearch.client.ml.job.config.AnalysisLimits;
import org.elasticsearch.client.ml.job.config.DataDescription;
import org.elasticsearch.client.ml.job.config.Detector;
import org.elasticsearch.client.ml.job.config.Job;
@ -2572,6 +2573,7 @@ public class MachineLearningIT extends ESRestHighLevelClientTestCase {
//should not be random, see:https://github.com/elastic/ml-cpp/issues/208
configBuilder.setBucketSpan(new TimeValue(5, TimeUnit.SECONDS));
builder.setAnalysisConfig(configBuilder);
builder.setAnalysisLimits(new AnalysisLimits(512L, 4L));
DataDescription.Builder dataDescription = new DataDescription.Builder();
dataDescription.setTimeFormat(DataDescription.EPOCH_MS);