Commit Graph

11 Commits

Author SHA1 Message Date
Zoltan Haindrich 5f3b310115
Build reliablity fixes (#15048)
* disable parallel builds; enable batch mode to get rid of transfer progress

* restore .m2 from setup-java if not found

* some change to sql

* add ws

* fix quote

* fix quote

* undo querytest change

* nullhandling in mvtest

* init more

* skip commitid plugin

* add-back 1.0C to build ; remove redundant skip-s from copy-resources; add comment
2023-09-28 12:27:52 -07:00
Tejaswini Bandlamudi a45b25fa1d
Removes support for Hadoop 2 (#14763)
Removing Hadoop 2 support as discussed in https://lists.apache.org/list?dev@druid.apache.org:lte=1M:hadoop
2023-08-09 17:47:52 +05:30
Gian Merlino 63ee69b4e8
Claim full support for Java 17. (#14384)
* Claim full support for Java 17.

No production code has changed, except the startup scripts.

Changes:

1) Allow Java 17 without DRUID_SKIP_JAVA_CHECK.

2) Include the full list of opens and exports on both Java 11 and 17.

3) Document that Java 17 is both supported and preferred.

4) Switch some tests from Java 11 to 17 to get better coverage on the
   preferred version.

* Doc update.

* Update errorprone.

* Update docker_build_containers.sh.

* Update errorprone in licenses.yaml.

* Add some more run-javas.

* Additional run-javas.

* Update errorprone.

* Suppress new errorprone error.

* Add exports and opens in ForkingTaskRunner for Java 11+.

Test, doc changes.

* Additional errorprone updates.

* Update for errorprone.

* Restore old fomatting in LdapCredentialsValidator.

* Copy bin/ too.

* Fix Java 15, 17 build line in docker_build_containers.sh.

* Update busybox image.

* One more java command.

* Fix interpolation.

* IT commandline refinements.

* Switch to busybox 1.34.1-glibc.

* POM adjustments, build and test one IT on 17.

* Additional debugging.

* Fix silly thing.

* Adjust command line.

* Add exports and opens one more place.

* Additional harmonization of strong encapsulation parameters.
2023-07-07 12:52:35 -07:00
Vadim Ogievetsky b9edfe34a4
be consistent about referring to the web console by its name (#13118) 2022-09-19 15:02:17 -07:00
Paul Rogers f83fab699e
Add IT-related changes pulled out of PR #12368 (#12673)
This commit contains changes made to the existing ITs to support the new ITs.

Changes:
- Make the "custom node role" code usable by the new ITs. 
- Use flag `-DskipITs` to skips the integration tests but runs unit tests.
- Use flag `-DskipUTs` skips unit tests but runs the "new" integration tests.
- Expand the existing Druid profile, `-P skip-tests` to skip both ITs and UTs.
2022-06-26 02:13:59 +05:30
Jihoon Son 3d9e3dbad9
Fix hadoop library location for integration tests (#12497) 2022-06-23 10:39:54 -05:00
Jihoon Son cacfcfcdab
ignore hadoop-gcs directory already exists error for integration tests (#12169) 2022-01-19 09:35:50 -08:00
Jihoon Son 58378aa967
Move gcs-connector from lib to hadoop-dependencies for integration test (#12144) 2022-01-12 16:47:34 -08:00
Jihoon Son 4a74c5adcc
Use Druid's extension loading for integration test instead of maven (#12095)
* Use Druid's extension loading for integration test instead of maven

* fix maven command

* override config path

* load input format extensions and kafka by default; add prepopulated-data group

* all docker-composes are overridable

* fix s3 configs

* override config for all

* fix docker_compose_args

* fix security tests

* turn off debug logs for overlord api calls

* clean up stuff

* revert docker-compose.yml

* fix override config for query error test; fix circular dependency in docker compose

* add back some dependencies in docker compose

* new maven profile for integration test

* example file filter
2022-01-05 23:33:04 -08:00
Karan Kumar cf27366b35
Fixing typos in docker build scripts (#11866) 2021-11-02 23:50:52 +05:30
Karan Kumar 90640bb316
Support for hadoop 3 via maven profiles (#11794)
Add support for hadoop 3 profiles . Most of the details are captured in #11791 .
We use a combination of maven profiles and resource filtering to achieve this. Hadoop2 is supported by default and a new maven profile with the name hadoop3 is created. This will allow the user to choose the profile which is best suited for the use case.
2021-10-30 22:46:24 +05:30