druid/integration-tests-ex
Paul Rogers f4dcc52dac
Redesign QueryContext class (#13071)
We introduce two new configuration keys that refine the query context security model controlled by druid.auth.authorizeQueryContextParams. When that value is set to true then two other configuration options become available:

druid.auth.unsecuredContextKeys: The set of query context keys that do not require a security check. Use this for the "white-list" of key to allow. All other keys go through the existing context key security checks.
druid.auth.securedContextKeys: The set of query context keys that do require a security check. Use this when you want to allow all but a specific set of keys: only these keys go through the existing context key security checks.
Both are set using JSON list format:

druid.auth.securedContextKeys=["secretKey1", "secretKey2"]
You generally set one or the other values. If both are set, unsecuredContextKeys acts as exceptions to securedContextKeys.

In addition, Druid defines two query context keys which always bypass checks because Druid uses them internally:

sqlQueryId
sqlStringifyArrays
2022-10-15 11:02:11 +05:30
..
cases Add IT for MSQ task engine using the new IT framework (#12992) 2022-09-22 16:09:47 +05:30
docs be consistent about referring to the web console by its name (#13118) 2022-09-19 15:02:17 -07:00
image Suppress false CVEs (#13026) 2022-09-06 11:46:56 +05:30
tools Redesign QueryContext class (#13071) 2022-10-15 11:02:11 +05:30
.gitignore
README.md Add the new integration test framework (#12368) 2022-08-24 17:03:23 +05:30

README.md

Revised Integration Tests

This directory builds a Docker image for Druid, then uses that image, along with test configuration to run tests. This version greatly evolves the integration tests from the earlier form. See the History section for details.

Shortcuts

List of the most common commands once you're familiar with the framework. If you are new to the framework, see Quickstart for an explanation.

Build Druid

To make the text a bit simpler, define a variable for the standard settings:

export MAVEN_IGNORE=-P skip-static-checks,skip-tests -Dmaven.javadoc.skip=true

```bash
mvn clean package -P dist $MAVEN_IGNORE -T1.0C

Build the Test Image

cd $DRUID_DEV/integration-tests-ex/image
mvn install -P test-image $MAVEN_IGNORE

Run an IT from the Command Line

mvn verify -P IT-<category> -pl :druid-it-cases $MAVEN_IGNORE

Where <category> is one of the test categories.

Or

cd $DRUID_DEV/integration-tests-ex/cases
mvn verify -P skip-static-checks,docker-tests,IT-<category> \
    -Dmaven.javadoc.skip=true -DskipUTs=true \
    -pl :druid-it-cases

Run an IT from the IDE

Start the cluster:

cd $DRUID_DEV/integration-tests-ex/cases
./cluster.sh <category> up

Where <category> is one of the test categories. Then launch the test as a JUnit test.

Contents

Background information

Goals

The goal of the present version is to simplify development.

  • Speed up the Druid test image build by avoiding download of dependencies. (Instead, any such dependencies are managed by Maven and reside in the local build cache.)
  • Use official images for dependencies to avoid the need to download, install, and manage those dependencies.
  • Make it is easy to manually build the image, launch a cluster, and run a test against the cluster.
  • Convert tests to JUnit so that they will easily run in your favorite IDE, just like other Druid tests.
  • Use the actual Druid build from distribution so we know what is tested.
  • Leverage, don't fight, Maven.
  • Run the integration tests easily on a typical development machine.

By meeting these goals, you can quickly:

  • Build the Druid distribution.
  • Build the Druid image. (< 1 minute)
  • Launch the cluster for the particular test. (a few seconds)
  • Run the test any number of times in your debugger.
  • Clean up the test artifacts.

The result is that the fastest path to develop a Druid patch or feature is:

  • Create a normal unit test and run it to verify your code.
  • Create an integration test that double-checks the code in a live cluster.