3a2220c68d
This PR contains non-functional / refactoring changes of the following classes in the sql module: 1. Move ExplainPlan and ExplainAttributes fromsql/src/main/java/org/apache/druid/sql/http to processing/src/main/java/org/apache/druid/query/explain 2. Move sql/src/main/java/org/apache/druid/sql/SqlTaskStatus.java -> processing/src/main/java/org/apache/druid/query/http/SqlTaskStatus.java 3. Add a new class processing/src/main/java/org/apache/druid/query/http/ClientSqlQuery.java that is effectively a thin POJO version of SqlQuery in the sql module but without any of the Calcite functionality and business logic. 4. Move BrokerClient, BrokerClientImpl and Broker classes from sql/src/main/java/org/apache/druid/sql/client to server/src/main/java/org/apache/druid/client/broker. 5. Remove BrokerServiceModule that provided the BrokerClient. The functionality is now contained in ServiceClientModule in the server package itself which provides all the clients as well. This is done so that we can reuse the said classes in #17353 without brining in Calcite and other dependencies to the Overlord. |
||
---|---|---|
.. | ||
cases | ||
docs | ||
image | ||
tools | ||
.gitignore | ||
README.md |
README.md
Revised Integration Tests
This directory builds a Docker image for Druid, then uses that image, along with test configuration to run tests. This version greatly evolves the integration tests from the earlier form. See the History section for details.
Shortcuts
List of the most common commands once you're familiar with the framework. If you are new to the framework, see Quickstart for an explanation.
Build Druid
./it.sh build
Build the Test Image
./it.sh image
Note: If you are running it on Apple Silicon processors, you would also need to uncomment all occurrences of platform: linux/x86_64
in dependencies.yaml.
Run an IT from the Command Line
./it.sh test <category>
Where <category>
is one of the test categories. You can see the list of test categories at src/test/java/org/apache/druid/testsEx/categories
. The corresponding test classes are also annotated with @Category
like @Category(HighAvailability.class)
.
For example, a sample command for running IT for @Category(HighAvailability.class)
would be:
./it.sh test HighAvailability
Run an IT from the IDE
Start the cluster:
./it.sh up <category>
Where <category>
is one of the test categories. Then launch the
test as a JUnit test.
Contents
- Goals
- Quickstart
- Create a new test
- Maven configuration
- Docker image
- Druid configuration
- Docker Compose configuration
- Test configuration
- Test structure
- Test runtime semantics
- Scripts
- Dependencies
- Debugging
Background information
- Next steps
- Test conversion - How to convert existing tests.
- History - Comparison with prior integration tests.
Goals
The goal of the present version is to simplify development.
- Speed up the Druid test image build by avoiding download of dependencies. (Instead, any such dependencies are managed by Maven and reside in the local build cache.)
- Use official images for dependencies to avoid the need to download, install, and manage those dependencies.
- Make it is easy to manually build the image, launch a cluster, and run a test against the cluster.
- Convert tests to JUnit so that they will easily run in your favorite IDE, just like other Druid tests.
- Use the actual Druid build from
distribution
so we know what is tested. - Leverage, don't fight, Maven.
- Run the integration tests easily on a typical development machine.
By meeting these goals, you can quickly:
- Build the Druid distribution.
- Build the Druid image. (< 1 minute)
- Launch the cluster for the particular test. (a few seconds)
- Run the test any number of times in your debugger.
- Clean up the test artifacts.
The result is that the fastest path to develop a Druid patch or feature is:
- Create a normal unit test and run it to verify your code.
- Create an integration test that double-checks the code in a live cluster.