druid/integration-tests/docker
Jihoon Son 73ce5df22d
Add support for authorizing query context params (#12396)
The query context is a way that the user gives a hint to the Druid query engine, so that they enforce a certain behavior or at least let the query engine prefer a certain plan during query planning. Today, there are 3 types of query context params as below.

Default context params. They are set via druid.query.default.context in runtime properties. Any user context params can be default params.
User context params. They are set in the user query request. See https://druid.apache.org/docs/latest/querying/query-context.html for parameters.
System context params. They are set by the Druid query engine during query processing. These params override other context params.
Today, any context params are allowed to users. This can cause 
1) a bad UX if the context param is not matured yet or 
2) even query failure or system fault in the worst case if a sensitive param is abused, ex) maxSubqueryRows.

This PR adds an ability to limit context params per user role. That means, a query will fail if you have a context param set in the query that is not allowed to you. To do that, this PR adds a new built-in resource type, QUERY_CONTEXT. The resource to authorize has a name of the context param (such as maxSubqueryRows) and the type of QUERY_CONTEXT. To allow a certain context param for a user, the user should be granted WRITE permission on the context param resource. Here is an example of the permission.

{
  "resourceAction" : {
    "resource" : {
      "name" : "maxSubqueryRows",
      "type" : "QUERY_CONTEXT"
    },
    "action" : "WRITE"
  },
  "resourceNamePattern" : "maxSubqueryRows"
}
Each role can have multiple permissions for context params. Each permission should be set for different context params.

When a query is issued with a query context X, the query will fail if the user who issued the query does not have WRITE permission on the query context X. In this case,

HTTP endpoints will return 403 response code.
JDBC will throw ForbiddenException.
Note: there is a context param called brokerService that is used only by the router. This param is used to pin your query to run it in a specific broker. Because the authorization is done not in the router, but in the broker, if you have brokerService set in your query without a proper permission, your query will fail in the broker after routing is done. Technically, this is not right because the authorization is checked after the context param takes effect. However, this should not cause any user-facing issue and thus should be OK. The query will still fail if the user doesn’t have permission for brokerService.

The context param authorization can be enabled using druid.auth.authorizeQueryContextParams. This is disabled by default to avoid any hassle when someone upgrades his cluster blindly without reading release notes.
2022-04-21 14:21:16 +05:30
..
environment-configs Add support for authorizing query context params (#12396) 2022-04-21 14:21:16 +05:30
ldap-configs Add support for authorizing query context params (#12396) 2022-04-21 14:21:16 +05:30
schema-registry add avro + kafka + schema registry integration test (#10929) 2021-03-08 08:12:12 -08:00
service-supervisords remove ZooKeeper 3.4 support + pass tests with Java 15 (#11073) 2021-05-25 12:49:49 -07:00
test-data Increase default DatasourceCompactionConfig.inputSegmentSizeBytes to Long.MAX_VALUE (#12381) 2022-04-04 16:28:53 +05:30
tls Upgrade RSA Key from 1024 bit to 4096 to eliminate warnings (#11743) 2022-01-11 13:24:09 +08:00
Dockerfile Use Druid's extension loading for integration test instead of maven (#12095) 2022-01-05 23:33:04 -08:00
base-setup.sh remove ZooKeeper 3.4 support + pass tests with Java 15 (#11073) 2021-05-25 12:49:49 -07:00
docker-compose.base.yml Use Druid's extension loading for integration test instead of maven (#12095) 2022-01-05 23:33:04 -08:00
docker-compose.cli-indexer.yml integration test for coordinator and overlord leadership client (#10680) 2020-12-17 22:50:12 -08:00
docker-compose.druid-hadoop.yml Integration Tests. (#9854) 2020-06-02 09:38:53 -07:00
docker-compose.high-availability.yml Use Druid's extension loading for integration test instead of maven (#12095) 2022-01-05 23:33:04 -08:00
docker-compose.ldap-security.yml Use Druid's extension loading for integration test instead of maven (#12095) 2022-01-05 23:33:04 -08:00
docker-compose.query-error-test.yml Use Druid's extension loading for integration test instead of maven (#12095) 2022-01-05 23:33:04 -08:00
docker-compose.query-retry-test.yml Use Druid's extension loading for integration test instead of maven (#12095) 2022-01-05 23:33:04 -08:00
docker-compose.schema-registry-indexer.yml add avro + kafka + schema registry integration test (#10929) 2021-03-08 08:12:12 -08:00
docker-compose.schema-registry.yml add avro + kafka + schema registry integration test (#10929) 2021-03-08 08:12:12 -08:00
docker-compose.security.yml integration test for coordinator and overlord leadership client (#10680) 2020-12-17 22:50:12 -08:00
docker-compose.yml Use Druid's extension loading for integration test instead of maven (#12095) 2022-01-05 23:33:04 -08:00
druid.sh Use Druid's extension loading for integration test instead of maven (#12095) 2022-01-05 23:33:04 -08:00
run-mysql.sh add missing license headers, in particular to MD files; clean up RAT … (#6563) 2018-11-13 09:38:37 -08:00
supervisord.conf Integration tests for JDK 11 (#9249) 2020-02-12 16:36:31 -08:00
wiki-simple-lookup.json refactor lookups to be more chill to router (#7222) 2019-04-05 14:49:41 -07:00