AWS RDS token based password provider (#9518)

* refresh db pwd

* aws iam token password provider

* fix analyze-dependencies build

* fix doc build

* add  ut for BasicDataSourceExt

* more doc updates

* more  doc update

* moving aws  token password  provider to new extension

* remove duplicate changes

* make  all config inline

* extension docs

* refresh db  password  in SQL Firehose code path as well

* add ut

* fix build

* add new extension to distribution

* rds lib is not provided

* fix license build

* add version to license

* change parent version to 0.19.0-snapshot

* address review comments

* fix core/ code coverage

* Update server/src/main/java/org/apache/druid/metadata/BasicDataSourceExt.java

Co-authored-by: Clint Wylie <cjwylie@gmail.com>

* address review comments

* fix spellchecker

* remove inadvertant website file change

Co-authored-by: Clint Wylie <cjwylie@gmail.com>
This commit is contained in:
Himanshu 2021-01-06 21:15:29 -08:00 committed by GitHub
parent 48e576a307
commit c7b1212a43
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
16 changed files with 710 additions and 3 deletions

View File

@ -230,6 +230,8 @@
<argument>-c</argument>
<argument>org.apache.druid.extensions:druid-s3-extensions</argument>
<argument>-c</argument>
<argument>org.apache.druid.extensions:druid-aws-rds-extensions</argument>
<argument>-c</argument>
<argument>org.apache.druid.extensions:druid-ec2-extensions</argument>
<argument>-c</argument>
<argument>org.apache.druid.extensions:druid-google-extensions</argument>

View File

@ -0,0 +1,38 @@
---
id: druid-aws-rds
title: "Druid AWS RDS Module"
---
<!--
~ Licensed to the Apache Software Foundation (ASF) under one
~ or more contributor license agreements. See the NOTICE file
~ distributed with this work for additional information
~ regarding copyright ownership. The ASF licenses this file
~ to you under the Apache License, Version 2.0 (the
~ "License"); you may not use this file except in compliance
~ with the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing,
~ software distributed under the License is distributed on an
~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
~ KIND, either express or implied. See the License for the
~ specific language governing permissions and limitations
~ under the License.
-->
[AWS RDS](https://aws.amazon.com/rds/) is a managed service to operate relation databases such as PostgreSQL, Mysql etc. These databases could be accessed using static db password mechanism or via [AWS IAM](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/UsingWithRDS.IAMDBAuth.html) temporary tokens. This module provides AWS RDS token [password provider](../../operations/password-provider.md) implementation to be used with [mysql-metadata-store](mysql.md) or [postgresql-metadata-store](postgresql.md) when mysql/postgresql is operated using AWS RDS.
```json
{ "type": "aws-rds-token", "user": "USER", "host": "HOST", "port": PORT, "region": "AWS_REGION" }
```
Before using this password provider, please make sure that you have connected all dots for db user to connect using token.
See [AWS Guide](https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/UsingWithRDS.IAMDBAuth.html).
To use this extension, make sure you [include](../../development/extensions.md#loading-extensions) it in your config file along with other extensions e.g.
```
druid.extensions.loadList=["druid-aws-rds-extensions", "postgresql-metadata-storage", ...]
```

View File

@ -56,6 +56,7 @@ Core extensions are maintained by Druid committers.
|druid-ranger-security|Support for access control through Apache Ranger.|[link](../development/extensions-core/druid-ranger-security.md)|
|druid-s3-extensions|Interfacing with data in AWS S3, and using S3 as deep storage.|[link](../development/extensions-core/s3.md)|
|druid-ec2-extensions|Interfacing with AWS EC2 for autoscaling middle managers|UNDOCUMENTED|
|druid-aws-rds-extensions|Support for AWS token based access to AWS RDS DB Cluster.|[link](../development/extensions-core/druid-aws-rds.md)|
|druid-stats|Statistics related module including variance and standard deviation.|[link](../development/extensions-core/stats.md)|
|mysql-metadata-storage|MySQL metadata store.|[link](../development/extensions-core/mysql.md)|
|postgresql-metadata-storage|PostgreSQL metadata store.|[link](../development/extensions-core/postgresql.md)|

View File

@ -0,0 +1,80 @@
<?xml version="1.0" encoding="UTF-8"?>
<!--
~ Licensed to the Apache Software Foundation (ASF) under one
~ or more contributor license agreements. See the NOTICE file
~ distributed with this work for additional information
~ regarding copyright ownership. The ASF licenses this file
~ to you under the Apache License, Version 2.0 (the
~ "License"); you may not use this file except in compliance
~ with the License. You may obtain a copy of the License at
~
~ http://www.apache.org/licenses/LICENSE-2.0
~
~ Unless required by applicable law or agreed to in writing,
~ software distributed under the License is distributed on an
~ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
~ KIND, either express or implied. See the License for the
~ specific language governing permissions and limitations
~ under the License.
-->
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.apache.druid.extensions</groupId>
<artifactId>druid-aws-rds-extensions</artifactId>
<name>druid-aws-rds-extensions</name>
<description>druid-aws-rds-extensions</description>
<parent>
<groupId>org.apache.druid</groupId>
<artifactId>druid</artifactId>
<version>0.21.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>
<dependencies>
<dependency>
<groupId>org.apache.druid</groupId>
<artifactId>druid-core</artifactId>
<version>${project.parent.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-rds</artifactId>
<version>${aws.sdk.version}</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.google.inject</groupId>
<artifactId>guice</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-core</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
</project>

View File

@ -0,0 +1,47 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.apache.druid.aws.rds;
import com.fasterxml.jackson.databind.Module;
import com.fasterxml.jackson.databind.jsontype.NamedType;
import com.fasterxml.jackson.databind.module.SimpleModule;
import com.google.common.collect.ImmutableList;
import com.google.inject.Binder;
import org.apache.druid.initialization.DruidModule;
import java.util.List;
public class AWSRDSModule implements DruidModule
{
@Override
public List<? extends Module> getJacksonModules()
{
return ImmutableList.of(
new SimpleModule("DruidAwsRdsExtentionsModule").registerSubtypes(
new NamedType(AWSRDSTokenPasswordProvider.class, "aws-rds-token")
)
);
}
@Override
public void configure(Binder binder)
{
}
}

View File

@ -0,0 +1,123 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.apache.druid.aws.rds;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.services.rds.auth.GetIamAuthTokenRequest;
import com.amazonaws.services.rds.auth.RdsIamAuthTokenGenerator;
import com.fasterxml.jackson.annotation.JacksonInject;
import com.fasterxml.jackson.annotation.JsonCreator;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.google.common.base.Preconditions;
import org.apache.druid.java.util.common.RE;
import org.apache.druid.java.util.common.logger.Logger;
import org.apache.druid.metadata.PasswordProvider;
/**
* Generates the AWS token same as aws cli
* aws rds generate-db-auth-token --hostname HOST --port PORT --region REGION --username USER
* and returns that as password.
*
* Before using this, please make sure that you have connected all dots for db user to connect using token.
* See https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/UsingWithRDS.IAMDBAuth.html
*/
public class AWSRDSTokenPasswordProvider implements PasswordProvider
{
private static final Logger LOGGER = new Logger(AWSRDSTokenPasswordProvider.class);
private final String user;
private final String host;
private final int port;
private final String region;
private final AWSCredentialsProvider awsCredentialsProvider;
@JsonCreator
public AWSRDSTokenPasswordProvider(
@JsonProperty("user") String user,
@JsonProperty("host") String host,
@JsonProperty("port") int port,
@JsonProperty("region") String region,
@JacksonInject AWSCredentialsProvider awsCredentialsProvider
)
{
this.user = Preconditions.checkNotNull(user, "null metadataStorage user");
this.host = Preconditions.checkNotNull(host, "null metadataStorage host");
Preconditions.checkArgument(port > 0, "must provide port");
this.port = port;
this.region = Preconditions.checkNotNull(region, "null region");
LOGGER.info("AWS RDS Config user[%s], host[%s], port[%d], region[%s]", this.user, this.host, port, this.region);
this.awsCredentialsProvider = Preconditions.checkNotNull(awsCredentialsProvider, "null AWSCredentialsProvider");
}
@JsonProperty
public String getUser()
{
return user;
}
@JsonProperty
public String getHost()
{
return host;
}
@JsonProperty
public int getPort()
{
return port;
}
@JsonProperty
public String getRegion()
{
return region;
}
@JsonIgnore
@Override
public String getPassword()
{
try {
RdsIamAuthTokenGenerator generator = RdsIamAuthTokenGenerator
.builder()
.credentials(awsCredentialsProvider)
.region(region)
.build();
String authToken = generator.getAuthToken(
GetIamAuthTokenRequest
.builder()
.hostname(host)
.port(port)
.userName(user)
.build()
);
return authToken;
}
catch (Exception ex) {
LOGGER.error(ex, "Couldn't generate AWS token.");
throw new RE(ex, "Couldn't generate AWS token.");
}
}
}

View File

@ -0,0 +1,16 @@
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
org.apache.druid.aws.rds.AWSRDSModule

View File

@ -0,0 +1,82 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.apache.druid.aws.rds;
import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.fasterxml.jackson.databind.InjectableValues;
import com.fasterxml.jackson.databind.Module;
import com.fasterxml.jackson.databind.ObjectMapper;
import org.apache.druid.metadata.PasswordProvider;
import org.junit.Assert;
import org.junit.Test;
import java.io.IOException;
public class AWSRDSTokenPasswordProviderTest
{
@Test
public void testSerde() throws IOException
{
ObjectMapper jsonMapper = new ObjectMapper();
for (Module module : new AWSRDSModule().getJacksonModules()) {
jsonMapper.registerModule(module);
}
jsonMapper.setInjectableValues(
new InjectableValues.Std().addValue(AWSCredentialsProvider.class, new AWSCredentialsProvider()
{
@Override
public AWSCredentials getCredentials()
{
return null;
}
@Override
public void refresh()
{
}
})
);
String jsonStr = "{\n"
+ " \"type\": \"aws-rds-token\",\n"
+ " \"user\": \"testuser\",\n"
+ " \"host\": \"testhost\",\n"
+ " \"port\": 5273,\n"
+ " \"region\": \"testregion\"\n"
+ "}\n";
PasswordProvider pp = jsonMapper.readValue(
jsonMapper.writeValueAsString(
jsonMapper.readValue(jsonStr, PasswordProvider.class)
),
PasswordProvider.class
);
AWSRDSTokenPasswordProvider awsPwdProvider = (AWSRDSTokenPasswordProvider) pp;
Assert.assertEquals("testuser", awsPwdProvider.getUser());
Assert.assertEquals("testhost", awsPwdProvider.getHost());
Assert.assertEquals(5273, awsPwdProvider.getPort());
Assert.assertEquals("testregion", awsPwdProvider.getRegion());
}
}

View File

@ -147,6 +147,16 @@ source_paths:
---
name: AWS RDS SDK for Java
license_category: source
module: extensions/druid-aws-rds-extensions
license_name: Apache License version 2.0
version: 1.11.199
libraries:
- com.amazonaws: aws-java-sdk-rds
---
name: LDAP string encoding function from OWASP ESAPI
license_category: source
module: extensions/druid-basic-security

View File

@ -112,6 +112,7 @@
<powermock.version>2.0.2</powermock.version>
<aws.sdk.version>1.11.199</aws.sdk.version>
<caffeine.version>2.8.0</caffeine.version>
<jacoco.version>0.8.5</jacoco.version>
<!-- Curator requires 3.4.x ZooKeeper clients to maintain compatibility with 3.4.x ZooKeeper servers,
If we upgrade to 3.5.x clients, curator requires 3.5.x servers, which would break backwards compatibility
see http://curator.apache.org/zk-compatibility.html -->
@ -168,6 +169,7 @@
<module>extensions-core/lookups-cached-single</module>
<module>extensions-core/ec2-extensions</module>
<module>extensions-core/s3-extensions</module>
<module>extensions-core/druid-aws-rds-extensions</module>
<module>extensions-core/simple-client-sslcontext</module>
<module>extensions-core/druid-basic-security</module>
<module>extensions-core/google-extensions</module>
@ -1269,7 +1271,7 @@
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>0.8.5</version>
<version>${jacoco.version}</version>
<configuration>
<excludes>
<!-- Ignore initialization classes, these are tested by the integration tests. -->

View File

@ -459,6 +459,17 @@
<build>
<plugins>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>${jacoco.version}</version>
<configuration>
<excludes>
<!-- There are UTs for this class but it is very difficult to get required branch coverage -->
<exclude>org/apache/druid/metadata/BasicDataSourceExt.class</exclude>
</excludes>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>

View File

@ -0,0 +1,179 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.apache.druid.metadata;
import com.google.common.annotations.VisibleForTesting;
import org.apache.commons.dbcp2.BasicDataSource;
import org.apache.commons.dbcp2.ConnectionFactory;
import org.apache.druid.java.util.common.RE;
import org.apache.druid.java.util.common.logger.Logger;
import java.sql.Driver;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.util.Properties;
/**
* This class exists so that {@link MetadataStorageConnectorConfig} is asked for password every time a brand new
* connection is established with DB. {@link PasswordProvider} impls such as those based on AWS tokens refresh the
* underlying token periodically since each token is valid for a certain period of time only.
* So, This class overrides (and largely copies due to lack of extensibility), the methods from base class in order to keep
* track of connection properties and call {@link MetadataStorageConnectorConfig#getPassword()} everytime a new
* connection is setup.
*/
public class BasicDataSourceExt extends BasicDataSource
{
private static final Logger LOGGER = new Logger(BasicDataSourceExt.class);
private Properties connectionProperties;
private final MetadataStorageConnectorConfig connectorConfig;
public BasicDataSourceExt(MetadataStorageConnectorConfig connectorConfig)
{
this.connectorConfig = connectorConfig;
this.connectionProperties = new Properties();
}
@Override
public void addConnectionProperty(String name, String value)
{
connectionProperties.put(name, value);
super.addConnectionProperty(name, value);
}
@Override
public void removeConnectionProperty(String name)
{
connectionProperties.remove(name);
super.removeConnectionProperty(name);
}
@Override
public void setConnectionProperties(String connectionProperties)
{
if (connectionProperties == null) {
throw new NullPointerException("connectionProperties is null");
}
String[] entries = connectionProperties.split(";");
Properties properties = new Properties();
for (String entry : entries) {
if (entry.length() > 0) {
int index = entry.indexOf('=');
if (index > 0) {
String name = entry.substring(0, index);
String value = entry.substring(index + 1);
properties.setProperty(name, value);
} else {
// no value is empty string which is how java.util.Properties works
properties.setProperty(entry, "");
}
}
}
this.connectionProperties = properties;
super.setConnectionProperties(connectionProperties);
}
@VisibleForTesting
public Properties getConnectionProperties()
{
return connectionProperties;
}
@Override
protected ConnectionFactory createConnectionFactory() throws SQLException
{
Driver driverToUse = getDriver();
if (driverToUse == null) {
Class<?> driverFromCCL = null;
if (getDriverClassName() != null) {
try {
try {
if (getDriverClassLoader() == null) {
driverFromCCL = Class.forName(getDriverClassName());
} else {
driverFromCCL = Class.forName(
getDriverClassName(), true, getDriverClassLoader());
}
}
catch (ClassNotFoundException cnfe) {
driverFromCCL = Thread.currentThread(
).getContextClassLoader().loadClass(
getDriverClassName());
}
}
catch (Exception t) {
String message = "Cannot load JDBC driver class '" +
getDriverClassName() + "'";
LOGGER.error(t, message);
throw new SQLException(message, t);
}
}
try {
if (driverFromCCL == null) {
driverToUse = DriverManager.getDriver(getUrl());
} else {
// Usage of DriverManager is not possible, as it does not
// respect the ContextClassLoader
// N.B. This cast may cause ClassCastException which is handled below
driverToUse = (Driver) driverFromCCL.newInstance();
if (!driverToUse.acceptsURL(getUrl())) {
throw new SQLException("No suitable driver", "08001");
}
}
}
catch (Exception t) {
String message = "Cannot create JDBC driver of class '" +
(getDriverClassName() != null ? getDriverClassName() : "") +
"' for connect URL '" + getUrl() + "'";
LOGGER.error(t, message);
throw new SQLException(message, t);
}
}
if (driverToUse == null) {
throw new RE("Failed to find the DB Driver");
}
final Driver finalDriverToUse = driverToUse;
return () -> {
String user = connectorConfig.getUser();
if (user != null) {
connectionProperties.put("user", user);
} else {
log("DBCP DataSource configured without a 'username'");
}
// Note: This is the main point of this class where we are getting fresh password before setting up
// every new connection.
String password = connectorConfig.getPassword();
if (password != null) {
connectionProperties.put("password", password);
} else {
log("DBCP DataSource configured without a 'password'");
}
return finalDriverToUse.connect(connectorConfig.getConnectURI(), connectionProperties);
};
}
}

View File

@ -64,7 +64,7 @@ public abstract class SQLFirehoseDatabaseConnector
protected BasicDataSource getDatasource(MetadataStorageConnectorConfig connectorConfig)
{
BasicDataSource dataSource = new BasicDataSource();
BasicDataSource dataSource = new BasicDataSourceExt(connectorConfig);
dataSource.setUsername(connectorConfig.getUser());
dataSource.setPassword(connectorConfig.getPassword());
String uri = connectorConfig.getConnectURI();

View File

@ -654,7 +654,7 @@ public abstract class SQLMetadataConnector implements MetadataStorageConnector
if (dbcpProperties != null) {
dataSource = BasicDataSourceFactory.createDataSource(dbcpProperties);
} else {
dataSource = new BasicDataSource();
dataSource = new BasicDataSourceExt(connectorConfig);
}
}
catch (Exception e) {

View File

@ -0,0 +1,113 @@
/*
* Licensed to the Apache Software Foundation (ASF) under one
* or more contributor license agreements. See the NOTICE file
* distributed with this work for additional information
* regarding copyright ownership. The ASF licenses this file
* to you under the Apache License, Version 2.0 (the
* "License"); you may not use this file except in compliance
* with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing,
* software distributed under the License is distributed on an
* "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
* KIND, either express or implied. See the License for the
* specific language governing permissions and limitations
* under the License.
*/
package org.apache.druid.metadata;
import org.apache.commons.dbcp2.ConnectionFactory;
import org.assertj.core.util.Lists;
import org.easymock.Capture;
import org.easymock.EasyMock;
import org.junit.Assert;
import org.junit.Test;
import java.sql.Driver;
import java.util.List;
import java.util.Properties;
public class BasicDataSourceExtTest
{
@Test
public void testCreateConnectionFactory() throws Exception
{
MetadataStorageConnectorConfig connectorConfig = new MetadataStorageConnectorConfig()
{
private final List<String> passwords = Lists.newArrayList("pwd1", "pwd2");
@Override
public String getUser()
{
return "testuser";
}
@Override
public String getPassword()
{
return passwords.remove(0);
}
};
BasicDataSourceExt basicDataSourceExt = new BasicDataSourceExt(connectorConfig);
basicDataSourceExt.setConnectionProperties("p1=v1");
basicDataSourceExt.addConnectionProperty("p2", "v2");
Driver driver = EasyMock.mock(Driver.class);
Capture<String> uriArg = Capture.newInstance();
Capture<Properties> propsArg = Capture.newInstance();
EasyMock.expect(driver.connect(EasyMock.capture(uriArg), EasyMock.capture(propsArg))).andReturn(null).times(2);
EasyMock.replay(driver);
basicDataSourceExt.setDriver(driver);
ConnectionFactory connectionFactory = basicDataSourceExt.createConnectionFactory();
Properties expectedProps = new Properties();
expectedProps.put("p1", "v1");
expectedProps.put("p2", "v2");
expectedProps.put("user", connectorConfig.getUser());
Assert.assertNull(connectionFactory.createConnection());
Assert.assertEquals(connectorConfig.getConnectURI(), uriArg.getValue());
expectedProps.put("password", "pwd1");
Assert.assertEquals(expectedProps, propsArg.getValue());
Assert.assertNull(connectionFactory.createConnection());
Assert.assertEquals(connectorConfig.getConnectURI(), uriArg.getValue());
expectedProps.put("password", "pwd2");
Assert.assertEquals(expectedProps, propsArg.getValue());
}
@Test
public void testConnectionPropertiesHanding()
{
BasicDataSourceExt basicDataSourceExt = new BasicDataSourceExt(EasyMock.mock(MetadataStorageConnectorConfig.class));
Properties expectedProps = new Properties();
basicDataSourceExt.setConnectionProperties("");
Assert.assertEquals(expectedProps, basicDataSourceExt.getConnectionProperties());
basicDataSourceExt.setConnectionProperties("p0;p1=v1;p2=v2;p3=v3");
basicDataSourceExt.addConnectionProperty("p4", "v4");
basicDataSourceExt.addConnectionProperty("p5", "v5");
basicDataSourceExt.removeConnectionProperty("p2");
basicDataSourceExt.removeConnectionProperty("p5");
expectedProps.put("p0", "");
expectedProps.put("p1", "v1");
expectedProps.put("p3", "v3");
expectedProps.put("p4", "v4");
Assert.assertEquals(expectedProps, basicDataSourceExt.getConnectionProperties());
}
}

View File

@ -88,6 +88,7 @@ HLL
HashSet
Homebrew
HyperLogLog
IAM
IANA
IETF
IP
@ -152,6 +153,7 @@ ParseSpecs
Protobuf
RDBMS
RDDs
RDS
Rackspace
Redis
S3
@ -872,6 +874,7 @@ DistinctCount
artifactId
com.example
common.runtime.properties
druid-aws-rds-extensions
druid-cassandra-storage
druid-distinctcount
druid-ec2-extensions