Merge pull request #191 from dreab8/wip/6.0_merged_12

Wip/6.0 merged
This commit is contained in:
Andrea Boriero 2019-11-08 15:54:35 +00:00 committed by GitHub
commit 7120b8bd40
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
41 changed files with 1972 additions and 174 deletions

View File

@ -3,6 +3,55 @@ Hibernate 5 Changelog
Note: Please refer to JIRA to learn more about each issue. Note: Please refer to JIRA to learn more about each issue.
Changes in 5.4.8.Final (October 28, 2019)
------------------------------------------------------------------------------------------------------------------------
https://hibernate.atlassian.net/projects/HHH/versions/31804/tab/release-report-done
** Bug
* [HHH-12965] - Hibernate Envers Audit tables are created with foreign key with the entity. Because of this I am not able to delete any entries from the entity tables.
* [HHH-13446] - java.lang.VerifyError from compile-time enhanced @Entity
* [HHH-13651] - NPE on flushing when ElementCollection field contains null element
* [HHH-13695] - DDL export forgets to close a Statement
* [HHH-13696] - Multiple OSGi bundles initializing concurrently would overlap classloaders
** Improvement
* [HHH-13686] - Upgrade to Agroal 1.6
Changes in 5.4.7.Final (October 21, 2019)
------------------------------------------------------------------------------------------------------------------------
https://hibernate.atlassian.net/projects/HHH/versions/31799/tab/release-report-done
** Bug
* [HHH-4235] - MapBinder.createFormulatedValue() does not honor DB schema name when creating query
* [HHH-13633] - Bugs join-fetching a collection when scrolling with a stateless session using enhancement as proxy
* [HHH-13634] - PersistenceContext can get cleared before load completes using StatelessSessionImpl
* [HHH-13640] - Uninitialized HibernateProxy mapped as NO_PROXY gets initialized when reloaded with enhancement-as-proxy enabled
* [HHH-13653] - Uninitialized entity does not get initialized when a setter is called with enhancement-as-proxy enabled
* [HHH-13655] - Envers Map<Enum, Integer> causes NullPointerException when mapped with @MapKeyEnumerated since Hibernate 5.4.6
* [HHH-13663] - Session#setHibernateFlushMode() method not callable without an active transaction
* [HHH-13665] - Selecting an entity annotated with @Immutable but not with @Cachable causes a NPE when use_reference_entries is enabled
* [HHH-13672] - The temporary PersistenceContext of a StatelessSession is not cleared after a refresh operation
* [HHH-13675] - Optimize PersistentBag.groupByEqualityHash()
** New Feature
* [HHH-10398] - _MOD columns not named correctly when using custom column names
** Task
* [HHH-13680] - Upgrade to Byte Buddy 1.10.2
* [HHH-13681] - Upgrade to Byteman 4.0.8
** Improvement
* [HHH-12858] - integration overrides during JPA bootstrap ought to override all logically related settings
* [HHH-13432] - Have EntityManagerFactory expose persistence.xml `jta-data-source` element as a `javax.persistence.nonJtaDataSource` property
* [HHH-13660] - Reduce allocation costs of IdentityMaps used by ResultSetProcessingContextImpl
* [HHH-13662] - Avoid initializing XmlMappingBinderAccess when no XML mappings are defined
* [HHH-13666] - AssertionFailure: Exception releasing cache locks upon After/BeforeTransactionCompletionProcess failure
* [HHH-13673] - Cryptic error when providing import.sql file without a terminal char at the end of each line
Changes in 5.4.6.Final (September 30, 2019) Changes in 5.4.6.Final (September 30, 2019)
------------------------------------------------------------------------------------------------------------------------ ------------------------------------------------------------------------------------------------------------------------

View File

@ -4,7 +4,7 @@
== Preface == Preface
Working with both Object-Oriented software and Relational Databases can be cumbersome and time-consuming. Working with both Object-Oriented software and Relational Databases can be cumbersome and time-consuming.
Development costs are significantly higher due to a paradigm mismatch between how data is represented in objects Development costs are significantly higher due to a number of "paradigm mismatches" between how data is represented in objects
versus relational databases. Hibernate is an Object/Relational Mapping (ORM) solution for Java environments. The versus relational databases. Hibernate is an Object/Relational Mapping (ORM) solution for Java environments. The
term Object/Relational Mapping refers to the technique of mapping data between an object model representation to term Object/Relational Mapping refers to the technique of mapping data between an object model representation to
a relational data model representation. See http://en.wikipedia.org/wiki/Object-relational_mapping for a good a relational data model representation. See http://en.wikipedia.org/wiki/Object-relational_mapping for a good
@ -14,7 +14,10 @@ takes a look at many of the mismatch problems.
Although having a strong background in SQL is not required to use Hibernate, having a basic understanding of the Although having a strong background in SQL is not required to use Hibernate, having a basic understanding of the
concepts can help you understand Hibernate more quickly and fully. An understanding of data modeling principles concepts can help you understand Hibernate more quickly and fully. An understanding of data modeling principles
is especially important. Both http://www.agiledata.org/essays/dataModeling101.html and is especially important. Both http://www.agiledata.org/essays/dataModeling101.html and
http://en.wikipedia.org/wiki/Data_modeling are good starting points for understanding these data modeling principles. http://en.wikipedia.org/wiki/Data_modeling are good starting points for understanding these data modeling
principles. If you are completely new to database access in Java,
https://www.marcobehler.com/guides/a-guide-to-accessing-databases-in-java contains a good overview of the various parts,
pieces and options.
Hibernate takes care of the mapping from Java classes to database tables, and from Java data types to SQL data Hibernate takes care of the mapping from Java classes to database tables, and from Java data types to SQL data
types. In addition, it provides data query and retrieval facilities. It can significantly reduce development types. In addition, it provides data query and retrieval facilities. It can significantly reduce development
@ -32,4 +35,4 @@ representation to a graph of objects.
See http://hibernate.org/orm/contribute/ for information on getting involved. See http://hibernate.org/orm/contribute/ for information on getting involved.
IMPORTANT: The projects and code for the tutorials referenced in this guide are available as link:hibernate-tutorials.zip[] IMPORTANT: The projects and code for the tutorials referenced in this guide are available as link:hibernate-tutorials.zip[]

View File

@ -14,7 +14,7 @@ ext {
junit5Version = '5.3.1' junit5Version = '5.3.1'
h2Version = '1.4.196' h2Version = '1.4.196'
bytemanVersion = '4.0.3' //Compatible with JDK10 bytemanVersion = '4.0.8' //Compatible with JDK14
jnpVersion = '5.0.6.CR1' jnpVersion = '5.0.6.CR1'
hibernateCommonsVersion = '5.1.0.Final' hibernateCommonsVersion = '5.1.0.Final'
@ -26,7 +26,9 @@ ext {
weldVersion = '3.0.0.Final' weldVersion = '3.0.0.Final'
javassistVersion = '3.24.0-GA' javassistVersion = '3.24.0-GA'
byteBuddyVersion = '1.9.11' byteBuddyVersion = '1.10.2'
agroalVersion = '1.6'
geolatteVersion = '1.4.0' geolatteVersion = '1.4.0'
@ -146,8 +148,8 @@ ext {
proxool: "proxool:proxool:0.8.3", proxool: "proxool:proxool:0.8.3",
hikaricp: "com.zaxxer:HikariCP:3.2.0", hikaricp: "com.zaxxer:HikariCP:3.2.0",
vibur: "org.vibur:vibur-dbcp:22.2", vibur: "org.vibur:vibur-dbcp:22.2",
agroal_api: "io.agroal:agroal-api:1.4", agroal_api: "io.agroal:agroal-api:${agroalVersion}",
agroal_pool: "io.agroal:agroal-pool:1.4", agroal_pool: "io.agroal:agroal-pool:${agroalVersion}",
atomikos: "com.atomikos:transactions:4.0.6", atomikos: "com.atomikos:transactions:4.0.6",
atomikos_jta: "com.atomikos:transactions-jta:4.0.6", atomikos_jta: "com.atomikos:transactions-jta:4.0.6",

View File

@ -0,0 +1,6 @@
STMT_END=1
MULTILINE_COMMENT=2
LINE_COMMENT=3
NEWLINE=4
WORD=5
QUOTED_TEXT=6

View File

@ -58,9 +58,10 @@ static Implementation wrap(
String mappedBy = getMappedBy( persistentField, targetEntity, enhancementContext ); String mappedBy = getMappedBy( persistentField, targetEntity, enhancementContext );
if ( mappedBy == null || mappedBy.isEmpty() ) { if ( mappedBy == null || mappedBy.isEmpty() ) {
log.infof( log.infof(
"Could not find bi-directional association for field [%s#%s]", "Bi-directional association not managed for field [%s#%s]: Could not find target field in [%s]",
managedCtClass.getName(), managedCtClass.getName(),
persistentField.getName() persistentField.getName(),
targetEntity.getCanonicalName()
); );
return implementation; return implementation;
} }
@ -101,7 +102,7 @@ static Implementation wrap(
if ( persistentField.getType().asErasure().isAssignableTo( Map.class ) || targetType.isAssignableTo( Map.class ) ) { if ( persistentField.getType().asErasure().isAssignableTo( Map.class ) || targetType.isAssignableTo( Map.class ) ) {
log.infof( log.infof(
"Bi-directional association for field [%s#%s] not managed: @ManyToMany in java.util.Map attribute not supported ", "Bi-directional association not managed for field [%s#%s]: @ManyToMany in java.util.Map attribute not supported ",
managedCtClass.getName(), managedCtClass.getName(),
persistentField.getName() persistentField.getName()
); );
@ -145,7 +146,7 @@ public static TypeDescription getTargetEntityClass(TypeDescription managedCtClas
if ( targetClass == null ) { if ( targetClass == null ) {
log.infof( log.infof(
"Could not find type of bi-directional association for field [%s#%s]", "Bi-directional association not managed for field [%s#%s]: Could not find target type",
managedCtClass.getName(), managedCtClass.getName(),
persistentField.getName() persistentField.getName()
); );
@ -163,7 +164,7 @@ else if ( !targetClass.resolve( TypeDescription.class ).represents( void.class )
private static TypeDescription.Generic target(AnnotatedFieldDescription persistentField) { private static TypeDescription.Generic target(AnnotatedFieldDescription persistentField) {
AnnotationDescription.Loadable<Access> access = persistentField.getDeclaringType().asErasure().getDeclaredAnnotations().ofType( Access.class ); AnnotationDescription.Loadable<Access> access = persistentField.getDeclaringType().asErasure().getDeclaredAnnotations().ofType( Access.class );
if ( access != null && access.loadSilent().value() == AccessType.FIELD ) { if ( access != null && access.load().value() == AccessType.FIELD ) {
return persistentField.getType(); return persistentField.getType();
} }
else { else {
@ -183,7 +184,20 @@ private static String getMappedBy(AnnotatedFieldDescription target, TypeDescript
return getMappedByManyToMany( target, targetEntity, context ); return getMappedByManyToMany( target, targetEntity, context );
} }
else { else {
return mappedBy; // HHH-13446 - mappedBy from annotation may not be a valid bi-directional association, verify by calling isValidMappedBy()
return isValidMappedBy( target, targetEntity, mappedBy, context ) ? mappedBy : "";
}
}
private static boolean isValidMappedBy(AnnotatedFieldDescription persistentField, TypeDescription targetEntity, String mappedBy, ByteBuddyEnhancementContext context) {
try {
FieldDescription f = FieldLocator.ForClassHierarchy.Factory.INSTANCE.make( targetEntity ).locate( mappedBy ).getField();
AnnotatedFieldDescription annotatedF = new AnnotatedFieldDescription( context, f );
return context.isPersistentField( annotatedF ) && persistentField.getDeclaringType().asErasure().isAssignableTo( entityType( f.getType() ) );
}
catch ( IllegalStateException e ) {
return false;
} }
} }

View File

@ -514,7 +514,7 @@ private AnnotationList getAnnotations() {
private AnnotationList doGetAnnotations() { private AnnotationList doGetAnnotations() {
AnnotationDescription.Loadable<Access> access = fieldDescription.getDeclaringType().asErasure() AnnotationDescription.Loadable<Access> access = fieldDescription.getDeclaringType().asErasure()
.getDeclaredAnnotations().ofType( Access.class ); .getDeclaredAnnotations().ofType( Access.class );
if ( access != null && access.loadSilent().value() == AccessType.PROPERTY ) { if ( access != null && access.load().value() == AccessType.PROPERTY ) {
Optional<MethodDescription> getter = getGetter(); Optional<MethodDescription> getter = getGetter();
if ( getter.isPresent() ) { if ( getter.isPresent() ) {
return getter.get().getDeclaredAnnotations(); return getter.get().getDeclaredAnnotations();
@ -523,7 +523,7 @@ private AnnotationList doGetAnnotations() {
return fieldDescription.getDeclaredAnnotations(); return fieldDescription.getDeclaredAnnotations();
} }
} }
else if ( access != null && access.loadSilent().value() == AccessType.FIELD ) { else if ( access != null && access.load().value() == AccessType.FIELD ) {
return fieldDescription.getDeclaredAnnotations(); return fieldDescription.getDeclaredAnnotations();
} }
else { else {

View File

@ -325,7 +325,7 @@ private void handleBiDirectionalAssociation(CtClass managedCtClass, CtField pers
final CtClass targetEntity = PersistentAttributesHelper.getTargetEntityClass( managedCtClass, persistentField ); final CtClass targetEntity = PersistentAttributesHelper.getTargetEntityClass( managedCtClass, persistentField );
if ( targetEntity == null ) { if ( targetEntity == null ) {
log.infof( log.infof(
"Could not find type of bi-directional association for field [%s#%s]", "Bi-directional association not managed for field [%s#%s]: Could not find target type",
managedCtClass.getName(), managedCtClass.getName(),
persistentField.getName() persistentField.getName()
); );
@ -334,9 +334,10 @@ private void handleBiDirectionalAssociation(CtClass managedCtClass, CtField pers
final String mappedBy = PersistentAttributesHelper.getMappedBy( persistentField, targetEntity, enhancementContext ); final String mappedBy = PersistentAttributesHelper.getMappedBy( persistentField, targetEntity, enhancementContext );
if ( mappedBy == null || mappedBy.isEmpty() ) { if ( mappedBy == null || mappedBy.isEmpty() ) {
log.infof( log.infof(
"Could not find bi-directional association for field [%s#%s]", "Bi-directional association not managed for field [%s#%s]: Could not find target field in [%s]",
managedCtClass.getName(), managedCtClass.getName(),
persistentField.getName() persistentField.getName(),
targetEntity.getName()
); );
return; return;
} }
@ -459,7 +460,7 @@ private void handleBiDirectionalAssociation(CtClass managedCtClass, CtField pers
if ( PersistentAttributesHelper.isAssignable( persistentField.getType(), Map.class.getName() ) || if ( PersistentAttributesHelper.isAssignable( persistentField.getType(), Map.class.getName() ) ||
PersistentAttributesHelper.isAssignable( targetEntity.getField( mappedBy ).getType(), Map.class.getName() ) ) { PersistentAttributesHelper.isAssignable( targetEntity.getField( mappedBy ).getType(), Map.class.getName() ) ) {
log.infof( log.infof(
"Bi-directional association for field [%s#%s] not managed: @ManyToMany in java.util.Map attribute not supported ", "Bi-directional association not managed for field [%s#%s]: @ManyToMany in java.util.Map attribute not supported ",
managedCtClass.getName(), managedCtClass.getName(),
persistentField.getName() persistentField.getName()
); );

View File

@ -209,7 +209,23 @@ public static boolean isPossibleBiDirectionalAssociation(CtField persistentField
public static String getMappedBy(CtField persistentField, CtClass targetEntity, JavassistEnhancementContext context) throws NotFoundException { public static String getMappedBy(CtField persistentField, CtClass targetEntity, JavassistEnhancementContext context) throws NotFoundException {
final String local = getMappedByFromAnnotation( persistentField ); final String local = getMappedByFromAnnotation( persistentField );
return local.isEmpty() ? getMappedByFromTargetEntity( persistentField, targetEntity, context ) : local; if ( local == null || local.isEmpty() ) {
return getMappedByFromTargetEntity( persistentField, targetEntity, context );
}
else {
// HHH-13446 - mappedBy from annotation may not be a valid bi-directional association, verify by calling isValidMappedBy()
return isValidMappedBy( persistentField, targetEntity, local, context ) ? local : "";
}
}
private static boolean isValidMappedBy(CtField persistentField, CtClass targetEntity, String mappedBy, JavassistEnhancementContext context) {
try {
CtField f = targetEntity.getField( mappedBy );
return context.isPersistentField( f ) && isAssignable( persistentField.getDeclaringClass(), inferFieldTypeName( f ) );
}
catch ( NotFoundException e ) {
return false;
}
} }
private static String getMappedByFromAnnotation(CtField persistentField) { private static String getMappedByFromAnnotation(CtField persistentField) {

View File

@ -11,11 +11,12 @@
import java.sql.SQLException; import java.sql.SQLException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collection; import java.util.Collection;
import java.util.Collections;
import java.util.HashMap;
import java.util.Iterator; import java.util.Iterator;
import java.util.List; import java.util.List;
import java.util.ListIterator; import java.util.ListIterator;
import java.util.Map; import java.util.Map;
import java.util.stream.Collectors;
import org.hibernate.HibernateException; import org.hibernate.HibernateException;
import org.hibernate.engine.spi.SessionImplementor; import org.hibernate.engine.spi.SessionImplementor;
@ -68,6 +69,7 @@ public PersistentBag(SharedSessionContractImplementor session) {
public PersistentBag(SessionImplementor session) { public PersistentBag(SessionImplementor session) {
this( (SharedSessionContractImplementor) session ); this( (SharedSessionContractImplementor) session );
} }
/** /**
* Constructs a PersistentBag * Constructs a PersistentBag
* *
@ -95,7 +97,7 @@ public PersistentBag(SharedSessionContractImplementor session, Collection coll)
* @param coll The base elements. * @param coll The base elements.
* *
* @deprecated {@link #PersistentBag(SharedSessionContractImplementor, Collection)} * @deprecated {@link #PersistentBag(SharedSessionContractImplementor, Collection)}
* should be used instead. * should be used instead.
*/ */
@Deprecated @Deprecated
public PersistentBag(SessionImplementor session, Collection coll) { public PersistentBag(SessionImplementor session, Collection coll) {
@ -128,7 +130,7 @@ public Object readFrom(ResultSet rs, CollectionPersister persister, CollectionAl
throws HibernateException, SQLException { throws HibernateException, SQLException {
// note that if we load this collection from a cartesian product // note that if we load this collection from a cartesian product
// the multiplicity would be broken ... so use an idbag instead // the multiplicity would be broken ... so use an idbag instead
final Object element = persister.readElement( rs, owner, descriptor.getSuffixedElementAliases(), getSession() ) ; final Object element = persister.readElement( rs, owner, descriptor.getSuffixedElementAliases(), getSession() );
if ( element != null ) { if ( element != null ) {
bag.add( element ); bag.add( element );
} }
@ -159,7 +161,7 @@ public void beforeInitialize(CollectionPersister persister, int anticipatedSize)
} }
@Override @Override
@SuppressWarnings( "unchecked" ) @SuppressWarnings("unchecked")
public boolean equalsSnapshot(CollectionPersister persister) throws HibernateException { public boolean equalsSnapshot(CollectionPersister persister) throws HibernateException {
final Type elementType = persister.getElementType(); final Type elementType = persister.getElementType();
final List<Object> sn = (List<Object>) getSnapshot(); final List<Object> sn = (List<Object>) getSnapshot();
@ -199,7 +201,8 @@ public boolean equalsSnapshot(CollectionPersister persister) throws HibernateExc
instance, instance,
instancesBag, instancesBag,
elementType, elementType,
countOccurrences( instance, instancesSn, elementType ) ) ) { countOccurrences( instance, instancesSn, elementType )
) ) {
return false; return false;
} }
} }
@ -213,7 +216,28 @@ public boolean equalsSnapshot(CollectionPersister persister) throws HibernateExc
* @return Map of "equality" hashCode to List of objects * @return Map of "equality" hashCode to List of objects
*/ */
private Map<Integer, List<Object>> groupByEqualityHash(List<Object> searchedBag, Type elementType) { private Map<Integer, List<Object>> groupByEqualityHash(List<Object> searchedBag, Type elementType) {
return searchedBag.stream().collect( Collectors.groupingBy( elementType::getHashCode ) ); if ( searchedBag.isEmpty() ) {
return Collections.emptyMap();
}
Map<Integer, List<Object>> map = new HashMap<>();
for ( Object o : searchedBag ) {
map.computeIfAbsent( nullableHashCode( o, elementType ), k -> new ArrayList<>() ).add( o );
}
return map;
}
/**
* @param o
* @param elementType
* @return the default elementType hashcode of the object o, or null if the object is null
*/
private Integer nullableHashCode(Object o, Type elementType) {
if ( o == null ) {
return null;
}
else {
return elementType.getHashCode( o );
}
} }
@Override @Override
@ -264,7 +288,7 @@ public Collection getOrphans(Serializable snapshot, String entityName) throws Hi
public Object disassemble(CollectionPersister persister) { public Object disassemble(CollectionPersister persister) {
final int length = bag.size(); final int length = bag.size();
final Object[] result = new Object[length]; final Object[] result = new Object[length];
for ( int i=0; i<length; i++ ) { for ( int i = 0; i < length; i++ ) {
result[i] = persister.getElementType().disassemble( bag.get( i ), getSession(), null ); result[i] = persister.getElementType().disassemble( bag.get( i ), getSession(), null );
} }
return result; return result;
@ -306,13 +330,13 @@ public Iterator getDeletes(CollectionPersister persister, boolean indexIsFormula
final ArrayList deletes = new ArrayList(); final ArrayList deletes = new ArrayList();
final List sn = (List) getSnapshot(); final List sn = (List) getSnapshot();
final Iterator olditer = sn.iterator(); final Iterator olditer = sn.iterator();
int i=0; int i = 0;
while ( olditer.hasNext() ) { while ( olditer.hasNext() ) {
final Object old = olditer.next(); final Object old = olditer.next();
final Iterator newiter = bag.iterator(); final Iterator newiter = bag.iterator();
boolean found = false; boolean found = false;
if ( bag.size()>i && elementType.isSame( old, bag.get( i++ ) ) ) { if ( bag.size() > i && elementType.isSame( old, bag.get( i++ ) ) ) {
//a shortcut if its location didn't change! //a shortcut if its location didn't change!
found = true; found = true;
} }
else { else {
@ -368,7 +392,7 @@ public int size() {
@Override @Override
public boolean isEmpty() { public boolean isEmpty() {
return readSize() ? getCachedSize()==0 : bag.isEmpty(); return readSize() ? getCachedSize() == 0 : bag.isEmpty();
} }
@Override @Override
@ -431,7 +455,7 @@ public boolean containsAll(Collection c) {
@Override @Override
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
public boolean addAll(Collection values) { public boolean addAll(Collection values) {
if ( values.size()==0 ) { if ( values.size() == 0 ) {
return false; return false;
} }
if ( !isOperationQueueEnabled() ) { if ( !isOperationQueueEnabled() ) {
@ -442,14 +466,14 @@ public boolean addAll(Collection values) {
for ( Object value : values ) { for ( Object value : values ) {
queueOperation( new SimpleAdd( value ) ); queueOperation( new SimpleAdd( value ) );
} }
return values.size()>0; return values.size() > 0;
} }
} }
@Override @Override
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")
public boolean removeAll(Collection c) { public boolean removeAll(Collection c) {
if ( c.size()>0 ) { if ( c.size() > 0 ) {
initialize( true ); initialize( true );
if ( bag.removeAll( c ) ) { if ( bag.removeAll( c ) ) {
elementRemoved = true; elementRemoved = true;
@ -486,7 +510,7 @@ public void clear() {
} }
else { else {
initialize( true ); initialize( true );
if ( ! bag.isEmpty() ) { if ( !bag.isEmpty() ) {
bag.clear(); bag.clear();
dirty(); dirty();
} }
@ -495,7 +519,7 @@ public void clear() {
@Override @Override
public Object getIndex(Object entry, int i, CollectionPersister persister) { public Object getIndex(Object entry, int i, CollectionPersister persister) {
throw new UnsupportedOperationException("Bags don't have indexes"); throw new UnsupportedOperationException( "Bags don't have indexes" );
} }
@Override @Override
@ -608,7 +632,7 @@ public List subList(int start, int end) {
@Override @Override
public boolean entryExists(Object entry, int i) { public boolean entryExists(Object entry, int i) {
return entry!=null; return entry != null;
} }
@Override @Override
@ -622,8 +646,9 @@ public String toString() {
* JVM instance comparison to do the equals. * JVM instance comparison to do the equals.
* The semantic is broken not to have to initialize a * The semantic is broken not to have to initialize a
* collection for a simple equals() operation. * collection for a simple equals() operation.
* @see java.lang.Object#equals(java.lang.Object)
* *
* @see java.lang.Object#equals(java.lang.Object)
* <p>
* {@inheritDoc} * {@inheritDoc}
*/ */
@Override @Override
@ -649,7 +674,7 @@ public Object getAddedInstance() {
@Override @Override
public Object getOrphan() { public Object getOrphan() {
throw new UnsupportedOperationException("queued clear cannot be used with orphan delete"); throw new UnsupportedOperationException( "queued clear cannot be used with orphan delete" );
} }
} }

View File

@ -532,7 +532,8 @@ public SQLExceptionConversionDelegate buildSQLExceptionConversionDelegate() {
@Override @Override
public JDBCException convert(SQLException sqlException, String message, String sql) { public JDBCException convert(SQLException sqlException, String message, String sql) {
switch ( sqlException.getErrorCode() ) { switch ( sqlException.getErrorCode() ) {
case 1205: { case 1205:
case 3572: {
return new PessimisticLockException( message, sqlException, sql ); return new PessimisticLockException( message, sqlException, sql );
} }
case 1207: case 1207:

View File

@ -13,6 +13,7 @@
import java.io.Serializable; import java.io.Serializable;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collection; import java.util.Collection;
import java.util.Collections;
import java.util.HashMap; import java.util.HashMap;
import java.util.HashSet; import java.util.HashSet;
import java.util.IdentityHashMap; import java.util.IdentityHashMap;
@ -91,31 +92,42 @@ public class StatefulPersistenceContext implements PersistenceContext {
private static final int INIT_COLL_SIZE = 8; private static final int INIT_COLL_SIZE = 8;
/*
Eagerly Initialized Fields
the following fields are used in all circumstances, and are not worth (or not suited) to being converted into lazy
*/
private SharedSessionContractImplementor session; private SharedSessionContractImplementor session;
// Loaded entity instances, by EntityKey
private Map<EntityKey, Object> entitiesByKey;
// Loaded entity instances, by EntityUniqueKey
private Map<EntityUniqueKey, Object> entitiesByUniqueKey;
private EntityEntryContext entityEntryContext; private EntityEntryContext entityEntryContext;
/*
Everything else below should be carefully initialized only on first need;
this optimisation is very effective as null checks are free, while allocation costs
are very often the dominating cost of an application using ORM.
This is not general advice, but it's worth the added maintenance burden in this case
as this is a very central component of our library.
*/
// Loaded entity instances, by EntityKey
private HashMap<EntityKey, Object> entitiesByKey;
// Loaded entity instances, by EntityUniqueKey
private HashMap<EntityUniqueKey, Object> entitiesByUniqueKey;
// Entity proxies, by EntityKey // Entity proxies, by EntityKey
private ConcurrentMap<EntityKey, Object> proxiesByKey; private ConcurrentReferenceHashMap<EntityKey, Object> proxiesByKey;
// Snapshots of current database state for entities // Snapshots of current database state for entities
// that have *not* been loaded // that have *not* been loaded
private Map<EntityKey, Object> entitySnapshotsByKey; private HashMap<EntityKey, Object> entitySnapshotsByKey;
// Identity map of array holder ArrayHolder instances, by the array instance // Identity map of array holder ArrayHolder instances, by the array instance
private Map<Object, PersistentCollection> arrayHolders; private IdentityHashMap<Object, PersistentCollection> arrayHolders;
// Identity map of CollectionEntry instances, by the collection wrapper // Identity map of CollectionEntry instances, by the collection wrapper
private IdentityMap<PersistentCollection, CollectionEntry> collectionEntries; private IdentityMap<PersistentCollection, CollectionEntry> collectionEntries;
// Collection wrappers, by the CollectionKey // Collection wrappers, by the CollectionKey
private Map<CollectionKey, PersistentCollection> collectionsByKey; private HashMap<CollectionKey, PersistentCollection> collectionsByKey;
// Set of EntityKeys of deleted objects // Set of EntityKeys of deleted objects
private HashSet<EntityKey> nullifiableEntityKeys; private HashSet<EntityKey> nullifiableEntityKeys;
@ -125,15 +137,15 @@ public class StatefulPersistenceContext implements PersistenceContext {
// A list of collection wrappers that were instantiating during result set // A list of collection wrappers that were instantiating during result set
// processing, that we will need to initialize at the end of the query // processing, that we will need to initialize at the end of the query
private List<PersistentCollection> nonlazyCollections; private ArrayList<PersistentCollection> nonlazyCollections;
// A container for collections we load up when the owning entity is not // A container for collections we load up when the owning entity is not
// yet loaded ... for now, this is purely transient! // yet loaded ... for now, this is purely transient!
private Map<CollectionKey,PersistentCollection> unownedCollections; private HashMap<CollectionKey,PersistentCollection> unownedCollections;
// Parent entities cache by their child for cascading // Parent entities cache by their child for cascading
// May be empty or not contains all relation // May be empty or not contains all relation
private Map<Object,Object> parentsByChild; private IdentityHashMap<Object,Object> parentsByChild;
private int cascading; private int cascading;
private int loadCounter; private int loadCounter;
@ -146,7 +158,6 @@ public class StatefulPersistenceContext implements PersistenceContext {
private LoadContexts loadContexts; private LoadContexts loadContexts;
private BatchFetchQueue batchFetchQueue; private BatchFetchQueue batchFetchQueue;
/** /**
* Constructs a PersistentContext, bound to the given session. * Constructs a PersistentContext, bound to the given session.
* *
@ -154,12 +165,7 @@ public class StatefulPersistenceContext implements PersistenceContext {
*/ */
public StatefulPersistenceContext(SharedSessionContractImplementor session) { public StatefulPersistenceContext(SharedSessionContractImplementor session) {
this.session = session; this.session = session;
this.entityEntryContext = new EntityEntryContext( this );
entitiesByKey = new HashMap<>( INIT_COLL_SIZE );
entitySnapshotsByKey = new HashMap<>( INIT_COLL_SIZE );
entityEntryContext = new EntityEntryContext( this );
collectionsByKey = new HashMap<>( INIT_COLL_SIZE );
} }
private ConcurrentMap<EntityKey, Object> getOrInitializeProxiesByKey() { private ConcurrentMap<EntityKey, Object> getOrInitializeProxiesByKey() {
@ -241,12 +247,12 @@ public void clear() {
} }
arrayHolders = null; arrayHolders = null;
entitiesByKey.clear(); entitiesByKey = null;
entitiesByUniqueKey = null; entitiesByUniqueKey = null;
entityEntryContext.clear(); entityEntryContext.clear();
parentsByChild = null; parentsByChild = null;
entitySnapshotsByKey.clear(); entitySnapshotsByKey = null;
collectionsByKey.clear(); collectionsByKey = null;
nonlazyCollections = null; nonlazyCollections = null;
collectionEntries = null; collectionEntries = null;
unownedCollections = null; unownedCollections = null;
@ -306,12 +312,15 @@ public void afterTransactionCompletion() {
@Override @Override
public Object[] getDatabaseSnapshot(Object id, EntityPersister persister) throws HibernateException { public Object[] getDatabaseSnapshot(Object id, EntityPersister persister) throws HibernateException {
final EntityKey key = session.generateEntityKey( id, persister ); final EntityKey key = session.generateEntityKey( id, persister );
final Object cached = entitySnapshotsByKey.get( key ); final Object cached = entitySnapshotsByKey == null ? null : entitySnapshotsByKey.get( key );
if ( cached != null ) { if ( cached != null ) {
return cached == NO_ROW ? null : (Object[]) cached; return cached == NO_ROW ? null : (Object[]) cached;
} }
else { else {
final Object[] snapshot = persister.getDatabaseSnapshot( id, session ); final Object[] snapshot = persister.getDatabaseSnapshot( id, session );
if ( entitySnapshotsByKey == null ) {
entitySnapshotsByKey = new HashMap<>( INIT_COLL_SIZE );
}
entitySnapshotsByKey.put( key, snapshot == null ? NO_ROW : snapshot ); entitySnapshotsByKey.put( key, snapshot == null ? NO_ROW : snapshot );
return snapshot; return snapshot;
} }
@ -370,7 +379,7 @@ private EntityPersister locateProperPersister(EntityPersister persister) {
@Override @Override
public Object[] getCachedDatabaseSnapshot(EntityKey key) { public Object[] getCachedDatabaseSnapshot(EntityKey key) {
final Object snapshot = entitySnapshotsByKey.get( key ); final Object snapshot = entitySnapshotsByKey == null ? null : entitySnapshotsByKey.get( key );
if ( snapshot == NO_ROW ) { if ( snapshot == NO_ROW ) {
throw new IllegalStateException( throw new IllegalStateException(
"persistence context reported no row snapshot for " "persistence context reported no row snapshot for "
@ -382,43 +391,56 @@ public Object[] getCachedDatabaseSnapshot(EntityKey key) {
@Override @Override
public void addEntity(EntityKey key, Object entity) { public void addEntity(EntityKey key, Object entity) {
if ( entitiesByKey == null ) {
entitiesByKey = new HashMap<>( INIT_COLL_SIZE );
}
entitiesByKey.put( key, entity ); entitiesByKey.put( key, entity );
if( batchFetchQueue != null ) { final BatchFetchQueue fetchQueue = this.batchFetchQueue;
getBatchFetchQueue().removeBatchLoadableEntityKey(key); if ( fetchQueue != null ) {
fetchQueue.removeBatchLoadableEntityKey( key );
} }
} }
@Override @Override
public Object getEntity(EntityKey key) { public Object getEntity(EntityKey key) {
return entitiesByKey.get( key ); return entitiesByKey == null ? null : entitiesByKey.get( key );
} }
@Override @Override
public boolean containsEntity(EntityKey key) { public boolean containsEntity(EntityKey key) {
return entitiesByKey.containsKey( key ); return entitiesByKey == null ? false : entitiesByKey.containsKey( key );
} }
@Override @Override
public Object removeEntity(EntityKey key) { public Object removeEntity(EntityKey key) {
final Object entity = entitiesByKey.remove( key ); final Object entity;
if ( entitiesByUniqueKey != null ) { if ( entitiesByKey != null ) {
final Iterator itr = entitiesByUniqueKey.values().iterator(); entity = entitiesByKey.remove( key );
while ( itr.hasNext() ) { if ( entitiesByUniqueKey != null ) {
if ( itr.next() == entity ) { final Iterator itr = entitiesByUniqueKey.values().iterator();
itr.remove(); while ( itr.hasNext() ) {
if ( itr.next() == entity ) {
itr.remove();
}
} }
} }
} }
else {
entity = null;
}
// Clear all parent cache // Clear all parent cache
parentsByChild = null; parentsByChild = null;
entitySnapshotsByKey.remove( key ); if ( entitySnapshotsByKey != null ) {
entitySnapshotsByKey.remove( key );
}
if ( nullifiableEntityKeys != null ) { if ( nullifiableEntityKeys != null ) {
nullifiableEntityKeys.remove( key ); nullifiableEntityKeys.remove( key );
} }
if( batchFetchQueue != null ) { final BatchFetchQueue fetchQueue = this.batchFetchQueue;
getBatchFetchQueue().removeBatchLoadableEntityKey( key ); if ( fetchQueue != null ) {
getBatchFetchQueue().removeSubselect( key ); fetchQueue.removeBatchLoadableEntityKey( key );
fetchQueue.removeSubselect( key );
} }
return entity; return entity;
} }
@ -752,6 +774,9 @@ public Object proxyFor(Object impl) throws HibernateException {
@Override @Override
public void addEnhancedProxy(EntityKey key, PersistentAttributeInterceptable entity) { public void addEnhancedProxy(EntityKey key, PersistentAttributeInterceptable entity) {
if ( entitiesByKey == null ) {
entitiesByKey = new HashMap<>( INIT_COLL_SIZE );
}
entitiesByKey.put( key, entity ); entitiesByKey.put( key, entity );
} }
@ -885,7 +910,7 @@ public void addNewCollection(CollectionPersister persister, PersistentCollection
private void addCollection(PersistentCollection coll, CollectionEntry entry, Object key) { private void addCollection(PersistentCollection coll, CollectionEntry entry, Object key) {
getOrInitializeCollectionEntries().put( coll, entry ); getOrInitializeCollectionEntries().put( coll, entry );
final CollectionKey collectionKey = new CollectionKey( entry.getLoadedPersister(), key ); final CollectionKey collectionKey = new CollectionKey( entry.getLoadedPersister(), key );
final PersistentCollection old = collectionsByKey.put( collectionKey, coll ); final PersistentCollection old = addCollectionByKey( collectionKey, coll );
if ( old != null ) { if ( old != null ) {
if ( old == coll ) { if ( old == coll ) {
throw new AssertionFailure( "bug adding collection twice" ); throw new AssertionFailure( "bug adding collection twice" );
@ -942,7 +967,7 @@ public CollectionEntry addInitializedCollection(CollectionPersister persister, P
@Override @Override
public PersistentCollection getCollection(CollectionKey collectionKey) { public PersistentCollection getCollection(CollectionKey collectionKey) {
return collectionsByKey.get( collectionKey ); return collectionsByKey == null ? null : collectionsByKey.get( collectionKey );
} }
@Override @Override
@ -1037,9 +1062,10 @@ public void addProxy(EntityKey key, Object proxy) {
@Override @Override
public Object removeProxy(EntityKey key) { public Object removeProxy(EntityKey key) {
if ( batchFetchQueue != null ) { final BatchFetchQueue fetchQueue = this.batchFetchQueue;
batchFetchQueue.removeBatchLoadableEntityKey( key ); if ( fetchQueue != null ) {
batchFetchQueue.removeSubselect( key ); fetchQueue.removeBatchLoadableEntityKey( key );
fetchQueue.removeSubselect( key );
} }
return removeProxyByKey( key ); return removeProxyByKey( key );
} }
@ -1052,9 +1078,25 @@ public HashSet getNullifiableEntityKeys() {
return nullifiableEntityKeys; return nullifiableEntityKeys;
} }
/**
* @deprecated this will be removed: it provides too wide access, making it hard to optimise the internals
* for specific access needs. Consider using #iterateEntities instead.
* @return
*/
@Deprecated
@Override @Override
public Map getEntitiesByKey() { public Map getEntitiesByKey() {
return entitiesByKey; return entitiesByKey == null ? Collections.emptyMap() : entitiesByKey;
}
@Override
public Iterator managedEntitiesIterator() {
if ( entitiesByKey == null ) {
return Collections.emptyIterator();
}
else {
return entitiesByKey.values().iterator();
}
} }
@Override @Override
@ -1094,7 +1136,12 @@ public void forEachCollectionEntry(BiConsumer<PersistentCollection, CollectionEn
@Override @Override
public Map getCollectionsByKey() { public Map getCollectionsByKey() {
return collectionsByKey; if ( collectionsByKey == null ) {
return Collections.emptyMap();
}
else {
return collectionsByKey;
}
} }
@Override @Override
@ -1185,8 +1232,9 @@ public boolean isLoadFinished() {
@Override @Override
public String toString() { public String toString() {
return "PersistenceContext[entityKeys=" + entitiesByKey.keySet() final String entityKeySet = entitiesByKey == null ? "[]" : entitiesByKey.keySet().toString();
+ ",collectionKeys=" + collectionsByKey.keySet() + "]"; final String collectionsKeySet = collectionsByKey == null ? "[]" : collectionsByKey.keySet().toString();
return "PersistenceContext[entityKeys=" + entityKeySet + ", collectionKeys=" + collectionsKeySet + "]";
} }
@Override @Override
@ -1483,7 +1531,7 @@ private void setEntityReadOnly(Object entity, boolean readOnly) {
@Override @Override
public void replaceDelayedEntityIdentityInsertKeys(EntityKey oldKey, Object generatedId) { public void replaceDelayedEntityIdentityInsertKeys(EntityKey oldKey, Object generatedId) {
final Object entity = entitiesByKey.remove( oldKey ); final Object entity = entitiesByKey == null ? null : entitiesByKey.remove( oldKey );
final EntityEntry oldEntry = entityEntryContext.removeEntityEntry( entity ); final EntityEntry oldEntry = entityEntryContext.removeEntityEntry( entity );
this.parentsByChild = null; this.parentsByChild = null;
@ -1516,13 +1564,18 @@ public void serialize(ObjectOutputStream oos) throws IOException {
oos.writeBoolean( defaultReadOnly ); oos.writeBoolean( defaultReadOnly );
oos.writeBoolean( hasNonReadOnlyEntities ); oos.writeBoolean( hasNonReadOnlyEntities );
oos.writeInt( entitiesByKey.size() ); if ( entitiesByKey == null ) {
if ( LOG.isTraceEnabled() ) { oos.writeInt( 0 );
LOG.trace( "Starting serialization of [" + entitiesByKey.size() + "] entitiesByKey entries" );
} }
for ( Map.Entry<EntityKey,Object> entry : entitiesByKey.entrySet() ) { else {
entry.getKey().serialize( oos ); oos.writeInt( entitiesByKey.size() );
oos.writeObject( entry.getValue() ); if ( LOG.isTraceEnabled() ) {
LOG.trace( "Starting serialization of [" + entitiesByKey.size() + "] entitiesByKey entries" );
}
for ( Map.Entry<EntityKey,Object> entry : entitiesByKey.entrySet() ) {
entry.getKey().serialize( oos );
oos.writeObject( entry.getValue() );
}
} }
if ( entitiesByUniqueKey == null ) { if ( entitiesByUniqueKey == null ) {
@ -1553,24 +1606,34 @@ public void serialize(ObjectOutputStream oos) throws IOException {
} }
} }
oos.writeInt( entitySnapshotsByKey.size() ); if ( entitySnapshotsByKey == null ) {
if ( LOG.isTraceEnabled() ) { oos.writeInt( 0 );
LOG.trace( "Starting serialization of [" + entitySnapshotsByKey.size() + "] entitySnapshotsByKey entries" );
} }
for ( Map.Entry<EntityKey,Object> entry : entitySnapshotsByKey.entrySet() ) { else {
entry.getKey().serialize( oos ); oos.writeInt( entitySnapshotsByKey.size() );
oos.writeObject( entry.getValue() ); if ( LOG.isTraceEnabled() ) {
LOG.trace( "Starting serialization of [" + entitySnapshotsByKey.size() + "] entitySnapshotsByKey entries" );
}
for ( Map.Entry<EntityKey,Object> entry : entitySnapshotsByKey.entrySet() ) {
entry.getKey().serialize( oos );
oos.writeObject( entry.getValue() );
}
} }
entityEntryContext.serialize( oos ); entityEntryContext.serialize( oos );
oos.writeInt( collectionsByKey.size() ); if ( collectionsByKey == null ) {
if ( LOG.isTraceEnabled() ) { oos.writeInt( 0 );
LOG.trace( "Starting serialization of [" + collectionsByKey.size() + "] collectionsByKey entries" );
} }
for ( Map.Entry<CollectionKey,PersistentCollection> entry : collectionsByKey.entrySet() ) { else {
entry.getKey().serialize( oos ); oos.writeInt( collectionsByKey.size() );
oos.writeObject( entry.getValue() ); if ( LOG.isTraceEnabled() ) {
LOG.trace( "Starting serialization of [" + collectionsByKey.size() + "] collectionsByKey entries" );
}
for ( Map.Entry<CollectionKey, PersistentCollection> entry : collectionsByKey.entrySet() ) {
entry.getKey().serialize( oos );
oos.writeObject( entry.getValue() );
}
} }
if ( collectionEntries == null ) { if ( collectionEntries == null ) {
@ -1831,6 +1894,32 @@ public CollectionEntry removeCollectionEntry(PersistentCollection collection) {
} }
} }
@Override
public void clearCollectionsByKey() {
if ( collectionsByKey != null ) {
//A valid alternative would be to set this to null, like we do on close.
//The difference being that in this case we expect the collection will be used again, so we bet that clear()
//might allow us to skip having to re-allocate the collection.
collectionsByKey.clear();
}
}
@Override
public PersistentCollection addCollectionByKey(CollectionKey collectionKey, PersistentCollection persistentCollection) {
if ( collectionsByKey == null ) {
collectionsByKey = new HashMap<>( INIT_COLL_SIZE );
}
final PersistentCollection old = collectionsByKey.put( collectionKey, persistentCollection );
return old;
}
@Override
public void removeCollectionByKey(CollectionKey collectionKey) {
if ( collectionsByKey != null ) {
collectionsByKey.remove( collectionKey );
}
}
private void cleanUpInsertedKeysAfterTransaction() { private void cleanUpInsertedKeysAfterTransaction() {
if ( insertedKeysMap != null ) { if ( insertedKeysMap != null ) {
insertedKeysMap.clear(); insertedKeysMap.clear();

View File

@ -159,8 +159,12 @@ protected void releaseStatements() {
clearBatch( statement ); clearBatch( statement );
resourceRegistry.release( statement ); resourceRegistry.release( statement );
} }
jdbcCoordinator.afterStatementExecution(); // IMPL NOTE: If the statements are not cleared and JTA is being used, then
// jdbcCoordinator.afterStatementExecution() will abort the batch and a
// warning will be logged. To avoid the warning, clear statements first,
// before calling jdbcCoordinator.afterStatementExecution().
statements.clear(); statements.clear();
jdbcCoordinator.afterStatementExecution();
} }
protected void clearBatch(PreparedStatement statement) { protected void clearBatch(PreparedStatement statement) {

View File

@ -124,4 +124,13 @@ protected JDBCException convertSqlException(String message, SQLException e) {
} }
protected abstract Connection makeConnection(String url, Properties connectionProps); protected abstract Connection makeConnection(String url, Properties connectionProps);
/**
* Exposed for testing purposes only.
* @return
*/
public Properties getConnectionProperties() {
return new Properties( connectionProps );
}
} }

View File

@ -417,6 +417,7 @@ else if ( CONDITIONAL_PROPERTIES.containsKey( key ) ) {
SPECIAL_PROPERTIES.add( AvailableSettings.ISOLATION ); SPECIAL_PROPERTIES.add( AvailableSettings.ISOLATION );
SPECIAL_PROPERTIES.add( AvailableSettings.DRIVER ); SPECIAL_PROPERTIES.add( AvailableSettings.DRIVER );
SPECIAL_PROPERTIES.add( AvailableSettings.USER ); SPECIAL_PROPERTIES.add( AvailableSettings.USER );
SPECIAL_PROPERTIES.add( AvailableSettings.CONNECTION_PROVIDER_DISABLES_AUTOCOMMIT );
ISOLATION_VALUE_MAP = new ConcurrentHashMap<String, Integer>(); ISOLATION_VALUE_MAP = new ConcurrentHashMap<String, Integer>();
ISOLATION_VALUE_MAP.put( "TRANSACTION_NONE", Connection.TRANSACTION_NONE ); ISOLATION_VALUE_MAP.put( "TRANSACTION_NONE", Connection.TRANSACTION_NONE );

View File

@ -215,6 +215,16 @@ protected void finalize() throws Throwable {
} }
//CHECKSTYLE:END_ALLOW_FINALIZER //CHECKSTYLE:END_ALLOW_FINALIZER
/**
* Exposed to facilitate testing only.
* @return
*/
public Properties getConnectionProperties() {
BasicConnectionCreator connectionCreator = (BasicConnectionCreator) this.state.pool.connectionCreator;
return connectionCreator.getConnectionProperties();
}
public static class PooledConnections { public static class PooledConnections {
private final ConcurrentLinkedQueue<Connection> allConnections = new ConcurrentLinkedQueue<Connection>(); private final ConcurrentLinkedQueue<Connection> allConnections = new ConcurrentLinkedQueue<Connection>();

View File

@ -9,6 +9,7 @@
import java.io.Serializable; import java.io.Serializable;
import java.util.Collection; import java.util.Collection;
import java.util.HashSet; import java.util.HashSet;
import java.util.Iterator;
import java.util.Map; import java.util.Map;
import java.util.function.BiConsumer; import java.util.function.BiConsumer;
import java.util.function.Supplier; import java.util.function.Supplier;
@ -487,7 +488,10 @@ CollectionEntry addInitializedCollection(
/** /**
* Get the mapping from key value to entity instance * Get the mapping from key value to entity instance
* @deprecated this will be removed: it provides too wide access, making it hard to optimise the internals
* for specific access needs. Consider using #iterateEntities instead.
*/ */
@Deprecated
Map getEntitiesByKey(); Map getEntitiesByKey();
/** /**
@ -526,7 +530,12 @@ CollectionEntry addInitializedCollection(
/** /**
* Get the mapping from collection key to collection instance * Get the mapping from collection key to collection instance
* @deprecated this method should be removed; alternative methods are available that better express the intent, allowing
* for better optimisations. Not aggressively removing this as it's an SPI, but also useful for testing and other
* contexts which are not performance sensitive.
* N.B. This might return an immutable map: do not use for mutations!
*/ */
@Deprecated
Map getCollectionsByKey(); Map getCollectionsByKey();
/** /**
@ -769,6 +778,31 @@ CollectionEntry addInitializedCollection(
*/ */
CollectionEntry removeCollectionEntry(PersistentCollection collection); CollectionEntry removeCollectionEntry(PersistentCollection collection);
/**
* Remove all state of the collections-by-key map.
*/
void clearCollectionsByKey();
/**
* Adds a collection in the collections-by-key map.
* @param collectionKey
* @param persistentCollection
* @return the previous collection, it the key was already mapped.
*/
PersistentCollection addCollectionByKey(CollectionKey collectionKey, PersistentCollection persistentCollection);
/**
* Remove a collection-by-key mapping.
* @param collectionKey the key to clear
*/
void removeCollectionByKey(CollectionKey collectionKey);
/**
* A read-only iterator on all entities managed by this persistence context
* @return
*/
Iterator managedEntitiesIterator();
/** /**
* Provides centralized access to natural-id-related functionality. * Provides centralized access to natural-id-related functionality.
*/ */

View File

@ -37,7 +37,6 @@
import org.hibernate.event.spi.FlushEvent; import org.hibernate.event.spi.FlushEvent;
import org.hibernate.internal.CoreMessageLogger; import org.hibernate.internal.CoreMessageLogger;
import org.hibernate.internal.util.EntityPrinter; import org.hibernate.internal.util.EntityPrinter;
import org.hibernate.internal.util.collections.LazyIterator;
import org.hibernate.persister.entity.EntityPersister; import org.hibernate.persister.entity.EntityPersister;
import org.jboss.logging.Logger; import org.jboss.logging.Logger;
@ -77,7 +76,7 @@ protected void flushEverythingToExecutions(FlushEvent event) throws HibernateExc
EventSource session = event.getSession(); EventSource session = event.getSession();
final PersistenceContext persistenceContext = session.getPersistenceContextInternal(); final PersistenceContext persistenceContext = session.getPersistenceContextInternal();
session.getInterceptor().preFlush( new LazyIterator( persistenceContext.getEntitiesByKey() ) ); session.getInterceptor().preFlush( persistenceContext.managedEntitiesIterator() );
prepareEntityFlushes( session, persistenceContext ); prepareEntityFlushes( session, persistenceContext );
// we could move this inside if we wanted to // we could move this inside if we wanted to
@ -369,7 +368,7 @@ protected void postFlush(SessionImplementor session) throws HibernateException {
LOG.trace( "Post flush" ); LOG.trace( "Post flush" );
final PersistenceContext persistenceContext = session.getPersistenceContextInternal(); final PersistenceContext persistenceContext = session.getPersistenceContextInternal();
persistenceContext.getCollectionsByKey().clear(); persistenceContext.clearCollectionsByKey();
// the database has changed now, so the subselect results need to be invalidated // the database has changed now, so the subselect results need to be invalidated
// the batch fetching queues should also be cleared - especially the collection batch fetching one // the batch fetching queues should also be cleared - especially the collection batch fetching one
@ -390,13 +389,14 @@ protected void postFlush(SessionImplementor session) throws HibernateException {
collectionEntry.getLoadedPersister(), collectionEntry.getLoadedPersister(),
collectionEntry.getLoadedKey() collectionEntry.getLoadedKey()
); );
persistenceContext.getCollectionsByKey().put( collectionKey, persistentCollection ); persistenceContext.addCollectionByKey( collectionKey, persistentCollection );
} }
}, true }, true
); );
} }
protected void postPostFlush(SessionImplementor session) { protected void postPostFlush(SessionImplementor session) {
session.getInterceptor().postFlush( new LazyIterator( session.getPersistenceContextInternal().getEntitiesByKey() ) ); session.getInterceptor().postFlush( session.getPersistenceContextInternal().managedEntitiesIterator() );
} }
} }

View File

@ -81,9 +81,7 @@ private void evictCollection(PersistentCollection collection) {
} }
if ( ce.getLoadedPersister() != null && ce.getLoadedKey() != null ) { if ( ce.getLoadedPersister() != null && ce.getLoadedKey() != null ) {
//TODO: is this 100% correct? //TODO: is this 100% correct?
persistenceContext.getCollectionsByKey().remove( persistenceContext.removeCollectionByKey( new CollectionKey( ce.getLoadedPersister(), ce.getLoadedKey() ) );
new CollectionKey( ce.getLoadedPersister(), ce.getLoadedKey() )
);
} }
} }

View File

@ -2850,7 +2850,7 @@ public <T> T find(Class<T> entityClass, Object primaryKey, LockModeType lockMode
throw getExceptionConverter().convert( new IllegalArgumentException( e.getMessage(), e ) ); throw getExceptionConverter().convert( new IllegalArgumentException( e.getMessage(), e ) );
} }
catch ( JDBCException e ) { catch ( JDBCException e ) {
if ( accessTransaction().getRollbackOnly() ) { if ( accessTransaction().isActive() && accessTransaction().getRollbackOnly() ) {
// assume this is the similar to the WildFly / IronJacamar "feature" described under HHH-12472 // assume this is the similar to the WildFly / IronJacamar "feature" described under HHH-12472
return null; return null;
} }

View File

@ -55,7 +55,6 @@ public String getInternalFetchProfile() {
@Override @Override
public void setInternalFetchProfile(String internalFetchProfile) { public void setInternalFetchProfile(String internalFetchProfile) {
} }
}; };
private final PersistenceContext temporaryPersistenceContext = new StatefulPersistenceContext( this ); private final PersistenceContext temporaryPersistenceContext = new StatefulPersistenceContext( this );
@ -240,6 +239,9 @@ public void refresh(String entityName, Object entity, LockMode lockMode) {
this.getLoadQueryInfluencers().setInternalFetchProfile( previousFetchProfile ); this.getLoadQueryInfluencers().setInternalFetchProfile( previousFetchProfile );
} }
UnresolvableObjectException.throwIfNull( result, id, persister.getEntityName() ); UnresolvableObjectException.throwIfNull( result, id, persister.getEntityName() );
if ( temporaryPersistenceContext.isLoadFinished() ) {
temporaryPersistenceContext.clear();
}
} }
@Override @Override
@ -274,11 +276,12 @@ public Object internalLoad(
boolean nullable) throws HibernateException { boolean nullable) throws HibernateException {
checkOpen(); checkOpen();
EntityPersister persister = getFactory().getMetamodel().entityPersister( entityName ); final EntityPersister persister = getFactory().getMetamodel().entityPersister( entityName );
final EntityKey entityKey = generateEntityKey( id, persister ); final EntityKey entityKey = generateEntityKey( id, persister );
// first, try to load it from the temp PC associated to this SS // first, try to load it from the temp PC associated to this SS
Object loaded = temporaryPersistenceContext.getEntity( entityKey ); final PersistenceContext persistenceContext = getPersistenceContext();
Object loaded = persistenceContext.getEntity( entityKey );
if ( loaded != null ) { if ( loaded != null ) {
// we found it in the temp PC. Should indicate we are in the midst of processing a result set // we found it in the temp PC. Should indicate we are in the midst of processing a result set
// containing eager fetches via join fetch // containing eager fetches via join fetch
@ -298,7 +301,6 @@ public Object internalLoad(
// if the entity defines a HibernateProxy factory, see if there is an // if the entity defines a HibernateProxy factory, see if there is an
// existing proxy associated with the PC - and if so, use it // existing proxy associated with the PC - and if so, use it
if ( persister.getRepresentationStrategy().getProxyFactory() != null ) { if ( persister.getRepresentationStrategy().getProxyFactory() != null ) {
final PersistenceContext persistenceContext = getPersistenceContext();
final Object proxy = persistenceContext.getProxy( entityKey ); final Object proxy = persistenceContext.getProxy( entityKey );
if ( proxy != null ) { if ( proxy != null ) {
@ -329,7 +331,6 @@ else if ( !entityMetamodel.hasSubclasses() ) {
} }
else { else {
if ( persister.hasProxy() ) { if ( persister.hasProxy() ) {
final PersistenceContext persistenceContext = getPersistenceContext();
final Object existingProxy = persistenceContext.getProxy( entityKey ); final Object existingProxy = persistenceContext.getProxy( entityKey );
if ( existingProxy != null ) { if ( existingProxy != null ) {
return persistenceContext.narrowProxy( existingProxy, persister, entityKey, null ); return persistenceContext.narrowProxy( existingProxy, persister, entityKey, null );
@ -342,7 +343,16 @@ else if ( !entityMetamodel.hasSubclasses() ) {
} }
// otherwise immediately materialize it // otherwise immediately materialize it
return get( entityName, id );
// IMPLEMENTATION NOTE: increment/decrement the load count before/after getting the value
// to ensure that #get does not clear the PersistenceContext.
persistenceContext.beforeLoad();
try {
return get( entityName, id );
}
finally {
persistenceContext.afterLoad();
}
} }
private Object createProxy(EntityKey entityKey) { private Object createProxy(EntityKey entityKey) {

View File

@ -1,40 +0,0 @@
/*
* Hibernate, Relational Persistence for Idiomatic Java
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later.
* See the lgpl.txt file in the root directory or <http://www.gnu.org/licenses/lgpl-2.1.html>.
*/
package org.hibernate.internal.util.collections;
import java.util.Iterator;
import java.util.Map;
public final class LazyIterator implements Iterator {
private final Map map;
private Iterator iterator;
private Iterator getIterator() {
if (iterator==null) {
iterator = map.values().iterator();
}
return iterator;
}
public LazyIterator(Map map) {
this.map = map;
}
public boolean hasNext() {
return getIterator().hasNext();
}
public Object next() {
return getIterator().next();
}
public void remove() {
throw new UnsupportedOperationException();
}
}

View File

@ -86,6 +86,15 @@ private Statement jdbcStatement() {
@Override @Override
public void release() { public void release() {
if ( jdbcStatement != null ) {
try {
jdbcStatement.close();
jdbcStatement = null;
}
catch (SQLException e) {
throw ddlTransactionIsolator.getJdbcContext().getSqlExceptionHelper().convert( e, "Unable to close JDBC Statement after DDL execution" );
}
}
if ( releaseAfterUse ) { if ( releaseAfterUse ) {
ddlTransactionIsolator.release(); ddlTransactionIsolator.release();
} }

View File

@ -7,6 +7,7 @@
package org.hibernate.jpa.test.connection; package org.hibernate.jpa.test.connection;
import java.util.Map; import java.util.Map;
import java.util.Properties;
import javax.persistence.Entity; import javax.persistence.Entity;
import javax.persistence.EntityManager; import javax.persistence.EntityManager;
import javax.persistence.Id; import javax.persistence.Id;
@ -15,10 +16,13 @@
import javax.persistence.criteria.CriteriaQuery; import javax.persistence.criteria.CriteriaQuery;
import org.hibernate.cfg.AvailableSettings; import org.hibernate.cfg.AvailableSettings;
import org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl;
import org.hibernate.engine.jdbc.connections.spi.ConnectionProvider;
import org.hibernate.exception.SQLGrammarException; import org.hibernate.exception.SQLGrammarException;
import org.hibernate.jpa.test.BaseEntityManagerFunctionalTestCase; import org.hibernate.jpa.test.BaseEntityManagerFunctionalTestCase;
import org.hibernate.testing.TestForIssue; import org.hibernate.testing.TestForIssue;
import org.junit.Assert;
import org.junit.Test; import org.junit.Test;
/** /**
@ -46,6 +50,19 @@ protected void addConfigOptions(Map options) {
AvailableSettings.POOL_SIZE, AvailableSettings.POOL_SIZE,
Integer.valueOf( CONNECTION_POOL_SIZE ) Integer.valueOf( CONNECTION_POOL_SIZE )
); );
options.put( "hibernate.connection.customProperty", "x" );
options.put( AvailableSettings.CONNECTION_PROVIDER_DISABLES_AUTOCOMMIT, "true" );
}
@Test
@TestForIssue(jiraKey = "HHH-13700")
public void testConnectionPoolPropertyFiltering() {
ConnectionProvider cp = serviceRegistry().getService( ConnectionProvider.class );
DriverManagerConnectionProviderImpl dmcp = (DriverManagerConnectionProviderImpl) cp;
Properties connectionProperties = dmcp.getConnectionProperties();
Assert.assertEquals( "x", connectionProperties.getProperty( "customProperty" ) );
Assert.assertNull( connectionProperties.getProperty( "pool_size" ) );
Assert.assertNull( connectionProperties.getProperty( "provider_disables_autocommit" ) );
} }
@Test @Test

View File

@ -18,12 +18,16 @@
import javax.persistence.spi.PersistenceUnitInfo; import javax.persistence.spi.PersistenceUnitInfo;
import javax.sql.DataSource; import javax.sql.DataSource;
import org.hibernate.boot.spi.MetadataImplementor;
import org.hibernate.boot.spi.SessionFactoryOptions;
import org.hibernate.cache.spi.access.AccessType; import org.hibernate.cache.spi.access.AccessType;
import org.hibernate.cfg.AvailableSettings; import org.hibernate.cfg.AvailableSettings;
import org.hibernate.dialect.Dialect; import org.hibernate.dialect.Dialect;
import org.hibernate.engine.jdbc.connections.internal.DatasourceConnectionProviderImpl; import org.hibernate.engine.jdbc.connections.internal.DatasourceConnectionProviderImpl;
import org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl; import org.hibernate.engine.jdbc.connections.internal.DriverManagerConnectionProviderImpl;
import org.hibernate.engine.jdbc.connections.spi.ConnectionProvider; import org.hibernate.engine.jdbc.connections.spi.ConnectionProvider;
import org.hibernate.engine.jdbc.connections.spi.JdbcConnectionAccess;
import org.hibernate.engine.jdbc.spi.JdbcServices;
import org.hibernate.engine.spi.SessionFactoryImplementor; import org.hibernate.engine.spi.SessionFactoryImplementor;
import org.hibernate.jpa.HibernatePersistenceProvider; import org.hibernate.jpa.HibernatePersistenceProvider;
import org.hibernate.persister.entity.EntityPersister; import org.hibernate.persister.entity.EntityPersister;
@ -88,6 +92,8 @@ public void testPassingIntegrationJtaDataSourceOverrideForJpaJdbcSettings() {
final DataSource integrationDataSource = new DataSourceStub( "integrationDataSource" ); final DataSource integrationDataSource = new DataSourceStub( "integrationDataSource" );
final HibernatePersistenceProvider provider = new HibernatePersistenceProvider(); final HibernatePersistenceProvider provider = new HibernatePersistenceProvider();
// todo (6.0) : fix for Oracle see HHH-13432
// puInfo.getProperties().setProperty( AvailableSettings.HQL_BULK_ID_STRATEGY, MultiTableBulkIdStrategyStub.class.getName() );
final EntityManagerFactory emf = provider.createContainerEntityManagerFactory( final EntityManagerFactory emf = provider.createContainerEntityManagerFactory(
puInfo, puInfo,
@ -273,6 +279,8 @@ public DataSource getJtaDataSource() {
final Map integrationOverrides = new HashMap(); final Map integrationOverrides = new HashMap();
//noinspection unchecked //noinspection unchecked
integrationOverrides.put( AvailableSettings.JPA_JTA_DATASOURCE, integrationDataSource ); integrationOverrides.put( AvailableSettings.JPA_JTA_DATASOURCE, integrationDataSource );
// todo (6.0) : fix for Oracle see HHH-13432
// integrationOverrides.put( AvailableSettings.HQL_BULK_ID_STRATEGY, new MultiTableBulkIdStrategyStub() );
final EntityManagerFactory emf = provider.createContainerEntityManagerFactory( final EntityManagerFactory emf = provider.createContainerEntityManagerFactory(
new PersistenceUnitInfoAdapter(), new PersistenceUnitInfoAdapter(),
@ -318,6 +326,8 @@ public DataSource getNonJtaDataSource() {
final DataSource override = new DataSourceStub( "integrationDataSource" ); final DataSource override = new DataSourceStub( "integrationDataSource" );
final Map<String,Object> integrationSettings = new HashMap<>(); final Map<String,Object> integrationSettings = new HashMap<>();
integrationSettings.put( AvailableSettings.JPA_NON_JTA_DATASOURCE, override ); integrationSettings.put( AvailableSettings.JPA_NON_JTA_DATASOURCE, override );
// todo (6.0) : fix for Oracle see HHH-13432
// integrationSettings.put( AvailableSettings.HQL_BULK_ID_STRATEGY, new MultiTableBulkIdStrategyStub() );
final PersistenceProvider provider = new HibernatePersistenceProvider(); final PersistenceProvider provider = new HibernatePersistenceProvider();
@ -502,4 +512,34 @@ public void setName(String name) {
this.name = name; this.name = name;
} }
} }
// public static class MultiTableBulkIdStrategyStub implements MultiTableBulkIdStrategy {
//
// @Override
// public void prepare(
// JdbcServices jdbcServices,
// JdbcConnectionAccess connectionAccess,
// MetadataImplementor metadata,
// SessionFactoryOptions sessionFactoryOptions) {
//
// }
//
// @Override
// public void release(
// JdbcServices jdbcServices, JdbcConnectionAccess connectionAccess) {
//
// }
//
// @Override
// public UpdateHandler buildUpdateHandler(
// SessionFactoryImplementor factory, HqlSqlWalker walker) {
// return null;
// }
//
// @Override
// public DeleteHandler buildDeleteHandler(
// SessionFactoryImplementor factory, HqlSqlWalker walker) {
// return null;
// }
// }
} }

View File

@ -73,6 +73,10 @@ private static class Customer {
String name; String name;
// HHH-13446 - Type not validated in bi-directional association mapping
@OneToMany(cascade = CascadeType.ALL, mappedBy = "custId", fetch = FetchType.EAGER)
List<CustomerInventory> inventoryIdList = new ArrayList<>();
@OneToMany( mappedBy = "customer", cascade = CascadeType.ALL, fetch = FetchType.EAGER ) @OneToMany( mappedBy = "customer", cascade = CascadeType.ALL, fetch = FetchType.EAGER )
List<CustomerInventory> customerInventories = new ArrayList<>(); List<CustomerInventory> customerInventories = new ArrayList<>();

View File

@ -0,0 +1,323 @@
/*
* Hibernate, Relational Persistence for Idiomatic Java
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later
* See the lgpl.txt file in the root directory or http://www.gnu.org/licenses/lgpl-2.1.html
*/
package org.hibernate.test.bytecode.enhancement.lazy;
import java.util.HashSet;
import java.util.LinkedHashSet;
import java.util.Set;
import javax.persistence.Entity;
import javax.persistence.FetchType;
import javax.persistence.Id;
import javax.persistence.Inheritance;
import javax.persistence.InheritanceType;
import javax.persistence.JoinColumn;
import javax.persistence.ManyToOne;
import javax.persistence.OneToMany;
import javax.persistence.Table;
import org.hibernate.Hibernate;
import org.hibernate.ScrollMode;
import org.hibernate.ScrollableResults;
import org.hibernate.Session;
import org.hibernate.StatelessSession;
import org.hibernate.annotations.LazyToOne;
import org.hibernate.annotations.LazyToOneOption;
import org.hibernate.boot.MetadataSources;
import org.hibernate.boot.SessionFactoryBuilder;
import org.hibernate.boot.registry.StandardServiceRegistryBuilder;
import org.hibernate.cfg.AvailableSettings;
import org.hibernate.dialect.DB2Dialect;
import org.hibernate.query.Query;
import org.hibernate.stat.spi.StatisticsImplementor;
import org.hibernate.testing.bytecode.enhancement.BytecodeEnhancerRunner;
import org.hibernate.testing.junit4.BaseNonConfigCoreFunctionalTestCase;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import static org.hamcrest.CoreMatchers.is;
import static org.hamcrest.MatcherAssert.assertThat;
/**
* @author Andrea Boriero
*/
@RunWith(BytecodeEnhancerRunner.class)
public class QueryScrollingWithInheritanceEagerManyToOneTest extends BaseNonConfigCoreFunctionalTestCase {
@Override
protected void configureStandardServiceRegistryBuilder(StandardServiceRegistryBuilder ssrb) {
super.configureStandardServiceRegistryBuilder( ssrb );
ssrb.applySetting( AvailableSettings.ALLOW_ENHANCEMENT_AS_PROXY, "false" );
}
@Override
protected void configureSessionFactoryBuilder(SessionFactoryBuilder sfb) {
super.configureSessionFactoryBuilder( sfb );
sfb.applyStatisticsSupport( true );
sfb.applySecondLevelCacheSupport( false );
sfb.applyQueryCacheSupport( false );
}
@Override
protected void applyMetadataSources(MetadataSources sources) {
super.applyMetadataSources( sources );
sources.addAnnotatedClass( EmployeeParent.class );
sources.addAnnotatedClass( Employee.class );
sources.addAnnotatedClass( OtherEntity.class );
}
@Test
public void testScrollableWithStatelessSession() {
final StatisticsImplementor stats = sessionFactory().getStatistics();
stats.clear();
ScrollableResults scrollableResults = null;
final StatelessSession statelessSession = sessionFactory().openStatelessSession();
try {
statelessSession.beginTransaction();
Query<Employee> query = statelessSession.createQuery(
"select distinct e from Employee e left join fetch e.otherEntities order by e.dept",
Employee.class
);
if ( getDialect() instanceof DB2Dialect ) {
/*
FetchingScrollableResultsImp#next() in order to check if the ResultSet is empty calls ResultSet#isBeforeFirst()
but the support for ResultSet#isBeforeFirst() is optional for ResultSets with a result
set type of TYPE_FORWARD_ONLY and db2 does not support it.
*/
scrollableResults = query.scroll( ScrollMode.SCROLL_INSENSITIVE );
}
else {
scrollableResults = query.scroll( ScrollMode.FORWARD_ONLY );
}
while ( scrollableResults.next() ) {
final Employee employee = (Employee) scrollableResults.get();
assertThat( Hibernate.isPropertyInitialized( employee, "otherEntities" ), is( true ) );
assertThat( Hibernate.isInitialized( employee.getOtherEntities() ), is( true ) );
if ( "ENG1".equals( employee.getDept() ) ) {
assertThat( employee.getOtherEntities().size(), is( 2 ) );
for ( OtherEntity otherEntity : employee.getOtherEntities() ) {
if ( "test1".equals( otherEntity.id ) ) {
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employee" ), is( false ) );
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employeeParent" ), is( true ) );
assertThat( otherEntity.employeeParent, is( employee ) );
}
else {
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employee" ), is( false ) );
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employeeParent" ), is( true ) );
assertThat( Hibernate.isInitialized( otherEntity.employeeParent ), is( true ) );
}
}
}
else {
assertThat( employee.getOtherEntities().size(), is( 0 ) );
}
}
statelessSession.getTransaction().commit();
assertThat( stats.getPrepareStatementCount(), is( 2L ) );
}
finally {
if ( scrollableResults != null ) {
scrollableResults.close();
}
if ( statelessSession.getTransaction().isActive() ) {
statelessSession.getTransaction().rollback();
}
statelessSession.close();
}
}
@Test
public void testScrollableWithSession() {
final StatisticsImplementor stats = sessionFactory().getStatistics();
stats.clear();
ScrollableResults scrollableResults = null;
final Session session = sessionFactory().openSession();
try {
session.beginTransaction();
Query<Employee> query = session.createQuery(
"select distinct e from Employee e left join fetch e.otherEntities order by e.dept",
Employee.class
);
if ( getDialect() instanceof DB2Dialect ) {
/*
FetchingScrollableResultsImp#next() in order to check if the ResultSet is empty calls ResultSet#isBeforeFirst()
but the support for ResultSet#isBeforeFirst() is optional for ResultSets with a result
set type of TYPE_FORWARD_ONLY and db2 does not support it.
*/
scrollableResults = query.scroll( ScrollMode.SCROLL_INSENSITIVE );
}
else {
scrollableResults = query.scroll( ScrollMode.FORWARD_ONLY );
}
while ( scrollableResults.next() ) {
final Employee employee = (Employee) scrollableResults.get();
assertThat( Hibernate.isPropertyInitialized( employee, "otherEntities" ), is( true ) );
assertThat( Hibernate.isInitialized( employee.getOtherEntities() ), is( true ) );
if ( "ENG1".equals( employee.getDept() ) ) {
assertThat( employee.getOtherEntities().size(), is( 2 ) );
for ( OtherEntity otherEntity : employee.getOtherEntities() ) {
if ( "test1".equals( otherEntity.id ) ) {
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employee" ), is( false ) );
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employeeParent" ), is( true ) );
assertThat( otherEntity.employeeParent, is( employee ) );
}
else {
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employee" ), is( false ) );
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employeeParent" ), is( true ) );
assertThat( Hibernate.isInitialized( otherEntity.employeeParent ), is( true ) );
}
}
}
else {
assertThat( employee.getOtherEntities().size(), is( 0 ) );
}
}
session.getTransaction().commit();
assertThat( stats.getPrepareStatementCount(), is( 2L ) );
}
finally {
if ( scrollableResults != null ) {
scrollableResults.close();
}
if ( session.getTransaction().isActive() ) {
session.getTransaction().rollback();
}
session.close();
}
}
@Before
public void prepareTestData() {
inTransaction(
session -> {
Employee e1 = new Employee( "ENG1" );
Employee e2 = new Employee( "ENG2" );
OtherEntity other1 = new OtherEntity( "test1" );
OtherEntity other2 = new OtherEntity( "test2" );
e1.getOtherEntities().add( other1 );
e1.getOtherEntities().add( other2 );
e1.getParentOtherEntities().add( other1 );
e1.getParentOtherEntities().add( other2 );
other1.employee = e1;
other2.employee = e1;
other1.employeeParent = e1;
other2.employeeParent = e2;
session.persist( other1 );
session.persist( other2 );
session.persist( e1 );
session.persist( e2 );
}
);
}
@After
public void cleanUpTestData() {
inTransaction(
session -> {
session.createQuery( "delete from OtherEntity" ).executeUpdate();
session.createQuery( "delete from Employee" ).executeUpdate();
session.createQuery( "delete from EmployeeParent" ).executeUpdate();
}
);
}
@Entity(name = "EmployeeParent")
@Table(name = "EmployeeParent")
@Inheritance(strategy = InheritanceType.TABLE_PER_CLASS)
public static abstract class EmployeeParent {
@Id
private String dept;
@OneToMany(targetEntity = OtherEntity.class, mappedBy = "employeeParent", fetch = FetchType.LAZY)
protected Set<OtherEntity> parentOtherEntities = new HashSet<>();
public Set<OtherEntity> getParentOtherEntities() {
if ( parentOtherEntities == null ) {
parentOtherEntities = new LinkedHashSet();
}
return parentOtherEntities;
}
public void setOtherEntities(Set<OtherEntity> pParentOtherEntites) {
parentOtherEntities = pParentOtherEntites;
}
public String getDept() {
return dept;
}
protected void setDept(String dept) {
this.dept = dept;
}
}
@Entity(name = "Employee")
@Table(name = "Employee")
public static class Employee extends EmployeeParent {
@OneToMany(targetEntity = OtherEntity.class, mappedBy = "employee", fetch = FetchType.LAZY)
protected Set<OtherEntity> otherEntities = new HashSet<>();
public Employee(String dept) {
this();
setDept( dept );
}
protected Employee() {
// this form used by Hibernate
}
public Set<OtherEntity> getOtherEntities() {
if ( otherEntities == null ) {
otherEntities = new LinkedHashSet();
}
return otherEntities;
}
public void setOtherEntities(Set<OtherEntity> pOtherEntites) {
otherEntities = pOtherEntites;
}
}
@Entity(name = "OtherEntity")
@Table(name = "OtherEntity")
public static class OtherEntity {
@Id
private String id;
@ManyToOne(fetch = FetchType.LAZY)
@LazyToOne(LazyToOneOption.NO_PROXY)
@JoinColumn(name = "Employee_Id")
protected Employee employee = null;
@ManyToOne(fetch = FetchType.EAGER)
//@LazyToOne(LazyToOneOption.NO_PROXY)
@JoinColumn(name = "EmployeeParent_Id")
protected EmployeeParent employeeParent = null;
protected OtherEntity() {
// this form used by Hibernate
}
public OtherEntity(String id) {
this.id = id;
}
public String getId() {
return id;
}
}
}

View File

@ -0,0 +1,326 @@
/*
* Hibernate, Relational Persistence for Idiomatic Java
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later
* See the lgpl.txt file in the root directory or http://www.gnu.org/licenses/lgpl-2.1.html
*/
package org.hibernate.test.bytecode.enhancement.lazy.proxy;
import java.util.HashSet;
import java.util.LinkedHashSet;
import java.util.Set;
import javax.persistence.Entity;
import javax.persistence.FetchType;
import javax.persistence.Id;
import javax.persistence.Inheritance;
import javax.persistence.InheritanceType;
import javax.persistence.JoinColumn;
import javax.persistence.ManyToOne;
import javax.persistence.OneToMany;
import javax.persistence.Table;
import org.hibernate.Hibernate;
import org.hibernate.ScrollMode;
import org.hibernate.ScrollableResults;
import org.hibernate.Session;
import org.hibernate.StatelessSession;
import org.hibernate.annotations.LazyToOne;
import org.hibernate.annotations.LazyToOneOption;
import org.hibernate.boot.MetadataSources;
import org.hibernate.boot.SessionFactoryBuilder;
import org.hibernate.boot.registry.StandardServiceRegistryBuilder;
import org.hibernate.cfg.AvailableSettings;
import org.hibernate.dialect.DB2Dialect;
import org.hibernate.query.Query;
import org.hibernate.stat.spi.StatisticsImplementor;
import org.hibernate.testing.bytecode.enhancement.BytecodeEnhancerRunner;
import org.hibernate.testing.junit4.BaseNonConfigCoreFunctionalTestCase;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import org.junit.runner.RunWith;
import static org.hamcrest.CoreMatchers.is;
import static org.hamcrest.MatcherAssert.assertThat;
/**
* @author Andrea Boriero
*/
@RunWith(BytecodeEnhancerRunner.class)
public class QueryScrollingWithInheritanceProxyEagerManyToOneTest extends BaseNonConfigCoreFunctionalTestCase {
@Override
protected void configureStandardServiceRegistryBuilder(StandardServiceRegistryBuilder ssrb) {
super.configureStandardServiceRegistryBuilder( ssrb );
ssrb.applySetting( AvailableSettings.ALLOW_ENHANCEMENT_AS_PROXY, "true" );
}
@Override
protected void configureSessionFactoryBuilder(SessionFactoryBuilder sfb) {
super.configureSessionFactoryBuilder( sfb );
sfb.applyStatisticsSupport( true );
sfb.applySecondLevelCacheSupport( false );
sfb.applyQueryCacheSupport( false );
}
@Override
protected void applyMetadataSources(MetadataSources sources) {
super.applyMetadataSources( sources );
sources.addAnnotatedClass( EmployeeParent.class );
sources.addAnnotatedClass( Employee.class );
sources.addAnnotatedClass( OtherEntity.class );
}
@Test
public void testScrollableWithStatelessSession() {
final StatisticsImplementor stats = sessionFactory().getStatistics();
stats.clear();
ScrollableResults scrollableResults = null;
final StatelessSession statelessSession = sessionFactory().openStatelessSession();
try {
statelessSession.beginTransaction();
Query<Employee> query = statelessSession.createQuery(
"select distinct e from Employee e left join fetch e.otherEntities order by e.dept",
Employee.class
);
if ( getDialect() instanceof DB2Dialect ) {
/*
FetchingScrollableResultsImp#next() in order to check if the ResultSet is empty calls ResultSet#isBeforeFirst()
but the support for ResultSet#isBeforeFirst() is optional for ResultSets with a result
set type of TYPE_FORWARD_ONLY and db2 does not support it.
*/
scrollableResults = query.scroll( ScrollMode.SCROLL_INSENSITIVE );
}
else {
scrollableResults = query.scroll( ScrollMode.FORWARD_ONLY );
}
while ( scrollableResults.next() ) {
final Employee employee = (Employee) scrollableResults.get();
assertThat( Hibernate.isPropertyInitialized( employee, "otherEntities" ), is( true ) );
assertThat( Hibernate.isInitialized( employee.getOtherEntities() ), is( true ) );
if ( "ENG1".equals( employee.getDept() ) ) {
assertThat( employee.getOtherEntities().size(), is( 2 ) );
for ( OtherEntity otherEntity : employee.getOtherEntities() ) {
if ( "test1".equals( otherEntity.id ) ) {
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employee" ), is( true ) );
assertThat( otherEntity.employee, is( employee ) );
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employeeParent" ), is( true ) );
assertThat( otherEntity.employeeParent, is( employee ) );
}
else {
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employee" ), is( true ) );
assertThat( otherEntity.employee, is( employee ) );
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employeeParent" ), is( true ) );
assertThat( Hibernate.isInitialized( otherEntity.employeeParent ), is( true ) );
}
}
}
else {
assertThat( employee.getOtherEntities().size(), is( 0 ) );
}
}
statelessSession.getTransaction().commit();
assertThat( stats.getPrepareStatementCount(), is( 2L ) );
}
finally {
if ( scrollableResults != null ) {
scrollableResults.close();
}
if ( statelessSession.getTransaction().isActive() ) {
statelessSession.getTransaction().rollback();
}
statelessSession.close();
}
}
@Test
public void testScrollableWithSession() {
final StatisticsImplementor stats = sessionFactory().getStatistics();
stats.clear();
ScrollableResults scrollableResults = null;
final Session session = sessionFactory().openSession();
try {
session.beginTransaction();
Query<Employee> query = session.createQuery(
"select distinct e from Employee e left join fetch e.otherEntities order by e.dept",
Employee.class
);
if ( getDialect() instanceof DB2Dialect ) {
/*
FetchingScrollableResultsImp#next() in order to check if the ResultSet is empty calls ResultSet#isBeforeFirst()
but the support for ResultSet#isBeforeFirst() is optional for ResultSets with a result
set type of TYPE_FORWARD_ONLY and db2 does not support it.
*/
scrollableResults = query.scroll( ScrollMode.SCROLL_INSENSITIVE );
}
else {
scrollableResults = query.scroll( ScrollMode.FORWARD_ONLY );
}
while ( scrollableResults.next() ) {
final Employee employee = (Employee) scrollableResults.get();
assertThat( Hibernate.isPropertyInitialized( employee, "otherEntities" ), is( true ) );
assertThat( Hibernate.isInitialized( employee.getOtherEntities() ), is( true ) );
if ( "ENG1".equals( employee.getDept() ) ) {
assertThat( employee.getOtherEntities().size(), is( 2 ) );
for ( OtherEntity otherEntity : employee.getOtherEntities() ) {
if ( "test1".equals( otherEntity.id ) ) {
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employee" ), is( true ) );
assertThat( otherEntity.employee, is( employee ) );
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employeeParent" ), is( true ) );
assertThat( otherEntity.employeeParent, is( employee ) );
}
else {
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employee" ), is( true ) );
assertThat( otherEntity.employee, is( employee ) );
assertThat( Hibernate.isPropertyInitialized( otherEntity, "employeeParent" ), is( true ) );
assertThat( Hibernate.isInitialized( otherEntity.employeeParent ), is( true ) );
}
}
}
else {
assertThat( employee.getOtherEntities().size(), is( 0 ) );
}
}
session.getTransaction().commit();
assertThat( stats.getPrepareStatementCount(), is( 2L ) );
}
finally {
if ( scrollableResults != null ) {
scrollableResults.close();
}
if ( session.getTransaction().isActive() ) {
session.getTransaction().rollback();
}
session.close();
}
}
@Before
public void prepareTestData() {
inTransaction(
session -> {
Employee e1 = new Employee( "ENG1" );
Employee e2 = new Employee( "ENG2" );
OtherEntity other1 = new OtherEntity( "test1" );
OtherEntity other2 = new OtherEntity( "test2" );
e1.getOtherEntities().add( other1 );
e1.getOtherEntities().add( other2 );
e1.getParentOtherEntities().add( other1 );
e1.getParentOtherEntities().add( other2 );
other1.employee = e1;
other2.employee = e1;
other1.employeeParent = e1;
other2.employeeParent = e2;
session.persist( other1 );
session.persist( other2 );
session.persist( e1 );
session.persist( e2 );
}
);
}
@After
public void cleanUpTestData() {
inTransaction(
session -> {
session.createQuery( "delete from OtherEntity" ).executeUpdate();
session.createQuery( "delete from Employee" ).executeUpdate();
session.createQuery( "delete from EmployeeParent" ).executeUpdate();
}
);
}
@Entity(name = "EmployeeParent")
@Table(name = "EmployeeParent")
@Inheritance(strategy = InheritanceType.TABLE_PER_CLASS)
public static abstract class EmployeeParent {
@Id
private String dept;
@OneToMany(targetEntity = OtherEntity.class, mappedBy = "employeeParent", fetch = FetchType.LAZY)
protected Set<OtherEntity> parentOtherEntities = new HashSet<>();
public Set<OtherEntity> getParentOtherEntities() {
if ( parentOtherEntities == null ) {
parentOtherEntities = new LinkedHashSet();
}
return parentOtherEntities;
}
public void setOtherEntities(Set<OtherEntity> pParentOtherEntites) {
parentOtherEntities = pParentOtherEntites;
}
public String getDept() {
return dept;
}
protected void setDept(String dept) {
this.dept = dept;
}
}
@Entity(name = "Employee")
@Table(name = "Employee")
public static class Employee extends EmployeeParent {
@OneToMany(targetEntity = OtherEntity.class, mappedBy = "employee", fetch = FetchType.LAZY)
protected Set<OtherEntity> otherEntities = new HashSet<>();
public Employee(String dept) {
this();
setDept( dept );
}
protected Employee() {
// this form used by Hibernate
}
public Set<OtherEntity> getOtherEntities() {
if ( otherEntities == null ) {
otherEntities = new LinkedHashSet();
}
return otherEntities;
}
public void setOtherEntities(Set<OtherEntity> pOtherEntites) {
otherEntities = pOtherEntites;
}
}
@Entity(name = "OtherEntity")
@Table(name = "OtherEntity")
public static class OtherEntity {
@Id
private String id;
@ManyToOne(fetch = FetchType.LAZY)
@LazyToOne(LazyToOneOption.NO_PROXY)
@JoinColumn(name = "Employee_Id")
protected Employee employee = null;
@ManyToOne(fetch = FetchType.EAGER)
@JoinColumn(name = "EmployeeParent_Id")
protected EmployeeParent employeeParent = null;
protected OtherEntity() {
// this form used by Hibernate
}
public OtherEntity(String id) {
this.id = id;
}
public String getId() {
return id;
}
}
}

View File

@ -9,6 +9,7 @@
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
import javax.persistence.CollectionTable; import javax.persistence.CollectionTable;
import javax.persistence.Column;
import javax.persistence.ElementCollection; import javax.persistence.ElementCollection;
import javax.persistence.Entity; import javax.persistence.Entity;
import javax.persistence.GeneratedValue; import javax.persistence.GeneratedValue;
@ -17,6 +18,9 @@
import javax.persistence.OrderBy; import javax.persistence.OrderBy;
import javax.persistence.Table; import javax.persistence.Table;
import org.hibernate.Session;
import org.hibernate.Transaction;
import org.hibernate.testing.TestForIssue;
import org.hibernate.testing.junit4.BaseCoreFunctionalTestCase; import org.hibernate.testing.junit4.BaseCoreFunctionalTestCase;
import org.junit.Test; import org.junit.Test;
@ -31,7 +35,8 @@ public class BagElementNullBasicTest extends BaseCoreFunctionalTestCase {
@Override @Override
protected Class[] getAnnotatedClasses() { protected Class[] getAnnotatedClasses() {
return new Class[] { return new Class[] {
AnEntity.class AnEntity.class,
NullableElementsEntity.class
}; };
} }
@ -85,6 +90,19 @@ public void addNullValue() {
); );
} }
@Test
@TestForIssue(jiraKey = "HHH-13651")
public void addNullValueToNullableCollections() {
try (final Session s = sessionFactory().openSession()) {
final Transaction tx = s.beginTransaction();
NullableElementsEntity e = new NullableElementsEntity();
e.list.add( null );
s.persist( e );
s.flush();
tx.commit();
}
}
@Test @Test
public void testUpdateNonNullValueToNull() { public void testUpdateNonNullValueToNull() {
int entityId = doInHibernate( int entityId = doInHibernate(
@ -169,4 +187,17 @@ public static class AnEntity {
@OrderBy @OrderBy
private List<String> aCollection = new ArrayList<String>(); private List<String> aCollection = new ArrayList<String>();
} }
@Entity
@Table(name="NullableElementsEntity")
public static class NullableElementsEntity {
@Id
@GeneratedValue
private int id;
@ElementCollection
@CollectionTable(name="e_2_string", joinColumns=@JoinColumn(name="e_id"))
@Column(name="string_value", unique = false, nullable = true, insertable = true, updatable = true)
private List<String> list = new ArrayList<String>();
}
} }

View File

@ -0,0 +1,119 @@
/*
* Hibernate, Relational Persistence for Idiomatic Java
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later.
* See the lgpl.txt file in the root directory or <http://www.gnu.org/licenses/lgpl-2.1.html>.
*/
package org.hibernate.test.connections;
import java.util.Map;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import org.hibernate.Session;
import org.hibernate.cfg.AvailableSettings;
import org.hibernate.cfg.Environment;
import org.hibernate.dialect.H2Dialect;
import org.hibernate.engine.jdbc.batch.internal.AbstractBatchImpl;
import org.hibernate.internal.CoreMessageLogger;
import org.hibernate.resource.jdbc.spi.PhysicalConnectionHandlingMode;
import org.hibernate.resource.transaction.backend.jta.internal.JtaTransactionCoordinatorBuilderImpl;
import org.hibernate.testing.RequiresDialect;
import org.hibernate.testing.TestForIssue;
import org.hibernate.testing.jta.TestingJtaBootstrap;
import org.hibernate.testing.jta.TestingJtaPlatformImpl;
import org.hibernate.testing.junit4.BaseNonConfigCoreFunctionalTestCase;
import org.hibernate.testing.logger.LoggerInspectionRule;
import org.hibernate.testing.logger.Triggerable;
import org.junit.Rule;
import org.junit.Test;
import org.jboss.logging.Logger;
import static org.junit.Assert.assertFalse;
@TestForIssue( jiraKey = "HHH-13307" )
@RequiresDialect(H2Dialect.class)
public class JdbcBatchingAgressiveReleaseTest extends BaseNonConfigCoreFunctionalTestCase {
@Rule
public LoggerInspectionRule logInspection = new LoggerInspectionRule(
Logger.getMessageLogger( CoreMessageLogger.class, AbstractBatchImpl.class.getName() )
);
private Triggerable triggerable = logInspection.watchForLogMessages( "HHH000010" );
@Override
@SuppressWarnings("unchecked")
protected void addSettings(Map settings) {
super.addSettings( settings );
TestingJtaBootstrap.prepare( settings );
settings.put( AvailableSettings.TRANSACTION_COORDINATOR_STRATEGY, JtaTransactionCoordinatorBuilderImpl.class.getName() );
settings.put( Environment.CONNECTION_HANDLING, PhysicalConnectionHandlingMode.DELAYED_ACQUISITION_AND_RELEASE_AFTER_STATEMENT.toString() );
settings.put( Environment.GENERATE_STATISTICS, "true" );
settings.put( Environment.STATEMENT_BATCH_SIZE, "500" );
}
@Test
public void testJdbcBatching() throws Throwable {
triggerable.reset();
TestingJtaPlatformImpl.INSTANCE.getTransactionManager().begin();
Session session = openSession();
// The following 2 entity inserts will be batched.
session.persist( new Person( 1, "Jane" ) );
session.persist( new Person( 2, "Sally" ) );
// The following entity has an IDENTITY ID, which cannot be batched.
// As a result the existing batch is forced to execute before the Thing can be
// inserted.
session.persist( new Thing( "it" ) );
session.close();
TestingJtaPlatformImpl.INSTANCE.getTransactionManager().commit();
assertFalse( triggerable.wasTriggered() );
}
@Override
protected Class[] getAnnotatedClasses() {
return new Class[] { Person.class, Thing.class };
}
@Entity( name = "Person")
public static class Person {
@Id
private int id;
private String name;
public Person() {
}
public Person(int id, String name) {
this.id = id;
this.name = name;
}
}
@Entity( name = "Thing")
public static class Thing {
@Id
@GeneratedValue(strategy = GenerationType.IDENTITY)
private int id;
private String name;
public Thing() {
}
public Thing(String name) {
this.id = id;
this.name = name;
}
}
}

View File

@ -0,0 +1,176 @@
/*
* Hibernate, Relational Persistence for Idiomatic Java
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later
* See the lgpl.txt file in the root directory or http://www.gnu.org/licenses/lgpl-2.1.html
*/
package org.hibernate.test.fileimport;
import java.util.EnumSet;
import java.util.Map;
import org.hibernate.boot.Metadata;
import org.hibernate.boot.MetadataSources;
import org.hibernate.boot.registry.StandardServiceRegistry;
import org.hibernate.boot.registry.StandardServiceRegistryBuilder;
import org.hibernate.cfg.AvailableSettings;
import org.hibernate.cfg.Environment;
import org.hibernate.dialect.H2Dialect;
import org.hibernate.engine.config.spi.ConfigurationService;
import org.hibernate.resource.transaction.backend.jta.internal.JtaTransactionCoordinatorBuilderImpl;
import org.hibernate.resource.transaction.spi.TransactionCoordinatorBuilder;
import org.hibernate.tool.hbm2ddl.ImportScriptException;
import org.hibernate.tool.hbm2ddl.MultipleLinesSqlCommandExtractor;
import org.hibernate.tool.hbm2ddl.grammar.SqlStatementParser;
import org.hibernate.tool.schema.SourceType;
import org.hibernate.tool.schema.TargetType;
import org.hibernate.tool.schema.internal.ExceptionHandlerLoggedImpl;
import org.hibernate.tool.schema.internal.SchemaCreatorImpl;
import org.hibernate.tool.schema.spi.ExceptionHandler;
import org.hibernate.tool.schema.spi.ExecutionOptions;
import org.hibernate.tool.schema.spi.SchemaCreator;
import org.hibernate.tool.schema.spi.ScriptSourceInput;
import org.hibernate.tool.schema.spi.ScriptTargetOutput;
import org.hibernate.tool.schema.spi.SourceDescriptor;
import org.hibernate.tool.schema.spi.TargetDescriptor;
import org.hibernate.testing.RequiresDialect;
import org.hibernate.testing.TestForIssue;
import org.hibernate.testing.jta.TestingJtaBootstrap;
import org.hibernate.testing.junit4.BaseUnitTestCase;
import org.hibernate.test.schemaupdate.CommentGenerationTest;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import static org.hamcrest.CoreMatchers.instanceOf;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.core.Is.is;
import static org.junit.Assert.fail;
/**
* @author Andrea Boriero
*/
@TestForIssue(jiraKey = "HHH-13673")
@RequiresDialect(value = H2Dialect.class,
jiraKey = "HHH-6286",
comment = "Only running the tests against H2, because the sql statements in the import file are not generic. " +
"This test should actually not test directly against the db")
public class StatementsWithoutTerminalCharsImportFileTest extends BaseUnitTestCase implements ExecutionOptions {
private StandardServiceRegistry ssr;
private static final String EXPECTED_ERROR_MESSAGE = "Import script Sql statements must terminate with a ';' char";
@Before
public void setUp() {
ssr = new StandardServiceRegistryBuilder()
.applySetting( Environment.HBM2DDL_AUTO, "none" )
.applySetting( Environment.DIALECT, CommentGenerationTest.SupportCommentDialect.class.getName() )
.applySetting(
Environment.HBM2DDL_IMPORT_FILES,
"/org/hibernate/test/fileimport/statements-without-terminal-chars.sql"
).applySetting( AvailableSettings.HBM2DDL_HALT_ON_ERROR, "true" )
.applySetting(
Environment.HBM2DDL_IMPORT_FILES_SQL_EXTRACTOR,
MultipleLinesSqlCommandExtractor.class.getName()
)
.build();
}
@Test
public void testImportFile() {
try {
final SchemaCreator schemaCreator = new SchemaCreatorImpl( ssr );
schemaCreator.doCreation(
buildMappings( ssr ),
this,
SourceDescriptorImpl.INSTANCE,
TargetDescriptorImpl.INSTANCE
);
fail( "ImportScriptException expected" );
}
catch (ImportScriptException e) {
final Throwable cause = e.getCause();
// todo (6.0) : fix it
// assertThat( cause, instanceOf( SqlStatementParser.StatementParserException.class ) );
assertThat( cause.getMessage(), is( EXPECTED_ERROR_MESSAGE ) );
}
}
@After
public void tearDown() {
if ( ssr != null ) {
StandardServiceRegistryBuilder.destroy( ssr );
}
}
@Override
public Map getConfigurationValues() {
return ssr.getService( ConfigurationService.class ).getSettings();
}
@Override
public boolean shouldManageNamespaces() {
return false;
}
@Override
public ExceptionHandler getExceptionHandler() {
return ExceptionHandlerLoggedImpl.INSTANCE;
}
private static class SourceDescriptorImpl implements SourceDescriptor {
/**
* Singleton access
*/
public static final SourceDescriptorImpl INSTANCE = new SourceDescriptorImpl();
@Override
public SourceType getSourceType() {
return SourceType.METADATA;
}
@Override
public ScriptSourceInput getScriptSourceInput() {
return null;
}
}
private static class TargetDescriptorImpl implements TargetDescriptor {
/**
* Singleton access
*/
public static final TargetDescriptorImpl INSTANCE = new TargetDescriptorImpl();
@Override
public EnumSet<TargetType> getTargetTypes() {
return EnumSet.of( TargetType.DATABASE );
}
@Override
public ScriptTargetOutput getScriptTargetOutput() {
return null;
}
}
private Metadata buildMappings(StandardServiceRegistry registry) {
return new MetadataSources( registry )
.buildMetadata();
}
protected StandardServiceRegistry buildJtaStandardServiceRegistry() {
StandardServiceRegistry registry = TestingJtaBootstrap.prepare().build();
assertThat(
registry.getService( TransactionCoordinatorBuilder.class ),
instanceOf( JtaTransactionCoordinatorBuilderImpl.class )
);
return registry;
}
}

View File

@ -0,0 +1,196 @@
/*
* Hibernate, Relational Persistence for Idiomatic Java
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later
* See the lgpl.txt file in the root directory or http://www.gnu.org/licenses/lgpl-2.1.html
*/
package org.hibernate.test.stateless;
import java.util.Collections;
import java.util.function.Consumer;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.Id;
import javax.persistence.JoinColumn;
import javax.persistence.ManyToOne;
import javax.persistence.Table;
import org.hibernate.StatelessSession;
import org.hibernate.Transaction;
import org.hibernate.engine.spi.PersistenceContext;
import org.hibernate.engine.spi.SharedSessionContractImplementor;
import org.hibernate.testing.TestForIssue;
import org.hibernate.testing.junit4.BaseCoreFunctionalTestCase;
import org.junit.Test;
import static org.junit.Assert.assertTrue;
/**
* @author Andrea Boriero
*/
public class StatelessSessionPersistentContextTest extends BaseCoreFunctionalTestCase {
@Override
protected Class<?>[] getAnnotatedClasses() {
return new Class[] { TestEntity.class, OtherEntity.class };
}
@Test
@TestForIssue(jiraKey = "HHH-13672")
public void testStatelessSessionPersistenceContextIsCleared() {
TestEntity testEntity = new TestEntity();
consumeAndCheckPersistenceContextIsClosed(
statelessSession -> {
testEntity.setName( "Fab" );
OtherEntity otherEntity = new OtherEntity();
otherEntity.setName( "other" );
testEntity.setOtherEntity( otherEntity );
statelessSession.insert( otherEntity );
statelessSession.insert( testEntity );
}
);
consumeAndCheckPersistenceContextIsClosed(
statelessSession -> {
statelessSession.get( TestEntity.class, testEntity.getId() );
}
);
consumeAndCheckPersistenceContextIsClosed(
statelessSession -> {
TestEntity p2 = (TestEntity) statelessSession.get( TestEntity.class, testEntity.getId() );
p2.setName( "Fabulous" );
statelessSession.update( p2 );
}
);
consumeAndCheckPersistenceContextIsClosed(
statelessSession -> {
TestEntity testEntity1 = (TestEntity) statelessSession.createQuery(
"select p from TestEntity p where id = :id" )
.setParameter( "id", testEntity.getId() )
.uniqueResult();
testEntity1.getOtherEntity();
}
);
consumeAndCheckPersistenceContextIsClosed(
statelessSession -> {
statelessSession.refresh( testEntity );
}
);
consumeAndCheckPersistenceContextIsClosed(
statelessSession -> {
statelessSession.delete( testEntity );
}
);
}
private void consumeAndCheckPersistenceContextIsClosed(Consumer<StatelessSession> consumer) {
Transaction transaction = null;
StatelessSession statelessSession = sessionFactory().openStatelessSession();
try {
transaction = statelessSession.beginTransaction();
consumer.accept( statelessSession );
transaction.commit();
}
catch (Exception e) {
if ( transaction != null && transaction.isActive() ) {
transaction.rollback();
}
throw e;
}
finally {
statelessSession.close();
}
assertThatPersistenContextIsCleared( statelessSession );
}
private void assertThatPersistenContextIsCleared(StatelessSession ss) {
PersistenceContext persistenceContextInternal = ( (SharedSessionContractImplementor) ss ).getPersistenceContextInternal();
assertTrue(
"StatelessSession: PersistenceContext has not been cleared",
persistenceContextInternal.getEntitiesByKey().isEmpty()
);
assertTrue(
"StatelessSession: PersistenceContext has not been cleared",
persistenceContextInternal.managedEntitiesIterator() == Collections.emptyIterator()
);
assertTrue(
"StatelessSession: PersistenceContext has not been cleared",
persistenceContextInternal.getCollectionsByKey().isEmpty()
);
assertTrue(
"StatelessSession: PersistenceContext has not been cleared",
persistenceContextInternal.getCollectionsByKey() == Collections.emptyMap()
);
}
@Entity(name = "TestEntity")
@Table(name = "TestEntity")
public static class TestEntity {
@Id
@GeneratedValue
private Long id;
private String name;
@ManyToOne
@JoinColumn
private OtherEntity otherEntity;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public OtherEntity getOtherEntity() {
return otherEntity;
}
public void setOtherEntity(OtherEntity otherEntity) {
this.otherEntity = otherEntity;
}
}
@Entity(name = "OtherEntity")
@Table(name = "OtherEntity")
public static class OtherEntity {
@Id
@GeneratedValue
private Long id;
private String name;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
}

View File

@ -71,7 +71,7 @@ public void testCreateUpdateReadDelete() {
assertEquals( "Blahs", doc2.getName() ); assertEquals( "Blahs", doc2.getName() );
assertEquals( doc.getText(), doc2.getText() ); assertEquals( doc.getText(), doc2.getText() );
doc2 = (Document) ss.createNativeQuery( "select * from Document" ) doc2 = (Document) ss.createSQLQuery( "select * from Document" )
.addEntity( Document.class ) .addEntity( Document.class )
.uniqueResult(); .uniqueResult();
assertEquals( "Blahs", doc2.getName() ); assertEquals( "Blahs", doc2.getName() );

View File

@ -0,0 +1,3 @@
CREATE TABLE test_data ( id NUMBER NOT NULL PRIMARY KEY, text VARCHAR2(100) )
INSERT INTO test_data(id, text) VALUES (1 `sample`)
DELETE FROM test_data

View File

@ -311,9 +311,7 @@ boolean addManyToOne(
// HHH-11107 // HHH-11107
// Use FK hbm magic value 'none' to skip making foreign key constraints between the Envers // Use FK hbm magic value 'none' to skip making foreign key constraints between the Envers
// schema and the base table schema when a @ManyToOne is present in an identifier. // schema and the base table schema when a @ManyToOne is present in an identifier.
if ( mapper == null ) { manyToOneElement.addAttribute( "foreign-key", "none" );
manyToOneElement.addAttribute( "foreign-key", "none" );
}
MetadataTools.addColumns( manyToOneElement, value.getColumnIterator() ); MetadataTools.addColumns( manyToOneElement, value.getColumnIterator() );

View File

@ -66,6 +66,48 @@ public Serializable run() {
data.put( propertyData.getName(), entity ); data.put( propertyData.getName(), entity );
} }
@Override
public void mapToEntityFromEntity(Object objTo, Object objFrom) {
if ( objTo == null || objFrom == null ) {
return;
}
AccessController.doPrivileged(
new PrivilegedAction<Object>() {
@Override
public Object run() {
final Getter getter = ReflectionTools.getGetter(
objFrom.getClass(),
propertyData,
getServiceRegistry()
);
final Setter setter = ReflectionTools.getSetter(
objTo.getClass(),
propertyData,
getServiceRegistry()
);
// Get the value from the containing entity
final Object value = getter.get( objFrom );
if ( value == null ) {
return null;
}
if ( !value.getClass().equals( propertyData.getVirtualReturnClass() ) ) {
setter.set( objTo, getAssociatedEntityIdMapper().mapToIdFromEntity( value ), null );
}
else {
// This means we're setting the object
setter.set( objTo, value, null );
}
return null;
}
}
);
}
@Override @Override
public boolean mapToEntityFromMap(Object obj, Map data) { public boolean mapToEntityFromMap(Object obj, Map data) {
if ( data == null || obj == null ) { if ( data == null || obj == null ) {

View File

@ -0,0 +1,62 @@
/*
* Hibernate, Relational Persistence for Idiomatic Java
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later.
* See the lgpl.txt file in the root directory or <http://www.gnu.org/licenses/lgpl-2.1.html>.
*/
package org.hibernate.envers.test.integration.manytoone.foreignkey;
import java.time.LocalDate;
import java.util.ArrayList;
import org.hibernate.envers.test.BaseEnversJPAFunctionalTestCase;
import org.junit.Test;
import org.hibernate.testing.TestForIssue;
import static org.hibernate.testing.transaction.TransactionUtil.doInJPA;
/**
* Tests that no foreign key should be generated from audit schema to main schema.
*
* @author Chris Cranford
*/
@TestForIssue(jiraKey = "HHH-12965")
public class ForeignKeyExclusionTest extends BaseEnversJPAFunctionalTestCase {
private RootLayer rootLayer;
@Override
protected Class<?>[] getAnnotatedClasses() {
return new Class<?>[] { RootLayer.class, MiddleLayer.class, LeafLayer.class };
}
@Test
public void testRemovingAuditedEntityWithIdClassAndManyToOneForeignKeyConstraint() {
// Revision 1 - Add Root/Middle/Leaf layers
this.rootLayer = doInJPA( this::entityManagerFactory, entityManager -> {
final RootLayer rootLayer = new RootLayer();
rootLayer.setMiddleLayers( new ArrayList<>() );
MiddleLayer middleLayer = new MiddleLayer();
rootLayer.getMiddleLayers().add( middleLayer );
middleLayer.setRootLayer( rootLayer );
middleLayer.setValidFrom( LocalDate.of( 2019, 3, 19 ) );
middleLayer.setLeafLayers( new ArrayList<>() );
LeafLayer leafLayer = new LeafLayer();
leafLayer.setMiddleLayer( middleLayer );
middleLayer.getLeafLayers().add( leafLayer );
entityManager.persist( rootLayer );
return rootLayer;
} );
// Revision 2 - Delete Root/Middle/Leaf layers
// This causes FK violation
doInJPA( this::entityManagerFactory, entityManager -> {
final RootLayer rootLayer = entityManager.find( RootLayer.class, this.rootLayer.getId() );
entityManager.remove( rootLayer );
} );
}
}

View File

@ -0,0 +1,49 @@
/*
* Hibernate, Relational Persistence for Idiomatic Java
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later.
* See the lgpl.txt file in the root directory or <http://www.gnu.org/licenses/lgpl-2.1.html>.
*/
package org.hibernate.envers.test.integration.manytoone.foreignkey;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.JoinColumn;
import javax.persistence.JoinColumns;
import javax.persistence.ManyToOne;
import org.hibernate.envers.Audited;
/**
* @author Chris Cranford
*/
@Entity(name = "LeafLayer")
@Audited
public class LeafLayer {
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
@ManyToOne(optional = false)
@JoinColumns({
@JoinColumn(name = "middle_layer_valid_from_fk", referencedColumnName = "valid_from"),
@JoinColumn(name = "middle_layer_root_layer_fk", referencedColumnName = "root_layer_fk") })
private MiddleLayer middleLayer;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public MiddleLayer getMiddleLayer() {
return middleLayer;
}
public void setMiddleLayer(MiddleLayer middleLayer) {
this.middleLayer = middleLayer;
}
}

View File

@ -0,0 +1,63 @@
/*
* Hibernate, Relational Persistence for Idiomatic Java
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later.
* See the lgpl.txt file in the root directory or <http://www.gnu.org/licenses/lgpl-2.1.html>.
*/
package org.hibernate.envers.test.integration.manytoone.foreignkey;
import java.time.LocalDate;
import java.util.List;
import javax.persistence.CascadeType;
import javax.persistence.Column;
import javax.persistence.Entity;
import javax.persistence.Id;
import javax.persistence.IdClass;
import javax.persistence.JoinColumn;
import javax.persistence.ManyToOne;
import javax.persistence.OneToMany;
import org.hibernate.envers.Audited;
/**
* @author Chris Cranford
*/
@Audited
@Entity
@IdClass(MiddleLayerPK.class)
public class MiddleLayer {
@Id
@Column(name = "valid_from", nullable = false)
private LocalDate validFrom;
@Id
@ManyToOne
@JoinColumn(name = "root_layer_fk")
private RootLayer rootLayer;
@OneToMany(mappedBy = "middleLayer", cascade = CascadeType.ALL, orphanRemoval = true)
private List<LeafLayer> leafLayers;
public LocalDate getValidFrom() {
return validFrom;
}
public void setValidFrom(LocalDate validFrom) {
this.validFrom = validFrom;
}
public RootLayer getRootLayer() {
return rootLayer;
}
public void setRootLayer(RootLayer rootLayer) {
this.rootLayer = rootLayer;
}
public List<LeafLayer> getLeafLayers() {
return leafLayers;
}
public void setLeafLayers(List<LeafLayer> leafLayers) {
this.leafLayers = leafLayers;
}
}

View File

@ -0,0 +1,53 @@
/*
* Hibernate, Relational Persistence for Idiomatic Java
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later.
* See the lgpl.txt file in the root directory or <http://www.gnu.org/licenses/lgpl-2.1.html>.
*/
package org.hibernate.envers.test.integration.manytoone.foreignkey;
import java.io.Serializable;
import java.time.LocalDate;
import java.util.Objects;
/**
* @author Chris Cranford
*/
public class MiddleLayerPK implements Serializable {
private Long rootLayer;
private LocalDate validFrom;
public Long getRootLayer() {
return rootLayer;
}
public void setRootLayer(Long rootLayer) {
this.rootLayer = rootLayer;
}
public LocalDate getValidFrom() {
return validFrom;
}
public void setValidFrom(LocalDate validFrom) {
this.validFrom = validFrom;
}
@Override
public boolean equals(Object o) {
if ( this == o ) {
return true;
}
if ( o == null || getClass() != o.getClass() ) {
return false;
}
MiddleLayerPK that = (MiddleLayerPK) o;
return Objects.equals( rootLayer, that.rootLayer ) &&
Objects.equals( validFrom, that.validFrom );
}
@Override
public int hashCode() {
return Objects.hash( rootLayer, validFrom );
}
}

View File

@ -0,0 +1,47 @@
/*
* Hibernate, Relational Persistence for Idiomatic Java
*
* License: GNU Lesser General Public License (LGPL), version 2.1 or later.
* See the lgpl.txt file in the root directory or <http://www.gnu.org/licenses/lgpl-2.1.html>.
*/
package org.hibernate.envers.test.integration.manytoone.foreignkey;
import java.util.List;
import javax.persistence.CascadeType;
import javax.persistence.Entity;
import javax.persistence.GeneratedValue;
import javax.persistence.GenerationType;
import javax.persistence.Id;
import javax.persistence.OneToMany;
import org.hibernate.envers.Audited;
/**
* @author Chris Cranford
*/
@Entity(name = "RootLayer")
@Audited
public class RootLayer {
@Id
@GeneratedValue(strategy = GenerationType.AUTO)
private Long id;
@OneToMany(mappedBy = "rootLayer", cascade = CascadeType.ALL, orphanRemoval = true)
private List<MiddleLayer> middleLayers;
public Long getId() {
return id;
}
public void setId(Long id) {
this.id = id;
}
public List<MiddleLayer> getMiddleLayers() {
return middleLayers;
}
public void setMiddleLayers(List<MiddleLayer> middleLayers) {
this.middleLayers = middleLayers;
}
}

View File

@ -85,15 +85,23 @@ public EntityManagerFactory createContainerEntityManagerFactory(PersistenceUnitI
final Map settings = generateSettings( properties ); final Map settings = generateSettings( properties );
// OSGi ClassLoaders must implement BundleReference // OSGi ClassLoaders must implement BundleReference
final ClassLoader classLoader = info.getClassLoader();
settings.put( settings.put(
org.hibernate.cfg.AvailableSettings.SCANNER, org.hibernate.cfg.AvailableSettings.SCANNER,
new OsgiScanner( ( (BundleReference) info.getClassLoader() ).getBundle() ) new OsgiScanner( ( (BundleReference) classLoader).getBundle() )
); );
osgiClassLoader.addClassLoader( info.getClassLoader() ); osgiClassLoader.addClassLoader( classLoader );
return Bootstrap.getEntityManagerFactoryBuilder( info, settings, final ClassLoader prevCL = Thread.currentThread().getContextClassLoader();
new OSGiClassLoaderServiceImpl( osgiClassLoader, osgiServiceUtil ) ).build(); try {
Thread.currentThread().setContextClassLoader( classLoader );
return Bootstrap.getEntityManagerFactoryBuilder( info, settings,
new OSGiClassLoaderServiceImpl( osgiClassLoader, osgiServiceUtil ) ).build();
}
finally {
Thread.currentThread().setContextClassLoader( prevCL );
}
} }
@SuppressWarnings("unchecked") @SuppressWarnings("unchecked")