Implementors manage transactional access to cached data. Transactions pass in a timestamp indicating transaction start time. Two different implementation patterns are provided for.
- A transaction-aware cache implementation might be wrapped by a "synchronous" concurrency strategy, where updates to the cache are written to the cache inside the transaction.
- A non transaction-aware cache would be wrapped by an "asynchronous" concurrency strategy, where items are merely "soft locked" during the transaction and then updated during the "after transaction completion" phase; the soft lock is not an actual lock on the database row - only upon the cached representation of the item.
In terms of entity caches, the expected call sequences are:
- DELETES : {@link #lock} -> {@link #evict} -> {@link #release}
- UPDATES : {@link #lock} -> {@link #update} -> {@link #afterUpdate}
- INSERTS : {@link #insert} -> {@link #afterInsert}
In terms of collection caches, all modification actions actually just invalidate the entry(s). The call sequence here is: {@link #lock} -> {@link #evict} -> {@link #release}
Note that, for an asynchronous cache, cache invalidation must be a two step process (lock->release, or lock-afterUpdate), since this is the only way to guarantee consistency with the database for a nontransactional cache implementation. For a synchronous cache, cache invalidation is a single step process (evict, or update). Hence, this interface defines a three step process, to cater for both models.
Note that query result caching does not go through a concurrency strategy; they are managed directly against the underlying {@link Cache cache regions}.
@deprecated As of 3.3; see
for details.