Enters shortcut entries into a cache to minimize the number of operations that future searches requires.
A cash search sequence is where a single logical read operation requires a sequence of one or more cache read operations to find a matching entry. The performance of such a search sequence can be based on the number of required cache read operations. The less the number of cache read operations, the better the performance. The Cache Search Sequence pattern therefore, describes a strategy for inserting shortcut entries into a cache to optimize the number of cache read operations that later searches require.
The following figure illustrates the static structure for the Cache Search Sequence pattern:
The structure is similar to the Cache Accessor with the primary addition of CacheSearchSequence class. Clients interact directly with the CacheSearchSequence class to issue cache operations rather than interact directly with CacheAccessor or Cache classes. CacheSearchSequence class maintains the state of a single sequence of associated cache read operations. It keeps a collection of the keys that make up the sequence and uses them to add shortcut entries to the Cache when the sequence ends. The IKey interface declares the Generalizes operation which the ConcreteKey class implements to indicate whether the key is a generalization of another key. The Strategies section defines what generalization means in this context and describes how CacheSearchSequence uses this information.
The following figure illustrates the sequence diagram when a client uses a CacheSearchSequence instance to read cached data but the entry requested by the client does not exist in the cache. CacheSearchSequence uses CacheAccessor to read the data and adds a key that describes this data to its collection of keys that make up sequence:
The following figure illustrates the sequence diagram when a client uses a CacheSearchSequence instance to read cached data and the entry requested by the client does exist in the cache:
When a CacheSearchSequence instance reads cached data and the entry requested by the client does exist in the cache. This condition indicates the end of the sequence, so CacheSearchSequence adds shortcut entries to the Cache for each key in its collection. The next time a client issues a search sequence that involves any of these keys, it finds these shortcut entries and resolves the logical read operation faster.
If a client finishes its search sequence without finding an entry, it calls CacheSearchSequence.End operation to add shortcut entries, but the shortcut entries are just placeholders to indicate no matches. These shortcut entries allow clients to conclude quickly and without repeating the entire search sequence that no entries exist.
public class CacheSearchSequence
// Data members
private CacheAccessor m_accessor = null;
private Hashtable m_htCache = new Hashtable();
private IKey m_LastKey = null;
private DirectedGraph m_graph = new DirectedGraph();
private bool m_bDone = false;
public CacheSearchSequence(CacheAccessor a, Hashtable cache)
m_accessor = a;
m_htCache = cache;
// Public interface
// Read data from the cache
public ArrayList Read( IKey key )
throw new InvalidOperationException( "Search sequence done" );
// Add this key to the list of those in the sequence
m_graph.Add( key );
// Get data using cache accessor
DataRow row = m_accessor.Read( key );
// If values are found, then end this sequence
if (row != null)
End( row );
// Mark the sequence as done. Call this if no values are found after trying every key in the sequence
public void End()
// Mark the end of the sequence with an empty DataRow to indicate that no values were found
if (!m_bDone) End( null );
// Mark the sequence as done
private void End( DataRow row )
if ( m_LastKey != null )
// Place a shortcut entry in the cache for each key that the last key in the sequence generalizes
m_bDone = true;
Use this pattern when:
Sometimes a linear list of keys for a particular search sequence does not adequately represent the set of shortcut entries that provide accurate results for future search sequences. Potential solutions when this problem arises is to use directed graphs (generalized form of trees). A directed graph defines the keys for a specific sequence along with its generalization relationships. Each key programmatically indicates if its applicable cache entries also apply to another key.