You may not encounter a situation where cache oblivious structure is needed, but the knowledge that some algorithms are cache oblivious can help.
For example, if you represent a set of values as an ordered array, you can perform many set operations with the (array) merge algorithm, which is cache oblivious one. You can avoid pointer chasing, etc, but your structure becomes static instead of dynamic as with, say, red-black trees. This is already a win because you can save memory and compute if at least one of your sets does not change much.
But, if you apply logarithmic method [1] you can have dynamic data structure (over static one, a sorted array) and still perform most of operations without pointer chasing and with cache oblivious merging.
For example, if you represent a set of values as an ordered array, you can perform many set operations with the (array) merge algorithm, which is cache oblivious one. You can avoid pointer chasing, etc, but your structure becomes static instead of dynamic as with, say, red-black trees. This is already a win because you can save memory and compute if at least one of your sets does not change much.
But, if you apply logarithmic method [1] you can have dynamic data structure (over static one, a sorted array) and still perform most of operations without pointer chasing and with cache oblivious merging.
[1] https://www.scilit.com/publications/4dd8074c2d05ecc9ed96b5cf...