Acronis Resource Center

What's a Disk Cache And Why Is It Important?

Technologists often talk about "cache" when discussing CPUs, disk drives, and computer systems. But what is it and why should you care?

A disk cache is RAM built into your hard disk. It evens out the flow of data between the relatively slow disk drive and the rest of the computer by accepting data faster than it can be written to or read from the disk. To help the cache do its job, the disk normally includes logic to analyze disk usage and pre-fetch data, that is likely to be needed soon.

As long as the cache isn't full, the disk drive's data transfer rate is a good deal higher than its actual read/write speed.

Don't confuse cache with the size of the disk drive or your system RAM. Cache is a separate entity; in fact, it's normally memory chips that are installed in the casing that houses your disk drive. Often, when you read the specifications for a disk drive, it will include the amount of on-board cache.

This built-in cache is also different from setting aside a portion of the system RAM to use as a disk buffer, even though this is occasionally referred to as "disk cache" as well.

The disk cache is important because it has a major impact on how well your system performs on data-intensives applications, including games. Although we usually think of disk performance in terms of rotational speed and seek time, the amount of cache on the disk is at least as important to performance. In fact, you can argue that in many applications, the cache size is even more important than seek time. This is especially true for applications that read and write very large files sequentially.

The benefit of cache only lasts until the cache is full. If the system trying to send or receive data faster than the disk can read or write it, the cache will fill up eventually and the disk's performance will fall off significantly. But since disk activity on desktop systems is usually bursty, that is, it comes in bursts rather than as a steady stream of data, it is possible to put enough cache memory on a disk to stay ahead of the write requests most of the time.

The exact meaning of "most of the time" depends in large part on how much cache the disk has. Ideally the disk cache would be so large it would never fill up. Since RAM is relatively expensive compared to other components in a disk drive (prices vary based on the type of RAM used) it isn't very practical.

Disks differ widely in the size of the cache. The standard today is a minimum of 2 Mb cache, although many "high-performance" commodity drives available at retail include 8 Mb of cache. A drive with 16 Mb will give significant performance improvements, but these are unusual and result in a more expensive disk drive.

All other things being equal, the disk with the most cache will give you the best performance. You'll pay more, but your system will perform better.