Thermal recalibration is a technique used on hard drives in the early and mid-1990s. The point of thermal recalibration is that as disk track spacing became smaller in the late 1980s and early 1990s, differences in the thermal expansion rate between the platters in disks approached or exceeded the track spacing. Since read/write heads were positioned based on track information recorded on a dedicated recording surface, head positioning became a problem. It became necessary to periodically test the drive and determine corrections between the apparent head positions read from the control surface and the actual head position. The recalibration process was rapid and had little affect on the overall data transfer rates. But it could significantly delay a specific data transfer. This plays havoc with applications like streaming audio/visual presentation (A/V) or CDRom burning that require constant data flow. It can also cause serious non-reproducible anomalies in disk benchmarking.

Special (expensive) A/V drives were developed that hid thermal recalibration through the use of special on-drive buffers. A/V drives used some other tricks to ensure steady data flow, Some of these were based on a 'better a few errors than no data', philosophy that made their use for non-A/V data problematic.

Around 1995, hard drive vendors switched to a technology called "embedded servo" that buried the track IDs between data sectors on data surfaces. This increased the available storage from the same number of platters since one surface was no longer used only for track IDs. It also eliminated the need for thermal recalibration because the read/write heads were now positioned based on positions read from the data surface rather than a separate surface.

Return To Index Copyright 1994-2002 by Donald Kenney.