Hitachi Deskstar 7K1000: Terabyte Storage arrives on the Desktop
by Gary Key on March 19, 2007 8:00 AM EST- Posted in
- Storage
Hardware Setup
Our current test bed reflects changes in the marketplace over the past six months. Based upon the continuing proliferation of dual core processors and future roadmaps from AMD and Intel signifying the end of the single core processor on the desktop in the near future, we settled on an AMD Opteron 170. This change will also allow us to expand our real world multitasking benchmarks in the near future while providing a stable platform for the next six months. We are currently conducting preliminary benchmark testing under Vista with both 2GB and 4GB memory configurations. We will offer real-world Vista benchmarks once the driver situation matures but IPEAK results will continue to be XP based as the application is not compatible with Vista.
Test Setup - Software
With the variety of disk drive benchmarks available, we needed a means of comparing the true performance of the hard drives in real world applications. While we will continue to utilize HDTach and PCMark05 for comparative benchmarks our logical choice for application benchmarking is the Intel iPeak Storage Performance Toolkit version 3. We originally started using this storage benchmark application in our Q2 2004 Desktop Hard Drive Comparison. The iPeak test can be designed to measure "pure" hard disk performance, and in this case we kept the host adapter consistent while varying the hard drive models. The idea is to measure the performance of individual hard drives with a consistent host adapter.
We utilize the iPeak WinTrace32 program to record precise I/O operations when running real world benchmarks. We then utilize the iPeak AnalyzeTrace program to review the disk trace file for integrity and ensure our trace files have properly captured the activities we required. Intel's RankDisk utility is used to play back the workload of all I/O operations that took place during the recording. RankDisk generates results in a mean service time in milliseconds format; in other words, it gives the average time that each drive took to fulfill each I/O operation. In order to make the data more understandable, we report the scores as an average number of I/O operations per second so that higher scores translate into better performance in all of our iPeak results. While these measurements will provide a score representing "pure" hard drive performance, the actual impact on the real world applications can and will be different.
Each drive is formatted before each test run and three tests are completed in order to ensure consistency in the benchmark results. The high and low scores are removed with the remaining median score representing our reported result. We utilize the NVIDIA nF4 SATA ports along with the NVIDIA IDE-SW driver to ensure consistency in our playback results when utilizing NCQ, TCQ, or RAID settings. Although we test NCQ capabilities, all of our reported results are generated with NCQ off unless otherwise noted. We will test our Deskstar 7K1000 with AAM and NCQ turned on as AAM does not noticeably impact performance and this drive performs better with NCQ on in the majority of our tests.
Our iPeak tests represent a fairly extensive cross section of applications and usage patterns for both the general and enthusiast user. We will continually tailor these benchmarks with an eye towards the drive's intended usage and feature set when compared to similar drives. In essence, although we will reports results from our test suite for all drives, it is important to realize a drive designed for PVR duty will generate significantly different scores in our gaming benchmarks than a drive designed with gaming in mind such as the WD Raptor. This does not necessarily make the PVR drive a bad choice for those who capture and manipulate video while also gaming. Hopefully our comments in the results sections will offer proper guidance for making a purchasing decision in these situations. Our iPeak Test Suite consists of the following benchmarks.
VeriTest Business Winstone 2004: trace file of the entire test suite that includes applications such as Microsoft Office XP, WinZip 8.1, and Norton Antivirus 2003.
VeriTest Multimedia Content Creation 2004: trace file of the entire test suite that includes applications such as Adobe Photoshop 7.01, Macromedia Director MX 9.0, Microsoft Windows Media Encoder 9.0, Newtek Lightwave 3D 7.5b, and others.
AVG Antivirus 7.1.392: trace file of a complete antivirus scan on our test bed hard drive.
Microsoft Disk Defragmenter: trace file of the complete defragmentation process after the operating system and all applications were installed on our test bed hard drive.
WinRAR 3.51: trace file of creating a single compressed file consisting of 444 files in 10 different folders totaling 602MB. The test is split into the time it takes to compress the files and the time it takes to decompress the files.
File Transfer: individual trace files of transferring the Office Space DVD files to our source drive and transferring the files back to our test drive. The content being transferred consists of 29 files with a content size of 7.55GB.
AnyDVD 5.9.6: trace file of the time it takes to "rip" the Office Space DVD. We first copy the entire DVD over to our source drives, defragment the drive, and then measure the time it takes for AnyDVD to "rip" the contents to our test drive. While this is not ideal, it does remove the optical drive as a potential bottleneck during the extraction process and allows us to track the write performance of the drive.
Nero Recode 2: trace file of the time it takes to shrink the entire Office Space DVD that was extracted in the AnyDVD process into a single 4.5GB DVD image.
Game Installation: individual trace files of the time it takes to install Sims 2 and Battlefield 2. We copy each DVD to our secondary test drives, defragment the drive, and then install each game to our source drive.
Game Play: individual trace files that capture the startup and about 15 minutes of game play in each game. The Sims 2 trace file consists of the time it takes to select a pre-configured character, setup a university, downtown, business from each expansion pack (pre-loaded), and then visit each section before returning home. Our final trace file utilizes Battlefield 2 and we play the Daqing Oilfield map in both single and multiplayer mode.
Standard Test Bed Playback of iPEAK Trace Files and Test Application Results |
|
Processor: | AMD Opteron 170 utilized for all tests |
RAM: | 2 x 1GB Corsair 3500LL PRO Settings - DDR400 at (2.5-3-3-7, 1T) |
OS Hard Drive: | 1 x Western Digital 7200 RPM SATA (16MB Buffer) |
System Platform Drivers: | NVIDIA Platform Driver - 6.85 |
Video Card: | 1 x Asus 7600GS (PCI Express) for all tests. |
Video Drivers: | NVIDIA nForce 84.21 WHQL |
Optical Drive: | BenQ DW1640 |
Cooling: | Zalman CNPS9500 |
Power Supply: | Corsair HX620W |
Case: | Gigabyte 3D Aurora |
Operating System: | Windows XP Professional SP2 |
Motherboard: | MSI K8N Diamond Plus |
Our current test bed reflects changes in the marketplace over the past six months. Based upon the continuing proliferation of dual core processors and future roadmaps from AMD and Intel signifying the end of the single core processor on the desktop in the near future, we settled on an AMD Opteron 170. This change will also allow us to expand our real world multitasking benchmarks in the near future while providing a stable platform for the next six months. We are currently conducting preliminary benchmark testing under Vista with both 2GB and 4GB memory configurations. We will offer real-world Vista benchmarks once the driver situation matures but IPEAK results will continue to be XP based as the application is not compatible with Vista.
Test Setup - Software
With the variety of disk drive benchmarks available, we needed a means of comparing the true performance of the hard drives in real world applications. While we will continue to utilize HDTach and PCMark05 for comparative benchmarks our logical choice for application benchmarking is the Intel iPeak Storage Performance Toolkit version 3. We originally started using this storage benchmark application in our Q2 2004 Desktop Hard Drive Comparison. The iPeak test can be designed to measure "pure" hard disk performance, and in this case we kept the host adapter consistent while varying the hard drive models. The idea is to measure the performance of individual hard drives with a consistent host adapter.
We utilize the iPeak WinTrace32 program to record precise I/O operations when running real world benchmarks. We then utilize the iPeak AnalyzeTrace program to review the disk trace file for integrity and ensure our trace files have properly captured the activities we required. Intel's RankDisk utility is used to play back the workload of all I/O operations that took place during the recording. RankDisk generates results in a mean service time in milliseconds format; in other words, it gives the average time that each drive took to fulfill each I/O operation. In order to make the data more understandable, we report the scores as an average number of I/O operations per second so that higher scores translate into better performance in all of our iPeak results. While these measurements will provide a score representing "pure" hard drive performance, the actual impact on the real world applications can and will be different.
Each drive is formatted before each test run and three tests are completed in order to ensure consistency in the benchmark results. The high and low scores are removed with the remaining median score representing our reported result. We utilize the NVIDIA nF4 SATA ports along with the NVIDIA IDE-SW driver to ensure consistency in our playback results when utilizing NCQ, TCQ, or RAID settings. Although we test NCQ capabilities, all of our reported results are generated with NCQ off unless otherwise noted. We will test our Deskstar 7K1000 with AAM and NCQ turned on as AAM does not noticeably impact performance and this drive performs better with NCQ on in the majority of our tests.
Our iPeak tests represent a fairly extensive cross section of applications and usage patterns for both the general and enthusiast user. We will continually tailor these benchmarks with an eye towards the drive's intended usage and feature set when compared to similar drives. In essence, although we will reports results from our test suite for all drives, it is important to realize a drive designed for PVR duty will generate significantly different scores in our gaming benchmarks than a drive designed with gaming in mind such as the WD Raptor. This does not necessarily make the PVR drive a bad choice for those who capture and manipulate video while also gaming. Hopefully our comments in the results sections will offer proper guidance for making a purchasing decision in these situations. Our iPeak Test Suite consists of the following benchmarks.
VeriTest Business Winstone 2004: trace file of the entire test suite that includes applications such as Microsoft Office XP, WinZip 8.1, and Norton Antivirus 2003.
VeriTest Multimedia Content Creation 2004: trace file of the entire test suite that includes applications such as Adobe Photoshop 7.01, Macromedia Director MX 9.0, Microsoft Windows Media Encoder 9.0, Newtek Lightwave 3D 7.5b, and others.
AVG Antivirus 7.1.392: trace file of a complete antivirus scan on our test bed hard drive.
Microsoft Disk Defragmenter: trace file of the complete defragmentation process after the operating system and all applications were installed on our test bed hard drive.
WinRAR 3.51: trace file of creating a single compressed file consisting of 444 files in 10 different folders totaling 602MB. The test is split into the time it takes to compress the files and the time it takes to decompress the files.
File Transfer: individual trace files of transferring the Office Space DVD files to our source drive and transferring the files back to our test drive. The content being transferred consists of 29 files with a content size of 7.55GB.
AnyDVD 5.9.6: trace file of the time it takes to "rip" the Office Space DVD. We first copy the entire DVD over to our source drives, defragment the drive, and then measure the time it takes for AnyDVD to "rip" the contents to our test drive. While this is not ideal, it does remove the optical drive as a potential bottleneck during the extraction process and allows us to track the write performance of the drive.
Nero Recode 2: trace file of the time it takes to shrink the entire Office Space DVD that was extracted in the AnyDVD process into a single 4.5GB DVD image.
Game Installation: individual trace files of the time it takes to install Sims 2 and Battlefield 2. We copy each DVD to our secondary test drives, defragment the drive, and then install each game to our source drive.
Game Play: individual trace files that capture the startup and about 15 minutes of game play in each game. The Sims 2 trace file consists of the time it takes to select a pre-configured character, setup a university, downtown, business from each expansion pack (pre-loaded), and then visit each section before returning home. Our final trace file utilizes Battlefield 2 and we play the Daqing Oilfield map in both single and multiplayer mode.
74 Comments
View All Comments
goldfish2 - Tuesday, March 20, 2007 - link
just noticed a problem you may wish to address with your charts, hope this hasn't already been mentioned.Take a look at the chart 'video application timing - time to transcode DVD'
Your times are in Minutes/seconds, it seems you're chart application has interpreted the numbers as decimals, and made the bar lengths on this basis. Take a look at the bar for WD5000YS 500GB. It says 4.59; I assume this means 4 minutes 59 seconds, making the WD740GD 2 seconds slower at 5 minutes 1 second. But the bar lengths are scaled for decimal, so that the bar on the WD740GD is much longer. You'll have to see if you can get your graph package to think in minutes:seconds, or have the bar lengths entered in decimal (i.e. 4:30 seconds becomes 4.5 minutes) and put a label on in minutes for readability.
Thanks for the review though.
Gary Key - Tuesday, March 20, 2007 - link
We have a short blurb under the Application Performance section -"Our application benchmarks are designed to show application performance results with times being reported in minutes / seconds or seconds only, with lower scores being better. Our graph engine does not allow for a time format such a 1:05 (one minute, five seconds) so this time value will be represented as 1.05."
We know this is an issue and hopefully we can address it in our next engine update (coming soon from what I understand). I had used percentage values in a previous article that was also confusing to some degree. Thanks for the comments and they have been passed on to our web team. ;)
PrinceGaz - Tuesday, March 20, 2007 - link
The simplest and most logical solution is just to enter the time in seconds, rather than minutes and seconds; even if graphed correctly, comparing values composed of two units (minutes:seconds) is difficult compared to a single unit (seconds).If two results were 6:47 and 7:04 for instance, the difference betweem them is much clearer if you say 407 and 424 seconds. By giving the value in seconds only, you can see at a glance that there is a 17 second difference, which translates to just over 4% (17 divided by 407/100, or 17 divided by about 4.1).
Doing the same mental calculation with 6:47 and 7:04 first involves working out the difference with the extra step of dealing with 60 seconds to a minute. Then you have a difference of 17 seconds out of a little under 7 minutes, which isn't very helpful until you convert the 7 minutes to seconds, as it should have been originally.
That's my opinion anyway.
JarredWalton - Tuesday, March 20, 2007 - link
Hi Gary. I told you so! Damned if you do, damned if you don't. ;) (The rest of you can just ignore me.)PrinceGaz - Tuesday, March 20, 2007 - link
How can you say only two years?
The 14 years you say it took to increase from 1GB to 500GB represents a doubling of capacity nine times, or roughly 1.56 years (19 months) for the capacity to double. That means that the two years (actually 20 months as Hitachi released a 500GB drive in Jul 2005) it took to double again, from 500GB to 1TB is actually marginally longer than average.
It would be more accurate to say that the trend of capacities doubling roughly every 18 months is continuing.
patentman - Tuesday, March 20, 2007 - link
The two year remark is two years from the first commercial perpendicular recording drive. Perpendicular recording has been in the works for a long time. In fact, when I used to examine patent applications for a living, there was patent literature related to perpendicular recording all the way back in 1990-1991, albeit for relatively simple aspects of the device.Gary Key - Tuesday, March 20, 2007 - link
The averaging of the time periods does work out to a doubling of capacities every 18~20 months but the last doubling took about 26 months to go from 250GB to 500GB.
mino - Wednesday, March 21, 2007 - link
Yes, but first 250GB drives were 4-platter 5400rpm ones(Maxtor?)...First 500GB were 5-platter 7200rpm ones.
IMO there are little dicrepancies in the tren dcaused bu the worry of many-platter drives after 75GXP. Aftre a few years Hitachi came back with 7K400 and the curve just returned to values it lost before...
scott967 - Tuesday, March 20, 2007 - link
On these big drives is NTFS performance an issue at all?scott s.
.
AbRASiON - Monday, March 19, 2007 - link
Too slow, too much money, too little space.I've owned 3 and sold them.
When are we going to see a 15krpm Savvio 2.5" review?
When will we see a 180gb per platter 32mb 10,000rpm new series raptor?
Maybe WD should also make a 15krpm 2.5" - 32mb model
These incrimental speed upgrades on hard disks are terrible :( need more, much much more.