Hitachi Deskstar 7K1000: Terabyte Storage arrives on the Desktop
by Gary Key on March 19, 2007 8:00 AM EST- Posted in
- Storage
Hardware Setup
Our current test bed reflects changes in the marketplace over the past six months. Based upon the continuing proliferation of dual core processors and future roadmaps from AMD and Intel signifying the end of the single core processor on the desktop in the near future, we settled on an AMD Opteron 170. This change will also allow us to expand our real world multitasking benchmarks in the near future while providing a stable platform for the next six months. We are currently conducting preliminary benchmark testing under Vista with both 2GB and 4GB memory configurations. We will offer real-world Vista benchmarks once the driver situation matures but IPEAK results will continue to be XP based as the application is not compatible with Vista.
Test Setup - Software
With the variety of disk drive benchmarks available, we needed a means of comparing the true performance of the hard drives in real world applications. While we will continue to utilize HDTach and PCMark05 for comparative benchmarks our logical choice for application benchmarking is the Intel iPeak Storage Performance Toolkit version 3. We originally started using this storage benchmark application in our Q2 2004 Desktop Hard Drive Comparison. The iPeak test can be designed to measure "pure" hard disk performance, and in this case we kept the host adapter consistent while varying the hard drive models. The idea is to measure the performance of individual hard drives with a consistent host adapter.
We utilize the iPeak WinTrace32 program to record precise I/O operations when running real world benchmarks. We then utilize the iPeak AnalyzeTrace program to review the disk trace file for integrity and ensure our trace files have properly captured the activities we required. Intel's RankDisk utility is used to play back the workload of all I/O operations that took place during the recording. RankDisk generates results in a mean service time in milliseconds format; in other words, it gives the average time that each drive took to fulfill each I/O operation. In order to make the data more understandable, we report the scores as an average number of I/O operations per second so that higher scores translate into better performance in all of our iPeak results. While these measurements will provide a score representing "pure" hard drive performance, the actual impact on the real world applications can and will be different.
Each drive is formatted before each test run and three tests are completed in order to ensure consistency in the benchmark results. The high and low scores are removed with the remaining median score representing our reported result. We utilize the NVIDIA nF4 SATA ports along with the NVIDIA IDE-SW driver to ensure consistency in our playback results when utilizing NCQ, TCQ, or RAID settings. Although we test NCQ capabilities, all of our reported results are generated with NCQ off unless otherwise noted. We will test our Deskstar 7K1000 with AAM and NCQ turned on as AAM does not noticeably impact performance and this drive performs better with NCQ on in the majority of our tests.
Our iPeak tests represent a fairly extensive cross section of applications and usage patterns for both the general and enthusiast user. We will continually tailor these benchmarks with an eye towards the drive's intended usage and feature set when compared to similar drives. In essence, although we will reports results from our test suite for all drives, it is important to realize a drive designed for PVR duty will generate significantly different scores in our gaming benchmarks than a drive designed with gaming in mind such as the WD Raptor. This does not necessarily make the PVR drive a bad choice for those who capture and manipulate video while also gaming. Hopefully our comments in the results sections will offer proper guidance for making a purchasing decision in these situations. Our iPeak Test Suite consists of the following benchmarks.
VeriTest Business Winstone 2004: trace file of the entire test suite that includes applications such as Microsoft Office XP, WinZip 8.1, and Norton Antivirus 2003.
VeriTest Multimedia Content Creation 2004: trace file of the entire test suite that includes applications such as Adobe Photoshop 7.01, Macromedia Director MX 9.0, Microsoft Windows Media Encoder 9.0, Newtek Lightwave 3D 7.5b, and others.
AVG Antivirus 7.1.392: trace file of a complete antivirus scan on our test bed hard drive.
Microsoft Disk Defragmenter: trace file of the complete defragmentation process after the operating system and all applications were installed on our test bed hard drive.
WinRAR 3.51: trace file of creating a single compressed file consisting of 444 files in 10 different folders totaling 602MB. The test is split into the time it takes to compress the files and the time it takes to decompress the files.
File Transfer: individual trace files of transferring the Office Space DVD files to our source drive and transferring the files back to our test drive. The content being transferred consists of 29 files with a content size of 7.55GB.
AnyDVD 5.9.6: trace file of the time it takes to "rip" the Office Space DVD. We first copy the entire DVD over to our source drives, defragment the drive, and then measure the time it takes for AnyDVD to "rip" the contents to our test drive. While this is not ideal, it does remove the optical drive as a potential bottleneck during the extraction process and allows us to track the write performance of the drive.
Nero Recode 2: trace file of the time it takes to shrink the entire Office Space DVD that was extracted in the AnyDVD process into a single 4.5GB DVD image.
Game Installation: individual trace files of the time it takes to install Sims 2 and Battlefield 2. We copy each DVD to our secondary test drives, defragment the drive, and then install each game to our source drive.
Game Play: individual trace files that capture the startup and about 15 minutes of game play in each game. The Sims 2 trace file consists of the time it takes to select a pre-configured character, setup a university, downtown, business from each expansion pack (pre-loaded), and then visit each section before returning home. Our final trace file utilizes Battlefield 2 and we play the Daqing Oilfield map in both single and multiplayer mode.
Standard Test Bed Playback of iPEAK Trace Files and Test Application Results |
|
Processor: | AMD Opteron 170 utilized for all tests |
RAM: | 2 x 1GB Corsair 3500LL PRO Settings - DDR400 at (2.5-3-3-7, 1T) |
OS Hard Drive: | 1 x Western Digital 7200 RPM SATA (16MB Buffer) |
System Platform Drivers: | NVIDIA Platform Driver - 6.85 |
Video Card: | 1 x Asus 7600GS (PCI Express) for all tests. |
Video Drivers: | NVIDIA nForce 84.21 WHQL |
Optical Drive: | BenQ DW1640 |
Cooling: | Zalman CNPS9500 |
Power Supply: | Corsair HX620W |
Case: | Gigabyte 3D Aurora |
Operating System: | Windows XP Professional SP2 |
Motherboard: | MSI K8N Diamond Plus |
Our current test bed reflects changes in the marketplace over the past six months. Based upon the continuing proliferation of dual core processors and future roadmaps from AMD and Intel signifying the end of the single core processor on the desktop in the near future, we settled on an AMD Opteron 170. This change will also allow us to expand our real world multitasking benchmarks in the near future while providing a stable platform for the next six months. We are currently conducting preliminary benchmark testing under Vista with both 2GB and 4GB memory configurations. We will offer real-world Vista benchmarks once the driver situation matures but IPEAK results will continue to be XP based as the application is not compatible with Vista.
Test Setup - Software
With the variety of disk drive benchmarks available, we needed a means of comparing the true performance of the hard drives in real world applications. While we will continue to utilize HDTach and PCMark05 for comparative benchmarks our logical choice for application benchmarking is the Intel iPeak Storage Performance Toolkit version 3. We originally started using this storage benchmark application in our Q2 2004 Desktop Hard Drive Comparison. The iPeak test can be designed to measure "pure" hard disk performance, and in this case we kept the host adapter consistent while varying the hard drive models. The idea is to measure the performance of individual hard drives with a consistent host adapter.
We utilize the iPeak WinTrace32 program to record precise I/O operations when running real world benchmarks. We then utilize the iPeak AnalyzeTrace program to review the disk trace file for integrity and ensure our trace files have properly captured the activities we required. Intel's RankDisk utility is used to play back the workload of all I/O operations that took place during the recording. RankDisk generates results in a mean service time in milliseconds format; in other words, it gives the average time that each drive took to fulfill each I/O operation. In order to make the data more understandable, we report the scores as an average number of I/O operations per second so that higher scores translate into better performance in all of our iPeak results. While these measurements will provide a score representing "pure" hard drive performance, the actual impact on the real world applications can and will be different.
Each drive is formatted before each test run and three tests are completed in order to ensure consistency in the benchmark results. The high and low scores are removed with the remaining median score representing our reported result. We utilize the NVIDIA nF4 SATA ports along with the NVIDIA IDE-SW driver to ensure consistency in our playback results when utilizing NCQ, TCQ, or RAID settings. Although we test NCQ capabilities, all of our reported results are generated with NCQ off unless otherwise noted. We will test our Deskstar 7K1000 with AAM and NCQ turned on as AAM does not noticeably impact performance and this drive performs better with NCQ on in the majority of our tests.
Our iPeak tests represent a fairly extensive cross section of applications and usage patterns for both the general and enthusiast user. We will continually tailor these benchmarks with an eye towards the drive's intended usage and feature set when compared to similar drives. In essence, although we will reports results from our test suite for all drives, it is important to realize a drive designed for PVR duty will generate significantly different scores in our gaming benchmarks than a drive designed with gaming in mind such as the WD Raptor. This does not necessarily make the PVR drive a bad choice for those who capture and manipulate video while also gaming. Hopefully our comments in the results sections will offer proper guidance for making a purchasing decision in these situations. Our iPeak Test Suite consists of the following benchmarks.
VeriTest Business Winstone 2004: trace file of the entire test suite that includes applications such as Microsoft Office XP, WinZip 8.1, and Norton Antivirus 2003.
VeriTest Multimedia Content Creation 2004: trace file of the entire test suite that includes applications such as Adobe Photoshop 7.01, Macromedia Director MX 9.0, Microsoft Windows Media Encoder 9.0, Newtek Lightwave 3D 7.5b, and others.
AVG Antivirus 7.1.392: trace file of a complete antivirus scan on our test bed hard drive.
Microsoft Disk Defragmenter: trace file of the complete defragmentation process after the operating system and all applications were installed on our test bed hard drive.
WinRAR 3.51: trace file of creating a single compressed file consisting of 444 files in 10 different folders totaling 602MB. The test is split into the time it takes to compress the files and the time it takes to decompress the files.
File Transfer: individual trace files of transferring the Office Space DVD files to our source drive and transferring the files back to our test drive. The content being transferred consists of 29 files with a content size of 7.55GB.
AnyDVD 5.9.6: trace file of the time it takes to "rip" the Office Space DVD. We first copy the entire DVD over to our source drives, defragment the drive, and then measure the time it takes for AnyDVD to "rip" the contents to our test drive. While this is not ideal, it does remove the optical drive as a potential bottleneck during the extraction process and allows us to track the write performance of the drive.
Nero Recode 2: trace file of the time it takes to shrink the entire Office Space DVD that was extracted in the AnyDVD process into a single 4.5GB DVD image.
Game Installation: individual trace files of the time it takes to install Sims 2 and Battlefield 2. We copy each DVD to our secondary test drives, defragment the drive, and then install each game to our source drive.
Game Play: individual trace files that capture the startup and about 15 minutes of game play in each game. The Sims 2 trace file consists of the time it takes to select a pre-configured character, setup a university, downtown, business from each expansion pack (pre-loaded), and then visit each section before returning home. Our final trace file utilizes Battlefield 2 and we play the Daqing Oilfield map in both single and multiplayer mode.
74 Comments
View All Comments
Justin Case - Monday, March 19, 2007 - link
"Considering the importance of data integrity in today's systems"...? You mean like, in yesterday's (or perhaps tomorrow's) systems, data corruption was considered normal or acceptable?Gary Key - Tuesday, March 20, 2007 - link
It was not meant to infer that data integrity was not or will not be important.Spoelie - Tuesday, March 20, 2007 - link
No, but if you lost a hard drive before, the amount of data that would be gone is nothing compared to the amount of data you lose with current hard drives. It's always a BAD thing to lose data, but it's BAD² to lose data². So it's important² to keep data² safe ;pJustin Case - Wednesday, March 21, 2007 - link
"Data integrity" and "drive failure" are two different things. Most data integrity issues are related to bad sectors and corrupted data (and that is why Hitachi chose to go with more platters and lower areal density - less chance of localized data corruption, but actually a slightly higher chance of "catastrophic" drive failure - namely a head crash or a dead motor). The article's author got _that_ part right.The problem was what came after it. It was just as important to "keep data safe" last year (or the year before that, etc.) as it is this year, so qualifying it as "in today's systems" makes no sense.
Gary Key - Wednesday, March 21, 2007 - link
I changed it back to the original text. ;)
Griswold - Monday, March 19, 2007 - link
Looking at the benchmark charts, one thing that pops into the eye is that your world at AT, as far as HDDs are concerned, seems to revolve around Seagate and WD only.But theres quite a few other manufacturers out there that make good drives (that surpass many of the featured drives in one way or another) - this new Hitachi beast proves it.
Go ahead and test more Samsung, Fujitsu, Hitachi and even Excelstor drives.
Gholam - Thursday, March 22, 2007 - link
ExcelStor drives are refurbished IBM/Hitachi.Gary Key - Tuesday, March 20, 2007 - link
We finally have agreements with Samsung and Hitachi to provide review samples so expect to see reviews of their drives ramp up quickly. We are discussing a review format for SCSI based drives at this time and if we can do it right then expect to see this drive category reviewed later this year. We will also be introducing SSD reviews into our storage mix in the coming weeks. While I am at it, our Actual Application Test Suite will under several changes and be introduced in the 500GB roundup. Thanks for the comments. :)
Final Hamlet - Monday, March 19, 2007 - link
Hmm. Only vendor I am interested in seeing him added is Samsung. They have quite a market share here in Germany.JarredWalton - Monday, March 19, 2007 - link
My personal take is that for 99% of users, it doesn't really matter which brand you use. Seagate may win a few benchmarks, WD some others, Samsung, etc. some as well. In reality, I don't notice the difference between any of the HDDs I own and use on a regular basis. I have purchased Samsung, WD, Seagate, Hitachi, and Maxtor. Outside of the Raptors being faster in a few specific instances, without running a low level diagnostic I would never notice a difference between the drives. I suppose I'm just not demanding enough of HDDs?