Disk corruption with 4 TB GPT drive

  • Thread starter Chuck Forsberg WA7KGX N2469R
  • Start date
C

Chuck Forsberg WA7KGX N2469R

I initialized a 4 TB drive with GPT, then put a single NTFS partition on
it with Win7 disk management. The result shows as a 4 TB drive.

I use this drive to record video from NASA select.
Unfortunately Windows 7 eventually corrupts this drive.

Some years ago I had the same problem with a 3 TB drive.

An older motherboard running Linux has no problems storing data on 3 TB
drives.

I am not trying to boot Windows from a 4 TB drive. I just want Windows
to store the data without corruption. Surely someone at MS must be aware
of what
Windows 7 is doing.
 
K

Ken1943

I initialized a 4 TB drive with GPT, then put a single NTFS partition on
it with Win7 disk management. The result shows as a 4 TB drive.

I use this drive to record video from NASA select.
Unfortunately Windows 7 eventually corrupts this drive.

Some years ago I had the same problem with a 3 TB drive.

An older motherboard running Linux has no problems storing data on 3 TB
drives.

I am not trying to boot Windows from a 4 TB drive. I just want Windows
to store the data without corruption. Surely someone at MS must be aware
of what
Windows 7 is doing.
Could you read S.M.A.R.T data ? Maybe that would show something.


KenW
 
C

charlie

I initialized a 4 TB drive with GPT, then put a single NTFS partition on
it with Win7 disk management. The result shows as a 4 TB drive.

I use this drive to record video from NASA select.
Unfortunately Windows 7 eventually corrupts this drive.

Some years ago I had the same problem with a 3 TB drive.

An older motherboard running Linux has no problems storing data on 3 TB
drives.

I am not trying to boot Windows from a 4 TB drive. I just want Windows
to store the data without corruption. Surely someone at MS must be aware
of what
Windows 7 is doing.
What version of win 7 ? (32 or 64), and so forth.
It may also have to do with the amount of RAM available.
 
P

Paul

Chuck said:
I initialized a 4 TB drive with GPT, then put a single NTFS partition on
it with Win7 disk management. The result shows as a 4 TB drive.

I use this drive to record video from NASA select.
Unfortunately Windows 7 eventually corrupts this drive.

Some years ago I had the same problem with a 3 TB drive.

An older motherboard running Linux has no problems storing data on 3 TB
drives.

I am not trying to boot Windows from a 4 TB drive. I just want Windows
to store the data without corruption. Surely someone at MS must be aware
of what
Windows 7 is doing.
I can paint you a scenario.

On WinXP, if I write a lot of NTFS data, memory fragmentation seems
to happen, such that the percentage of CPU used by the file system
grows with time. (Take note of CPU usage now, when the recording
session has just started, then check back in an hour or two and
see if the percentage of CPU is higher than it used to be. Now,
you're in trouble. If the CPU percentage isn't rising, then maybe
that OS doesn't have this bug.)

After writing continuously for around 8 hours, this becomes so bad
under WinXP. that I end up with a "delayed write failure" event. That
means the write attempt was so slow (bandwidth drops to such a low level),
it timed out (didn't complete in 5 seconds or whatever).

This problem doesn't seem to exist on Windows 8. I'm not
really sure about Windows 7. My laptop has the Windows 7 install,
and doesn't have a lot of I/O options for me to test with.
(Testing a large disk over USB2, would suck.)

I would recommend using a program like "dd", to test I/O
on your computer, and see if you can reproduce a problem
that way. This will write at a fair rate, without using
a lot of CPU while doing so (for dd.exe at least).

http://www.chrysocome.net/dd

The largest file I've written with that program, would be around
500GB, to my 2TB disk. And that worked fine in Windows 8.

This would write a file of roughly 4TB. Try something in that range
and see how long it runs before dying. Check for a delayed write failure
in Event Viewer (or even on the screen).

dd if=/dev/zero of=K:\testbig.bin bs=1048576 count=4000000

No matter what test cases you run, it's going to take a while.

You can use SMART statistics, to evaluate drive physical health, but
problems like this, can't all be blamed on the drive. For example,
the free version of HDTune can display SMART stats.

http://www.hdtune.com/files/hdtune_255.exe

In this example, "Reallocated Sector Count" data column is 0,
and "Current Pending Sector Count" data column is 0 as well. That
tells me the disk is OK. Even though there are "yellow marks"
in this screenshot, they're for things which involve a
mis-interpretation of the SMART data. No program is perfect
at this sort of thing. And when I use the free version of
that program, the free version doesn't receive any updates
over time.

http://img94.imageshack.us/img94/2460/hdtunesample.gif

HTH,
Paul
 
P

Paul

Something else you can try as a workaround, is to
reformat the partition to NTFS with 64KB allocation unit size.
Normal size is 4KB. At least that has helped me with
I/O rate on one of my disks here. There's no reason to believe
it will help with your problem though. It's just another
option, for those cases where you're writing a small number
of very large files. This would not be a very good option for
a disk full of 2KB data files. Only my backup partition is
formatted this way. Using a non-standard size, also
disables the NTFS compression option. Using a non-standard
size, is when you want to throw "Compatibility with Everything"
out the window.

http://ss64.com/nt/format.html

/A:size Allocation unit size.
Default settings (via /F) are strongly recommended for general use.
NTFS supports 512, 1024, 2048, 4096, 8192, 16K, 32K, 64K.

The default is 4096.

Paul
 
J

JJ

I initialized a 4 TB drive with GPT, then put a single NTFS partition on
it with Win7 disk management. The result shows as a 4 TB drive.

I use this drive to record video from NASA select.
Unfortunately Windows 7 eventually corrupts this drive.

Some years ago I had the same problem with a 3 TB drive.

An older motherboard running Linux has no problems storing data on 3 TB
drives.

I am not trying to boot Windows from a 4 TB drive. I just want Windows
to store the data without corruption. Surely someone at MS must be aware
of what
Windows 7 is doing.
Since you had same problem with other drives, the cause is probably due to
bad driver. Try updating the driver for the disk controller if any newer one
exists. Bad RAM module(s) may also cause this, but it would likely to cause
system crash or freeze too. Worst case is that it was due to bad motherboard
design, even though the chipset is perfect.
 
C

Chuck Forsberg WA7KGX N2469R

Since you had same problem with other drives, the cause is probably due
to bad driver. Try updating the driver for the disk controller if any
newer one exists. Bad RAM module(s) may also cause this, but it would
likely to cause system crash or freeze too. Worst case is that it was
due to bad motherboard design, even though the chipset is perfect.
The 3 TB drives are working happily in a Linux system.

2 TB drives in the system in question have operated normally without
corrupting data.

For this next round I put the 4 TB drive on a port served by a Marvell
controller and formatted it with 64k blocks. So this will involve a
different driver presumably.
 
P

Paul

Chuck said:
The 3 TB drives are working happily in a Linux system.

2 TB drives in the system in question have operated normally without
corrupting data.

For this next round I put the 4 TB drive on a port served by a Marvell
controller and formatted it with 64k blocks. So this will involve a
different driver presumably.
Test it carefully, before deciding it's fixed.

This is one of the reasons, for people who own >2.2TB drives,
I *always* recommend a "test fill" to the drive. To make sure
you didn't miss anything. There is nothing worse than relying
on a new fat drive, perhaps not having backups as you go along,
then having the drive corrupt for some reason. You should test
the thing, fill with fake files or the like, until you're satisfied
there isn't a problem hiding in the weeds, waiting for the day.

At one time, I used to test the 137GB barrier, by filling with
1GB files to 136GB, then "walking across" the barrier. Just in
case I missed something there. (I discovered a problem with a
Firewire enclosure that way.) Now, the challenge has moved
to 2.2TB.

You can find helpful docs on the disk manufacturer web site as well.
For example, they mention that an older Intel driver, isn't
happy above 2.2TB. That's likely fixed on any newer Intel driver.

http://www.hgst.com/tech/techlib.ns...0235119/$file/Deskstar_3TB_FAQ_finalwebv2.pdf

"Yes, as of this posting (November, 2010) the native
Intel HDD drivers do not support high capacity hard drives."

Paul
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top