Over the years I have recovered many hard drives configured with NTFS. One of the leading reasons that data recovery is performed on these hard drives is an anamoly developed in the Master File Table. This area of the drive is the single most important set of data stored on your system. The Master File Table houses all attributes, as well as cluster placement for every file on your system. It contains security attributes, file name attributes, date and time signatures, and a mini FAT called a run list that points to every cluster where a particular file is stored.
In addition to the infomation stored in the Master File Table it has been my experience that if a previous copy of the Master File Table had been saved off into a file onto a remote site I could have easily imported that file and used it to recover the data. In other words, it is rarely the occasion that an entire file system gets totally wiped out. It is usually some small piece of information either corrupted or omitted from the Master File Table that causes the problem. Even a restore disk used on a hard drive that totally destroys all remnants of a file system cannot keep a backup copy of the Master File Table from recovering some data.
How, you may ask can this be? Well grasshopper, read on and see. Imagine a book. A reference book preferably. Now, let us define the attributes of a reference book. Lets see, there is a forward where the author may offer a few remarks so we know how intelligent he is. There is a table of contents that give you a general idea of what is in the book and where it is located. There is the body of the book, the actual information. Last but not least, an index. A detailed description, with page numbers that tell you exactly where the data is that you are looking for. For illustration purposes we can say that the index of the book is the Master File Table, and the body of the book is the data on your hard drive. If the index of the book is ripped out of the back, how would it be possible to find the information you are looking for? I suppose you could wade through the entire book and possibly, after several hours of searching, find the answers you are looking for. I have done that with some of my older books where the back, and the front of the book have disappeared. A book may have 200, 300, 400, maybe even 500 pages to look through, and if the information is important enough it is worth the look. However, wouldn’t it have been easier if I would have just photo copied the index and placed that in a nice safe place. Then, when the book gets old, and I lose the index, I have this nice copy that I have kept to help me find my information.
Leafing through a 500 page book may be time consuming but it is feasible, however, apply that same logic of the index and the book to a hard drive. Who wants to scan through 234,000,000 sectors looking for data. If the data is fragmented then the data is probably lost. Wouldn’t have been nice to have a copy of the Master File Table to use and find all of your old tax returns, or doctoral thesis, or the only pictures of your grandsons birth? I would say, “Yeah!! It would’ve been nice!”.
Please don’t get the wrong idea. This is not the same as entire backup, on another set of media. There are holes to this system. First, if the drive actually goes bad, then it will be difficult if not impossible to get the data back. Secondly, any thing that writes to the data portion of the drive will make the Master File Table useless. However, it takes a long time to destroy a 250 GB hard drives data area. Lastly, I have not been able to find a piece of software that just dumps the Master File Table to a remote site. Looks like someone should write one?