G
Gary S. Terhune
Guest
Re: disk defragmentor
What a total crock. You haven't made any tests worth speaking of. You've
made some observations of one drive. Well, I've made hundreds of similar
observations and they contradict yours 99.99% of the time.
In the case of defragmenting wearing out the drive, you didn't even do any
tests. You just repeated some gossip.
--
Gary S. Terhune
MS-MVP Shell/User
www.grystmill.com
"Lord Turkey Cough" <spamdump@invalid.com> wrote in message
news:64Atj.541$%W6.439@newsfe2-gui.ntli.net...
> The proof of the puffing is in the eating.
> All the detailed explantions in the world will not change
> test results which contradict them.
>
> It's a case of once bitten twice shy, and I have been bitten
> 2 or 3 times by the defragging myth. I don't intend getting bitten
> anymore.
>
>
>
> "MEB" <meb@not here@hotmail.com> wrote in message
> news:%23Q8LId4bIHA.4196@TK2MSFTNGP04.phx.gbl...
>>
>> Debated upon whether I would get into this ridiculous argument, but:
>> I'll just add this comment, which happens to coincide with most of the
>> material on the subject. [Oh boy another web page]
>>
>> Fragmentation happens to cause significant impact upon hard drives,
>> particularly in the NT/XP environment as files are not stored in the
>> fashion
>> one would [non-informed] generally think off.
>>
>> Ponder upon these overly simplified explanations:
>>
>> Many files are created or modified each time their application is run or
>> accessed, and the system does not use the next available hard drive space
>> to
>> place those file segments/additions, but may place them in any unused
>> space
>> on the disk. This creates files which might extend from the base address
>> [fat address or MFT] to anywhere else on the partition handled by jump
>> instructions or other, which indicates the location of the next segment
>> needed. These may once again be jumped to the next segment that may
>> actually
>> be at the opposite end of the disk/partition. Picture that happening
>> several
>> dozen times during the access of that one file. During this time, the
>> hard
>> drive controller, OS, and the algorithms used, may place other segments
>> elsewhere on the disk, either temporarily or permanently.
>> Think of a large file and picture the number of additional head movements
>> needed to access JUST that one file and the extra time [additional
>> nanoseconds] needed, then consider that there are likely a dozen or more
>> additional files [dlls and other exes, etc.] needed for that one
>> application
>> which are also fragmented taking that same whipping head motion picking
>> up a
>> fragment here and there...
>> Now let's picture that application has a data base of information, new
>> information is added to that base but is stored wherever it was created.
>> After running that same application and saving those new bits of data,
>> that
>> data base now exists in several thousand non-contiguous sectors of the
>> hard
>> drive. To view or access that data base ALL those segments must be found
>> and
>> brought together for the visual display, so these scattered bits are
>> temporarily collected in the swap file and/or memory.
>> All of this, of course, takes more head movement and time than if the
>> files
>> were contiguous and the application's other needed files were also closer
>> together.
>>
>> A good indication is when intermittent Windows errors begin to show up
>> for
>> some reason or hard drive access times become excessive. If one goes to
>> Safe
>> Mode, shuts off Windows handling of virtual memory [swap] then deletes
>> the
>> win386.swp file after a restart in DOS, restarts to Safe mode and
>> defrags,
>> then turns ON Windows management when done; restarting to Windows Normal
>> Mode and behavior will be noticeably improved. Part of the reason is that
>> the SWAP file is no longer scattered all over the disk, and is contiguous
>> [Fat systems]. NT's defragmentation is of course different as are the
>> results..
>>
>> Regarding new installations and defragging:
>>
>> A major misconception is that a newly installed OS is defragmented and
>> arraigned closely on the disk. As the files are expanded areas of the
>> disk,
>> various areas are used to hold temporary copies of those files in any
>> available area of the disk. Each file may first be copied, then expanded,
>> then added to the proper directory; or may be placed in temporary storage
>> pending installation order, then placed with some directory [listed as
>> part
>> of].
>> Each time the file is written, it takes up space on the drive, which may
>> or
>> may not be the next contiguous area, and may be some scattered areas upon
>> the disk [other segments of files may already be using an area which
>> might
>> have been used].
>> The directories themselves [via the table] assign the "base" area then
>> list
>> the various temporary and permanent locations of the files listed under
>> the
>> various directories. Nothing at this stage requires these files are
>> actually
>> assigned an area of the disk in which all the directory's files are
>> located
>> within a specific segment of the disk, e.g., one file after the other or
>> one
>> sector after the other. Continuing to use a newly installed disk without
>> ever defragmenting it, will eventually cause errors and at minimum,
>> slower
>> loading times; noticeable after extended usage.
>>
>> The first defragmentation done on a disk attempts to align the various
>> individual segments of the files into contiguous areas/segments. If one
>> has
>> used something like "Align for faster Windows Loading" [MSDeFrag - not a
>> recommended setting}, then the files are arraigned according to the
>> monitored access, space required while running, and other factors held
>> [created by taskmon] in C:\WINDOWS\APPLOG\ and using Logitec Mouse as
>> example LOGI_MWX.LGC to supposedly place the file in an area conducive to
>> its loading and any additional space it might require while
>> loading/running
>> [some exe files temporarily expand on disk and become fragmented in the
>> process]. IF this is a fairly new system or taskmon has been disabled
>> then
>> the files will NOT be arraigned properly as there is not enough saved
>> details.
>> Successive deframentations generally take less time, and decrease file
>> movement.. Watch a defragment tool: it checks the fats, then folders;
>> then
>> files, and only adjusts what has or is fragmented [unless one uses the
>> Align
>> for Faster Windows which WILL constantly move files around based upon the
>> logic files {which is why its not recommended}]
>>
>> Now, should any wish to complain this provides no conclusive proof, then
>> they should get off their dead behinds and actually look at a
>> fragmemented
>> disk and defragmented disk with a hex/disk editor. THEN come back and
>> post,
>> maybe someone will listen, though I doubt it ...
>> Or if you like visuals displays, then run MS Defrag and look at the
>> details,, and watch it move files around trying to place all the file
>> segments together.
>>
>> The short answer is: defragmentation can decrease OS access times and
>> reduce wear and tear on your hard disk. Load times ARE NOT a definitive
>> display of problems with defragmentation, but with the routines used, and
>> INCLUDING a fragmented swap file.
>>
>> --
>>
>> MEB
>> http://peoplescounsel.orgfree.com
>> _________
>>
>>
>>
>>
>
>
What a total crock. You haven't made any tests worth speaking of. You've
made some observations of one drive. Well, I've made hundreds of similar
observations and they contradict yours 99.99% of the time.
In the case of defragmenting wearing out the drive, you didn't even do any
tests. You just repeated some gossip.
--
Gary S. Terhune
MS-MVP Shell/User
www.grystmill.com
"Lord Turkey Cough" <spamdump@invalid.com> wrote in message
news:64Atj.541$%W6.439@newsfe2-gui.ntli.net...
> The proof of the puffing is in the eating.
> All the detailed explantions in the world will not change
> test results which contradict them.
>
> It's a case of once bitten twice shy, and I have been bitten
> 2 or 3 times by the defragging myth. I don't intend getting bitten
> anymore.
>
>
>
> "MEB" <meb@not here@hotmail.com> wrote in message
> news:%23Q8LId4bIHA.4196@TK2MSFTNGP04.phx.gbl...
>>
>> Debated upon whether I would get into this ridiculous argument, but:
>> I'll just add this comment, which happens to coincide with most of the
>> material on the subject. [Oh boy another web page]
>>
>> Fragmentation happens to cause significant impact upon hard drives,
>> particularly in the NT/XP environment as files are not stored in the
>> fashion
>> one would [non-informed] generally think off.
>>
>> Ponder upon these overly simplified explanations:
>>
>> Many files are created or modified each time their application is run or
>> accessed, and the system does not use the next available hard drive space
>> to
>> place those file segments/additions, but may place them in any unused
>> space
>> on the disk. This creates files which might extend from the base address
>> [fat address or MFT] to anywhere else on the partition handled by jump
>> instructions or other, which indicates the location of the next segment
>> needed. These may once again be jumped to the next segment that may
>> actually
>> be at the opposite end of the disk/partition. Picture that happening
>> several
>> dozen times during the access of that one file. During this time, the
>> hard
>> drive controller, OS, and the algorithms used, may place other segments
>> elsewhere on the disk, either temporarily or permanently.
>> Think of a large file and picture the number of additional head movements
>> needed to access JUST that one file and the extra time [additional
>> nanoseconds] needed, then consider that there are likely a dozen or more
>> additional files [dlls and other exes, etc.] needed for that one
>> application
>> which are also fragmented taking that same whipping head motion picking
>> up a
>> fragment here and there...
>> Now let's picture that application has a data base of information, new
>> information is added to that base but is stored wherever it was created.
>> After running that same application and saving those new bits of data,
>> that
>> data base now exists in several thousand non-contiguous sectors of the
>> hard
>> drive. To view or access that data base ALL those segments must be found
>> and
>> brought together for the visual display, so these scattered bits are
>> temporarily collected in the swap file and/or memory.
>> All of this, of course, takes more head movement and time than if the
>> files
>> were contiguous and the application's other needed files were also closer
>> together.
>>
>> A good indication is when intermittent Windows errors begin to show up
>> for
>> some reason or hard drive access times become excessive. If one goes to
>> Safe
>> Mode, shuts off Windows handling of virtual memory [swap] then deletes
>> the
>> win386.swp file after a restart in DOS, restarts to Safe mode and
>> defrags,
>> then turns ON Windows management when done; restarting to Windows Normal
>> Mode and behavior will be noticeably improved. Part of the reason is that
>> the SWAP file is no longer scattered all over the disk, and is contiguous
>> [Fat systems]. NT's defragmentation is of course different as are the
>> results..
>>
>> Regarding new installations and defragging:
>>
>> A major misconception is that a newly installed OS is defragmented and
>> arraigned closely on the disk. As the files are expanded areas of the
>> disk,
>> various areas are used to hold temporary copies of those files in any
>> available area of the disk. Each file may first be copied, then expanded,
>> then added to the proper directory; or may be placed in temporary storage
>> pending installation order, then placed with some directory [listed as
>> part
>> of].
>> Each time the file is written, it takes up space on the drive, which may
>> or
>> may not be the next contiguous area, and may be some scattered areas upon
>> the disk [other segments of files may already be using an area which
>> might
>> have been used].
>> The directories themselves [via the table] assign the "base" area then
>> list
>> the various temporary and permanent locations of the files listed under
>> the
>> various directories. Nothing at this stage requires these files are
>> actually
>> assigned an area of the disk in which all the directory's files are
>> located
>> within a specific segment of the disk, e.g., one file after the other or
>> one
>> sector after the other. Continuing to use a newly installed disk without
>> ever defragmenting it, will eventually cause errors and at minimum,
>> slower
>> loading times; noticeable after extended usage.
>>
>> The first defragmentation done on a disk attempts to align the various
>> individual segments of the files into contiguous areas/segments. If one
>> has
>> used something like "Align for faster Windows Loading" [MSDeFrag - not a
>> recommended setting}, then the files are arraigned according to the
>> monitored access, space required while running, and other factors held
>> [created by taskmon] in C:\WINDOWS\APPLOG\ and using Logitec Mouse as
>> example LOGI_MWX.LGC to supposedly place the file in an area conducive to
>> its loading and any additional space it might require while
>> loading/running
>> [some exe files temporarily expand on disk and become fragmented in the
>> process]. IF this is a fairly new system or taskmon has been disabled
>> then
>> the files will NOT be arraigned properly as there is not enough saved
>> details.
>> Successive deframentations generally take less time, and decrease file
>> movement.. Watch a defragment tool: it checks the fats, then folders;
>> then
>> files, and only adjusts what has or is fragmented [unless one uses the
>> Align
>> for Faster Windows which WILL constantly move files around based upon the
>> logic files {which is why its not recommended}]
>>
>> Now, should any wish to complain this provides no conclusive proof, then
>> they should get off their dead behinds and actually look at a
>> fragmemented
>> disk and defragmented disk with a hex/disk editor. THEN come back and
>> post,
>> maybe someone will listen, though I doubt it ...
>> Or if you like visuals displays, then run MS Defrag and look at the
>> details,, and watch it move files around trying to place all the file
>> segments together.
>>
>> The short answer is: defragmentation can decrease OS access times and
>> reduce wear and tear on your hard disk. Load times ARE NOT a definitive
>> display of problems with defragmentation, but with the routines used, and
>> INCLUDING a fragmented swap file.
>>
>> --
>>
>> MEB
>> http://peoplescounsel.orgfree.com
>> _________
>>
>>
>>
>>
>
>