Saving space
Anyone know some good methods to save some space in my hard drive. I got a 8BG hard drive which has been patitioned into C drive with 2GB and D drive with 6GB. Now the D drive has plenty of space but C drive I got only 460MB left and something is wrong with my Disk cleanup option so I can't use that.
Anyone know some good methods to save some space in my hard drive. I got a 8BG hard drive which has been patitioned into C drive with 2GB and D drive with 6GB. Now the D drive has plenty of space but C drive I got only 460MB left and something is wrong with my Disk cleanup option so I can't use that. Anyone know how to fix that or like an idea to save space? I got Windows 2000 and Service Pack 3.
Participate on our website and join the conversation
This topic is archived. New comments cannot be posted and votes cannot be cast.
Responses to this topic
Quote:Quote:Ace my boy, you have 3 choices
1.) Figure out what files to transfer from D: to C: or to CD, etc.
2.) Use a utility like Partition Magic to reaportion your disk.
3.) Add another hd. LOL, hd storage is soo cheap.
There IS a 4th choice & it is reliable, as opposed to say DOS 6.x's DoubleSpace/DriveSpace single logical virtual compressed disk scheme:
4.) WindowsNT35.1/4.0 & Windows2000/XP File-by-File integrated compression
* It works, is stable, & in SOME cases could technically even improve performance of reads of data up off the disk (smaller file to readup from drive) but, it has to go thru the uncompress stage so that may offset the gain in speed... today's CPU's are SO fast, it may actually be faster to use it!
(Tough call there & I have NOT tested it but have thought about writing a small program to do so with a hi-resolution timer counting the clockticks needed to copy/move/read/write compressed files vs. uncompressed ones).
I use it on data I do not access alot (readme files, docs I don't modify alot, texts I keep around with technical info, etc.) & that is not of an executeable nature. I leave .exe type files (.dll, .ocx, .sys, .tlb, .com, .exe, etc.) uncompressed so they load as fast as possible!
I don't know anyone who uses that anymore. Again, storeage is soo cheap, it's just not worth the effort.
1.) Figure out what files to transfer from D: to C: or to CD, etc.
2.) Use a utility like Partition Magic to reaportion your disk.
3.) Add another hd. LOL, hd storage is soo cheap.
There IS a 4th choice & it is reliable, as opposed to say DOS 6.x's DoubleSpace/DriveSpace single logical virtual compressed disk scheme:
4.) WindowsNT35.1/4.0 & Windows2000/XP File-by-File integrated compression
* It works, is stable, & in SOME cases could technically even improve performance of reads of data up off the disk (smaller file to readup from drive) but, it has to go thru the uncompress stage so that may offset the gain in speed... today's CPU's are SO fast, it may actually be faster to use it!
(Tough call there & I have NOT tested it but have thought about writing a small program to do so with a hi-resolution timer counting the clockticks needed to copy/move/read/write compressed files vs. uncompressed ones).
I use it on data I do not access alot (readme files, docs I don't modify alot, texts I keep around with technical info, etc.) & that is not of an executeable nature. I leave .exe type files (.dll, .ocx, .sys, .tlb, .com, .exe, etc.) uncompressed so they load as fast as possible!
I don't know anyone who uses that anymore. Again, storeage is soo cheap, it's just not worth the effort.
Quote:
P.S.=> What do you think about the "conundrum" I pose up there about NTFS compressed files being less of a read up off the disk (especially for exe types), but the compression layer slowing that speed gain down some...? Do you think that a compressed exe, due to today's disk & CPU speeds GAINS by being compressed OR loses overall due to having to be decompressed first & then assembled in RAM?? apk
With the exception of those fokes out there who 'really' can't afford newer equiptment, I think that the problems with space and speed which was relivant a few years ago, has been made totaly moot by todays hardware. That doesn't mean that you don't need to 'manage' your data, but, things like cheap 600+mb cd storeage & fast burn times have made even long term archiving real easy. The compression becomes irelevant for speed too. My 2 wdjb80gb hd's are soo fast. Few people are really concerned about a +/-ms access diferiental any more.
I supose that eventually OS bloat will make this all relivant again, but untill then let's just enjoy ...
P.S.=> What do you think about the "conundrum" I pose up there about NTFS compressed files being less of a read up off the disk (especially for exe types), but the compression layer slowing that speed gain down some...? Do you think that a compressed exe, due to today's disk & CPU speeds GAINS by being compressed OR loses overall due to having to be decompressed first & then assembled in RAM?? apk
With the exception of those fokes out there who 'really' can't afford newer equiptment, I think that the problems with space and speed which was relivant a few years ago, has been made totaly moot by todays hardware. That doesn't mean that you don't need to 'manage' your data, but, things like cheap 600+mb cd storeage & fast burn times have made even long term archiving real easy. The compression becomes irelevant for speed too. My 2 wdjb80gb hd's are soo fast. Few people are really concerned about a +/-ms access diferiental any more.
I supose that eventually OS bloat will make this all relivant again, but untill then let's just enjoy ...
Quote:
Quote:I supose that eventually OS bloat will make this all relivant again, but untill then let's just enjoy ...
Heh, "bloat"... yes, again, I agree.
This stuff gets better/faster ALL the time hardware-wise at least... but, give programmers more available power, they'll figure out how to soak it up! Personally, I don't go for alot of the "new" ideas in coding because of said "bloat"...
Let's remember that it's not just coding that's the bloat. People want to see more and more 'realism', so, more bits of data added to the sound, to the visual, etc., etc.. Nothing wrong with that, but it's not just text files that we're dealing with now. True 3d imaging should be a real kicker. You said you're into games, so, you should know that.
I already pity aanyone who doesn't have broadband .
Quote:I supose that eventually OS bloat will make this all relivant again, but untill then let's just enjoy ...
Heh, "bloat"... yes, again, I agree.
This stuff gets better/faster ALL the time hardware-wise at least... but, give programmers more available power, they'll figure out how to soak it up! Personally, I don't go for alot of the "new" ideas in coding because of said "bloat"...
Let's remember that it's not just coding that's the bloat. People want to see more and more 'realism', so, more bits of data added to the sound, to the visual, etc., etc.. Nothing wrong with that, but it's not just text files that we're dealing with now. True 3d imaging should be a real kicker. You said you're into games, so, you should know that.
I already pity aanyone who doesn't have broadband .
OK, I know this is slightly off-topic, but, speaking of filesize bloat, coding, etc...
Have any of you seen intro demos made for competitions? There's one in peticular that I am referring to, The Product that runs for about 15 minuites, and is only 64KB. If you take the time to download and watch this, I assure you that you'll be as amazed as I was the first time I saw this. Now, I'm am seriously impressed by the programming talent shown!
Adam
Have any of you seen intro demos made for competitions? There's one in peticular that I am referring to, The Product that runs for about 15 minuites, and is only 64KB. If you take the time to download and watch this, I assure you that you'll be as amazed as I was the first time I saw this. Now, I'm am seriously impressed by the programming talent shown!
Adam
It's not streamed...
8)
8)
Quote:What kind of "Competitions"? Is that an app name, or literally somekind of competition of some sort??
They're actual coding competitions, from what I understand, The Party is a major competition, apparently.
There's some more information on the demo that I mentioned above, here.
The group that made that demo has a homepage, too, here.
They're actual coding competitions, from what I understand, The Party is a major competition, apparently.
There's some more information on the demo that I mentioned above, here.
The group that made that demo has a homepage, too, here.
Well, back on the original topic, I use file compression on old NT servers with arrays that weren't setup with large partitions to begin with. Or, in some cases, they might not be large disks (maybe 2 4GB drives in RAID1) and they need to free up some space. So, rather than spend a $1000 on 2 more disks to work with that array (yes, on some servers I come across, the disks that are available are easily that much) we just compress the offending directories. No big deal, and it can either be a band-aid or long term solution. I have used this on one server for more than 2 years now without any issue.
Hmmm, wonder what effect a larger L2 cache would have on commonly used compressed files? Most likely increase speed drasticaly and be more of a benefit for a file server if you use compression.
I recently compressed all of my file and apps servers and gained a couple gigabyes (I'm talking from 650mb to 11gb free one one of the main ones). Almost all of the files are simply pictures and text. Easily compressible data. That's what's so great about file-system compression. It's transparent to the user but you still get the benefits of compression. Unfortunately the compression isn't as great as say...Winrar but just about as good if not better than Winzip. (By the way..anyone know of a way to modify NTFS compression ratio? I know how to view it but not to modify).
I recently compressed all of my file and apps servers and gained a couple gigabyes (I'm talking from 650mb to 11gb free one one of the main ones). Almost all of the files are simply pictures and text. Easily compressible data. That's what's so great about file-system compression. It's transparent to the user but you still get the benefits of compression. Unfortunately the compression isn't as great as say...Winrar but just about as good if not better than Winzip. (By the way..anyone know of a way to modify NTFS compression ratio? I know how to view it but not to modify).
Quote:And, for sure, there is no use of compressing the Volume or folder containing uncompressible data, such as JPG images, ZIP files, etc... Ideal data for the compression are text and office documents, bitmap images and other files consisting of lots of repeating characters.
Wrong.
.Jpeg's??? Umm, there's still a lot of compression to be done here.
.Zip's? One of the least effective compression format's around. NTFS compression does alot of good on .zip's.
It's very few situations that NTFS compression actually increases the size...one that I don't see very often. Try it out for yourselves. NTFS compression is effective almost all the time.
Quote:Sure, Mr. Gates called it a "cesspool" recently... but I figure he's out to sell their next idea of databased filesystems, which is really drip-down from the As/400 & System390 IBM mainframe & midrange DB/2 databased filesystem world anyhow... but one using SQL Server type databasing, this is the rumor I heard anyhow!
Did Gates actually say that? Do you have a link?
Besides, What does Gates know about computers these days? Not much if you ask me....
Wrong.
.Jpeg's??? Umm, there's still a lot of compression to be done here.
.Zip's? One of the least effective compression format's around. NTFS compression does alot of good on .zip's.
It's very few situations that NTFS compression actually increases the size...one that I don't see very often. Try it out for yourselves. NTFS compression is effective almost all the time.
Quote:Sure, Mr. Gates called it a "cesspool" recently... but I figure he's out to sell their next idea of databased filesystems, which is really drip-down from the As/400 & System390 IBM mainframe & midrange DB/2 databased filesystem world anyhow... but one using SQL Server type databasing, this is the rumor I heard anyhow!
Did Gates actually say that? Do you have a link?
Besides, What does Gates know about computers these days? Not much if you ask me....
The only situation where I would use NTFS compression would be on archived and/or rarely used data or situations where highly compressible data is needed to be compressed for more space on a server. Any other usage of NTFS compression (except under closely monitored situations where you KNEW that it would improve performance) just seems like a reduction of performance and overall efficiency. Didn't read the entire articles was just going by what you posted.
Quote:Did Gates actually say that? Do you have a link?
Wasn't doubting ya. Just wanted a link!
Quote:Did Gates actually say that? Do you have a link?
Wasn't doubting ya. Just wanted a link!
An SQL like database for a filesystem would seem like waaayy too much for desktop systems. The amount of processing time required would far outweigh any benefits I would think especially considering that it would mostly be used for backwards compatibility...so little benefit would be seen to common users who would wish to keep compatibility across multiple OS'S. I see this as something that would be highly useful in a workplace environment but for home use it would be more trouble than it's worth. In other words...I feel sorry for those with 500mhz and below systems!
The SQL-like filesystem being mentioned is the core of Yukon, the next-gen SQL Server from MS. And as for speed, read up on ReiserFS for *nix, and you can see some comparisions between it and ext3 (ext2 with journaling) and xfs. The new version of ReiserFS (v4.x) is faster still, and looks very promising. Essentially, all MS is promising is a b-tree style file structure that apps can interact with, and that's what ReiserFS does already (it's a little more balanced, but the same). MS OS's and apps (like Exchange, AD, and SQL server) will probably be more closely tied to the filesystem as opposed to, say PostgreSQL on a RH box running ReiserFS, so it will be interesting to see the performance charts when it comes out.