Windows 2000 Distributed File System Performance

Hey Everyone I'm looking for some opinions about the performance of Windows 2000 Distributed File System (DFS) when dealing with a very large number of small files. Please reply with personal experiences using DFS and any impact on performance of the servers.

Windows Software 5498 This topic was started by ,


data/avatar/default/avatar07.webp

59 Posts
Location -
Joined 2000-02-22
Hey Everyone
 
I'm looking for some opinions about the performance of Windows 2000 Distributed File System (DFS) when dealing with a very large number of small files. Please reply with personal experiences using DFS and any impact on performance of the servers.
 
I plan to implement a DFS share to replicate the wwwroot on 3 web application servers. The wwwroot contains about 230,000 small files in 24,000 folders. Total size is about 3 GB but the files rarely change. All three servers are large machines running Windows 2000 Server (SP3) with hardware SCSI RAID and are connected via 100mbit lan. I fully understand how to implement this but am looking more for input on the performance hit I can expect when replicating such a large number of files.
 
Thanks for any help,
 
Evan

Participate on our website and join the conversation

You have already an account on our website? Use the link below to login.
Login
Create a new user account. Registration is free and takes only a few seconds.
Register
This topic is archived. New comments cannot be posted and votes cannot be cast.

Responses to this topic


data/avatar/default/avatar15.webp

1047 Posts
Location -
Joined 2000-04-17
I have never heard of DFS, is it some sort of other file system like NTFS?

data/avatar/default/avatar16.webp

1615 Posts
Location -
Joined 2000-03-25
funny that you bring that up because i use DFS on my .NET servers to do the exact same thing. Ihave my wwwroot for all the website that i have built on 2 computers using DFS. Updates to file happen almost instantaneously. What DFS does is like binds the 2 or more folders together into what they call a domain root. Then you can work on that share and it will auto replicate to the others. It is pretty cool to cause the share is like virtual like \\nameofdomain\nameofdomainroot
so it is really easy to get to. then you can just point all your IIS home dirs to that share also. It allways worked great for me in .NET i am not sure how it works in 2k but i imagine it is just the same

data/avatar/default/avatar07.webp

59 Posts
Location -
Joined 2000-02-22
OP
Thanks for your reply Four and Twenty. How many files do you have in your wwwroot? Did you notice any performance hit?
 
pmistry, here's a great link to learn more about DFS( http://www.labmice.net/Windows2000/FileMgmt/DFS.htm). Its a pretty sweet way of giving yourself fault tolerant file shares.

data/avatar/default/avatar16.webp

1615 Posts
Location -
Joined 2000-03-25

 
no performance hit that i can see
DFS works great.

data/avatar/default/avatar16.webp

1615 Posts
Location -
Joined 2000-03-25
holy crap you have a lot of files
I just write my web apps by my self for small local biz
what the hell kinda apps are you guys writing?

data/avatar/default/avatar07.webp

59 Posts
Location -
Joined 2000-02-22
OP
Hehe yes we have WAY too many files. The problem is our web software was written by someone who thought it best that every client had their own folder (with a copy of master version files) and every client project had its own sub-folder (also with a copy of master version files). Multiply this by 3 major products (and multiple versions of each) and we have a lot of the same files in different folders with a few customizations here and there. These files will eventually be archived but not in the near future. Luckily our new versions of software will be less file intense