Article

File server management dos and don'ts

Jan Stafford

IT managers can blame operating system vendors' proprietary standards for some, but not all, of the headaches caused by file server management, said T. Reid Lewis, president and founder of Arlington, VA-based Group Logic,

    Requires Free Membership to View

Inc., a file and print server software vendor. Also to blame are the poor performance and/or poor implementation of the protocols themselves.

Dealing with any one of these issues can cause a headache, but tackling all three at once can bring on a migraine. So, searchWindowsManageability asked Lewis to offer some headache relief by providing some expert tips on taming file servers.

Do learn from IT managers' past problems with proprietary standards. Lewis offered an example: "For year, Novell file servers required IPX/SPX, which clogged up a network already loaded with Microsoft protocols and, more often than not, TCP/IP." The solution? IT managers jumped ship and moved their Novell servers to TCP/IP to eliminate the conflict.

Do eliminate non-standard protocols, such as AppleTalk and IPX/SPX, from your network since they are less-efficient and slower than TCP/IP. Since OS-based file-sharing facilities for mixed networks, such as Services for Macintosh in Windows NT and 2000, often rely on these protocols, you may need to consider third-party file server products.

Do put as much memory on your file server as possible. "File servers typically use RAM for caching operations and optimized transfer, so your clients won't have to wait on the actual storage mechanism," Lewis explained.

Do get the fastest storage mechanism you can. While file servers will use extra RAM to make up for a slow disk, eventually that RAM will fill up unless your disk is quick.

Do invest in network bandwidth technology. Gigabit Ethernet on your server will deliver better performance, but you must complement it with appropriate switching throughout your organization. Make sure that all your systems that are connected to switches are running in full-duplex mode.

Don't create a single share with hundreds of thousands of files and directories. While file servers will work with such a configuration, you can achieve much better performance and scalability by distributing your shared files across multiple shares, partitions and even other servers. Use your file server as heavily as you need, but do what you can to spread the usage load.

Don't set your Windows 2000 server's network settings to "Maximize for File Sharing." This may sound odd, but configuring your server this way will allow the Windows file serving components to consume most of the machine's memory, Lewis said. This will make it much harder for file and print management programs to deliver performance. Instead, configure it to "Maximize for Network Applications."

Do get as much traffic on TCP/IP as possible. File sharing performance problems result from the overhead required to support multiple protocols, Lewis explained. The server and client spend time processing multiple protocol stacks, the network has to switch them, and so on. Putting more traffic on TCP/IP will reduce the additional workload and improve overall performance.


There are Comments. Add yours.

 
TIP: Want to include a code block in your comment? Use <pre> or <code> tags around the desired text. Ex: <code>insert code</code>

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
Sort by: OldestNewest

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to: