This is a WinNT/2000 question about memory usage as indicated under SYSTEM icon (Win Task Mgr/Performance) in the...
I work with two machines. One is client to the other (server). The server is strictly a back-end database server (WinNT Workstation-S.P. 6; Oracle7 7.2). The client is either WinNT/S.P. 6 or Win2000. The client has much more horsepower at 511 M bytes RAM/1800MHz. The server has 128 M bytes RAM/600MHz. The client is on one subnet; the server on another.
The client runs a Delphi application that links to the Oracle7 RDBMS back-end. The database is distributed across the C and D drives and is installed on C. (I realize Oracle7 7.2 is ancient history but that is a temporary constraint I have to work under.)
(1) The 128 M bytes is enough to install Oracle7 but, without knowing the app, do you feel it is not near enough to run the app? There is one function in the app that either does not complete (possibly too much memory use/mem sorting) or completes in less than 45 minutes. On an Oracle7 7.3/UNIX deployment the same function always completes in about 40 minutes. The database instance SGA fits in memory with much room to spare. The virtual memory has a reasonable setting -- was 500 M bytes but I decreased it. There is ample free disk space on C and D. I defragmented both C and D.
(2) Can you explain why memory usage (Control Panel/SYSTEM/Task Mgr/Performance) on the server machine never gets below 45K, even after I stop services to bare minimum (even shutdown all databases and TNS Listener Service)? When one database is up and running, memory usage jumps to around 80-100 M bytes and sometimes hovers just under the 128 M bytes max. On the client machine quiescent memory usage is much, much less at say 10,000 or less.
(3) Why does back-end seem to want to occupy so much memory when just basic processes are running and Windows services are started?
(4) Do you recommend more memory on the database back-end? Any idea for a minimum recommendation with detail provided to you?
Many people are confused about the way memory is allocated on back-end machines as opposed to client machines. The explanations usually revolve around several key facts:
Windows 2000 Server, and many server-level applications, stake out a great deal of RAM for internal processes, much more so than client systems do. IIS 5.0, for instance, is set by default to use half the system's physical RAM as a cache for Web operations. Many people don't know this, and wonder why IIS is being such a memory hog; they don't know that the memory allocation settings for IIS can be tuned manually. Many other applications, including Oracle, do this, and can also have their memory usages tuned through the use of internal settings. How to do this is fairly complicated and varies from program to program, so you would need to contact a guru for the app you're using to do that tuning.
128 M bytes is enough to run Win2K by itself, but probably not nearly enough to run Oracle as well. Databases are very memory-intensive programs; I would recommend having at least 256 M bytes to run Oracle, or more if you're producing extremely complex queries. Many people who move from a desktop database solution to something like Oracle are startled at how much more memory Oracle needs, but Oracle is designed to work with millions of records at once.
Related Q&A from Serdar Yegulalp, WinIT
Frustrated admins have been dealing with extend.dat error messages since Outlook 97. Our expert sheds some light on this all-too-common Outlook error.continue reading
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.