S
smtwtfs007@gmail.com
Guest
What is the difference between "CPU Load" and "Total Memory Used" ?
Our system is a file processing system which reads input file and
processes it based on different conditions and writes an output file.
These are several application windows services do different jobs
during the processing. So far we have run only large file of 300,000
lines. We had some performance issues but finally processed it
successfully. Now the target is to run files having 500,000 or even
1,000,000 lines/records.
I was asked to look for any performance issues. Here I am going to
brief you what the system does.
1. We receive these files via FTP. There will be a folder watcher
service that keep monitoring the FTP folder and as soon as it receives
one file, then it will make the file entry into database (typically
inser a new record)
2. Then there is another service FTP the input file to UNIX box in
which we have our Database sitting and It will create an external
table with all the file data. Writes each line of input as a record in
external table.
3. There will be another service reads this table and picks each
record and performs different checks and writes into an output table.
After reading all the records It will write the output records into
output file.
4. There are other services for reports and for other jobs.
My question is:
How would I know the reason for the slow performance when we run a
huge files say 500,000 or 1,000,000 lines?
I have notices the CPU Load never was 100% but It went upto 70% When
we ran 300,000 lines file. What do I have to do in order to maintain
CPU Load level?
What is Percent Memory Used? I have see throughout we are maintaining
process memory of 17% and some times it went upto 35%.
How can I check if there is any IO issue between file transfers from
our server to UNIX etc.
Our server RAM is 5GB and OS is Windows 2003.
I appreciate your responses.
Thanks
Our system is a file processing system which reads input file and
processes it based on different conditions and writes an output file.
These are several application windows services do different jobs
during the processing. So far we have run only large file of 300,000
lines. We had some performance issues but finally processed it
successfully. Now the target is to run files having 500,000 or even
1,000,000 lines/records.
I was asked to look for any performance issues. Here I am going to
brief you what the system does.
1. We receive these files via FTP. There will be a folder watcher
service that keep monitoring the FTP folder and as soon as it receives
one file, then it will make the file entry into database (typically
inser a new record)
2. Then there is another service FTP the input file to UNIX box in
which we have our Database sitting and It will create an external
table with all the file data. Writes each line of input as a record in
external table.
3. There will be another service reads this table and picks each
record and performs different checks and writes into an output table.
After reading all the records It will write the output records into
output file.
4. There are other services for reports and for other jobs.
My question is:
How would I know the reason for the slow performance when we run a
huge files say 500,000 or 1,000,000 lines?
I have notices the CPU Load never was 100% but It went upto 70% When
we ran 300,000 lines file. What do I have to do in order to maintain
CPU Load level?
What is Percent Memory Used? I have see throughout we are maintaining
process memory of 17% and some times it went upto 35%.
How can I check if there is any IO issue between file transfers from
our server to UNIX etc.
Our server RAM is 5GB and OS is Windows 2003.
I appreciate your responses.
Thanks