P
Priya Bange
Guest
Hi Experts,
Am working on file processing task where I need to bulk load the files into Database SQL Server for this am using the FileSystemWatcher which works as required and am a newbie in C#.
The problem starts to build up when the drive gets a massive load of 10-20 files gets loaded in the directory getting monitored with each file around 30-40 mb in size..
The CPU starts peaking 50-60 percent.. To avoid the CPU issue , am trying
File Caching but still issue persists. Am thinking if I could delay the processing file by 1-2 seconds interval between each file that may reduce the overhead on the processor. Please suggest if delay need to be added what's the best way or any other approach.
The process method simply obtains the csv --> use csvhelper to load to data table --> datatable finally gets loaded into SQL Server using bulk insert.
It may contain 40-50 k records per file. Please share optimisation techniques
private static MemoryCache FilesToProcess = MemoryCache.Default;
private static void AddToCache(string fullPath)
{
var item = new CacheItem(fullPath, fullPath);
var policy = new CacheItemPolicy
{
RemovedCallback = ProcessFile,
SlidingExpiration = TimeSpan.FromSeconds(5),
};
FilesToProcess.Add(item, policy);
}
private static void FileCreated(object sender, FileSystemEventArgs e)
{
AddToCache(e.FullPath);
}
private static void ProcessFile(CacheEntryRemovedArguments args)
{
if (args.RemovedReason == CacheEntryRemovedReason.Expired)
{
var fileProcessor = new DataProcessing(args.CacheItem.Key);
fileProcessor.Process();
}
else
{
}
}
Thanks
Priya
Continue reading...
Am working on file processing task where I need to bulk load the files into Database SQL Server for this am using the FileSystemWatcher which works as required and am a newbie in C#.
The problem starts to build up when the drive gets a massive load of 10-20 files gets loaded in the directory getting monitored with each file around 30-40 mb in size..
The CPU starts peaking 50-60 percent.. To avoid the CPU issue , am trying
File Caching but still issue persists. Am thinking if I could delay the processing file by 1-2 seconds interval between each file that may reduce the overhead on the processor. Please suggest if delay need to be added what's the best way or any other approach.
The process method simply obtains the csv --> use csvhelper to load to data table --> datatable finally gets loaded into SQL Server using bulk insert.
It may contain 40-50 k records per file. Please share optimisation techniques
private static MemoryCache FilesToProcess = MemoryCache.Default;
private static void AddToCache(string fullPath)
{
var item = new CacheItem(fullPath, fullPath);
var policy = new CacheItemPolicy
{
RemovedCallback = ProcessFile,
SlidingExpiration = TimeSpan.FromSeconds(5),
};
FilesToProcess.Add(item, policy);
}
private static void FileCreated(object sender, FileSystemEventArgs e)
{
AddToCache(e.FullPath);
}
private static void ProcessFile(CacheEntryRemovedArguments args)
{
if (args.RemovedReason == CacheEntryRemovedReason.Expired)
{
var fileProcessor = new DataProcessing(args.CacheItem.Key);
fileProcessor.Process();
}
else
{
}
}
Thanks
Priya
Continue reading...