Thursday, March 8, 2012

.NET Database/General Performance

Can someone please explain what I'm missing?

Looks like, from my investigation on the net, that 1500/s is pretty much the upper limit of .NET's ability to insert records. Can anyone tell me why? I have tried using UpdateBatchSize properties of 300, 500, 1500, 4500 all of which resulted in a max of 4 seconds difference over 90,000 records.

Why can .NET only insert 1500/s when DTS can do roughly a million in a little over a minute, which is approximately 10 times the performance? Both were doing only simple inserts. Due to my processing needs and integration with other apps, I really need to do the loading from within the app.

I have also written a lot of automation in Visual Basic and C++ before, and when I rewrote it in .NET, the performance was abominable. I am concerned that .NET is not a great language for performance-oriented tasks outside the core application. It does not interface smoothly or quickly with outside technologies as far as I can tell.

Within the application, I don't seem to have a problem, as I was able to write a parser to evaluate 100,000 boolean expressions in less than half a second (simple expressions, mind you). This used a lot of Regex expressions and evaluation code, so it's not that the code is running slow.

Anyone out there seen similar performance with .NET? Anyone from Microsoft that can help explain this?

Programming languages are not the best solution for bulk inserts.

.NET is a lot faster than plain old VB when used properly. Unmanaged C++ still rocks of course.

There are some great articles on MSDN about Data Access (and general performance) with .NET. Maybe you could check these out.

No comments:

Post a Comment