I have a problem with performance. The thing is there is a huge insert trigger which does all kinds of things when inserting an order. The insert trigger also inserts in a lot of other tables, causing a lot of insert triggers to go off. The strange thing is. The first time I insert something, the performance is just awful. The 2nd time I insert in the table, the performance is way better. I bet this has something to do with the SQL server caching stuff.
An example is inserting an order based on a huge calculation took 13 seconds. The second time i did this it only took 2.8 seconds. We know this big 'trigger party' is a problem, but this is the way it works; the 3 seconds is acceptable, but 13 is not!
I'd love to know if there's a way to see what is increasing this performance this much (like a cache) and is there a way to always achieve this? Like force caching stuff?
Who can help me with this issue?
Solved
Bad performance on inserting. 2nd insert is way faster
Best answer by Robert Jan de Nie
Anne Buit wrote:
Compiling a query plan for a large trigger will cost a lot of CPU time. Moving this to a large procedure will probably not change much as the large procedure will also eat up a lot of compile time.
There are various ways to deal with this. However, I can't provide any silver bullet advice, it really depends on the situation.
I agree with this statement, complex logic stays complex, no matter where you put it. It is however, from my experience, far easier to comprehend and optimise a procedure than a trigger firing and resulting in multiple triggers to go off. Also there is often less overhead because a procedure is purposely built for a specific job. Triggers fire regardless and you have to at least test for cases you don’t want (a part of) the trigger to do anything.
Reply
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.