Forum Discussion
SQL 2019 Std Edition high recompilation and CPU spike
KanishkaB As Ronen mentioned, there are always subtle differences between how queries get optimized from version to version. You asked if anything changed in the engine between 2016 and 2019. A significant number of changes happened between those versions, as you'd hope. In general, we always end up better off overall, but there's nothing to stop you falling into an edge case, where that is not the case. That's why it's so important to test in detail before upgrading.
Some code is more susceptible to becoming an edge case than other code. And so when we find these cases, we try to modify the code to avoid what's triggering the issues.
If the temp tables are created, and indexes, only once, then just used, that's not the issue.
Temp tables trigger recompilations much more quickly than other tables. A previous recommendation of Microsoft's best practices analyzer used to suggest replacing temp tables with table variables every time it saw one. That was very poor advice as a general rule. It avoids the recompiles because it doesn't have the same level of statistics available.
In your case though, with a small number of rows, I'd be interested to see if using table variables would have an overall positive effect on the code. Other questions I'd have are around the other recompilation triggers that are mentioned in the whitepaper that I linked.
In particular, are you always calling the code with well-defined parameter data types?
To answer your last question yes we are.
Regarding the white paper link that you have sent I am trying to see if all my tests adheres to that- which mostly I think it is.
On the 2019 RTM edition without CU the CPU remains around 40%, but on 2019 CU (15 or 16) it spikes to 100%- with everything being same.