Sep 02 2024 03:58 PM
SQL 2019 - SSMS 19.3.4.0
I was always wrongly under the impression that SSMS required a server connection in the Object Explorer to run a script against. We have databases with the same names on 2 servers as we're preparing for migration and I accidentally ran a script on server B, even though there appeared to be no connection open to server B. Only Server A was connected in the object explorer. I was then shocked to find that any new sql script I opened was connected to server B which had been closed out in Object Explorer.
What controls the default server for a script when opening via File / Open in SSMS? What is the best way to lock a script to specific server or make it more obvious which server this is being applied to. I may need to get used to looking in the bottom right where it displays the SQL server, but I'd like to make it more fool proof.
I see activating SQLCMD Mode on the Query Menu is one option, but I wonder what the downside to this might be such that it is not default behaviour.
Sep 03 2024 04:56 AM
SolutionSep 05 2024 02:04 AM
@Javier_Villegas Thanks for your insights. I like the @@SERVERNAME idea as a passive solution that I think will help. I'm also more aware now, so the odds are much lower based on on past experience. I always use BEGIN TRANSACTION and a --COMMIT that is commented until I check that the update was successful.
Sep 03 2024 04:56 AM
Solution