September 11, 2025 at 9:19 am
i have subscription of github copilot which i can access in vs 2022 comunity edition.
i have been optimising my tsql in vs 2022 using githubcopilot.
now i need to optimise my whole database which has many tables and sps.
September 11, 2025 at 10:28 am
Trying to optimize something just because is like trying to skate uphill into a blizzard.
As Brent Ozar keeps reminding us ... "What problem are you trying to solve?"
Every optimization comes at the expense of something else.
Also, in my limited experience, AI does not do too well with bulk tasks. You need to know exactly what you want, and direct AI carefully to achieve your expected outcome.
September 11, 2025 at 11:18 am
i need to know how AI can do that bulk tasks which u mentioned.
September 11, 2025 at 12:26 pm
i have seen DTA can take table as input, but somebody said only one statement can be put in side the workload tables' one row.
i need to anaylise all storeprocedures in one go. how can do it.
September 11, 2025 at 1:32 pm
Just ask copilot to look at all of the files in your solution and suggest potential performance improvements.
To be honest, I would not expect any useful feedback from a question as vague as that. Copilot knows nothing about your data volumes or distribution. It knows nothing about your read/write ratios. So it has no way to know if your code will perform badly or not.
September 11, 2025 at 4:27 pm
I'd also like to add that AI can also give bad advice. Before making ANY changes to the system, you should fully understand the advice being given. Like if AI says "change MAXDOP from 0 to 1", do you know the impact of that? It MAY benefit some queries, but it will likely hurt others. I've had AI tell me to switch to simple recovery on a production database that had large volume of write data. Sure, that'll make my log file use less disk space, but it completely breaks my RPO. I've also had AI give me commands with syntax errors or that would have hurt performance.
If it is just looking at tables, it MAY suggest adding indexes to have 1 covering index per stored procedure/queries. That MAY offer better performance to the SP's, but can you afford the disk cost of doing that? AND, as DesNorton pointed out, if some of the SP's or queries are run once per year (for example), they may not need optimizing.
IF you are looking to optimize everything, how will you know it is optimal? EVERYTHING you do to optimize (query tuning, adding/removing indexes, changing server settings) has pros and cons and a good DBA should know their environment well enough to know which suggestions to take and which ones to avoid.
Also, I really really hope you are testing all of this in a dev environment before it hits prod... I personally would never give AI access to prod; especially after reading this story about how AI deleted production code even after being instructed not to: https://www.pcmag.com/news/vibe-coding-fiasco-replite-ai-agent-goes-rogue-deletes-company-database
The above is all just my opinion on what you should do.
As with all advice you find on a random internet forum - you shouldn't blindly follow it. Always test on a test server to see if there is negative side effects before making changes to live!
I recommend you NEVER run "random code" you found online on any system you care about UNLESS you understand and can verify the code OR you don't care if the code trashes your system.
September 11, 2025 at 6:43 pm
Don't you optimize workloads and not individual queries? Some queries may get slower if you change indexing, but on the whole, the workload efficiency will increase. I doubt that's something that ChatGPT can do for you - you actually have to know what you're doing.
September 12, 2025 at 6:04 am
i have a database with 400 stored porcedure , i need to optimise it , i found that tables are not having command index also.
like joins where ,group by , select cols in include.
so i though AI /DTA can give me some indexs after reading my database which has all tables statistics and sp.
and after that i will see them one by one.so it can speed up the process.
September 15, 2025 at 8:14 pm
i have a database with 400 stored porcedure , i need to optimise it , i found that tables are not having command index also.
like joins where ,group by , select cols in include.
so i though AI /DTA can give me some indexs after reading my database which has all tables statistics and sp.
and after that i will see them one by one.so it can speed up the process.
First, got to this link and download the "First Responder Kit". Read the instructions for how to install it and how to use it.
Also, SQL Server actually comes with a kind of hand set of tools although they are time sensitive for a given day. Get into SSMS, right click on the instance name, select [Reports], and follow your nose to the "Performance" reports for the TOP 10 Total CPU and IO queries.
All of these assume that your database is in service and being used. You can use things like AI to prescan your code but it's going to be and overwhelming job if you do that and it's also going to be very wrong in a huge number of cases because it was trained to be a "Consensus Engine" in that part and, just like the warnings to not believe in everything that shows up on a Google or other internet search, so it is true for AI.
--Jeff Moden
Change is inevitable... Change for the better is not.
Viewing 9 posts - 1 through 9 (of 9 total)
You must be logged in to reply to this topic. Login to reply