Honestly, the fact that ChatGPT makes the mistake of thinking that people age on 01 Jan, which is such a fundamental error, show that it can't be relied on when you don't understand what the "solutions" it gives you are.
You caught onto another one of it's errors, using
nvarchar without a length, but that mistake is even more fatal; someone not paying attention could easily end up with significant data loss if they just implemented that solution.
If any of the names contain multiple words, this code will only capture the first word as the first name, and the remaining words as the last name.
Honestly, it is good that it told you this though; splitting full names into first (middle) and surname is an unsolved problem, and I will admit that's it's nice that the text it generated denotes that it, effectively, won't work correctly for people with 3+ names. It doesn't, however, handle scenarios where the name doesn't contain a space; which a "good" solution would.
The part on the
LOGIN with server admin privileges as well isn't right. Presumbly, by "serveradmin" you mean a system administrator, which is the server role
serveradmin; on a default instance that SQL would error.
Honestly, I see this article as little more than further cement that ChatGPT is quite a dangerous tool. For those that understand the output it gives then it's probably not actually that useful and likely they would have written the solution themselves. For those who don't understand it's output, then it can easier introduce fundamental flaws. We're going to end up with a "world" where isntead of people copy pasting from sites like Stack Overflow, they are copying from ChatGPT; which (from my experience) ends up with a lower reliability.
Excuse my typos and sometimes awful grammar. My fingers work faster than my brain does.