We have an ASP.NET web application running against a SQL Server 2005 Standard database server. We've had some issues recently where the application appears to be trying to insert several million characters of text into an ntext column. It's trying to report to us the fact that an exception has occurred and is dumping all the information it has to the database in one record. Our best estimates is that the one column alone could be over 100MB in size in some cases, based simply on the amount of text. We've got one master DB that the application connects to. The database that the data is being inserted into is in another DB on the same server using a synonym.
We've went back and found several records of a large size that appear to have inserted correctly, but yesterday, we had an instance where we believe it was attempting to do this and the CPU spiked to 100% and any attempt to connect to the DB would get a time out error. We ended up having to kill the SQL Server process to get things responsive again.
Our question is whether or not attempting to insert large records like that could cause SQL Server to do this, and if so, why does it do this? The database server is an 8-core box with 32 GB of RAM and the main DB we run is around 150GB, with the one that it's attempting to insert into through the synonym is about 70GB.