August 14, 2012 at 12:49 am
Just wondering is this the right way to build dynamic update query? is there any potential bugs?
Thanks to Phil Factor's parseJSON function first. "Consuming JSON Strings in SQL Server"
http://www.simple-talk.com/sql/t-sql-programming/consuming-json-strings-in-sql-server/%5B/url%5D
I have a simple json data which submitted from web form
e.g. {"FName":"sss","MName":"ddd","Surname":"test","Mdob":"1976-08-11","Mage":"","Country":"AU","Postcode":"5000"}
By using parseJSON function in the SQL SERVER 2005, I got recordset as below
SELECT ROW_NUMBER() OVER(ORDER BY element_id) AS RowNum, [NAME], stringvalue FROM dbo.parseJSON('{"FName":"sss","MName":"ddd","Surname":"test","Mdob":"1976-08-11","Mage":"","Country":"AU","Postcode":"5000"}') WHERE valuetype='string'
RESULT AS: http://i.stack.imgur.com/qm9UW.jpg
I also have a table named as 'tblMother' which have around 50 columns, the json data may contain different columns data each time depends on what user submitted.
so in my store procedure I query COLUMN_NAME, DATA_TYPE, character_maximum_length from information_schema.columns, and then use nested loop to compare json NAME with table columns name to build UPDATE sql query.
LOOP in TSQL: http://i.stack.imgur.com/sJUC2.jpg
There are two temp tables #tmp_JSON and #tmp_TABLEDETAILS which store info from JSON and COLUMN_NAME, DATA_TYPE
***The Full Stored Procedure I wrote:***
CREATE PROCEDURE [dbo].[SaveJSON]
@MID nvarchar(12),
@TableName nvarchar(25),
@JSONData nvarchar(MAX)
AS
BEGIN
SET NOCOUNT ON;
DECLARE @CountJSON int,
@CountTbl int,
@RowNumJSON int,
@RowNumTbl int,
@CharLen nvarchar(4),
@strSQL nvarchar(MAX),
@jCOLname nvarchar(30),
@jCOLval nvarchar(255),
@tCOLname nvarchar(30),
@tCOLtyp nvarchar(255)
SET @strSQL = 'UPDATE ' + @TableName + ' SET '
-- INSERT ALL INFO INTO TWO TEMP TABLE
SELECT ROW_NUMBER() OVER(ORDER BY element_id) AS RowNum, [NAME], stringvalue INTO #tmp_JSON FROM dbo.parseJSON(@JSONData) WHERE valuetype='string';
SELECT ROW_NUMBER() OVER(ORDER BY ORDINAL_POSITION) AS RowNum, COLUMN_NAME AS ColName, DATA_TYPE, character_maximum_length AS CharLen INTO #tmp_TABLEDETAILS FROM information_schema.columns WHERE table_name=@TableName;
SELECT @CountJSON = count([NAME]) FROM #tmp_JSON;
SELECT @CountTbl = count(ColName) FROM #tmp_TABLEDETAILS;
SET @RowNumJSON = 0
SET @RowNumTbl = 0
-- TWO NESTED LOOP TO BUILD UPDATE SQL STRING
WHILE @RowNumJSON < @CountJSON
BEGIN
SET @RowNumJSON = @RowNumJSON + 1
SET @RowNumTbl = 0
SELECT @jCOLname = [NAME], @jCOLval = stringvalue FROM #tmp_JSON WHERE RowNum = @RowNumJSON;
WHILE @RowNumTbl < @CountTbl
BEGIN
SET @RowNumTbl = @RowNumTbl + 1
SELECT @tCOLname = ColName, @tCOLtyp = DATA_TYPE, @CharLen = CAST(CharLen AS nvarchar(4)) FROM #tmp_TABLEDETAILS WHERE RowNum = @RowNumTbl;
IF @jCOLname = @tCOLname AND NOT @CharLen IS NULL
BEGIN
IF @jCOLval = ''
SET @strSQL = @strSQL + @jCOLname + '=NULL, '
ELSE
SET @strSQL = @strSQL + @jCOLname + '=''' + @jCOLval + ''', '
END
IF @jCOLname = @tCOLname AND @CharLen IS NULL AND @tCOLtyp <> 'datetime'
SET @strSQL = @strSQL + @jCOLname + '=' + @jCOLval + ', '
IF @jCOLname = @tCOLname AND @tCOLtyp = 'datetime'
SET @strSQL = @strSQL + @jCOLname + '=CONVERT(DATETIME,''' + @jCOLval + ''', 102), '
END
END
DROP TABLE #tmp_JSON
DROP TABLE #tmp_TABLEDETAILS
SET @strSQL = LEFT(@strSQL, LEN(@strSQL)-1)
SET @strSQL = @strSQL + ' WHERE MotherID = ''' + @MID + ''';'
PRINT @strSQL
END
**EXEC by**
exec DBO.SaveJSON '10000000','tblMother','{"FName":"sss","MName":"ddd","Surname":"test","Mdob":"1976-08-11","Mage":"","Country":"AU","Postcode":"5000"}'
**Return @strSQL string is**
UPDATE tblMother SET FName='sss', MName='ddd', Surname='test', Mdob=CONVERT(DATETIME,'1976-08-11', 102), Mage=NULL, Country='AU', Postcode=5000 WHERE MotherID = '10000000';
Viewing post 1 (of 1 total)
You must be logged in to reply to this topic. Login to reply
This website stores cookies on your computer.
These cookies are used to improve your website experience and provide more personalized services to you, both on this website and through other media.
To find out more about the cookies we use, see our Privacy Policy