Towards an efficient interface between application and database

  • Comments posted to this topic are about the item Towards an efficient interface between application and database

    Best wishes,
    Phil Factor

  • Hi Phil,

    what kind of connector are you using to communicate with the SQL database when writing c#?

    System.Data.SqlClient?

    I am asking, because I am wondering how you are calling the stored procedures from your code. I guess you are pasting the JSON objects to an nvarchar procedure parameter via deserialization?

    Cheers, Tarek

  • tsalha  Yes, I'm using System.Data.SqlClient generally for this. I use a temporary stored procedure on the connection, (create or edit) and then execute it with parameters. I usually just have two parameters, one for the input JSON and one for the output. You can get away with one input parameter. One can either use an output parameter to get the JSON return, or define a variable in code for the output parameter, and  simply do a SELECT on the output of the stored procedure, catching the output as a return from the ExecuteScalar().

     

    Best wishes,
    Phil Factor

  • Thanks Phil I'm a big fan of your articles.  I completely agree JSON can be a key element of efficient application data access.  As far as data validation goes tho...  imo and afaik Sql Server's current JSON features set is quite complete.  Of course, this could be legitimately disagreed with and for sure I'm extremely interested in counterarguments.

    Are the following validation steps [applied in this order] sufficient to reasonably ensure data integrity?  App sends json document to SQL then:

    1) is the JSON document valid? -- ISJSON() function --> if yes, proceed to deserialize and parse the document

    2) are the correct data elements present? -- STRICT optional parameter(s) to Json parse functions --> parse data elements

    3) can the data be persisted? -- DB Engine --> DB Engine DML

    Whether or not the document is valid compared to its own schema doesn't really enter the picture.  It would be extra information.  Are the others listed "Python, Go, Java, PHP, .NET. Even MongoDB.." transaction engines?   Maybe Mongo I don't know.

    If Sql Server could generate a JSON schema for any query that uses FOR JSON that would be amazing.  I still refer to (and use some modified code from) this article you wrote about 2 years ago:

    https://www.red-gate.com/simple-talk/blogs/producing-data-and-schemas-in-json-array-of-array-format/

    Microsoft could expand on this and make the functionality built in.  That would be great schema support I'd love to see.

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • scdecade

    Yes, IsJSON() is a good start. The 'Strict' option is an excellent precaution against Typos, but neither of these tackle the issues of how your data should behave.  In a database, you will be dealing with special datatypes that are merely based on SQL Server's built-in datatypes. When you specify and integer for a month, for example and it has to be constrained between 1 and 12, it isn't sufficient to know that it is an integer.  A filename mustn't have certain characters in it and so on. A credit card must conform to Luhn's algorithm. Many columns will be have values that are based on enumerations. (e.g. Spring, Summer, Autumn, Winter).  If you are reading data from a table and that table has constraints on it you can be confident that your data is clean. When you are processing invoices, you will know that they are dated in a very narrow range of dates, and are unlikely to date from Roman times. The invoice amount will have certain rules. You wouldn't want to get that wrong.  If you are reading from JSON, any basic security will lead you to do a number of check constraints

    I produced a procedure for generating a JSON Schema from SQL Server, here: https://www.red-gate.com/simple-talk/sql/t-sql-programming/transferring-data-with-json-in-sql-server/. This is just the start because it is impossible to map all SQL constraints directly to a JSON schema. It really just checks the datatypes rather than let the stored procedure error-out. Because JSON Schema allows regexes, you can do a fair amount but you soon reach the limits.

     

    Best wishes,
    Phil Factor

  • One way or another I would like to come up with a way to deal with JSON schemas.  If I observe how my time is spent and do the "5 why's" what it inevitably points out is that the current workflow can't be fully automated unless we have the schemas of all of the JSON documents used by our API's.  What I'm trying to work out is a workflow that adds input/output JSON document schemas as well as example data as extended properties of stored procedures.  Then in C# the API build process would read and append the SQL extended properties for each endpoint to the swagger.json file.

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • Steve Collins wrote:

    One way or another I would like to come up with a way to deal with JSON schemas.  If I observe how my time is spent and do the "5 why's" what it inevitably points out is that the current workflow can't be fully automated unless we have the schemas of all of the JSON documents used by our API's.  What I'm trying to work out is a workflow that adds input/output JSON document schemas as well as example data as extended properties of stored procedures.  Then in C# the API build process would read and append the SQL extended properties for each endpoint to the swagger.json file.

    There's a better way imo.  Appending JSON schema (and example data) from SQL to swagger.json would potentially be an improvement over what we're doing now but it would only be a small optimization and not automation.   It seems what I'm trying to do really fits into a larger picture of what many other developers have already figured out.  It's called "Contract First" development.  In this case the "contract" is a YAML document that describes an API and follows the OpenAPI (3.0.3) standard.  YAML is superset of JSON which means all YAML documents are also expressible as JSON.  The OpenAPI standard [which is itself a YAML document (with it's own JSON schema)] defines the path, parameters (including JSON SCHEMA(!!!)), headers, content body, etc. of an HTTP request.  If the starting point of SQL development is a YAML/JSON document, then really full automation could be built.  That's what I'm going to try to do.  It's my New Years resolution.

    One way to enforce parameter contracts within SQL Server would be to build a generic converter from JSON Schema to SQL Server VIEW.  This is similar to what ScottPletcher described here

    https://www.sqlservercentral.com/forums/topic/extract-data-with-variable-column-names-and-order/page/2/#post-3819741

    Then the VIEW(s) would be used in stored procedures to fulfill the data contract.

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply