Forum Api Project

  • It appears there are additional SQL functions/procedures needed to make the codegen work and which I mistakenly omitted.  To run the SQL code generated by the codegen it references:

    /* tvf to return error columns, if the transaction fails */
    create function dbo.get_error_metadata()
    returns table with schemabinding
    as
    return
    (select
    error_number() error_number,
    error_severity() error_severity,
    error_state() error_state,
    error_line () error_line,
    error_procedure() error_procedure,
    error_message() error_message,
    xact_state() xact_state);
    go

    /* tvf to return error columns, if test exec fails */
    drop function if exists frm.get_output_error_metadata;
    go
    create function frm.get_output_error_metadata(
    @output_response nvarchar(max))
    returns table with schemabinding as return
    select [error_number], [error_severity], [error_state], [error_line],
    [error_procedure], [error_message], [xact_state]
    from openjson(@output_response)
    with ([error_number] int,
    [error_severity] int,
    [error_state] int,
    [error_line] int,
    [error_procedure] nvarchar(128),
    [error_message] nvarchar(4000),
    [xact_state] smallint);
    go

    /* procedure to compare openapi document to system tables */
    drop proc if exists dbo.api_procedure_codegen2_post;
    go
    create proc dbo.api_procedure_codegen2_post
    @open_api nvarchar(max),
    @test_id bigint output,
    @response nvarchar(max) output
    as
    set nocount on;
    set xact_abort on;

    begin transaction
    begin try
    ;with
    endpoints_cte(jv_tag, uri, http_verb, resource_handler, [procedure_name], method_name,
    proc_string, schema_string, procedure_definition) as (
    select json_value(path_info.tags, '$[0]'), paths. uri, http_method. http_verb,
    jas.*, obj.*, object_definition(sp.[object_id])
    from openjson(@open_api) with (openapi nvarchar(32),
    info nvarchar(max) as json,
    [servers] nvarchar(max) as json,
    paths nvarchar(max) as json,
    components nvarchar(max) as json,
    [security] nvarchar(max) as json,
    tags nvarchar(max) as json) bc
    cross apply openjson(bc.paths) paths
    cross apply openjson(paths.[value]) http_method
    cross apply openjson(http_method.[value]) with (tags nvarchar(max) as json,
    operationId nvarchar(max),
    [parameters] nvarchar(max) as json,
    [responses] nvarchar(max) as json) path_info
    outer apply (values (cast(cast(cast(path_info.operationId
    as xml ).value('.','varbinary(max)') as varchar(max)) as nvarchar(max)))) v(oid)
    outer apply openjson(v.oid) with (resource_handler nvarchar(100),
    [procedure_name] nvarchar(100),
    method_name nvarchar(100)) jas
    outer apply (values (parsename(jas.[procedure_name], 1),
    parsename(jas.[procedure_name], 2))) obj(proc_string, schema_string)
    left join sys.procedures sp on obj.proc_string=sp.[name]
    left join sys.schemas sch on sp.schema_id=sch.schema_id
    and obj.schema_string=sch.[name]),
    json_cte as (
    select t.[name] as jv_tag,
    (select * from endpoints_cte e where t.[name]=e.jv_tag for json path) epoints
    from openjson(@open_api, N'$.tags') with ([name] nvarchar(64)) t
    )
    select @test_id=cast(1 as bigint),
    @response=(select * from json_cte for json path, root('endpoints'));

    commit transaction;
    end try
    begin catch
    select @test_id=cast(0 as bigint),
    @response=(select * from dbo.get_error_metadata() for json path, without_array_wrapper);

    rollback transaction;
    end catch

    set xact_abort off;
    set nocount off;
    go

     

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • With the 2 SQL functions and 1 procedure above in place, the following steps should results in a functional Forum API with 32 .NET Core 3.1 API endpoints wired up and connected to 32 SQL Server stored procedures.

    1. Clone the GitHub repo and Open the RelatableCodeGeneratorApi.sln solution file
    2. Add SQL Server connection strings to 2 appsettings.json files.
    3. In the Solution Explorer, in the RelatableCodeGenerator project, Right click on ControllersTemplate1.tt and select 'Run Custom Tool'.  Do the same (right click on, select 'Run Custom Tool') for ProceduresTemplate1.tt (in the Procedures folder).
    4. Execute the SQL code generated by step #3 in the RelatableCodeGeneratorApi project folder.  In this case there are 4 Controllers so there are 4 SQL scripts in the Procedures folder.  The SQL scripts are organized to match the same code pattern as the Controllers.
    5. Set the Visual Studio launch settings to RelatableCodeGeneratorApi (per the screenshot above) and hit F5 to build and launch the API in Debug mode.  The launch settings direct Visual Studio to launch the Swagger browser after building the API executable.
    6. A SwaggerUI website should launch in a browser which allows testing of all 32 endpoints.  The endpoints are all pointed at stored procedures which just (re-)return whatever json data was sent to them.  This way the dataflow can be put in place without the specific logic for every controller having to be in place.

    Why would anyone care?  This tool could be used for any project of any complexity.  Considering the #lines of SQL per API endpoint  (2,237/32 = about 70) and #C (264/32 = about 8) for a project with 200 controllers the codegen would create 1,650 lines of C# and 14,000 lines of t-sql code.  The potential developer time savings seems to be considerable.  My goal is to help promote set based solutions and to help DBA's have a much better approach to support application development.  In my career it's been frustrating for a long time.

    How could someone use the codegen as a 'blackbox' to generate code for their own project?  The true input to the codegen is a Google sheet.   The spreadsheet is called APITemplate and it contains a JavaScript script (which I got on the internet) which turns the contents of the sheets into a JSON file.  Then copy/paste the JSON (from APITemplate Sheet) into the ControllersTemplate1.json file in the RelatableCodeGenerator project. Whatever and however many endpoints are described in the sheet there are C# Controllers automatically generated in the API project.  Building the API project launches SwaggerUI.  SwaggerUI creates an underlying file (swagger.json) which gets copy/pasted  into the codegen project (as ProceduresTemplate1.json).  With those inputs in place the individually named and organized SQL procedures can be generated.  Once the SQL has been executed,  re-launch the API and voila.

    The process workflow is new tho.  The repository is a combination of different pieces I've pulled together partly from different past projects.  Six weeks ago none of this codegen parts existed.  What's not new are the code patterns that are generated.  Those are based on past experience and are afaik and it's open to further enhancement.

    With the 32 procedures pointed at mock versions the next step(s) is to create the production versions.  The production procedures will execute against the DDL from the pre-codegen code.   For most of the endpoints old procs exist which worked the old way.  So there needs to be a conversion to the new approach.   Also my b.s. way of updating the code can end.  Code updates to the Forum Api Project will now include a GitHub pull request to the repo.

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • It's been a few weeks since there's been an update.  We've had some work upheaval and also my son's 6th Birthday party and the Memorial Day holiday.  Workwise I've been updating code which could be useful for this project too.  Hopefully it will all fit in well at some point.

    In many ways this project thread is my response to this other thread No-code Software Engineering where I was frustrated communicating about the necessity of ORM, or object relational mapping.  It's my position that ORM is frequently unnecessary and often counterproductive.  In the past it was perhaps necessitated by framework omissions which have since been filled in.  This project isn't using traditional ORM.  In the repository there are 3 project files in the solution, one of which is JsonAutoService, which is a C# data access library I built to support the various work projects I've been involved in over the years (many years alas).  It takes advantage of SQL Server's ability to serialize/deserialize JSON data.  It also takes advantage of recent C# .NET Core features, like MVC filters, the htttp route value dictionary, and the SQL Types library.  How it works is so simple and direct that it should've been like this the whole time imo.  The proceduralists haven't come up with a framework yet to support set based development (seemingly no plans either) so out of necessity I've tried my best to make one.

    An API sends and receives data requests according to internet protocols.  People create API's to encapsulate the data services from UI and other applications.  In this project so far we've used standard HTTPS REST CRUD requests.  The parts which vary between HTTP requests are the http verb (put, post, get, etc.), the URL route, the route parameters, the headers, and the message body.  There's authentication and authorization too.  The library contains highspeed generic request filters which can be set up to intercept data requests and send them directly to SQL Server without any custom C# code.  SQL Server receives JSON data from the requests in a standardized set of variables and returns JSON data as the responses.  The library also contains different variations for different use cases.  Although I keep referencing "library" the underlying code is contained in this solution (.sln) as it's own project file (.csproj).  Nothing hidden or needs to be decompiled to be seen.

    The PostRegisterAsync method of the C# AccountController class is the endpoint which receives user account registration information.  The C# code (lines 18-21) is as follows:

            [HttpPost("api/a/register")]
    [AllowAnonymous]
    [TypeFilter(typeof(JsonResourceResultsHandler), Arguments = new object[] { "frm.api_user_registration_post" })]
    public void PostRegisterAsync() { }

    There are 3 C# Attributes assigned to this method.  Syntax-wise Attributes are enclosed in brackets [].  Attributes in C# are similar to Extended Properties of SQL Server Stored Procedures.  Attributes are "extended" information about the method.

    Attribute 1: [HttpPost("api/a/register")] is the route suffix of the request.  The API base address (https://apiserver.domain.ext) plus suffix are where requests get sent

    Attribute 2: [AllowAnonymous] permits unauthenticated users access to the method.

    Attribute 3: [TypeFilter(typeof(JsonResourceResultsHandler), Arguments = new object[] { "frm.api_user_registration_post" })]  In C# a TypeFilter Attribute is used to define a Resource Filter.  Methods marked with the TypeFilter Attribute are executed automatically by the MVC request pipeline.  JsonResourceResultsHandler  is the code which gets executed to handle filtered (intercepted) http requests.  JsonResourceResultsHandler takes an array of type object as its input parameter.  By convention JsonResourceResultsHandler expects the name of the SQL Server stored procedure to execute for this endpoint as a parameter.

    [AttributeUsage(AttributeTargets.Class | AttributeTargets.Method, AllowMultiple = true, Inherited = true)]
    public class JsonResourceResultsHandler : Attribute, IAsyncResourceFilter
    {
    private readonly JsonAutoServiceOptions _options;
    private readonly IJsonAutoService _jsonAutoService;
    private readonly string procName;

    public JsonResourceResultsHandler(IOptionsMonitor<JsonAutoServiceOptions> options, IJsonAutoService jsonAutoService, object procName)
    {
    this._options = options.CurrentValue;
    this._jsonAutoService = jsonAutoService;
    this.procName = procName.ToString();
    }

    public async Task OnResourceExecutionAsync(ResourceExecutingContext context, ResourceExecutionDelegate next)
    {
    var httpContext = context.HttpContext;
    var request = httpContext.Request;
    var user = httpContext.User;

    //Serialize Routes
    var jsonRoutes = JsonConvert.SerializeObject(context.RouteData.Values);

    //Serialize Headers
    var headerDictionary = _jsonAutoService.GetHeadersDictionary(request.Headers, user);
    var jsonHeaders = JsonConvert.SerializeObject(headerDictionary);

    //Get connection string
    var sqlConnection = _options.ConnectionString;

    //Handle request
    using (var reader = new StreamReader(request.Body))
    {
    var body = await reader.ReadToEndAsync();

    switch (request.Method)
    {
    case nameof(SupportedMethods.GET):
    var getResponse = await _jsonAutoService.SqlGetAsync(sqlConnection, jsonHeaders, procName, jsonRoutes);
    context.Result = _jsonAutoService.JsonGetContentResult(getResponse.ToString());
    break;
    case nameof(SupportedMethods.PUT):
    var putResponse = await _jsonAutoService.SqlPutAsync(sqlConnection, jsonHeaders, procName, jsonRoutes, body);
    context.Result = _jsonAutoService.JsonPutContentResult(putResponse, _options.Mode, _options.ErrorThreshold);
    break;
    case nameof(SupportedMethods.POST):
    var postResponse = await _jsonAutoService.SqlPostAsync(sqlConnection, jsonHeaders, procName, jsonRoutes, body);
    context.Result = _jsonAutoService.JsonPostContentResult(postResponse, _options.Mode, _options.ErrorThreshold);
    break;
    case nameof(SupportedMethods.DELETE):
    var deleteResponse = await _jsonAutoService.SqlDeleteAsync(sqlConnection, jsonHeaders, procName, jsonRoutes);
    context.Result = _jsonAutoService.JsonDeleteContentResult(deleteResponse, _options.Mode, _options.ErrorThreshold);
    break;
    case nameof(SupportedMethods.HEAD):
    var headResponse = await _jsonAutoService.SqlHeadAsync(sqlConnection, jsonHeaders, procName, jsonRoutes);
    context.Result = _jsonAutoService.JsonHeadContentResult((bool)headResponse);
    break;
    default:
    context.Result = _jsonAutoService.JsonDefaultContentResult();
    break;
    }
    }
    }
    }

    First, it serializes the Routes.  Second, it serializes the Headers.  Third, it gets the SQL Server connection string from configuration.  Fouth, it "handles" the request by using reflection to determine the request method type (Get, Put, Post, Delete, etc.) and executing SQL using ADO.NET methods.

    The PostRegisterAsync method is a POST method.  To handle a POST request the code which gets executed is (lines 57-60)

                        case nameof(SupportedMethods.POST):
    var postResponse = await _jsonAutoService.SqlPostAsync(sqlConnection, jsonHeaders, procName, jsonRoutes, body);
    context.Result = _jsonAutoService.JsonPostContentResult(postResponse, _options.Mode, _options.ErrorThreshold);
    break;

    First, SqlPostAsync is executed which returns a PostResponse object.  Lastly, JsonPostContentResult is called by context.Result.  context.Result is the framework request finalizer which returns the assigned result as the response to the request.

    The SqlPostAsync is a method in the data access service class.  This part should be familiar to lots of C#/SQL developers

            public async Task<PostResult> SqlPostAsync(ActionContext context, string jsonHeaders, string procName, string jsonParams, string jsonBody)
    {
    _logger.LogInformation($"Sql Post params: {jsonHeaders}, {procName}, {jsonParams}, {jsonBody}");

    var conString = _options.ConnectionString;

    using (var sqlCon = new SqlConnection(conString))
    using (var sqlCmd = new SqlCommand(procName, sqlCon))
    {
    sqlCmd.CommandType = CommandType.StoredProcedure;

    var h = sqlCmd.Parameters.Add("@headers", SqlDbType.NVarChar, -1);
    h.Direction = ParameterDirection.Input;
    h.Value = jsonHeaders;
    var p = sqlCmd.Parameters.Add("@params", SqlDbType.NVarChar, -1);
    p.Direction = ParameterDirection.Input;
    p.Value = jsonParams;
    var b = sqlCmd.Parameters.Add("@body", SqlDbType.NVarChar, -1);
    b.Direction = ParameterDirection.Input;
    b.Value = jsonBody;

    var t_id = sqlCmd.Parameters.Add("@test_id", SqlDbType.BigInt);
    t_id.Direction = ParameterDirection.Output;
    var r = sqlCmd.Parameters.Add("@response", SqlDbType.NVarChar, -1);
    r.Direction = ParameterDirection.Output;

    await sqlCon.OpenAsync();
    await sqlCmd.ExecuteNonQueryAsync();
    sqlCon.Close();
    return new PostResult((SqlInt64)t_id.SqlValue, (SqlString)r.SqlValue);
    }
    }

    and finally the result of the SqlPostAsync method is passed to the method which creates the ContentResult for the client (who sent the data request) along with some logging and error handling.

            public ContentResult JsonPostContentResult(PostResult postResponse, string mode, int errorThreshold)
    {
    if (postResponse.IsValid)
    {
    return new ContentResult
    {
    StatusCode = StatusCodes.Status201Created,
    Content = postResponse.Body.ToString(),
    ContentType = Json
    };
    }
    else
    {
    var errorResult = JsonConvert.DeserializeObject<ErrorResult>(postResponse.Body.ToString());
    var errorMessage = ErrorMessage(mode, errorThreshold, errorResult, SupportedMethods.POST);

    return new ContentResult
    {
    StatusCode = StatusCodes.Status400BadRequest,
    Content = JsonConvert.SerializeObject(errorMessage),
    ContentType = Json
    };
    }
    }

    The PostRegisterAsync Controller method itself returns a void type, takes no parameters, and has no code body.  Because by design the request gets intercepted and "handled" before the C# program delegate ever reaches the method.

            public void PostRegisterAsync() { }

    That's how it works.  At no point is there any object relational mapping.  Imo ORM in this context would be wasteful.  Requests are intercepted and inputs are JSON serialized and sent to SQL Server at the earliest possible point in the request pipeline.  Responses are sent back to the client without C# deserializing the payload.

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • Hi Steve Jones, when these posts were consolidated from what were originally 7 separate threads it appears a post(s)? containing the DDL and explanation for having a System Version'ed edition of the Posts table (and insert procedure) to compare with the regular table (and rCTE proc) was not included.  Now I see the individual posts were removed and maybe blown away forever?  If it's possible and not too involved it would be nice to add that part back to this thread.   Thank you for everything you've done.

    Currently I'm creating a single DDL_and_Example_Data_Setup.sql script to handle setting up the schema, base tables, and examples.  Then it'll be added to the repository.  I went back through the old emails and old db instance to get the complete set of tables.  Some tables have been omitted so far because they're only needed for admin or to store ID cross-references.  There were a bunch of tables (and triggers) to store and handle upvotes/downvotes for the Threads and Posts (TP).  Other tables store the event of TP content being banned.  Other tables store the event of TP content being marked inappropriate.  These extra tables need to be added to the script, API endpoints need to be created for these and other admin purposes, and add the System Version'ed approach need management tables added too.  Then rerun the C# codegen.  Then launch the API to get Swagger.jon.  Then copy/paste Swagger.json into PreceduresTemplate1.json in the CodeGen project and create SQL scripts in the API Procedures folder.  Then execute the generated SQL.  Then relaunch the API and voila, a boatload of SQL procedures are ready to be written/modified.  What the old triggers were doing could be done in the procs.  A lot of SQL

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • I'll try to search those out, but not sure. The threading in here isn't really a place to run a project or post the code. The GH repo is much better.

  • Yes the repository is the better place for the code.  There are going to be many changes and now this thread seems incomplete.  Maybe it makes sense to start over when it works and create a new repo as well.  Nobody forked it so it doesn't matter.  The current RelatableCodeGeneratorApi repository is serving sort of dual purpose as a training/demo app.  I'll create one specifically for this project.

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

Viewing 6 posts - 16 through 21 (of 21 total)

You must be logged in to reply to this topic. Login to reply