EF Code-First

  • What is the DBA world consensus on using this to implement databases for an application? An application developer presented this as the "future of EF" but I do not know much about "EF Code-First" but it looks like its coding the database as classes in a .NET app and then applying it to a database.

    I feel its more for developers who do not have a database savvy person to work with on implementing a database.

    Thank you in advance for any information/direction on this.

    Michael

  • EF Code first is in the eyes of most database professionals a paraphrase of *^%#Fxx&^%$& (get my drift?)

    😎

    EFCF my in the simplest of scenarios be usable for storing applications users' configurations and preferences, normally does not scale for any amount of data. I've spent much more time rectifying EFCF systems than it would take to design them properly from the start. The point is to get a database professional as early in the loop as possible!

  • EFCF is great if you want everything to be NVARCHAR(256) and even True/False columns to be NUMERIC(18,0).  That's great for the economy, as well, because it keeps a raft of "Performance Tuning" consultants gainfully employed virtually forever.  It will also help you justify the need for monster servers in a large farm of servers with dozens of CPUs and terra-bytes of memory with a SAN as big as a house and at least 100 NIC cards per server and at least a thousand for the SAN.  You also never be bothered by troublesome upgrades to the code because no one will be able to actually find anything and they'll just write all new code if they ever need to do something new.   Just think... you'll also be way ahead on continuous deployments and integration.  You'll also be able to justify buying the latest and greatest software tools on a quarterly basis to support it all because you'll need to, partially because of the short lifespan of such software and because you'll need it all to support the code you're writing.

    Heh... not sorry for the sarcasm here.  Every time I've seen the likes of EFCF used by people that don't understand the true meaning of Knuth's parable and misuse it to justify sheer lunacy, there's been serious problems with performance (especially in the face of scalability), accuracy of code, resource usage, and data inaccuracy.

    While certainly not true in some cases, I've found that a whole lot of Developers that consider themselves knowledgeable in SQL (any flavor) actually aren't and if knowledge about how relational databases are supposed to work were gasoline, a high percentage of them wouldn't have enough to run a piss ant's mini-bike through a matchbox.

    Most DBA's and Database Developers wouldn't presume to know what's the best for the front end code.  Front end coders need to stop making that presumption about backend code and object structure.  A really good team has really good people on both sides of that.

    I agree with Eirikur... get the full time data people involved from the git.  If you don't have at least one, hire one or you'll need to hire many later on to fix all the garbage that's been produced.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    EFCF is great if you want everything to be NVARCHAR(256) and even True/False columns to be NUMERIC(18,0).  That's great for the economy, as well, because it keeps a raft of "Performance Tuning" consultants gainfully employed virtually forever.  It will also help you justify the need for monster servers in a large farm of servers with dozens of CPUs and terra-bytes of memory with a SAN as big as a house and at least 100 NIC cards per server and at least a thousand for the SAN.  You also never be bothered by troublesome upgrades to the code because no one will be able to actually find anything and they'll just write all new code if they ever need to do something new.   Just think... you'll also be way ahead on continuous deployments and integration.  You'll also be able to justify buying the latest and greatest software tools on a quarterly basis to support it all because you'll need to, partially because of the short lifespan of such software and because you'll need it all to support the code you're writing.

    Heh... not sorry for the sarcasm here.  Every time I've seen the likes of EFCF used by people that don't understand the true meaning of Knuth's parable and misuse it to justify sheer lunacy, there's been serious problems with performance (especially in the face of scalability), accuracy of code, resource usage, and data inaccuracy.

    While certainly not true in some cases, I've found that a whole lot of Developers that consider themselves knowledgeable in SQL (any flavor) actually aren't and if knowledge about how relational databases are supposed to work were gasoline, a high percentage of them wouldn't have enough to run a piss ant's mini-bike through a matchbox.

    Most DBA's and Database Developers wouldn't presume to know what's the best for the front end code.  Front end coders need to stop making that presumption about backend code and object structure.  A really good team has really good people on both sides of that.

    I agree with Eirikur... get the full time data people involved from the git.  If you don't have at least one, hire one or you'll need to hire many later on to fix all the garbage that's been produced.

    CF normally doesn't stand for Code First, it is rather a notation of a Cluster Fxxk, but then again that would be the same thing

    😎

     

  • Eirikur Eiriksson wrote:

    CF normally doesn't stand for Code First, it is rather a notation of a Cluster Fxxk, but then again that would be the same thing

    😎

    Heh... I'll remind you that I spent almost 9 years in the Submarine Service.  The connotation wasn't lost on me.  I just decided to not take it totally vertical. 😀

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • SQL Espo wrote:

    What is the DBA world consensus on using this to implement databases for an application? An application developer presented this as the "future of EF" but I do not know much about "EF Code-First" but it looks like its coding the database as classes in a .NET app and then applying it to a database.

    I feel its more for developers who do not have a database savvy person to work with on implementing a database.

    Thank you in advance for any information/direction on this.

    Michael

    EFCF has a lot of negative issues.  The alternatives are much better in terms of database design but the work flow and the amount of code needed are usually much greater.  To help with this I've created an open source (Apache 2.0) C# library (.NET Standard 2.0) called JsonAutoService.  Among other things it implements 3 different (depending on use case) generic SQL handlers as filters in the .NET Core http pipeline.  In most use cases SQL is handled by a Resource Filter that "short circuits" the http pipeline before UI binding and before Action Filters.  In .NET Core all "Controllers" are Action Filters.  JsonAutoService is faster, simpler, and more secure than any of the current methods  afaik.  Again afaik there simply is not a better way to execute SQL stored procedures if you wish to make use of .NET Core authentication/authorization.   The library is a work in progress and it hasn't really been officially released yet.  The Github repo currently contains the library code and an example project along with sample SQL scripts.  We're a few days away from having the public Nuget.org package available.  If you're interested you can view the C# and SQL code at JsonAutoService.

    The traditional alternatives to EF are ORM like Dapper and NHibernate.  The no framework approach is to write the ADO.NET code yourself for each procedure.  Because ADO.NET code is typically placed in (or called from) a .NET Controller, all processing happens AFTER JsonAutoService would've already exited the request pipeline.

     

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • I could be wrong but I'm pretty sure that's not what the OP is talking about.  I believe he's talking about the front-end Developers creating the tables on the fly to match their code rather than doing a decent database design first and then matching code to use that... with or without the use of stored procedures, which is a whole 'nuther argument that I'm not going to get involved in because it won't change anyone's mind.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    I could be wrong but I'm pretty sure that's not what the OP is talking about.  I believe he's talking about the front-end Developers creating the tables on the fly to match their code rather than doing a decent database design first and then matching code to use that... with or without the use of stored procedures, which is a whole 'nuther argument that I'm not going to get involved in because it won't change anyone's mind.

    If it's the former then my experience exactly parallel's Eirikur's and your's, EFCF is junk.  You can steer a car with your feet but that doesn't make it a good idea.

    Jeff Moden wrote:

    ...doing a decent database design first and then matching code to use that...

    The days of having to write "matching code" are over.  I've been developing this way for over a year now.  The resultant applications can speak for themselves.  Literally.  They can speak quite well and perform transactions based on speech.

    Jeff Moden wrote:

    "with or without the use of stored procedures, which is a whole 'nuther argument"

    The argument for procs is even more overwhelming then it is for integer surrogate keys!  To date, or until now, there hasn't been a design pattern which exploits the inherent advantages tho.  If nothing else, from a security point of view there isn't a serious alternative imo.

     

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • I'll have to disagree with the stance you took (or at least the stance it looks like you took, I could be wrong) on "matching code".  I'm not talking about basic CRUD.  I'm talking about people using the front end designers to create tables.  Some people do it right but, for the most part, it's done by people that have no clue when it comes to database design.

    On your last point, I'm confused.  Are you saying that no one has created designs that exploit the inherent advantages of using stored procedures?  And, yes, I do agree about the security advantages procs provide.  I had the very rare privilege of working on an app where the app only need connect privs and privs to execute certain stored procedures.  Security wise, it was a joy to behold.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    I'm talking about people using the front end designers to create tables.  Some people do it right but, for the most part, it's done by people that have no clue when it comes to database design.

    What happened to your Red Adair quote?  That was the perfect response in this situation

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • Steve Collins wrote:

    Jeff Moden wrote:

    I'm talking about people using the front end designers to create tables.  Some people do it right but, for the most part, it's done by people that have no clue when it comes to database design.

    What happened to your Red Adair quote?  That was the perfect response in this situation

    No one would stop hiring amateurs on the upswing of the Kruger-Dunning curve so I thought no one was listening. 😀  With that being said, the new quote is now much more appropriate. 😀

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.

    Change is inevitable... Change for the better is not.


    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)

  • Jeff Moden wrote:

    Are you saying that no one has created designs that exploit the inherent advantages of using stored procedures?

    From a SQL perspective I'm not saying that at all.  SQL developers have maxed out everything over the years and the community has done a great job.  It's the "matching code" part where things have fallen down.  For a long time Microsoft gave equal lip service to "database first" development.  No longer though.  EF makes them money in more than one way because it doesn't scale and other issues.  And CRUD is really not "basic" even though it's been around for so long.  The workflow to implement CRUD with stored procedures hasn't changed much in 30 years.  There's also a whole laundry list of technologies (ADO.NET, data readers, data adapters, DataTable objects, .NET Json, LINQ to SQL, etc...) that offer a small subset of sub-optimal SQL functionality all in the name of making database development easier and which don't create/use stored procedures.  EF, Dapper, and NHibernate can each manage and run procs but they really do nothing to make SQL programmers' lives better.  Given the security situation why are these tools so prevalent?  It's because building and maintaining proc based systems has always required two experts.  The other guys have spent 20 years trying to replace DBAs.  SQL needs its own answers to the historical issues using the latest most modern platform C#, .NET Core, and SQL type libraries.  All the pieces are available separately to build a really kickass library.  The problem with the old .NET full framework is/was a fundametal issue with the http request pipeline.  It isn't (and wasn't) possible to access the request route values from within the program delegate.  Also, the C# Sql type library didn't exist until 5 years ago.  So there were good reasons nothing happened for so long.

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • Just to add more history here.  For application developers using SQL Server db's there used to be something called the Microsoft Enterprise Library.  Among other things it included tools and framework for running procs which were useful to SQL developers.  The last version of which came out in 2013 and people, some people, (maybe just me) still inquire about a replacement.  Or they used to... it's been a while.

    https://en.wikipedia.org/wiki/Microsoft_Enterprise_Library

    JsonAutoService is completely unrelated to that.  It's very narrowly focused just on SQL Server data access from C#.  In my head JsonAutoService is what somebody else should've done but didn't.  Who?  Idk.  The word "enterprise" implies things but JsonAutoService is completely free and unencumbered by license.  The whole library is only 25.5Kb.  The description we've come up with now is just: "The Database First (DF) API framework for .NET Core" because that's all it is.

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

  • I'm intrigued, but I have absolutely no idea what this looks like in practice. Have you got a friendly introduction somewhere? Video? Some code to look at? And what does this sentence mean... "It isn't (and wasn't) possible to access the request route values from within the program delegate." ? Are you letting the DB respond directly to an HTTP request?

    • This reply was modified 3 years, 11 months ago by  bbsimonbb.
  • bbsimonbb wrote:

    I'm intrigued, but I have absolutely no idea what this looks like in practice. Have you got a friendly introduction somewhere? Video? Some code to look at? And what does this sentence mean... "It isn't (and wasn't) possible to access the request route values from within the program delegate." ? Are you letting the DB respond directly to an HTTP request?

    Hi Simon, thanks for asking about this.  For database developers the introduction is through 3 SQL scripts:

    1. JsonAutoService.sql
    2. JsonAutoServiceExamples.sql
    3. JsonAutoServiceExamplesSQLExecution.sql

    Script 1 contains the sample table definitions, procedures, functions etc. needed to demonstrate the C# library's capabilities.  Script 2 contains examples of (non-framework) CRUD operations using the sample tables.  Script 3 contains SQL code to execute the exact same CRUD operations (as in Script 2) using procedures (created in Script 1) which follow the framework design pattern.  If the framework pattern is followed, then the C# library can execute stored procedures automatically (with only a route) faster and more securely then any other methods which make use of AspNetCore authentication.

    The built-in examples make use of 3 tables:

    • products
    create table jas.products(
    p_id bigint identity(1,1) constraint pk_products primary key not null,
    product_name nvarchar(20) unique not null,
    created_dt datetime2 not null);
    go

    • options
    create table jas.options(
    o_id bigint identity(1,1) constraint pk_ingredients primary key not null,
    option_name nvarchar(20) unique not null,
    created_dt datetime2 not null);
    go

    • product_options
    create table jas.product_options(
    po_id bigint identity(1,1) constraint pk_products_options primary key not null,
    p_id bigint constraint fk_products references jas.products(p_id) not null,
    o_id bigint constraint fk_options references jas.options(o_id) not null,
    product_option nvarchar(20) null,
    created_dt datetime2 not null,
    constraint
    unq_product_options_p_o unique (p_id, o_id));
    go

    Each table has a BIGINT IDENTITY(1, 1) NOT NULL primary key.  The examples in Script 2 demonstrate CRUD (Create/Read/Update/Delete) operations in SQL Server as INSERT/SELECT/UPDATE/DELETE statement(s) within simple explicit transactions.  The intention of Script 2 is to demonstrate CRUD in SQL Server as simply as possible.

    The first example in Script 2 is a single line insert into the PRODUCTS table.  This is the non-framework SQL code:

    begin transaction
    insert jas.products values('Phone 1', sysutcdatetime());
    commit;

    The same code when incorporated into the framework pattern procedure code (from Script 1) is as follows:

    create proc jas.api_product_post
    @headersnvarchar(max)=null,
    @paramsnvarchar(max)=null,
    @bodynvarchar(max)=null,
    @test_idbigint output,
    @responsenvarchar(max) output
    as
    set nocount on;
    set xact_abort on;

    begin transaction
    begin try
    declare
    @p_id bigint,
    @p_count bigint;

    insert jas.products(product_name, created_dt) values(json_value(@body, N'strict $.product_name'), sysutcdatetime())
    select @p_count=rowcount_big();
    select @p_id=cast(scope_identity() as bigint);

    if @p_count<>1
    throw 50000, 'No rows inserted', 1;

    select
    @test_id=@p_id,
    @response=(select N'Ok' reply_message, @p_id p_id for json path, without_array_wrapper);
    commit transaction;
    end try
    begin catch
    select
    @test_id=cast(0 as bigint),
    @response=(select * from jas.error_metadata() for json path, without_array_wrapper);

    rollback transaction;
    end catch

    set xact_abort off;
    set nocount off;
    go

    Every CRUD/Create procedure uses the same inputs/outputs pattern.

    Every CRUD/Read procedure uses the same inputs/outputs pattern.

    Every CRUD/Update procedure uses the same inputs/outputs pattern.

    Every CRUD/Delete procedure uses the same inputs/outputs pattern.

    The different value inputs/outputs are all serialized to Json.  By generalizing the interfaces between application requests/responses and stored procedure the automation can be made as close to 100% as possible depending on use case.

    Continuing the example, to perform the exact same insert into the PRODUCTS table as above ('Phone 1')  using the example framework procedure (created by Script 1) is provided in Script 3:

    declare
    @out_test_id bigint,
    @out_response nvarchar(max);

    exec jas.api_product_post
    @headers = null,
    @params = null,
    @body='{ "product_name": "Phone 1" }',
    @test_id=@out_test_id output,
    @response=@out_response output;

    print (cast(@out_test_id as varchar(12)));
    print (@out_response);

    If the framework pattern is followed then C# library can "automatically" send request inputs to the procedures in a "high speed" way by short-circuiting the http pipeline in .NET Core.  Because the inputs/output are serialized there is no POCO necessary.  No SQL embedded in the application is necessary either.  Json gets sent to SQL Server and json gets returned to the requester.

    This is a long post.  I'll post a follow up which uses the C# library to execute the example above in an http request.

    Aus dem Paradies, das Cantor uns geschaffen, soll uns niemand vertreiben können

Viewing 15 posts - 1 through 15 (of 24 total)

You must be logged in to reply to this topic. Login to reply