Invoking Stored Procedures through a One Line Native .NET Method Call

  • Comments posted to this topic are about the item Invoking Stored Procedures through a One Line Native .NET Method Call

  • Thanks Mark. This will be helpful to me. 🙂

  • Advertising by any other name......

  • I cannot find any independent editorial reviews. Is this product too new (think, untested) to have been reviewed yet? Or perhaps it is too unimportant to be reviewed?

    Paul DB

  • Nice Ad. You should have included the price and licence in the article.

    Anyway, I think that if you just use stored procedures, it can be a good idea, but if you've queries too then I'd have it all in the same 'place' (Entity Framework, NHibernate, etc...).

  • Brilliant.

  • I've never seen a C# class file look like the one you have with an @ in front of every field and the class name.

    Hrm.

    Yeah. Nice ad. Prefer SMO.

  • First off, good job. I am starting to see this a lot, my company does something similar where they just make strongly-typed Stored Procedure classes. I've actually had a framework for DataAccess for many years that requires no custom code or code generators and supports ORM. This has helped me over the years get things up and running with data access with only 1 AppSetting and very easy calling code.

    For example:

    Dictionary<string, object> parameters = new Dictionary<string, object>();

    parameters.Add("@OrderID", orderId);

    DataTable dtData = DataService.Fill ("dbo.usp_MyStoredProc", CommandType.StoredProcedure, parameters);

    If I just want to execute the Proc without any returns, it's just as easy:

    int hResult = DataService.ExecuteNonQuery("dbo.usp_MyStoredProc", CommandType.StoredProcedure, parameters);

    System.Diagnostics.Debug.WriteLine("Rows Affected: " + hResult.ToString());

    I've seen these strongly-typed stored-proc classes being generated now in some peoples frameworks and to me, it's still too complicated. I have no extra code to maintain, the DAL does it all. It takes care of building the Procs on the fly (caching them for a relative period of time that can be configured) so that subsequent calls to the same proc are as fast as possible. I've thought about posting the code for some time, I just wanted to get support in there for the IQueryable interface for LINQ so that it could run LINQ queries as well.

    Oh, by the way, it also supports Data-Paging out of the box and optimizes any ad-hoc sql you write. It does not let any "Select *" statements come across the wire and also helps you parametrize all ad-hoc sql statements if you choose to use them (and you will, once you start data paging).

    That's my 2cents...

  • Am I missing something or does LINQ do this but better?

  • LINQ does do this. There are 3 flavors of LINQ (LINQ to SQL, LINQ to Objects, and LINQ to Entities (i.e. for ADO.NET)). If you hook up SQL Profiler to a LINQ application, the queries hitting the database are horrible (except for those where you're actually calling a Stored Proc). The only thing I don't like about LINQ is all the Stored Procs are methods at the root namespace of the Objects. You MIGHT be able to change this, but each time it re-generates the code it will overwrite it. So, if you have 500+ procs like we do, there's no way to logically seperate them to make the code cleaner to call.

  • This is definitely an interesting article, but, to echo some of the others, it does have the feel of an ad.

    Moreover, it seems that this would buy you relatively little that Linq does not do for you. I agree that the SQL Linq sends may not be fully optimized for most things, but it seems to handle stored procedure calls relatively elegantly.

    ---
    Timothy A Wiseman
    SQL Blog: http://timothyawiseman.wordpress.com/

  • As a 24 year software developer moving into the DBA role, I was impressed with your solution, and your understanding of the problems.

    I couldn't agree more about the need to have a seemless integration of the source code, intellisense, and the current, underlying database schema for developers. In my 8 years as a SQL Programmer having used MFC's, ADO's, a self-designed set of datatype sensitive container classes, and in one case using purely dynamic SQL to eliminate some of the current difficulties associated with using stored procedures, getting data into and out of the database, from and to memory using software, was a difficult proposition fraught with the potential for programmer error. There were too many issues and variables to be considered for every database access to do it reliably 100% of the time.

    Combine that with database alterations unknown to the software development staff and you have a failed application.

    You are absolutely correct. The errors should show up at compile time, not run time. And it should be much easier for programmers to implement database access correctly. That can only occur if they have the proper tools.

    Yours are a well-thought-out set of tools.

    LC

  • Just a word of warning. This all looks good at first, but you will find it largely unmaintainable in the future. Having to constantly regenerate "wrappers" for all you're procs every time there is one little change to the parameters is daunting. Couple this with the fact that developers cannot even begin to code their business layers until you're done regenerating all the procs makes it very, very unproductive.

    Our company does this today and they've been using this strategy for a couple of years now. It's made developer productivity go from a couple of days to a couple of weeks. Not to mention that the Assembly that they generate all these wrappers in has now become an extra 4 - 10mb of assembly bloat that must be passed around with all applications (I'm not worried about the space, but rather about how much memory is needed to host it). We will soon look into changing this and they are very interested in the strategy I mentioned earlier. The DAL I created to do this was created long before LINQ existed, and having used LINQ myself, I will probably continue using my own because of the pure power that it has over LINQ. For one, I have the ability to Datamap my objects with simple Attributes and can datamap to Columns (including multiple columns) and Parameters. This allows for seemless integration between the Business Objects and DataReaders, DataTables, and Commands.

    Here is what a sample business object might look like. There is no code for datamapping and no direct invocation of Stored Procedures. There's only one requirement that you override the "OnSave" base method. Also, how you fill the object is up to you (Notice the constructors), usually I write a Manager / Director / BusinessService (whatever we're calling them these days) that has all the surrogate methods for my object (ex. ContactDataManager.FillByID(Guid ID)). Take a look at the static method on the class (ex. Contact c = Contact.GetByID(Guid.Empty)).

    [DataMap("dbo.usp_save_Contact", System.Data.StatementType.Insert)]

    [DataMap("dbo.usp_save_Contact", System.Data.StatementType.Update)]

    [DataMap("dbo.usp_delete_Contact", System.Data.StatementType.Delete)]

    public class Contact : BusinessBase

    {

    private string fname;

    [DataMap("FirstName"), DataMap("ContactFirstName"), DataMap("@FirstName")]

    [RequiredRule(), MinLengthRule(1), MaxLengthRule(25)]

    public string FirstName

    {

    get { return (this.fname == null ? string.Empty : this.fname); }

    set {

    if (this.fname == value) return;

    base.OnPropertyChanging("FirstName"); // support for DataBinding and LINQ

    this.fname = value;

    base.OnPropertyChanged("FirstName"); // support for DataBinding and LINQ

    }

    }

    protected overrides OnSave(EventArgs e)

    {

    base.OnSave(e);

    if (e.Cancel) return;

    // Save the object to the Database

    DataService.Update((IUpdatableObject)this);

    }

    public static Contact GetByID(Guid ID)

    {

    DataTable dt = ContactDataManager.GetByID(ID);

    if (dt.Rows.Count == 0) return null;

    return new Contact(dt.Rows[0]);

    }

    // Constructors

    public Contact() : base() {}

    public Contact(DbDataReader Data) : base(Data) {}

    public Contact(DataRow Data) : base(Data) {}

    }

    Example:

    Contact c = Contact.GetByID(Guid.Empty);

    if (c == null) c = new Contact();

    c.FirstName = "John";

    if (!c.Save()) {

    List rules = c.BrokenRules();

    // TODO: Display the broken rules

    // See the Rule Attributes applied to each Property...

    }

    LINQ is nice, as does a lot of this for you. However, if you ever need your business object to come from multiple sources and be fill from two different Column Names (not that uncommon when you work with Tables AND Views or alias columns in Procs), then you will have to create an entirely new set of LINQ objects because they can't map that way (believe me I tried). LINQ is used at best in very simple scenarios to get developers going fast, once you need Enterprise level code, then it severely falls short. I agree, if all you want are the nice little Proc wrapper code it creates, then use it. By the way, the Business Object I wrote above is supported by LINQ and can be also passed into the base "System.Data.Linq.DataContext" and stored in a LINQ ChangeSet for later persistance.

  • if you make a proxy class with the changes you want, it will not get overwritten when you make changes. I have done this. Make a new class in a separate file that inherits the generated class.

  • If you're talking about "partial" classes, yes, I am aware of that. LINQ generates all the class definitions as "partial" classes so you can simply add extra files. You shouldn't need to Inherit from the LINQ generated classes.

Viewing 15 posts - 1 through 15 (of 16 total)

You must be logged in to reply to this topic. Login to reply