Where the Provider Model Falls Short

0 comments suggest edit

Now that ASP.NET 2.0 is released, a lot of developers will start to really dig into the provider model design pattern and specification and its various implementations. The provider model is really a blending of several design patterns, but most closely resembles the abstract factory.

Where the provider model really busts out the flashlight and shines is when an application (or subset of an application) has a fairly fixed API, but requires flexibility in the implementation. For example, the Membership Provider has a fixed API for dealing with users and roles, but depending on the configured provider, could be manipulating users in a database, in Active Directory or a 4’x6’ piece of plywood. The user of the provider doesn’t need to know.

Provider Misuse\ However, one common area where I’ve seen providers misused is in an attempt to abstract the underlying database implementation away from application. For example, in many open source products such as DotNetNuke, an underlying goal is to support multiple database providers. However, the provider model in these applications tend to be a free for all data access API. For example, in the .TEXT (and currently Subtext) codebase, there is one provider that is responsible for nearly all data access. It has a huge number of methods which aren’t factored into well organized coherent APIs.

The other inherent flaw in many of these approaches is they violate a key principle of good object oriented design…

Good design seeks to insulate code from the impact of changes to other code.

Take Subtext as an example. Suppose we want to add a column to a table. Not only do we have to update the base provider to account for the change, we also have to update every single concrete provider that implements the provider (assuming we had some). Effectively, the provider model in this case amplifies the effect of a schema change. The result is that It makes your proverbial butt look fat.

This is why you see an appalling lack of concrete providers for applications such as DotNetNuke, .TEXT, Subtext etc…. Despite the fact that they all implement the provider model, very few take (nor have) the time to implement a concrete provider.

A better way\ For most professional web projects, this is not really an issue since your client probably has little need to switch the database upon which the application is built. However, if you are building a system (such as an open source blogging engine) in which the user may want to plug in nearly any database, this is a much bigger issue.

After a bit of research and using an ORM tool on a project, I’ve stepped away from being a religious stored procedure fanatic and am much more open to looking at object/relational mapping tools and dynamic query engines. ORM tools such as LLBLGen Pro and NHibernate make use of dynamically generated prepared sql statements. Now before you dismiss this approach, bear with me. Because the statements are prepared, the performance difference between these queries and a stored procedure are marginal. In fact, a dynamic sql statement can often even outperform a stored proc because it is targeted to the specific case, whereas stored procs tend to support the general case. One example is the dynamic query that only selects the needed columns from a table and not every column.

Better Insulation\ The key design benefit of such a tool is that they insulate the main application from the impact of schema changes. Add a column to a table and perhaps you only need to change one class and a mapping file. The other key benefit is that these tools already support multiple databases. Every time the ORM vendor spends time implementing support for a new database system, you’re application supports that database for free! That’s a lot of upside.

Think about that for a moment. NHibernate currently supports DB2, Firebird, Access, MySql, PostgreSQL, Sql Server 2000, and SqlLite. Think about how much time and effort it would take to implement a provider for each of these databases.

The very fact that you don’t see a plethora of database providers for DNN, .TEXT, etc… is convincing evidence that the provider model falls short in being a great solution for abstracting the database layer from application code. It is great for small well defined APIs, but not well suited for a generalized data api where there tends to be a lot of code churn.

To this end, the Subtext developers are investigating the potential for using NHibernate in a future version of Subtext.

Referenced Links and other resources\

Found a typo or error? Suggest an edit! If accepted, your contribution is listed automatically here.



11 responses

  1. Avatar for Chad Humphries
    Chad Humphries November 1st, 2005

    This is why all of our project on .net and java have used iBatis (http://ibatis.apache.org/) for the last year or so.

    In practice only minimal changes to the sql maps are needed to move between data providers. In some of our applications only a property addition and update of the result map were necessary.

    After getting into NHibernate for a while I've not been satisified. That's just me though, I much enjoy the simplicity of iBatis.

    Finally I would say that I enjoy NHibernate in the context of ActiveRecord. When working with the Castle Project and ActiveRecord implementation is so simple and quick that it does make it a pleasure to work with.

  2. Avatar for jayson knight
    jayson knight November 1st, 2005

    The only trepidation I have about dynamic SQL is that there is a very real chance that the query optimizer won't get to work its magic on the statements. You also then have to roll your own security around the statements, plus check for injection attack scenarios, etc. Sprocs have that all built in, plus they're pretty much guaranteed to go through the optimizer.

    That being said I'd love to see some benchmarks, plus the upsides that you mention are compelling in their own right. In the end I believe it's definitely an attractive package for ISV's, but in the realm of IT where it's usually a very controlled environment, the added overhead and complexity generally isn't worth it IMO.

  3. Avatar for Haacked
    Haacked November 1st, 2005

    Well when I say dynamic SQL, I mean via some sort of dynamic SQL engine as in an ORM mapper. I wouldn't dare generate dynamic SQL in my own code.

    Using something like NHibernate or iBatis or LLBLGEN Pro provides much of the benefits of stored procs. Via their APIs, you get security against SQL injection attacks, etc...

    You use the API, they generate the safe SQL using prepared statements. The first time a prepared statement runs, it takes a bit longer since the query is optimized. Afterwards it will run much faster.

    Even if the perf was a bit slower, the benefits are worth it in portability, and potentially faster coding.

  4. Avatar for Jeff Atwood
    Jeff Atwood November 1st, 2005

    > The only trepidation I have about dynamic SQL is that there is a very real chance that the query optimizer won't get to work its magic on the statements.

    Not true of any modern database. Hashed string = optimization plan, if the query has been seen before. And particularly when it's parameterized..

    > You also then have to roll your own security around the statements, plus check for injection attack scenarios, etc.

    Not if you use parameterized SQL!

    > Sprocs have that all built in, plus they're pretty much guaranteed to go through the optimizer.

    Stored procs are an unnecessary productivity tax. They are DATABASE ASSEMBLY LANGUAGE. What adds insult to injury is that people keep repeating the same old misinformation about them, as if the clock had somehow mysteriously rolled back to 1993 and we're all still listening to Haddaway's "What Is Love".

    Who needs Stored Procedures, anyways?


    Give me parameterized SQL, or give me death


  5. Avatar for chid
    chid November 2nd, 2005

    also, with dynamic SQL, you can set updates to only update the changed fields, especially good with tables with a fair amount of fields.

  6. Avatar for Michael Teper
    Michael Teper November 11th, 2005

    I came to the same conclusion this past year. My two most recent projects have been built on top of NHibernate.

  7. Avatar for Tom
    Tom April 24th, 2006

    When you need run several sql statements that depends each other, you can very profit from stored procedures. There is sure less traffic bettwen app data layer and database and more place for query plan optimization. You are thinking only about simple sql query, that's your problem. Also securing each sp is better way than each table and view.
    Sorry, but arguments, that you linked Jeff are, "I feel it so and my mother and sister even so, so it's true."

  8. Avatar for Dust
    Dust October 17th, 2006

    I've found that developers who are weaker in SQL, or have grown up in modern forerunner entry level databases (like MS SQL) often gravitate towards doing everything in SPs because they simply don't understand / aren't comfortable with SQL.
    Comments like a dynamic statement might not get optimized? Oh please tell me that you don't actually get paid for your technical input on such issues!! lol
    The fact of the matter is that there is almost no difference between using "dynamic" parameterized SQL and using SQL in a SP. Furthmore, shoddy SP scripting by shoddy developers often ends up in SPs that do more parser context switching than actual work.
    My two cents... if you know what you're doing, and you're not just spewing up tired edict from some best practices magizine you read once but never questions, you should have all the tools you need in your head to understand which method will server you best in whatever situation.

  9. Avatar for Community Blogs
    Community Blogs November 16th, 2006

    In a recent post I talked about how good design attempts to minimize the impact of changes to a system,

  10. Avatar for you've been HAACKED
    you've been HAACKED April 23rd, 2007

    Avoid Using a Database as an API Integration Point

  11. Avatar for Community Blogs
    Community Blogs April 23rd, 2007

    Before I begin, I should clarify what I mean by using a database as an API integration point . In another