AJ's blog

April 20, 2008

LINQ to SQL Reality Check

Filed under: .NET, .NET Framework, C#, LINQ, Software Architecture, Software Development — ajdotnet @ 2:37 pm

After the introduction and checking LINQ to Objects, this is the second part of the Reality Check story.

Employment of LINQ to SQL

OK, off with the gloves and hard core LINQ 😉

image But first some background: LINQ to SQL manifests itself as a .dbml file supported in VS2008 by a respective designer. Via server explorer one can add tables, stored procedures, and other DB objects to the designer. SQLMetal, the code generation tool behind the scenes of LINQ to SQL, is used to generate code for the data context class with static methods for stored procedures. Tables are mapped to entity classes, columns to properties, in a trivial fashion but non-trivial way. E.g. the ID column:

[Column(Storage=”_CustomerID”, DbType=”UniqueIdentifier NOT NULL”)]
public System.Guid CustomerID
{
     get
     {
         return this._CustomerID;
     }
     set
     {
        if ((this._CustomerID != value))
        {
            if (this._Customer.HasLoadedOrAssignedValue)
            {
                throw new System.Data.Linq.ForeignKeyReferenceAlreadyHasValueException();
            }
             this.OnCustomerIDChanging(value);
             this.SendPropertyChanging();
             this._CustomerID = value;
             this.SendPropertyChanged(“CustomerID”);
             this.OnCustomerIDChanged();
         }
     }
}

As you can see, the set method goes at length to announce the change of the value. The generated entity properties support various “events” by means of partial methods, and the entity classes implement respective interfaces (INotifyPropertyChanging and INotifyPropertyChanged).
In addition to properties for columns, the entity classes provide properties for relationships (supporting lazy loading) if they have been modeled accordingly.

All generated classes are realized as partial classes and can be adorned with custom code without problems. All classes are created in one big source file, one can however influence the class names, namespaces and some other parameters that affect code generation.

Now back to the prototype application…

The prototype application uses “poor man’s O/R mapping”, i.e. data readers to access the DB and manual mapping to POCOs. It even doesn’t use reflection (dude, it’s a prototype application), thus it is by no means representative.

Anyway, I’ll just present old and new code as before, again, relevant parts in bold. 

3. Example: Reading a customer (single object) from the database.

const string SELECT_CUSTOMER_BY_ACCOUNT = “select * from customer where MainAccount=@MainAccount”;

Customer FindCustomer(int mainAccount)
{
     Customer c;
     using (SqlConnection con = OpenConnection())
     {
         using (SqlCommand command = new SqlCommand(SELECT_CUSTOMER_BY_ACCOUNT, con))
         {
             AddParameter(command, “MainAccount”, mainAccount);

             using (SqlDataReader reader = command.ExecuteReader())
             {
                 while (reader.Read())
                 {
                     c = FillCustomer(reader);
                     return c;
                 }
             }
         }
     }
     return null;
}

private static Customer FillCustomer(SqlDataReader reader)
{
     Customer c = new Customer();
     string name;
     for (int i = 0; i < reader.FieldCount; ++i)
     {
         name = reader.GetName(i).ToLower();
         switch (name)
         {
             case “id”: c.Id = reader.GetGuid(i); break;
             case “mainaccount”: c.MainAccount = reader.GetInt32(i); break;
             case “name”: c.Name = reader.GetString(i); break;
         }
     }
     return c;
}

Nothing but boilerplate code. And the rewrite using LINQ to SQL:

CustomerLinq FindCustomerLinq(int mainAccount)
{
     DataClassesDataContext dc = new DataClassesDataContext();
     var customerList = from cc in dc.CustomerLinqs where cc.MainAccount == mainAccount select cc;
     return customerList.FirstOrDefault();
}

About 35 LOC (still 20 LOC if the mapping is left out) vs. 6 LOC. No need to comment on that I guess…

To provide a complete picture (and because I didn’t want to change the whole application to new entity classes) I also tried the “separate entity assembly” approach. This code also maps the generated entities to layer independent entities:

Customer FindCustomerLinq2(int mainAccount)
{
     DataClassesDataContext dc = new DataClassesDataContext();
     var customerList = from cc in dc.CustomerLinqs where cc.MainAccount == mainAccount select cc;
     var cl= customerList.FirstOrDefault();
     if (cl!=null)
         return new Customer() { Id= cl.ID, MainAccount= cl.MainAccount, Name= cl.Name, TotalBalance= cl.TotalBalance };
     return null;
}

As there was no appropriate constructor, object initializers kicked in nicely. Of course this would be the next mapping method, but not to complicated and, contrary to the data reader mapping above, it would be fully type safe (unless you employed reflection, of course).

4. Example: Read all accounts (i.e. a list) for a customer.

const string SELECT_ACCOUNTS_BY_CUSTOMER= “select * from vAccount where CustomerID=@CustomerID order by SubAccount”;

IList<Account> IAccountTable.GetAccounts(Guid customerId)
{
        IList<Account> list = new List<Account>();
        Account a;
        using (SqlConnection con = OpenConnection())
        {
                using (SqlCommand command = new SqlCommand(SELECT_ACCOUNTS_BY_CUSTOMER, con))
                {
                        AddParameter(command, “CustomerID”, customerId);

                        using (SqlDataReader reader = command.ExecuteReader())
                        {
                                while (reader.Read())
                                {
                                        a = FillAccount(reader);
                                        list.Add(a);
                                }
                        }
                }
        }
        return list;
}

private static Account FillAccount(SqlDataReader reader)
{
        Account a = new Account();
        string name;
        for (int i = 0; i < reader.FieldCount; ++i)
        {
                name = reader.GetName(i).ToLower();
                switch (name)
                {
                        case “id”: a.Id = reader.GetGuid(i); break;
                        case “mainaccount”: a.MainAccount = reader.GetInt32(i); break;
                        case “subaccount”: a.SubAccount = reader.GetInt32(i); break;
                        case “balance”: a.Balance = reader.GetDecimal(i); break;
                }
        }
        return a;
}

With LINQ:

IList<AccountLinq> GetAccountsLinq(Guid customerId)
{
     DataClassesDataContext dc = new DataClassesDataContext();
     var accountsList = from aa in dc.AccountLinqs where aa.CustomerID == customerId orderby aa.SubAccount select aa;
     return accountsList.ToList();
}

The same LOC relation as above. Mapping the generated entities may be a little more effort, but still far less than in the classical version. And since I called ToList() I didn’t even have to change my interfaces. With separate entities one would have to traverse the list and translate each entry. But again, no big deal and less work than without LINQ.

Conclusion:

The rate of code reduction with LINQ to SQL is even higher than with LINQ to Objects. This is something that’s very hard to ignore.

On the other hand, there is the architectural issue to be solved. In my opinion the biggest obstacle with LINQ to SQL is the fact that all code, entities and data context, are created in one big source file, thus I cannot separate them. I cannot put the entities in an entity assembly as mandated by the architecture without also putting the data context class into that assembly as well, exposing the database to anyone unduly interested. The alternative would be to have a second set of entities, one for the data access and one to flow across the layers. And respective mapping code. Doable as I checked, but not a nice prospect.

BTW, this time I used nearly every aspect of LINQ: The code contained Query Syntax, Extension Methods, Lambdas, Object Initializers, and Type Inference. And with System.Data.Linq a library is used that heavily relies on Expression Trees. Anonymous Types is the only missing part.

Final judgement:

All in all, LINQ proved to be such a useful addition to the developers toolbox that it cannot be ignored, even if it has its own problems. And while it is often seen as “better SQL”, this is already shortsighted. My earlier posts as well as all the fuss about functional programming in general should be indication enough that this is just where it begins.

That’s all for now folks,
AJ.NET

kick it on DotNetKicks.com

Advertisement

8 Comments »

  1. In fact, you can (and should) do your “mapping” (projection) in the select operator:

    Customer FindCustomerLinq2(int mainAccount)
    {
    DataClassesDataContext dc = new DataClassesDataContext();
    return (from cc in dc.CustomerLinqs
    where cc.MainAccount == mainAccount
    select new Customer()
    {
    Id = cc.ID,
    MainAccount = cc.MainAccount,
    Name = cc.Name,
    TotalBalance = cc.TotalBalance
    }).FirstOrDefault();
    }

    Comment by Kris Vandermotten — April 20, 2008 @ 9:00 pm

  2. It would be nicer if you would have also incorperated abit more of the comparision with a layered architecture, instead of those 5 lines (Although you do nail it there). In general LINQ is great, but we’ll still have to find a proper place for it in our architectures…

    Comment by JV — April 21, 2008 @ 2:36 pm

  3. “Real” old code would look a little neater with some helper-functions:

    Customer FindCustomer(System.Data.IDbConnection voConn, int mainAccount)
    {
    Customer c = null;
    string lsSql = “SELECT id, mainaccount, name FROM tabAccount WHERE id = ” + mainAccount;
    SqlDataReader rst = clsSql.ExecuteReader(voConn, lsSql);
    if (!rst.Read())
    {
    c.Id = rst.GetInt32(0);
    c.MainAccount = rst.GetInt32(1);
    c.Name = rst.GetString(2);
    }
    rst.Close();
    return c;
    }

    Comment by titrat — April 21, 2008 @ 6:09 pm

  4. @JV: The introduction in the previous post said “I kind of need it as introduction to some other posts I have in mind.” Guess what they will be about…

    @tirat: “Real” real old code (if I had written it) would certainly not concat SQL statements with ‘+’, neither would it hardcode a dependecy between a SQL string and another pice of code like the indexed field access does. It would probably look like that:

    Customer FindCustomer(int mainAccount)
    {
    using (DBHelper db = DBHelper.OpenConnection())
    {
    return db.ReadSingleRow(SELECT_CUSTOMER_BY_ACCOUNT, “MainAccount”, mainAccount);
    }
    }

    Neat?

    AJ.NET

    Comment by ajdotnet — April 22, 2008 @ 7:19 am

  5. LINQ to SQL is explicitly targetted as a “quick (and dirty)” solution by Microsoft. It is sort of the successor to using datasets, the former “shotgun” approach (you always end up hitting something).

    Though I like LINQ to SQL much more than DataSets from what I have seen so far. And when using LINQ to SQL, you still can’t turn off your brain.

    If you are talking “architecture”, services, abstraction and thelike, take a look at the ADO.NET Entity Framework, which is meant to be a full blown solution with much greater capabilities.

    Comment by Brian — April 22, 2008 @ 3:54 pm

  6. Great post on LINQ to SQL. I find LINQ to SQL is surely a better way than old code. But I have been using a product called Quick Objects and the code above that shows mapping I can write like below.

    Customer FindCustomer(int mainAccount)
    {
    using (Customer c = new Customer())
    {
    c.ObjectMode = ObjectModes.Search;
    c.MainAccount.Value = mainAccount;
    c.SetVisibleFields(c.ID, c.Name, c.TotalBalance);
    c.FindAndLoad();
    return c;
    }
    }

    Best of all I could have added bunch of joins, sorting or added server side paging with the same kind of ease. I know Quick Objects guys are working on LINQ support and I am really looking forward to that. I haven’t tried ADO.NET Entity Framework yet but I can already do most of the stuff using Quick Objects very easily.

    I am hoping soon I’ll review the Entity Framework soon.

    Comment by instantcode — May 28, 2008 @ 7:40 am

  7. “In my opinion the biggest obstacle with LINQ to SQL is the fact that all code, entities and data context, are created in one big source file, thus I cannot separate them”

    This is not true. The VS designer tool, and the corresponding command line tool Sql metal, do behave in the way you described. But there is nothing about Linq-to-sql that requires it to be this way. You can use your own persistence ignorant classes if you want. Alternatively, you can leave all the classes in the one generated file and then use partial classes to add behaviour to them. This is not a disadvantage since the classes are codegened, so having them generated into seperate files would not make them editable.

    I too have had trouble trying to get the datacontext into a different assembly from the entities. However, once again I believe it is possible if you are prepared to walk away from the Linq-to-sql codegen tools.

    Comment by Liam McLennan — June 9, 2008 @ 12:11 pm

  8. Hi AJ. You wrote, “In my opinion the biggest obstacle with LINQ to SQL is the fact that all code, entities and data context, are created in one big source file, thus I cannot separate them.”

    There is actually a great way to separate them without jumping through hoops. I found a new code generation tool for LINQ-to-SQL that splits all of these parts into separate files which you could even put into separate assemblies AND you can still use the O/R Designer in VS2008 to make changes. The code re-generation process preserves both the designer changes and your code changes. It’s pretty cool. It’s called PLINQO. You can read more about it here: http://kevinlawry.wordpress.com/2009/02/18/does-linq-stinq-not-with-plinqo/

    Comment by kevinlawry — February 20, 2009 @ 5:25 pm


RSS feed for comments on this post. TrackBack URI

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a free website or blog at WordPress.com.

%d bloggers like this: