Matthew Groves: OR/M Comparisons

Matthew Groves has been working on a series of articles comparing some OR/M frameworks.  So far, he's done some posts on Entity Framework and Fluent NHibernate, but I believe he's planning on hitting a couple more, too.

If you're a .Net developer, you know what a "lively" subject this always is.  Data access consumes so much time and effort in most applications that it's worth educating yourself on the options so you can make informed decisions.  I'd also recommend checking out a couple of these frameworks yourself, as merely reading about these frameworks isn't quite the same as using them on a project.

Here's a list of Matthew's posts so far:

Ray Ozzie takes us back to Windows 1.0

This one is definitely worth a read:

Early Windows logo (1985-1992)
Image via Wikipedia

I’m back.

Be sure to check out the Windows 1.0 press kit that he scanned in.  I can't believe it's been 25 years, and Microsoft has  managed to add -- what -- Direct-X and muti-monitor support, I think.  I love how "Paint" is a big selling point, and CD technology is broken down like it's a flux capacitor or something.

Ahh, simpler times.

Google Irony

You've probably heard of Blogger.  It's a (duh) blogging tool, and it's one of the most popular platforms available for casual bloggers.  It's simple, easy to get started with, and it's hosted (and owned) by Google.

You know what else Google does?  They build a browser called Chrome.  Perhaps you've heard of it.  I've heard of it.  I've been using it for the last couple of years, and it's generally really nice.  'Cept here's the good part -- it doesn't like to work with Blogger.

Note - no image

I happened to see a new blog post from Roger Sessions today -- it popped up in Google Reader (yeah, I drink the Kool-Aide), and I clicked over to his site to leave a comment.  So far, so good.  When I clicked the link to comment, however, it popped up a comment form in a popup window (minor annoyance, by the way), but then, I noticed that there was a Captcha-style image verification panel that wasn't actually showing an image.

Hmmm....

This has got to really cut down on spam comments, right?  Intent to get my message to Roger, though, I opened his blog in Firefox and clicked the comment link -- it turns out Google likes Firefox better than Chrome:

No problem here...

Looks fine in Firefox, doesn't it?  Incidentally, I also tried in Internet Explorer and saw the same problem I saw in Chrome, but then, I hit the link a second time, and it was fine.  Having hit upon a possible solution, I tried again in Chrome, and sure enough -- it worked fine in Chrome, too, the second time around.

I'm not sure what the problem is, here, but it's pretty telling that Google hasn't seen fit to update the commenting system on Blogger.  After all, the native commenting system in WordPress has undergone continuous improvement since it was launched, and it's also stupid-easy to integrate commenting systems from folks like Disqus (which I use) and Intense Debate.

If you're a blogger, you may want to look at giving Blogger's commenting system the boot in favor of one of these other systems until Google gets its act together.

Enhanced by Zemanta

Thoughts on blog comments

This morning, I saw a tweet from Mike Figliuolo where he was sounding off about an anonymous comment on one of his blog posts, indicating that he'd left a "scathing reply", and asking for reactions.

As so often is the case, I started to leave a comment on his blog, but as it grew, I figured I might as well make a post of its own about it.  There are a number of core issues going on here, in my opinion.  If you've got a blog, or even if you just comment on others' blogs, it's worth considering how you feel about these issues.

Do you want comments?

Chez Castel
Image by @rgs via Flickr

If you've got a blog, it's your baby and you can do what you want with it, but I think it's important to be clear about your objectives with respect to comments.  If you're really interested in a public discussion of the thoughts presented on your blog, then you're somewhat obligated to embrace and foster an open exchange of ideas.  If, on the other hand, you're not really all that interested in an open discussion, then turn off comments.  I suppose a third possibility is that you want to see comments, but only the ones that agree with you.  In my opinion, this also really defeats the purpose of comments.

Whatever your objective, it stands to reason that if you respond to a negative comment by going nuclear, you're not going to encourage an open exchange of anything at all.  Sadly, it's a given that people behave more rudely and abusively on the web than they would in person (especially under the veil of web anonymity), and this shows up in comments.  When you see a comment that strikes you as truly abusive or destructive, you've got every right to moderate it, but in the case of Mike's anonymous commenter, that's not what I saw.  What I saw in Mike's response, though, was Mike suggesting that the commenter hadn't read his post (or, apparently all of Mike's previous posts) carefully enough to form a well-reasoned opinion.  That's not a great way to encourage more discussion.

How's the weather in your little echo chamber?

If you blog, why do you do it?

Although there are any number of reasons, I suspect that every blogger at some level wants to make an impact on people.  We want to share our opinions and, hopefully, sway some readers to consider our opinions, and hope against hope, maybe to even adopt our opinions.

So how much are you really accomplishing if you're only reaching people who think just like you?

But that's not what I said...

If someone reads your post and comments in a way that makes it seem like they read a different article than you wrote (which, I believe, would be Mike's assessment in this case), it may be because they just weren't paying attention.  It might, however, just mean that they're reading it with a different bias or perspective.  Contrary to your first reaction, this just might be your target audience.  Here's someone with a different opinion than you, and they're sharing their thoughts on your blog.
Everyone's experienced conversations where we've said something that just didn't come out right, or perhaps it came out sounding just fine, but someone ended up hearing something completely unlike what we meant to say.  We call this communication.  Any time you utter a thought, it's just a stream of lonely, disembodied words until it comes to roost in someone else's noggin.  Here's the crazy part, though -- your listener / reader is going to interpret your words on the way into their grey matter, and they just might find meaning in your words that's a little different than you intended.
You can find volumes of material about this, but again, as a communicator, if you see that this is happening, consider the following:
  • Be thankful that you're aware that you can see the impedance mismatch.  Most of the time, if someone doesn't get what you're saying, they'll just tune you out, and you'll never know it.
  • Review your message to see if there's any way you can change the delivery to clear up misunderstanding.
  • Engage the reader to try to understand why they interpreted your message differently than you intended.  Telling them that they're not too bright doesn't count as "engaging".

If you're looking for an example of open discourse done right, Robert Scoble is the best I've ever seen.  There are plenty of people who disagree with almost everything that comes out of his mouth, but Robert will engage any of them in an open conversation about what he's said, and he does it without taking negative comments personally.  As a result, Robert has made a name for himself as a communicator, and I suspect he may have even learned a thing or two along the way.

Who have you seen that manages comments well?

Related articles by Zemanta

Enhanced by Zemanta

More thoughts on Microsoft Lightswitch

A couple weeks ago, Microsoft announced a tool called Lightswitch, and the response in the development community has been almost universally tepid.  One of these responses really caught my eye, though.  It was from Bill Vaughn, who's been the patron saint of Microsoft data access for as long as there's been Microsoft data access.  This guy knows a thing or two about what works, and he was dumping all over Lightswitch.

Construction in Gibraltar.
Image via Wikipedia

I left a comment on his post, and I thought it had been swallowed up by the great spam filter in the sky, but Bill just  resurrected it (thanks, Bill!) and responded.  It looks like the biggest fear with Lightswitch is that this tool is going to be used to create a bunch of garbage apps that "pros" have to come and clean up later.  This sentiment is echoed across many of the lukewarm comments I've seen elsewhere, too.

But even if all these bloggers are right about the quality of Lightswitch applications, I'm still not convinced that there isn't a place in Microsoft's developer tool portfolio for something like this.
Why?
  • We already have developers cranking out lousy prototypes, but with today's tools, they take longer.  In fact, they take long enough that an awful lot of developers look to platforms like Ruby-on-Rails to do prototyping, and that's not helping the MS developer tool position a whole lot.
  • If a tool like this were positioned specifically as a "non-production use" tool, you'd have at least a chance of setting proper expectations about would need to happen in order to scale apps up for production deployment.
  • There are very few shortcomings in Lightswitch that couldn't be substantially addressed via some sort of code generation.  A modern-day "upsizing wizard" could solidify a database schema, generate an Entity Framework model, even generate stored procedures if you want them.  The work I've done recently with the Code-First functionality in Entity Framework's recent CTP convinced me that the concrete constraints we're used to between DB and data access code might go away soon.  I've also seen some really great schema transformation capabilities in Visual Studio's Database Projects.  When this stuff comes together, I can absolutely envision a "pro" developer sitting down with a Lightswitch application and refactoring it into a high-quality application.

All good developers would rather see an application start out with a proper foundation and high-quality architecture.  We hate seeing messy apps, and we hate cleaning up after junior developers, or worse -- amateur developer wannabes.

But there's a business problem that we can't ignore: we're expensive - especially when we sit down to "do something right."  There is a need need for business owners and managers to produce prototype applications at a reasonable cost.  If Microsoft doesn't provide that capability, someone else will.  You may not like that, but it's a fact.

And here' s another fact:  If applications are being prototyped on someone else's application development stack, guess which stack is going to get first crack at upsizing those apps when they need to be scaled?

Lightswitch might not be the prototyping tool we'd all like to see in its current form, but don't let that distract you from the fact that Microsoft needs to be present in this part of the market.  It's important for them, and it's important for you, too.

Enhanced by Zemanta

A schizophrenic development platform

It's an occupational hazard, I guess.  The .Net development platform is moving at an absolutely dizzying pace these days, and there's no end in sight.  Ordinarily, you might think that this is great news for Microsoft's customers, because we're getting more innovation than we can swallow.  That's good, right?

Umm...

ADO.NET Entity Framework stack
Image via Wikipedia

Aside from the (somewhat) obvious problem that all the developers who work on this stack have to devote serious time to stay even close to current on this stuff, there's a whole other class of problem that's completely with Microsoft's control to manage, should anyone there choose to do so.  The problem is that Microsoft's Developer Division has a product management problem.  There's so much stuff coming out of this area that it's lost any sense of cohesive design.  Allow me to explore a couple of recent announcements:

Entity Framework CTP (Code First)

I've been playing with this for the last couple of weeks, and there's some very cool stuff in there.  This is stuff that's targeted for some future .Net release (hence, the CTP), but it builds on Entity Framework 4.0 classes - especially data annotation.

One of the things I've been struggling with in working with this tool is figuring out how validation is supposed to be done when you move beyond simple property annotations.  In addition to not finding any guidance about advanced validation, when I tried to use one of the more recent validation interfaces in an n-tier sample application, I found that it wasn't marked for serialization over WCF, which really slowed me down.  There are now at least two flavors of attribute-based validation (that I've seen), and object validation includes the IValidatableObject stuff and the Validation Enterprise Block in the Codeplex Enterprise Library.  Which one is the one validation to rule them all?

There's also a real sense of work-in-progress around the database generation stuff.  It feels like that capability is intended mainly as a prototyping capability , but the transition to a "real" project is very unclear at this point.  Again, tons and tons of potential, but it's not crystal clear what the positioning and role of this tech will be going forward.

Visual Studio Lightswitch

I read about this one today on Jason Zander's WebLog.  This looks like a next-generation Access-like tool.  The usage scenarios that are described for this tool are slanted distinctly toward hobbyist programmers, and there's undoubtedly a market there.  What is again not quite clear is where this tool is intended to fit in the Visual Studio lineup, and once again, what the "upscaling" path looks like.

In both of these cases, we're seeing some hints of really cool technology, but it's happening in a bit of a vacuum.  Either or both of these, for instance, could wind up usurping an existing VS technology, or living shoulder-to-shoulder with it, and either one could obviously end up withering on the vine, but am I the only one who feels like Microsoft is just throwing this stuff out there to see what sticks?

A technology that time forgot?

Do you remember Dynamic Data Entity Projects?  I wrote about my first impressions about Dynamic Data back in December of 2007, though these bits didn't reach release status until .Net 3.5 SP1 was released in November, 2008, and by most standards, this stuff didn't get usably stable until .Net 4.0 came out.  We're now less than two years removed from that original release, and it seems like the whole world has already moved on -- Dynamic Data is the next best thing to legacy code now.

Dynamic Data sites clearly seem to target prototype sites and/or hobbyists, but given that I don't hear "boo" about Dynamic Data any more, it would appear that Microsoft didn't really light those groups on fire.  Rather than evolving that platform to meet the needs of those customers, though, it looks like Microsoft's strategy is to throw some more bits out to see if anything else sticks any better.  It's clear that some of the ideas in Dynamic Data found their way into ASP.Net MVC, and then on to EF4 Code-First, and Lightswitch, too, but if we end up with three or four products out there that are jockeying for the same real estate, Microsoft's message is going to be lost.

Remeber the Kin

You could make the argument that Microsoft's mobile platform is the most important front that Microsoft is fighting today.  They're badly behind, and they really need s solid base hit to keep Microsoft in the mobile game.  Despite this, they gave us the Kin.  And then they took it away.  This wasn't the Developer Division, obviously, but it speaks to the same overall lack of strategic product line  management that's showing up in developer tools in a big way.

What customers need

Microsoft, if you're listening, here's what we, the decision-making developers and architects of the world, need from you:

  1. Publish a product roadmap.  Let us know how you see these new previews fitting into the Visual Studio line.  If they're intended to replace some existing technology, then say so.  If the technology is going to get folded into a future release, such that the "preview" project disappears altogether, then tell us that, too.  If you really don't know, then I guess I'd just as soon you tell us that, too, though I should warn you that printing "TBD" next to "Future Direction:" on all the lines of your spreadsheet won't do much to convince folks that there's someone steering the boat.
  2. Each of the new tools you introduce has limitations.  Please tell us where they are.  Product "X", for instance, might not do well in an n-tier deployment.  Product "Y" might have a hard time handling certain kinds of data relationships.  Whatever the case, we'd love to know about the limitations up front, rather than finding out after the fact that we've dug ourselves an architectural hole.
  3. Document how and when you expect development technology to die.  Most of the developers I know would agree that the world would be a better place without Datasets, yet they live on.  Despite the fact that Datasets are recognized as Model-T technology, there's no document from Microsoft that I can show a manager to explain that we're trying to get off these things because they've been around since men were drawing flow charts on cave walls, and that EF has now solidly replaced these as a mainstream technology.  Web Services were fantastic when that's all we had, but there are now all sorts of applications where WCF would be preferred -- can we get this sort of information in a *really* simple format?

If Microsoft were able to articulate direction on a couple of these items just a bit more clearly, I'd bet it would go a long way toward creating a big 'ol developer comfort zone.  In the mean time, I'm back to figuring out which validation interface I'm going to use.

Related articles by Zemanta

Enhanced by Zemanta

Visual Studio locked up?

I could have kicked myself last week.  I was running Visual Studio 2010 in a VMWare VM, and it kept locking up - it was driving me batty.  The problem started out as one of those "once every four of five hours" kind of lock ups, but it eventually got to the point where I'd open the solution and click "X" on one file, and I'd be locked up.

Visual Studio 2010 features a new UI developed...
Image via Wikipedia

I tried everything.  I googled the problem till the cows came home -- nothing.  I tried updating VMWare tools.  I tried rebuilding the VM.  I tried 64-bit and 32-bit.  I even uninstalled VMWare Server and installed VMWare Player, thinking that the lack of Direct-X support in Server might be causing the WPF-based VS2010 some issues.  No dice.  Nothing worked.

Just about the time I was ready to chuck the whole works out the window, a nagging memory surfaced.  Somewhere in my distant past (ok, it was three or four years ago), I remember having lock-up problems with Visual Studio 2005.  I never did figure out what was causing the lock-ups, but I had a sure-fire workaround:  just delete the .SUO file for the solution and re-launch.

So, having already exhausted all of my "good" ideas, I gave it a try, and just like that, I was back in business.

Wouldn't you think that at this point, there'd be some way for Visual Studio to detect that something was pretty badly hosed in the .SUO file?  Me too.

Enhanced by Zemanta

Thanks for the help

If you're hypothetically considering changing the interface of your application so completely that users can't find the stuff they need in the places where they're used to finding it (Office, I'm looking at you...), you might be expected to have your users need to use "help" a bit more than normal.

And if you expect your users to need to use your help system, and if you choose to make that help system an internet experience, and if that internet experience is going to use a browser that's had so many restrictions placed upon it in the name of security that you can't sneeze without seeing some sort of exception (IE, I'm looking at you....), you should probably consider the likelihood that your users (already frustrated because they can no longer accomplish something that they used to be able to accomplish on their own) are going to encounter a help system that looks something like this:

Thanks a lot, Microsoft.

Don’t get hosed by “Someday”

For the last couple days, I've been helping another project team track down a memory leak in their app. Memory leaks always seem to show up at the most inopportune moments, of course, and this one was no exception.

This memory leak, however, was self-inflicted.  The source of the memory leak appears to be rooted in an open-source project the team was using to manage detached POCO's.  The detached POCO's, it turns out, weren't needed by the project when they were built into the architecture, but were built to support a yet-to-be-built offline client.  The project team met the anticipated requirement with a clever project that met a widely-recognized need in Entity Framework 3.5.  For those shops that absolutely needed this functionality, there weren't too many choices, and this one was right at the top of the list.  The extra functionality, though, added code and complexity, and in this case, a memory leak.

As a final insult, it now looks like this offline client won't be needing the detached POCO's, after all, and on top of everything else, the POCO support project is now essentially deprecated in favor of native support for POCO's in EF 4.0.   The lesson is repeated every day in development shops around the world: someday never comes (you had no idea CCR was singing about software development, did you?).

Everyone who designs software has done this, but if you're designing software, think about these points to avoid this all-too-common pitfall:

  • Beware the temptation to build for future requirements.  I'll be the first to admit that it's pretty tempting to put in that hook or extra layer when you just know you're going to need it, but consider the risk when you do this.  Requirements change all the time after they're real, and the ones that aren't real yet are even more tenuous.
  • If you're using  a third-party tool or library, open-source or not, be sure to assess the present and future stability of the software.  On Scott Guthrie's blog today, he announced a new View engine called "Razor".  Buried in the article was a little gem of a quote:  "Razor will be one of the view engine options we ship built-into ASP.NET MVC."  If you don't have a quote like that for your library, you have to factor in the risk that the software you're counting on will go away (perhaps with little notice).
  • Be especially vigilant about architecture decisions that permeate your software deeply or create extra complexity in your design.  These decisions are critical and difficult to correct later.

When you build for the future and guess right, you might wind up looking like a genius, but be careful, because far more often, "someday" will bite you.

Automated patterns considered harmful

A couple years ago, if you read some of the "best practices" stuff coming out of Redmond, you'd have thought that software factories were going to transform software development.  Thankfully, this turns out not to have been the case.  I never met a software factory I didn't detest almost immediately, and I'm glad the idea hasn't really caught on any more than it did.

Software factories generally consist of some tools to assist software development, but a central theme of these tools is that they generate code for you.  I've always felt that automation in software development is a very good thing, but it's vitally important that you understand what you're left with when you're done.  If automation results in code that you never have to touch or see, then you're probably more productive as a result, but if you're generating code that you're going to have to maintain, there's a very good chance that you're taking one step forward and two steps back.

Most developers are familiar with "Don't Repeat Yourself" (DRY).  The objective of DRY, of course, is not just reduction in code (and thus, effort) when developing new software, but a centralization of logic that pays dividends as you maintain software.  Code generation often accomplishes a deceptively attractive initial productivity for new development, but the generated code is typically littered with repetitive code.  This code is "free" when generated, but it's an anchor around your neck every time you have to maintain the code, or even when you step through it while debugging.  It clutters your project, reduces readability, and inhibits your capacity to maintain, revise, and refactor your software.  There's a reason why repetitive code is usually right at the top of Code Smells lists.

Given my strong preference for software craftsmanship, it shouldn't be surprising that I consider software factories that barf out projects and classes that you're supposed to maintain an absolute blight upon the landscape of software development, but there are any number of other automation tools available to us.  Here are some thoughts on a few of these:

  • Designers.  These remain the best example of code generation done right, in my opinion.  Designers, when properly executed, produce wholly-standalone code files that we can largely ignore.  Classes are declared as "partial" so that if you need to modify them, you can do so without touching the generated code.  Some designers will add [DebuggerStepThrough] attributes so you don't see this code when debugging.  All of these things help the generated code disappear unless we're specifically looking for something, and that's a very good thing.
  • Snippets.  I remain mixed on these little gems.  When used correctly, they can be a big help, but in practice, they're almost always a sign that you should be doing things differently.  Whether or not you're using a snippet to generate code, you don't want to end up with code that violates DRY, and this means that the opportunities to use snippets effectively are few and far between.
  • T4.  The best usage I've seen of this generation tool (built into Visual Studio, by the way), is Rob Conery's SubSonic data access project.  T4 creates code from templates written in an ASP-like syntax, and is great for iterating over a database or other object structure to crank out code in for...each loops.  Like designers, this code is intended to be read-only (you shouldn't modify the generated code), and it can be marked with [DebuggerStepThrough].
  • Reflection.  Yes, it's a little bit of a stretch to call reflection an automation tool, but creative use of reflection can help you achieve some Ruby on Rails-like productivity by adding behavior dynamically.  Reflection is commonly maligned as a performance-killer, but Rocky Lhotka has been doing great things with reflection in CSLA for years with minimal impact on performance.
  • Copy-paste coding.  If you're not even using a tool to help you with your unnecessary code duplication, you're definitely doing it wrong.  'Nuff said, I think.

In addition to these "automation" tools, there are language features and frameworks available to us today that can serve the same productivity objectives without resulting in tons of repeated code:

  • Attributes.  Typically used with reflection to act on classes at run-time, attributes can make your code much more expressive by declaring behavior rather than implementing it over and over.
  • MVC.  Since MVC is a framework, it doesn't really do anything at all to enforce DRY (or any other coding practice), but it encourages a declarative style of Model development that's very consistent with the ideas I've been discussing here, and most MVC examples use a very expressive, compact Model syntax.
  • Model-driven development.  Microsoft's data modeling bits (known at various times as "Oslo") consist of tools, modeling syntax, and extensions that create a very dynamic metadata-driven application environment.  We're still looking at the early stages of these tools, but the broad objective is to make object behavior completely declarative and dynamic.  There's a danger that we could trade unmaintainable code for unmaintainable configuration data, but I think that as this technology matures, it's going to move us in the right direction.

In addition to the options available to us today, we'll continue to see innovations in the future.  When you're reviewing and evaluating these options, though, remember to always add lightness, because less code is better.

Enhanced by Zemanta