What, exactly, is wrong with “Private Clouds”?

Racks of telecommunications equipment in part ...
Image via Wikipedia

I recently saw an interesting post by Gordon Haff that claims that cloud computing concepts can't really be applied to enterprises smaller in scale than the Googles, Microsofts, and Amazons of the world.

Humbug, I say.  I certainly didn't have that impression when I learned about Azure.

I'll concede that there aren't too many enterprises that have the scale of Google or Microsoft, but there are quite a lot of enterprises that would see benefits of cloud computing concepts in their internal data centers.

You see, the "infinite scalability" promise of cloud computing is only one of the benefits that cloud computing promises.  Most of the concepts we're seeing in cloud computing were already trends in big data centers before they turned into cloud features.

Continue reading "What, exactly, is wrong with “Private Clouds”?"

Sysinternals Tools

SAN FRANCISCO - JANUARY 29: (FILES) Buttons wi...
Image by Getty Images via Daylife

A number of years ago, Microsoft purchased an excellent body of tools called Sysinternals.  They're now (officially) called Windows Sysinternals.

Every once in a while, I am reminded just how useful these tools are - today, it was Process Explorer.  My system was running dog-slow, and I wasn't really sure why.  I tried using Task Manager, but I didn't really understand what I was seeing.  There as an instance of setup.exe that was taking up CPU time, but I hadn't launched any installs.

So I grabbed Process Explorer (I keep it on a USB stick) and fired it up, and I saw immediately (because of how the processes were linked on-screen) that Windows Update was upgrading SQL Server Express.  I still had a slow system, but now I knew why.

If these tools aren't already on your go-to list, you owe it to yourself to check them out and keep them with you at all times.  You'll be glad you did.

Reblog this post [with Zemanta]

Here’s one I got right

Winter
Image by 96dpi via Flickr

I just happened to stumble across a post I wrote just about exactly two years ago to the day:

Ford: Demise of an industry

Normally, I gain some satisfaction from having been right way ahead of the pack.  Not so much in this case, though.

In this case, I look back and I see an industry's incredible collective capacity to see a problem coming, look it in the eye, and keep pouring the coals to the boilers.

Since I wrote that post two years ago, the auto makers have hit the skids in a big way (though, paradoxically, Ford is fairing the best of the US automakers), and we've also seen our banking industry implode.  It turns out that "too big to fail" is too big to steer, doesn't it?

Reblog this post [with Zemanta]

Is your network ready for cloud computing?

Live Mesh
Image via Wikipedia

One of the most impactful things I saw at CodeMash wasn't on the schedule.  I'd dropped in to check out Jeff Blankenburg's presentation on Azure & Windows Live Mesh, but it was clear that the demo gods weren't about to smile on Jeff that day.

The CodeMashers had sucked up the hotel's WiFi bandwidth like it was free beer, and Jeff's demo was sucking wind.  Now, truth be told, I'd seen a similar demo under more favorable conditions, and the demo rocks.  I'm not trying to indict the demo, or even the technology.

What I do question, though, is whether we're collectively thinking about all the architectural implications of the cloud computing movement.

There's a giddy excitement about cloud computing, because scalability in the cloud is super-simple and all but free.  You just twist a knob, and you've got more capacity.  Really -- that's what all the marketing folks tell us, isn't it?

But most of us won't build enterprises entirely in the cloud.  Most of us (or our clients) have systems running on internal networks, and we're going to move portions of our systems to the cloud, or we're going to expand into the cloud for new applications.  That means integration -- not only are we going to drive more traffic through our Internet gateway because we're accessing applications running on the 'Net, we're also going to need to integrate our cloud apps to our enterprise and our enterprise apps to the cloud -- again, all passing through the Internet gateway, and all using relatively fat HTTP requests.

Is your network ready for this extra stress?

A server used for the My Home
Image via Wikipedia

I've seen enterprises operating with chronically poor network performance.  I've seen internal apps running slowly, and I've seen the same apps slow to a crawl when accessed from the Internet.  I've seen the baffled looks on IT Managers' faces when they weren't sure how to attack the problem of network performance.  Networking problems are scary, because it takes a completely different skill set to identify and fix these problems than the skills we've developed as application developers.

Do you have a network engineer that you trust as completely as you trust your DBA's?

Consider what you're going to do the next time your office experiences a "slow internet day".  Now, you're no longer talking about slow Google searches and interruptions to webcasts.  Now, you've enterprise applications that aren't working.  Is your enterprise ready to handle that call?

Reblog this post [with Zemanta]

Death of the Walled Garden

I was listening to NPR's Marketplace last night, and I heard them talking about a settlement that had just been reached in New York.  In this settlement, UnitedHealth Group will close a proprietary database run by their Ingenix subsidiary, and spend $50m to start a new, open database in its place.

The database in question is a gigantic repository of claims and payments, and Ingenix made a tidy living over the years by selling subsets of this data to insurers for really large sums of money.  But for the insurers, the data is crucial, and they were the only game in town.  The barriers to entry for another player were just too prohibitive for someone else to try to break into this business.

So what's the problem?

The problem is that insurers made many of the decisions about how they were going to reimburse for services based on this database, and they were the only ones who could check the numbers.  They were able to mine and manipulate the data to their advantage, and providers and insureds had no choice but to trust that the insurance companies were treating them fairly.

Thus, a lawsuit was born.

But what does this have to do with software?  The Ingenix database was one of the biggest remaining walled data gardens out there, and it's history.  One by one, businesses that were in business only because of proprietary information are going the way of the dodo.

This is a really encouraging development for the health care industry - what's the next walled garden to fall?

Reblog this post [with Zemanta]

CodeMash notes


I'm heading to CodeMash tomorrow, and I hope to come back with all sorts of great news, notes, and ideas.  I'll be scribbling notes throughout the day on my Adesso tablet, and I'll sync them with Evernote to a CodeMash public folder.  You can see all of my scribblings there as I sync - apologies in advance for my chicken scratchings.  As I have time to organize my thoughts, I'll try to get some blog posts out of my trip, too.

When’s the last time you used dial-up?

As a favor to a friend, I recently did some tech support for a guy who was having trouble getting his dial-up Internet connection to work properly.  Yes, that's right - dial-up.

Let me tell you - if you haven't done this recently, it's a real eye-opener.  Not that I'd wish it on anyone else, mind you, but it helped remind me that there are still people out there (in the United States, actually) who still connect to the Internet by dial-up, and that there are times when the Internet really bites for these folks.

SAN FRANCISCO - JANUARY 29:  A display of the ...
Image by Getty Images via Daylife

Here's a gigantic for-instance: it turns out that the reason this guy couldn't connect to the internet is that he'd just upgraded his computer (no, I don't know why -- maybe his old one died), so he had this brand-new Dell loaded with Vista, and he couldn't get it connected.  He'd called his dial-up provider (PeoplePC) and the tech-support guy tried to talk him through getting connected, but just couldn't seem to finish the job.  So when I got there and got the poor thing online, the first thing we wanted to do was to get email set up.

You know what the first thing the PC wanted to do was, though?  Windows updates.  Arrrgghhh.  So this poor guy is downloading 10 hours worth of Windows updates while we're trying to get email configured, and predictably, network access is godawful slow because the updates are sucking up all our bandwidth.  Nice.

Windows Update
Image via Wikipedia

I tried to drop a couple of hints to the effect that he could probably get broadband for a couple bucks more than he's paying right now for dial-up, but honestly, wouldn't it be nice if Vista would be a little more considerate about how it shoves updates at dial-up users?  If you've got a guy who's dialing up every day to get his email, and then hanging up when he's done, he might literally never catch up with the updates, and I'd bet that all the really good stuff is queued up behind 30MB worth of Office clipart updates.

And in the mean time, I'm staring at bits dribbling down the PC thinking that I'd be getting more throughput on my cell phone.

The other big problem that occurred to me was that this guy didn't have a hardware firewall / router, which meant that we were relying only on Microsoft's firewall to protect us.  A little shiver ran up my spine at this thought.

I left before the 10-hour download completed, and I'm going to have to go back to see if everything's working ok, but I've really been thinking about how effectively the Web 2.0 movement is leaving these people behind.  Flash?  Silverlight?  Painful.  Big JavaScript files to support Akax?  Doable, but not nearly as usable as a plain HTML web site.  Even Gmail crapped out while Windows update was downloading.

So the next time you see one of those studies that talks about how the USA lags behind "X" developing nations in broadband adoption, stop and think about it for a minute.  These people live in your neighborhood, and there's a good chance that they're not buying your product or using your service because they can't use your site on dial-up.  Is this really the best we can do?  Really?

Reblog this post [with Zemanta]

public class SubSonic3: IEnjoyable

Software Update
Image via Wikipedia

A couple of years ago, I heard about a data-layer code generation tool called SubSonic.  At the time, there was a fair bit of confusion about how it was to be categorized - it wasn't really an ORM, after all.  In fact, it was sort of a C# version of Ruby-On-Rails' ActiveRecord.  I played with it a bit, but I set it aside because I didn't have any projects at the time that called for something like this.

Fast forward a couple years, and some things have changed.  SubSonic's founder, Rob Conery, has been hired by Microsoft.  Somewhat surprisingly, this has not meant the end of SubSonic.  In fact, Rob's working on a new release, and it's shaping up to be really, really nice.  This is the first SubSonic release to come out with Linq support, and they look like they were made for one another.

When Rob announced the Subsonic 3 Preview 2 release, I grabbed it and took it for a quick spin.  If you're going to try this release, Rob's done a quick video that walks through a quick install and deployment, and I'd recommend watching it.  Installation couldn't be easier - the readme file lists four simple steps, and if you do everything right, you'll be up and running with scaffolding support for your database in less than five minutes (it takes less than one in the video).

What do you get for your effort?  SubSonic's scaffolding support builds out classes for each table in your database, as well as helpers for all of your stored procs.  The syntax shifts since SubSonic's earlier releases means that SubSonic 3 isn't a true ActiveRecord implementation, but the syntax is very natural and (IMO) productive -- it allows you to express database requests in a really compact form:  (example from Rob's post)

Northwind.DB db = new DB();
var result = from p in db.Products
where p.CategoryID == 5
select p;

foreach (var item in result)
{
Console.WriteLine(item.ProductName);
}
database 2
Image by Tim Morgan via Flickr

There's also support for LINQ queries, and you can batch queries so that multiple queries occur in one call to the database.  Even stored procs get an assist from SubSonic - procs are encapsulated into simple, type-safe function calls that return a reader for the results.  This is a big improvement over traditional data access layers where you'll see tons of repeated code.

Part of the magic comes from the SubSonic assembly, which supplies database-related code that could cut a bunch of repeated code out of your projects all by itself.  That's only part of the picture, though - there's a set of generated files that gives you strongly-typed objects for each of your tables.  The code generation uses T4 templates - see this screencast on the Patterns & Practices site for more on T4.

While simple applications might make use of the generated partial classes as-is, more complex apps are going to want to extend and/or encapsulate these classes, and expose more complete business classes to callers.  There are a number of ways you could do this:

  • Edit the generated code.  This is probably not an approach you'd want to take for a production app, but for a quick prototype, it could work well.  The problem with this approach is that your customizations are subject to breakage as the generated code changes.  If you can limit your changes to extensions (via partial classes), you might stand a chance here; otherwise, you'll want to explore other options.
  • Change the templates.  Since SubSonic is open-sourced, you have access to anything you might want to change.  Better yet, since the templates themselves are copied into your project, any changes you make there will be local to your project.  If your changes can be expressed in the T4 templates, this could be a really good option for you.
  • Encapsulate the generated classes.  I've come to really like working with Rocky Lhotka's CSLA Framework, but the data access parts of these classes freqently end up feeling a little more ponderous than I'd like.  SubSonic is just the thing to fix this - its syntax is crisp and compact, leaving me to focus on writing business code in my CSLA classes.  Personally, I like this model a lot - it builds on the strengths of both projects.

I'd also recommend checking out Rob's webcast on using SubSonic as a REST handler.  There are some other interesting options if you're trying to build a REST interface to your database (Astoria, WCF REST starter kit), but it's a great demonstration of SubSonic's ease-of-use.

Bear in mind that this is a preview release (before beta), so there are some rough edges.  I saw some code generation snafus where I had fields named the same as their tables, and there's a problem with LINQ support that makes it tough to construct a LINQ query using the value of a variable rather than a reference to that variable.  You're going to want to wait for a slightly more mature release before you start doing "real" work with 3.0, but this preview will give you a good indication of what's to come.  There's a lot here to like.

Reblog this post [with Zemanta]

GradSourcing

Green Bay
Image via Wikipedia

There was another great tech event in Columbus last week- the TechColumbus CIOhio 2008 conference. Unfortunately, I wasn't able to attend this event, but I followed along thanks to @bblanquera, @8101harris, and others. According to Dan, one of the topics that came up was GradSourcing. BMW Financial's Jeff Haskett spoke about a program he's put in place to bring new talent into his organization directly out of school.

When I heard about this program, I thought back to my own experience growing a team in Green Bay, WI a dozen or so years ago.  Green Bay was (is?) largely a blue-collar town, and it really didn't attract a ton of IT talent, so when I needed to build a software development team, I found that I had to relocate people to Green Bay or develop talent locally.  Relocating people to Wisconsin, it turned out, wasn't especially easy or cost-effective, so I had to find a way to grow good developers locally.

Continue reading "GradSourcing"

Great post on Generics

If there's one thing I can do...
Image by harold.lloyd (homologous) via Flickr

If you're not comfortable using C# Generics in your code, be sure to check out this great article by Karl Seguin.  It's an easy read, and it covers all the important ideas you'll need to get your head wrapped around these things.

I've mentioned Generics a couple times on this blog - they've been a great resource to me.  I'll admit - it took a little while for me to get comfortable with some of these ideas - especially multiple Generics, but since I've gained an appreciation for their capabilities, I find myself grabbing this tool early and often.

For me, the key to understanding Generics was to use them - as intensively as possible.  I used them in code I was writing from scratch, and then, when I started working with Rocky Lhotka's CSLA Framework, I needed to work with them more.  His framework, in fact, makes very heavy use of Generics - to the extent that I could see someone being put off by the learning curve if they weren't already comfortable with Generics.

Reblog this post [with Zemanta]