Type Safety: It makes the C# world go ’round

I'll admit this may not be a subject on the tip of America's tongue, but there are some things that are bugging me about the state of the software development profession, and I thought that perhaps a good place to start would be with something that's working extraordinarily well.

Type safety, by the way, has nothing to do with an airbag on your keyboard, so this article will be a bit technical, but I'll try to keep my explanations simple enough for a wide audience. You may also recall a recent article about Generics -- these concepts are very closely related, as you'll seen shortly.

The concept of type safety in a programming language simply means that the compiler and runtime environment know the type of each variable, and can use that information to protect you from a lot of errors you might otherwise see. A variable's type can be a primitive type, like a string, or an integer, or a Date. It can also be a complex type or an object. A Customer object, also has a type, and it's different from an Invoice object.

What good are types?

Types help us in two main ways. First, the compiler can produce much safer code. When the compiler can check the types of our variables, we aren't allowed to shove square pegs into round holes, and we are safe from many type conversion bugs that plague non-typed languages.

The second contribution we get is coding support from our development environment. In Visual Studio, for instance, its intellisense features help with things like auto-complete and refactoring. None of these features is due strictly to how .Net handles types, but many of them are made possible when we know about our variables' types.

There are some downsides, too. There are times when you know full well that you're copying one variable to another, and they're supposed to be of the same type, but the compiler gives you an error. These problems are easier to take once you get the hang of explicit type declarations and even easier to take when you see the types of errors these checks prevent.

You may have made the connection to Generics by now. Generics simply let us extend all of this type-safe goodness to classes where a type will take multiple forms depending on where it's used.

The overall affect on the developer is to make it much easier to crank out lots of code in a hurry, with a reasonable expectation that it'll run right when it's compiled. The development environment actually predicts what I'm going to type in some cases because it knows enough about my objects to be able to understand the most typical coding patterns as I start them.

It's also possible for all that help to lead to developer dysfunction, but that's a topic for another day.

Windows Guy and Mac Guy – a grain of truth?

When Apple started running its latest version of the "Mac guy vs. PC guy" campaign, I was a little indignant. This most recent installment features the large, dark-suited fellow representing the firewall. "You're coming to a sad realization.. accept or deny" -- you've probably seen this by now.

"It's not that bad," I felt. You click a few things to set up some policies, and then it doesn't bother you any more. Well, here's an interesting post on Channel 9 that shows that maybe Apple's got it right in this ad after all. This looks just like the Apple ad -- a whole bunch of dialogs to click through just to be able to install a Vista gadget.

But wait. Is it really as bad as it looks?

Let's start with the last dialog. This one indicates that the code in this example wasn't signed correctly. Developers see this all the time, because we're building unsigned code ourselves, and because we use a lot of beta or experimental programs that might not be signed. For a "normal" user, though, this is a legitimate red flag. They absolutely, positively, should question what's going on if they see this dialog.

The second window (the one about operating outside of protected mode) makes it look like this gadget is requesting resources or access above and beyond what would be normal for this type of program. I haven't seen this particular warning, but I've set up .Net security policies that are used to drive exactly this sort of behavior. The whole idea is that the application developer indicates the minimum set of resources and permissions needed to run the program.

Gadgets are designed to work in a protected "sandbox" very much like the environment Java applets run in. There's a mechanism that developers can employ to ask for additional resources, and it'll result in a dialog like this. Again, this indicates that there's really a heightened need to access stuff on your PC (meaning that you should trust the publisher before accepting) or that the developer goofed (meaning that you should punt this back to the developer until they fix it).

So when you look at what's happening in these dialogs, it makes sense.

The thing that doesn't make as much sense, and therefore, the thing that lets Apple go on making fun of PCs, is that the communication with the user is waaayy too technical and jargon-filled. Programmers are going to want to know about the information on these screens; very few users will.

Generics: Making life easier one class at a time

I was looking forward to .Net Generics since I learned they'd be included in .Net 2.0. Having to build an entirely new class to implement a strongly-typed collection was a drag, when Collection(of..) or Collection<> would have solved the problem in a fraction of the time.

So I've been using them for typed collections, and they're great. The part I didn't anticipate was what a lifesaver they'd be in a thousand other instances.

So far, besides using them for Collections, Lists, Dictionaries, and so on, I've used Generics in places like these:

  • Building a base class to handle properties that can track Dirty. Since the property types vary, this would have meant creating them as Object before. This is much, much better syntactically, and especially with respect to type safety.
  • Abstracting me from Oracle hell. For reasons known only to the Royal Order of Oracle DBAs, local chapter #666, some of our lookup tables use Integer keys, while others use two-characters "codes". They're all the same to me, thanks to Generics. I set the ID up as a Generic type, and I'm off and running (it's slightly more complicated than this, but not by much).
  • I used Generics to wrap some common functionality around a bunch of DataSets, where the DataSets had to remain strongly-typed in order for things to work right (so I couldn't just write against DataSet).

Bottom line: once you get used to thinking in Generics, you'll find all sorts of problems that cry out for them.

One of the things you'll have to get used to is the slightly weird meta-syntax you have to deal with from within the Genericized class. You can't just use == or != for instance, becuase those operators might not be defined for all generic types.

So crack open the help files and recall how .Compare works, and while you're in there, check out the Where syntax for use when declaring your Generic class -- it lets you specify specific interfaces that you require your implemented object to support.

For example, if you want to create a class that works with any IEnumerable object, you can specify that using Where on your class declaration, and then you can use this class Generically only with objects supporting IEnumerable. This isn't too difficult to get a handle on, and it's a small price to pay for real type-safety.

If you have any questions, or you'd like to see some examples, leave a comment -- I'd love to hear from you.

Berkeley Physics in Podcast

A while ago, I downloaded a podcast series from iTunes. It's a series of lectures by Richard Muller, a Physics Professor at Berkeley. This podcast, along with a few others, are available free on iTunes.

These lectures have been a real delight. Muller has a great knack for covering topics at a level that doesn't dive too deeply beyond the comprehension of mere mortals. I usually find each lecture to have a mix of things I already knew and things I knew nothing at all about. As a measure of Muller's ability to communicate, I find I'm equally interested in both, as he always finds a novel way to convey these topics.

The name of the course, incidentally, is "Physics for Future Presidents," so a central theme of the lectures is that the student should absorb the concepts in the course, but not be expected to churn out calculations. That suits me just fine.

Understanding these ideas as they apply to our lives, on the other hand, is fascinating. Muller explains such things as the physics behind alternative fuels and global warming. As you might expect, some of the common wisdom surrounding these and other topics has a basis in physics, and Muller explains that. What you might not expect is that there are places where the common wisdom has completely failed to grasp the reality of these matters, and Muller covers those areas well, too.

Why a physics-lite class? What's the point, if students aren't intended to go on to dedicate their lives to physics? Easy. Muller understands that a broad understanding of the real forces that shape the world around us will equip anyone to call "bullshit" when somebody starts blowing smoke or selling FUD, either intentionally or out of ignorance. A little understanding equals better decisions.

So go ahead - blow off the drive-time drivel, and listen to some physics on your commute!

Physics for future Presidents
Physics 10 Physics for Future Presidents Podcast index
Muller's Home Page

Disrespect for technology not appreciated

I've seen this a thousand times. Some self-proclaimed technology pundit starts going on about how the world would be a better place if only "______" didn't exist (or, didn't exist anymore). I saw it again this week, and it's still not very pleasant.

I'm sure it began as the first higher-level languages began to elbow their way into the computer room next to assembly language, but it didn't start to become fashionable until computer "experts" began to predict the demise of COBOL. For the record, I think we're going on about 30 years of "COBOL's dead, haven't you heard?"

I didn't appreciate how ridiculous this really was the first time I heard it, but when I started to develop in Visual Basic, I was very aware that VB was considered an "inferior" language. Never mind the fact that I, as well as the former Mainframe developers I'd trained, were cranking out features a lot faster than the "smart" developers. While they were wading through inheritance stacks trying to find out which super-class was really responsible for setting the text of a window caption, I was building applications out of a bag of components I'd bought for a few hundred dollars. I'm not worthy!

So it was again with Java. I saw a lot of Java code and Java coders come and go. I saw some smart ones and some that weren't so smart, and I saw some fantastic applications and some that weren't worth a hill of beans. Through all of this, I'd dare say that the better the developer, the less likely he was to suggest that anybody who wasn't working with Java was a nobody loser has-been legacy programmer. The distinction wasn't lost on me.

Now, of course, it's becoming fashionable to dump on Java. Rails is far, far hipper, and .Net is rising fast. The more things change, the more they stay the same.

I'm working in C# now. I still hear the noise about those inferior languages and tools, and it still ruffles my feathers a bit. This week, I was in the middle of a design discussion in which the use of ADO.Net DataSets was proclaimed to be categorically evil.

This, of course, completely ceased to amuse me.

I can appreciate how the misuse of a technology can make a mess, but I just don't have a lot of patience for someone who suggests that a technology is without any redeeming qualities, let alone is destructive enough all by itself to ruin an application. I've seen far more applications ruined by developers who didn't understand how the technologies they were using really worked.

In case this escaped you to this point, ADO.Net DataSets are part of Microsoft's current technology stack. These classes ship with .Net and the Visual Studio development environment. They appear prominently in Microsoft's documentation and sample applications. They are not, contrary to popular opinion, the abandoned stepchildren of the modern Microsoft development stack.

I'm just not buying the line about DataSets being the spawn of Satan, and I won't buy it next time, either, when I hear a generalization made without any real understanding of the problem or the solution.

Save that for someone who doesn't know better.