Friday, April 4, 2008

New Languages Considered Wonderful

Gustavo Duarte has a blog post that just showed up on proggit: New Languages Considered Harmful

His basic point, if I understand him correctly, is that we shouldn't bother learning new languages, because it's just a waste of time.  It's well thought out, and he does support his point.  But I completely disagree.

I'm a C++ programmer, which these days is apparently some horrible blight.  Someone, not too long ago (I want to say it was Steve Yegge, but I can't find it in his archives), said that he would not seriously consider any interview candidate who "still wrote C++."  Whatever.  I honestly don't care -- until someone can show me a language that's as good or better at the things C++ is good at, I'm going to keep using C++ (not to mention that I can't exactly just decide to switch our projects over).

I'll save all the details about why I like C++ for another post (and it is coming).  The most important aspect of C++ is that it is a multi-paradigm programming language.  You can use it as a better C, as an object-oriented language, as a generic programming language, to some extent even as a functional language (I think C++0X will actually be an acceptable functional language).  But how do you discover these paradigms, and how to use them effectively?

Early in college, I didn't like to think of myself as a C++ programmer, it seemed too restrictive.  I wrote Perl, I wrote some Python and Ruby, I wrote C, I even wrote some Java.  Then I took a class called Programming Language Design at UIC, taught by the wonderful Professor Barbara Di Eugenio.  After the usuals of covering [static|dynamic] [typing|scoping], we were introduced to languages that otherwise were not generally taught in the CS curriculum at UIC.  The first of these was Common Lisp.  During my brief foray with emacs a few years back, I had tried to wrap my head around elisp using just the manual, but it didn't take, so I was excited to really learn Lisp for real.  Something very fundamental happened -- I got it.  It was a completely different way of thinking about programming, from my most basic assumptions -- Lisp was just math, functions were, well, functions.  From there, we learned Haskell, which did a similar thing -- Haskell is just math, type calculus can solve any problem.  We also learned Prolog, which I'll admit didn't click as well for me, I was never quite able to break through from "here's how you get sum of elements of a list" to "here's how you write programs."  But I imagine there's something there, just under the surface, that I haven't yet found, that, if I ever do find it, will again change the way I think about programming.

I suppose this all goes back to Eric Raymond's choice quote about Lisp:
Lisp is worth learning for the profound enlightenment experience you will have when you finally get it; that experience will make you a better programmer for the rest of your days, even if you never actually use Lisp itself a lot.
It doesn't just apply to Lisp.  I don't really use Lisp on a regular basis, but there are patterns and techniques you learn in Lisp, because you have to, that can be applied to make your programs in other languages better.  I don't use Smalltalk on a regular basis either, but learning Smalltalk helped me to understand OOP on a much deeper level than I had before.  I don't much use Perl anymore, but it helped me understand regular expressions, scripting, weak typing, and much more.  I used C and C++ in high school, but until I worked with the Z80 assembly language for my graphing calculator, I didn't really understand the machine.  I've been using Python a lot recently, it's sort of my new de facto language when no other language is obviously better suited to the task.  I'm not sure exactly what Python has taught me, other than perhaps the value of enforced readability.  What I do know is that Python makes things that are hard in other languages easy, which is valuable in the pragmatic sense regardless of its implications outside the language.

There's a flip side, too.  I recently decided to play with C#, half so that I could write the little GUI app I wanted, and half because I've heard so many great things about it.  I honestly wasn't very impressed.  The .NET framework is pretty great, except for the parts that are still very Win32 (BeginInvoke() anyone?).  But the only feature I was really intrigued with in the language was delegate/event, and I was actually quite unimpressed.  Here's a language feature that adds two keywords to the language, and all of it could be done in library code in C++ (and I imagine C# as well).  C#'s generics are also nice, and in some ways better than C++ templates, but with concepts, C++'s templates will be leaps and bounds better in just about every way.  Of course, it took me about 6 hours (including learning what I needed to know about the C#, .NET, and the Windows API) to write my little app, and the next time I need a Windows GUI app, I'll do it in C# again.  But I can't see how the experience really changed the way I look at anything.

C++0x will be a very interesting language.  Since the last "real" C++ standard, generic programming and template metaprogramming have become favorites in the C++ world, and C++0x will add features to make these easier.  Concepts, along with auto and decltype will make TMP nearly as powerful as Haskell's type programming.  And I'll know how to do it already.  The addition of lambdas and closures to the language proper will make C++ areal functional programming language (assuming you're not one of those purists who claims that the only functional language is Haskell).  And I'll know how to use it already.  Regular expressions, based on Boost.Regex, are being added to the standard library, and I'll know how to use them already.

Duarte says in his blog post, "Scott Hanselman argues that learning a new language is sharpening your saw, but I see it as neglecting your half-sharpened saw while playing with the dull, new, shiny one."  I agree that this analogy is appropriate in a way.  But really, the point is that by learning a new language, you get a new saw which was built differently.  You get to see why the new saw is better in certain situations, and you get to apply that to your sawing from now on, regardless of what saw you use.  Languages are tools, but the best tool you have is up in that head of yours.  Learning new languages makes sure that tool learns new tricks.

Plus, it's fun as shit.

Tuesday, March 18, 2008


Crunch time (slang) A critical period of time during which it is necessary to work hard and fast.

Crunch is the mythical process from the old waterfall development days where software isn't done for a deadline and the team needs to work long hours, even weekends right at the end to get it done. The last known example of crunch occurring in the wild was in 1982 on the development of (ironically enough) an overtime tracking system in COBOL. Since then, we all work on two-week sprints with extreme agile pragmatic scrum efficiency and don't have any bugs that we didn't find with our 100% unit test coverage (which we wrote before the code, obviously).

Ok, so that's total bullshit. Crunch happens a lot in the game industry, enough that you plan for it. The quasi-whistleblowing ea_spouse article even referred to "pre-crunch" starting at the beginning of a game cycle.

So the questions are: why does crunch happen, and what can we do to eliminate minimize it?

In the game industry, especially console games, one factor that amplifies crunch is the fact that once you ship a product, that's it. It's done. There's no v1.1. You can't release a beta to select customers to let them find bugs for you, and you can't respond to bug reports with "will be fixed in a later release." Of course, that's not necessarily true with the current generation of consoles -- on the Xbox 360 and PS3, you can patch the game after it ships. That isn't always as simple as you might think, though. First, patching costs money -- not just for the development of the patch itself, but the money you pay Microsoft or Sony to host the patch for you. Second, unless you release your patch on or before launch day, the first batch of users (and reviewers) plays the game bugs and all. Third, trying to get a bug waived by Microsoft or Sony saying "we'll patch it" doesn't generally go over very well. And lastly, it's not just a simple matter of rolling out a patch: the patch itself needs to be approved, and in certain cases your entire game gets retested.

But this isn't the entire reason why crunch happens. We crunch because the product isn't done, and it's supposed to be. So, looking at it that way, there's two reasons. Is it because the product isn't done, or because the product isn't supposed to be done?

Realistic schedules, deliverables, and deadlines are crucial, and lying to yourself doesn't end up helping anyone. If you're a year out from shipping with six months left in the schedule, you're still a year out. Furthermore, if you're a year out with six months left, so you add 20 people to your team of 30 and tell them to work 72 hour weeks, you might save three months, at the high end. Managers need to be aware that the man-month is, in fact, mythical, and that no amount of extra resources or extra time will change that.

Okay, managers aren't completely to blame, as much as we'd like them to be. Why do we have a million bugs that crop up right at the end of a project? Most software is not written from scratch, and games are (generally) no exception. Especially for sequels, the majority of the code base comes from a previous game. So when I see a comment that says:
// HACK: jsmith 12/12/02 workaround for TTP #1234
I know a few things. More than five years ago, there was a bug in a bug tracker that no longer exists, and someone named jsmith who doesn't work for the company anymore hacked in a workaround for it. What was the bug? What was the hack? I have no idea. This kind of code makes software more fragile and more difficult to maintain. I'm guilty of doing this too -- because it's 2 in the morning and we're submitting next week and I've got 14 A bugs and I want to get it the hell done so I can get my four hours of sleep. These things happen and (without getting rid of crunch) there's no way to avoid them. But the real problem is that they cause crunch in the project that inherits them. Hacks beget hacks, and hacks lead to crunch.

With a two-fold problem, we need a two-fold proposal. On the managerial side, we need people who aren't going to lie to themselves or anyone else about what's possible in a given timeline. A lot of being able to predict how long something is going to take comes from experience. If you're not experienced, ask someone who is, and -- this is the key point here -- TRUST THEM. It doesn't help anyone to have delusions about when you're going to ship. If you're not going to ship on time, either change the definition of "on time" or start cutting features until you will ship on time. This technique does not work during crunch, obviously, but the whole purpose here is to avoid or shorten that as much as possible.

And as for us. At the beginning of the project, start unhacking. Fix all those hacks the right way. Now that you know what problems there were with your initial design, refactor it to take those into account.

Of course, it would benefit everyone to be a little more agile, but when you're working on a project for two years in order to make one release, that's not always possible.


I started this blog as a place to make posts about software development, along with the folks from Cavern of Cobol on Something Awful. We set up an aggregator for all the different poster’s blogs, currently at, but the address may change.

I’m a software engineer at Midway Games in Chicago. I work in the Advanced Technology Group, the idea being that I work on tech that gets shared with the whole company. Most recently I’ve been working on the online systems for John Woo Presents: Stranglehold and NBA Ballers: Chosen One. Lots of colons.

Also, spacepirate just showed up to blog on my blog, so everybody wave.

My First Falafel Blog Post

Well, I am hoping to add some stuff about mixed mode authentication with Active Directory, web programming using python and some mix and match C stuff.


bloggin is the state of craving to blog, blogging, or recovering from blogging.