|« A Summary of My Computing/Communication Environment||Sonic.net, static IPs, and firewalls do not mix »|
A colleague of mine from the Britton Lee days (worth a post of its own) recently sent out a message that hit home. Repeated with permission:
From: Jim Bradford <(deleted)@(deleted).com>
Subject: programming isn't fun anymore
Date: August 23, 2011 4:19:31 PM -0600
I used to enjoy writing programs. I could write lines of code, compile them, link them and run them and they would do things. Useful things. They would solve problems. Or they could take input and produce output. Now all that is ancient history.
I don't write code. I learn tools. Or try to learn tools. Problem is, there are more tools than anyone can keep track of. Some people know some tools and so they can get work done. These people are called "architects" and they expect everyone will know the same tools they know. Or everyone should. If they don't, those people are stupid and should be shunned and ignored. We programmers know a few tools but not the ones the architect knows. That's ok - we've learned lots of tools over the years so we keep quiet and think to ourselves "I don't know what this guy is talking about but I can learn these tools". So when the architect says "exclude slf4j from the library's build sequence or modify the pom file dependency list" we don't say "what the hell are you talking about". We don't say anything. We go to google and spend the next two weeks learning slf4j and Ivy and Maven, and RESTful WebServices and Grails, and the proper syntax for the BuildConfig file. Then we reboot the computer three or four times for good measure; download security patches for the IDE; get the latest version of the JDK; clone a few repositories from GitHub; study the Artifactory website; look for new docs on the wiki; and hope to god someone has figured out why the WAR file doesn't deploy to the 3.2 version of the app server. In all this time no code has been written. No problems have been solved. No user interfaces have been created. But everyone has been terribly busy. And the architect has been studying newer, better versions of the three or four tools we have now almost learned, but which have become obsolete. Eventually, the project is cancelled. Management decides to continue using the prototype version written in Objective-C, even tho the source code has been lost and it doesn't use any of the company-standard tools, because it does 80% of what the one customer wants, and no one else really needs the application anymore.
That's the story of my professional career. Trying to learn things fast enough to create programs to solve problems that go away by themselves or weren't worth spending time on in the first place. Sisyphus had more job satisfaction.
Now, I think Jim is a bit harsh. But only a bit. As a long term architect, I can say that we (well, at least the ones that I've known) don't sit around trying to figure out how to make programmers' lives more difficult. But as someone who, after a decade spent primarily in management, PR, marketing, finance, nearly anything except actual coding, found that I was completely and utterly lost, I can affirm at least some of what Jim says.
When I was a coder, we worked on algorithms. Today, we memorize APIs for countless libraries — those libraries have the algorithms. Section 3 of the Unix manual used to fit comfortably in a small binder; today the BOOST library has hundreds (if not thousands) of APIs. It used to be that the tools were helpers (i.e., optional but useful) and relatively minimal and well-defined — the build tool was
make, etc. Now I've worked on a project that used
cmake, and a couple of others I never did identify, not to mention a series of scripts written in
perl, etc., with UML used to generate interface descriptions. In fact, most projects I've seen in the last few years seem to resemble Dr. Frankenstein's monster. There is no way you can work with such a mess without using
eclipse or some similar tool.
But let me go further. I used to be enamored of object-oriented programming. I'm now finding myself leaning toward believing that it is a plot designed to destroy joy. The methodology looks clean and elegant at first, but when you actually get into real programs they rapidly turn into horrid messes. I will assert that you simply can't program any non-trivial program in Java or C++ without an "Environment" to help you. You need this to figure out WTF is going on with all those library calls that differ in very subtle ways. Debugging has turned into an exercise in figuring out why the methods you are calling don't do precisely what you anticipated. Engineers spend long meetings debating whether to upgrade to the latest version of (fill in favorite library name) out of fear that it will break things in unexpected ways. DLL Hell has turned into CLASSPATH Hell.
And who managed to convince the world that a malloc call was cheap, so you should do arbitrary numbers of them at every procedure (method) invocation? Memory management is expensive. Even the best algorithm blows the cache. Get used to it, at least until memory becomes so fast that we don't need caches any more. Wait, I hear a voice in the back of the room — no, garbage collection does not fix the problem; in fact, it makes it worse.
We spent years working on software reusability. We have now succeeded, and have to pay for that. Be careful of what you wish for.
The good news is that programming will become easier for people with good memories but not so great analytical capabilities. Unfortunately I fall into the opposite camp. Alas, if I had only memorized my times tables instead of thinking about Boolean logic, base 8 arithmetic, etc. I am a product of the "New Math", and it served me very well.
I have some hopes for Go, which Rob Pike specifically said was partially designed to make programming fun again (see about three minutes in). Similarly, Lua is a rather interesting language. I haven't tried to code anything significant in either language, so I can't speak from experience, but they do look interesting.
I don't have a solution. In fact, I'm probably just getting old enough that I'm not adapting to the new reality. I grew up when it was important to understand NAND gates and flip-flops. Now it's a different world. I can take some satisfaction that I rode a wonderful wave.
Trackback address for this post
My primary platform? VBA in Excel. Before that, C, Fortran, awk and shell scripts.
It really does irk, though, when 'serious' coders say they feel sorry for me because of what I use now. I feel strongly that I'd much rather avoid getting bogged down with any kind of thing which prevents me from Getting Stuff Done and Making Stuff Work.
I seemed to get a lot done without lots of libraries. Now all the standardization and large numbers of libraries make the API's impossible to learn.
The only way to get something done is to google for example code using that API, as my head-memory isn't large enough or willing enough to learn these API's. Especially knowing that they change and get thrown out quickly, so whatever learning I would do would then not be useful in 5 years time.
I'm working on ways to make it easier again.
Get a grip. Sure, things can be simplified. Do it! Stop whining and _show_ people the better way. But the reality is that things are always getting more and more complicated. And that takes tools. Very, very sophisticated tools
Simplicity doesn't come from a programming language. It comes from architecture and design and that only comes from a hella lot of experience, practice and some natural talent. Go isn't going to be your own personal Jesus. Neither is Scala, Lisp or Haskell.
Again, stop this incessant whining and just do it. Or do us all a favor and just retire for "Bob's" sake.
Playing with canvas and coffeescript, for exemple.
I've been writing a fair bit of Java lately including J2EE and to me it seems your opinion is right on the money and it puts into words what I've been feeling while "programming" these last few years.
I started writing in BASIC when I was a teenager and then moved on to COBOL, FORTRAN, C, and yes even VB, and I'm fairly sure I had more fun with all of them but then the past is good at hiding the bad.
The API that I'm always thankful for these days though is the one that provides for creating the GUI and helping to handle the events it generates; I wouldn't want to still write that code like it was written in the early days of X11.
Thanks for the great article, it made my day.
For me, programming combined math, logic, problem solving, and creativity.
Fun for me was eventaully becoming self employed, having my software sold in a major vertical market, programming provided sucess, independence, and a very good income.
Alas today the combination of endless, unplanned, 'change', plus corporate nano-management, and compartamentalization of duties has sucked out much of the enjoyment.
How many artists continually use a different medium? How many muscians continually play a different instrument? They polish their skills and art form. All too many programmers have been thrown away, driven out, by the current enviornment. Many of these people had skills way above those who 'love to learn.'
Maybe the result of the power of our current environments is that we now have to put up with what you describe in your posting, which I agree with too. But, given the choice, I'd go with what we have now.
The "new" operator is not your friend if it's always needed, because memory allocation is still expensive. High performance systems are required to pre-allocate memory to avoid high runtime costs and increase locality. Essentially, "best practice" in OO languages is to defeat most of what they give you. This strikes me as essentially equivalent to the bad old days of limited, flaky hardware. And unlike those days, we software people did it to ourselves.
Now software developers must all be their own architects. This isn't bad in itself, but it's definitely a challenge to keep up. And worst of all is the fate of programmers in many large organizations: the real work of building non-trivial systems has been farmed out to vendors or the engineering department, while the IT department has become increasingly limited to liasing and Tier 2 tasks--and maybe a bit of configuration or trivial data transformations if they're lucky.
Comments are not allowed from anonymous visitors.