This posting describes the Arduino software and protocol used for my sump monitor. The previous posting describes the hardware. I may do an additional one later for the FreeBSD-based host software.
(I started this post in December 2011 but didn't finish it and then managed to forget about it. I figured I might as well finish it and get it out.)
I recently returned from the LISA 2011 conference, which was very good. One of the recurring themes was IPv6. That fired me up enough to try to get IPv6 fully working when I got home.
We have a below-grade basement. We have a sump, but we’ve had problems with the pumps, or more precisely, the pump switches. Note the use of the plural: several years ago we installed a second pump on a separate switch, but we have still had both of them fail.
So, I decided to build a monitor for the sump to measure both the height of the water in the sump and when (and if!) the pumps turn on. This needs to be able to send us alerts when something goes wrong, and may give us an idea of whether our sumps should be upgraded to something higher power.
I'm writing this mostly so I can refer to it from other postings. It's not all that interesting in itself.
A colleague of mine from the Britton Lee days (worth a post of its own) recently sent out a message that hit home. Repeated with permission:
From: Jim Bradford <(deleted)@(deleted).com>
Subject: programming isn't fun anymore
Date: August 23, 2011 4:19:31 PM -0600
I used to enjoy writing programs. I could write lines of code, compile them, link them and run them and they would do things. Useful things. They would solve problems. Or they could take input and produce output. Now all that is ancient history.
I don't write code. I learn tools. Or try to learn tools. Problem is, there are more tools than anyone can keep track of. Some people know some tools and so they can get work done. These people are called "architects" and they expect everyone will know the same tools they know. Or everyone should. If they don't, those people are stupid and should be shunned and ignored. We programmers know a few tools but not the ones the architect knows. That's ok - we've learned lots of tools over the years so we keep quiet and think to ourselves "I don't know what this guy is talking about but I can learn these tools". So when the architect says "exclude slf4j from the library's build sequence or modify the pom file dependency list" we don't say "what the hell are you talking about". We don't say anything. We go to google and spend the next two weeks learning slf4j and Ivy and Maven, and RESTful WebServices and Grails, and the proper syntax for the BuildConfig file. Then we reboot the computer three or four times for good measure; download security patches for the IDE; get the latest version of the JDK; clone a few repositories from GitHub; study the Artifactory website; look for new docs on the wiki; and hope to god someone has figured out why the WAR file doesn't deploy to the 3.2 version of the app server. In all this time no code has been written. No problems have been solved. No user interfaces have been created. But everyone has been terribly busy. And the architect has been studying newer, better versions of the three or four tools we have now almost learned, but which have become obsolete. Eventually, the project is cancelled. Management decides to continue using the prototype version written in Objective-C, even tho the source code has been lost and it doesn't use any of the company-standard tools, because it does 80% of what the one customer wants, and no one else really needs the application anymore.
That's the story of my professional career. Trying to learn things fast enough to create programs to solve problems that go away by themselves or weren't worth spending time on in the first place. Sisyphus had more job satisfaction.
Now, I think Jim is a bit harsh. But only a bit. As a long term architect, I can say that we (well, at least the ones that I've known) don't sit around trying to figure out how to make programmers' lives more difficult. But as someone who, after a decade spent primarily in management, PR, marketing, finance, nearly anything except actual coding, found that I was completely and utterly lost, I can affirm at least some of what Jim says.
When I was a coder, we worked on algorithms. Today, we memorize APIs for countless libraries — those libraries have the algorithms. Section 3 of the Unix manual used to fit comfortably in a small binder; today the BOOST library has hundreds (if not thousands) of APIs. It used to be that the tools were helpers (i.e., optional but useful) and relatively minimal and well-defined — the build tool was
make, etc. Now I've worked on a project that used
cmake, and a couple of others I never did identify, not to mention a series of scripts written in
perl, etc., with UML used to generate interface descriptions. In fact, most projects I've seen in the last few years seem to resemble Dr. Frankenstein's monster. There is no way you can work with such a mess without using
eclipse or some similar tool.
But let me go further. I used to be enamored of object-oriented programming. I'm now finding myself leaning toward believing that it is a plot designed to destroy joy. The methodology looks clean and elegant at first, but when you actually get into real programs they rapidly turn into horrid messes. I will assert that you simply can't program any non-trivial program in Java or C++ without an "Environment" to help you. You need this to figure out WTF is going on with all those library calls that differ in very subtle ways. Debugging has turned into an exercise in figuring out why the methods you are calling don't do precisely what you anticipated. Engineers spend long meetings debating whether to upgrade to the latest version of (fill in favorite library name) out of fear that it will break things in unexpected ways. DLL Hell has turned into CLASSPATH Hell.
And who managed to convince the world that a malloc call was cheap, so you should do arbitrary numbers of them at every procedure (method) invocation? Memory management is expensive. Even the best algorithm blows the cache. Get used to it, at least until memory becomes so fast that we don't need caches any more. Wait, I hear a voice in the back of the room — no, garbage collection does not fix the problem; in fact, it makes it worse.
We spent years working on software reusability. We have now succeeded, and have to pay for that. Be careful of what you wish for.
The good news is that programming will become easier for people with good memories but not so great analytical capabilities. Unfortunately I fall into the opposite camp. Alas, if I had only memorized my times tables instead of thinking about Boolean logic, base 8 arithmetic, etc. I am a product of the "New Math", and it served me very well.
I have some hopes for Go, which Rob Pike specifically said was partially designed to make programming fun again (see about three minutes in). Similarly, Lua is a rather interesting language. I haven't tried to code anything significant in either language, so I can't speak from experience, but they do look interesting.
I don't have a solution. In fact, I'm probably just getting old enough that I'm not adapting to the new reality. I grew up when it was important to understand NAND gates and flip-flops. Now it's a different world. I can take some satisfaction that I rode a wonderful wave.