Thursday, December 31, 2009

Still More LaTeX On The Web










A good friend has brought yet another library for embedding LaTeX into an HTML page to my attention. It's a JavaScript library called MathJax. It looks like MathJax builds on jsMath, a 2004 vintage JavaScript library.

The results look beautiful to me. If you're a scientist, it's got to be great to have such wonderful tools so freely available. The typesetting is taken care of for you; all you have to do now is imagine great applications. And the proof is left for you! QED, baby.



New Year's Eve










2009 is almost in the book. Like everyone else, I usually look back on the year and ponder what I've done, resolve to fix what I've done poorly, and maintain or improve on what went well.

I'm a fortunate man, because I have some wonderful things that I've been lucky enough to be able to take for granted.

My mother is still with us - living in her own home, taking care of all her business. She still has all her wits about her and is as good looking as ever. Same for my in-laws. Seeing what people go through when they lose a parent, I'm grateful that the three that I still have left are all doing amazingly well.

I've been married now for 28 years and counting. In a time of 50% divorce, I'd say we're breaking the curve.

I have one daughter through her undergraduate degree and another that's half paid for. My goal is to have two smart girls educated without any of us having to take on a mountain of debt.

I'm still working. When 10% of America is reported as out of work, with the shadowy bits that have stopped looking unreported, I'm humbled to have a job that doesn't appear to be shaky. I try my best to always make it worthwhile to my employer by pushing myself to continue to learn and stay sharp. That won't change in 2010.

My health was excellent last year. The problem I had with a bulging disk has not resurfaced. More on this topic to come.

The two things that I usually focus on are technical education and exercise. They're the two easiest things to quantify, especially that second one.

I think I had a pretty good technical year. I made a special effort to go against the grain of my architect title by trying to write code each and every day. It's important to practice, or the skill goes away. I didn't always succeed, but I think I've done a better job of it lately. That will ramp up in 2010. More languages (C#, Python), more math (Mathematica and LaTeX), and more coding will help. I've got a good list of projects to work through.

I signed up at Stack Overflow in Oct 2008, but I didn't answer my first question until Christmas Eve 2008. Since then, I've amassed over 29K points, rising as high as #47 in their point ranking. The negative side of that is that I spend far too much time on the site; the positive side is that I've learned a fair amount and (hopefully) helped a few people.

This blog still has a long way to go, but I've managed 27 posts this year, more than twice as many as all of 2008. I'm happy with this year's progress. I hope I'll find my voice in 2010.

I've got seven out of ten speeches under my belt towards a Competent Communicator rank in Toastmasters. I hope to finish that off early in 2010 and reach for the next rung. I'd love to try a competition and see how I do.

I'm going to start tracking my reading in the coming year, just to get a baseline for how much material I'm taking in.

I have something of an anal-retentive streak when it comes to exercise. I've been tracking my swimming yardage and attendance using an Excel spreadsheet since 1994. Last year was a blowout swimming year for me. I exceeded 524K yards for the year, averaging 10K yards per week for the first time ever. My previous best was 429K yards in 2006, the year before I hurt my neck. I managed 407K yards this year. I'm pleased to say that I've topped 400K yards three times, and all have come after I turned 50. I know I can't get back the youth and speed that I had when I was 20 or 30, but I'm still swimming pretty well. The younger me never had the courage (or the time) to attempt a Masters workout. Now it's a big part of my social life.

When I look at the difference between 2008 and 2009, it's obvious what happened. I swim in the morning before work and with a Masters group. My morning attendance and yardage was unchanged from 2008 to 2009, but my Masters yardage declined by 50%.

I have an easy scheme for improving on my 2009 totals: swim more. My standard was to swim a mile during my noon and morning workouts. One day I decided that 2000 yards per day would be my new minimum. I've maintained that for the last five years. Next year I plan to up the ante and shoot for 2200-2500 yards in the morning. If I can be more consistent with the Masters workouts I'll have a shot at matching my 2008 totals.

I rediscovered the joy of riding my bicycle to work this summer. I had my best year on a bike in about fifteen years. It was great to spend time with my best friend doing something I love. Next year I'll make the goal 1,000 miles for the summer. I'll start earlier in April and work my way to two rides per week.

I kept up with yoga all year. I averaged 90 minutes of yoga every week for the whole year. I've attended a weekly class faithfully and managed to do it fairly often on my own. I think it's helped my flexibility and core strength.

2009 was a pretty good year for me. I hope I can maintain it in 2010.



Sunday, November 29, 2009

More LaTeX On The Web












Sometimes I find myself in need of a GIF image of an equation or two. I want to be able to generate them quickly and easily, but I find that my MikTeX setup on Windows isn't helping.

Fortunately, Google found a utility that converts LaTeX to an image at EquationSheet.com. All I have to do is type in a snippet of LaTeX, hit the "convert" button, and I can see the equation rendered as an image. Even better, I can copy the URL and paste it into another HTML page as an image tag. Just what the doctor ordered!



Saturday, November 21, 2009

Coders At Work










I finished reading Peter Seibel's Coders At Work: Reflections on the Craft of Programming yesterday. I thought it was terrific and would recommend it highly to anyone who writes code for a living. It's an impressive follow-up to his Practical Common Lisp.

Peter drew his inspiration from the Paris Review Writers At Work. Great creativity is required to recognize an idea that's good in one context and reuse it in another.

The roster of interviewees is impressive, especially after having read the book. I only knew a minority when I opened the book. Only Knuth and Ken Thompson were long-familiar names. I knew of JWZ by hearsay. I have a copy of Peter Norvig's "AI: A Modern Approach", it's been over my head forever.

Donald Knuth is the biggest name; Peter holds him back for the last chapter. He seemed to me to be from another planet. I've written about my love of LaTeX; this is the man who spent ten years of his life coming up with TeX. I especially liked this bit on page 598:


With TeX I was interacting with hundreds of years of human history and I didn't want to throw out all of the things that book designers have learned over the centuries and start anew and say, "Well, forget that guys; you now, we're going to be logical now."


The more I learn about Peter Norvig, the more impressed I am. The people who are exposed to his genius at Google are fortunate indeed.

Guy Steele didn't disappoint. He's brilliant, but he can't help working the fact that he went to MIT into the very first answer he gives. My informal survey tells me that this is true of each and every MIT graduate I've ever known personally - except for one. (My friend is one of the most brilliant, most accomplished people I know, but I didn't hear that his Ph.D. came from MIT until after I'd known him for a long time, until I heard it from a mutual friend.) Thank you for another data point, Guy.

Some of these folks are younger than I am, but most are my contemporaries. All are far more accomplished as programmers and computer scientists than I am. They were doing their best work in this field when I was a mechanical engineer. I'm doing my best to try and catch up, but I still have a very long way to go. I envy them their accomplishments.

It seems to me that Peter Seibel struck a very nice balance between scripting his questions and letting the conversation lead him. There was a thread of common questions that ran through all the interviews (e.g., "How do you debug?", "Do you read code?", "Do you consider yourself a scientist, engineer, craftsman, or artist?", "How much math is needed to be a good programmer?"), but each interviewee contributed special insights that would have been hard to anticipate.

The book brought home two things to me: how much computing has changed since these people were in their heydays and how much wonderful stuff is being lost to time. I haven't read "The Art of Computer Programming". I feel like I should, but I'm not sure that it'll provide a payoff for all that effort in the corporate enterprise computing world in which I earn my living today. Those with the talent and good fortune to be working for Google and firms that value deep knowledge are a minority.

Every one of the individuals interviewed by Peter Seibel is a giant in the field. Is that so because they did their best work early in the history of computer science? If 2% of people are wired to be programmers, and there are 6.67 billion people on earth, that means there are about 133,400,000 potential writers of C#, Java, Cobol, Lisp, or what have you. Will we find giants to compare to the early ones?

I've read that Shakespeare and Babe Ruth stood out in their fields because there were comparatively fewer contemporaries who were their equals. Now that we have everyone blogging on the Internet and full-time professional athletes it's harder to stand out.

Programming is often compared to building buildings or manufacturing widgets or god knows what else. Like these other productive activities, it's often outsourced by American firms to be done by individuals in parts of the world that will do it for far less than their domestic counterparts. Will the next generation of great achievers come from their ranks? Will Peter Seibel have to travel further afield to hear about their exploits?

I enjoyed the book very much and recommend it highly.


Wednesday, November 11, 2009

Raking Leaves










The weather on the East Coast was unseasonably beautiful this weekend. It was sunny and warm both on Saturday and Sunday. It afforded me an opportunity to get the leaves off my yard and into a pile in the woods. My lot is 1.1 acres, with woods separating my house from the one behind me. There's a mountain of leaves taller than me in those woods tonight. It grew one tarp load at a time over the course of two days.

I always think of my father when I rake leaves. Some kids went to ball games with their dad; I raked leaves with mine.

He was an Irish immigrant who came to this country with the equivalent of a high school education. He got a job working for the water company in town. He was a member of the crew that worked to maintain the system of pipes and valves buried under the roads. They led from the treatment plants situated next to the reservoirs to the metered 1" diameter copper pipes that were the "last mile" into each and every house in town. He knew every pipe, both size and material, every valve in the system, because he had either put them in place or repaired them at one time or another. He had an unmatched encyclopedic knowledge of roads in town. He could associate each one with one job or another: "We had an 8 inch main break on Sinoway Road last night."

It was a good union job, but with six kids at home he didn't mind hustling for a few extra bucks. So on Saturdays he would go out and do yard work for people. He had regular customers that would have him cut the grass in the summer and rake the leaves in the fall. It meant getting up early on Saturday mornings and spending the day going from house to house. The last stop was always my grandmother's.

I was his eldest son. By my best recollection I was nine or ten years old the first time I went with him on a Saturday morning. I was less than useless. I didn't have the strength or stamina to help much in the beginning, but young legs can serve a purpose when you don't want to walk back to get a tool. He was easing me into the idea of helping.

As I became capable of contributing more, I remember going more frequently. I can't recall anymore how regular I was about the task. I'm sure that my faithfulness fell short. But I do remember going on more than a haphazard basis. I knew the names of all the customers, and they knew me. I can still point out the houses that haven't fallen victim to time and been knocked down to make way for McMansions.

These were quiet trips. We'd both get up early on Saturday mornings, load the appropriate tools into the car, and go on our appointed rounds. We didn't get coffee or chat a lot. He would thank me for helping, but whatever checks exchanged hands went into his pocket. I never questioned the arrangement. It was understood that it was my duty to help.

My father was very methodical and meticulous about raking leaves. He always had a large tarp that we'd spread out on a part of the lawn that was covered with leaves. "Don't rake onto a piece that you've already cleaned," he'd tell me. He would start in one corner and work in one direction, sweeping the area until there wasn't a leaf to be seen. He bought a gas-powered blower, the first one that I'd ever seen, that would speed the work and spare our hands the raking until we had a pile worthy of pushing into the tarp to be carried away.

After I went to college I didn't join him on Saturdays anymore. It's been too long - I can't pinpoint when he gave it up. Perhaps it was after my grandmother passed away. I never asked if the task fell to either of my younger brothers when I dropped the torch. I don't remember any of them joining us on Perryridge Road.

As I look back on it now, I like to think that I was chosen to go because I had the temperament for it. Maybe he liked doing it with me as much as I was proud to be chosen by him. It was something that I did with my father that no one else did.

Some kids went to ball games with their dad; I raked leaves with mine.

I still carry that experience with me to this day. I rake leaves the way he taught me. He would have been happy with my handiwork this weekend. At the end of a day of work - the sky red from the setting sun; the chilled air reminding me that it's the waning of another year; the dead silence at the end of a late autumn day - I think of my father.


Monday, November 9, 2009

When Everything Changed










I've finished reading Gail Collins' "When Everything Changed: The Amazing Journey of American Women From 1960 To The Present."

Amazing, indeed.

I've lived through the period described by the book, although I was awfully young for the earliest years. I remembered as I read, but having the changes spelled out so clearly was astonishing. Seeing how much has changed in such a short period of time, I had a feeling of disbelief as I read: "Did we really live like that? Is that what people thought back then?"

I can't fully identify because of my gender, but I can appreciate the difference as the father of two daughters. My sisters had to fight the good fight to go to college; my daughters grew up with the expectation that they'd go. Such a difference in the span of one or two generations.

Gail's writing style mirrors her columns on the editorial pages of the New York Times: part historian, part educator, part wry observer, all glued together by a dry wit. I happen to love it. I laughed out loud in places.

Of course I'll have my daughters read the book. It'll be a good lesson for them to see how their choices have expanded. I recommend it highly.


Sunday, November 8, 2009

jsMath: Typesetting Math In A Browser










I just became aware of jsMath, a JavaScript library from the Math Union for typesetting mathematics in a browser.

The content from the web site says it far better than I will, but it looks like jsMath was inspired by the slow adoption of MathML support in browsers running under Windows, Mac, and *nix machines.

The examples that the web site offers look beautiful. It's based on TeX, so it's no surprise that the results look so good. I didn't get a chance to dive into it this weekend. Unseasonably nice weather on the East Coast made it possible for me to clean up all the leaves that were covering my yard, so time to program was hard to come by. But I'll be looking into this gem soon. It's a nice complement to my recent rediscovery of LaTeX.

It amazes me to see how smart people can come up with things like this. It's also another example of the increasing reach of JavaScript. Brendan Eich's language is becoming more important every day.


Thursday, October 29, 2009

A New Old Idea









This blog entry is different from all the others I've posted to date. Whenever I've uploaded a photo, I've gotten it by searching Flickr.com Creative Commons for something that fit the theme of the post.

I took the picture that accompanies today's entry. I wanted to try something new.

Writer's block is a problem for me. I've had trouble for a while with finding my voice here. I love to write, but I've had a hard time making up my mind what I should focus on. A purely technical blog, like Jeff Atwood's Coding Horror, would be worthwhile, but I have a hard time filtering out more personal and non-technical thoughts. I feel a little exposed putting too much personal information out on the Internet. The frequency of posts shows my problem: when I have long gaps, I'm having trouble coming up with a topic.

The funny thing is that inspiration is all around us. I see all kinds of small details that are interesting to me. But they're often forgotten in the bustle of getting through busy weeks.

I used to keep an electronic journal. I have entries dating back to 1994 that comprise a special personal history. Sadly, it's fallen into decay. I don't have the same inspiration for it.

At one time I thought that learning to draw from Betty Edward's wonderful "Drawing On The Right Side Of The Brain" would be my inspiration. I did the exercises faithfully. I would draw random things while sitting in meetings, just to hone my skills, my eye, and the shift to right-brain mode. I loved it - until I hit the chapter on portrait drawing. My left brain was too critical. I couldn't find a way to quiet it. I would still love to find a way over the barrier, but to date I've been unsuccessful.

I have learned how to adopt ideas from people who are smarter than me. I have no problem emulating and following when I see someone doing something that I admire.

I went to Boston this past weekend to see the just-completed renovation job on my beloved second sister's house (it's spectacular). My beloved oldest sister was taking photos using a Fuji digital camera that had a big view finder, took great photos, and fit into a shirt pocket. I've never been a photographer. I've tried to concentrate on experiencing the moment rather than preserving it. But watching her snap away unobtrusively made me think "I could do that, too."

So I went out the other night and picked up a Fuji Finepix J38 - same model as hers; same color, black. Did I say "no problem emulating"? That meant "slavishly copying." I grabbed a case sturdy enough to protect the view finder and still slide into a pocket without looking too bulky. ("Is that a Fuju Finepix J38 in your pocket or are you just glad to see me?")

My idea is that I'll try to have the camera on hand as much as possible. If I see something interesting, I'll snap it, upload it later, and perhaps write about it here. I'm not going to worry so much about what comes out; I'm going to just keep practicing and writing.

I failed at drawing; now this jewel of technology will be my cache for ideas. It's got to piss off every truly skilled photographer who has spent a lifetime mastering film, technology, light, developing techniques, and the all-important artist's eye. Digital cameras have become so cheap and so good that any fool can trick themselves into thinking that they're Ansel Adams.

I tried it this morning when I went to work. The light and the view when I came out of the pool before work was enticing, so I unselfconsciously took out my camera, stood on the sidewalk, and snapped away. I took a few on the way out as well. I liked this picture the best of all.

I'm going to emulate my beloved, beautiful, brilliant eldest daughter, too. She's been writing a blog on street art for almost a year. The amazing thing about it, besides the content, is that she posts an entry every single weekday, without fail.

How does she do it? By treating it like a job. She lines up her sources, writes the pieces, and queues them up for daily release. Her dedication, discipline, and work ethic are as impressive as the material she elicits from all over the world.

I'd like to try that, too. I need to be more focused on what I'm pumping out there. I shouldn't have only one or two posts per month. I can do better.

What does all this mean? Probably nothing. I'm just another guy on the Internet, taking pictures and blabbing about himself, putting it out because Blogger makes it easy, thinking that it's terribly interesting and world-changing. How self-indulgent and boring! Right?

I'll concede that point.

I like the mental stimulation of trying to do something that I've never done. I want to keep thinking and challenging myself. I prefer this to coming home and settling in front of a television every night. I don't bloody well care if anybody notices. I'm doing this solely for myself, for its own sake.

Saturday, October 24, 2009

Binary Tree Iterators










In my last post, I presented code (and unit tests) for a binary tree implementation in Java. I alluded to the fact that there was a depth-first iterator in one of the test classes. This time I'll discuss iterators and present more practice code.

Iterator is one of the behavioral patterns described by the Gang of Four in their classic "Design Patterns". It provides the means for walking through a data structure without having to expose the details of how it's done.

Timothy Budd provides an excellent explanation of binary trees and iterators in chapter 10 of his "Classic Data Structures In C++".

Binary trees are interesting data structures. If the n nodes in a binary tree are independent, then there are n! = n*(n-1)*...*2*1 different orderings by which one could visit every node.

But the fact that the tree is arranged so that each parent has, at most, a left and right child limits the number of choices for walking the tree to six:

  1. Process value, then left child, then right child
  2. Process left child, then value, then right child
  3. Process left child, then right child, then value
  4. Process value, then right child, then left child
  5. Process right child, then value, then left child
  6. Process right child, then left child, then value


Subtrees are usually traversed from left to right, so the first three possibilities are the most common. Each is given a name that may be more familiar and memorable. The first is called preorder or depth-first traversal; the second in-order or symmetric traversal; and the third post-order traversal. There is also level order or breadth-first traversal, where all the nodes at one level are visited before proceeding to the next.

Java has an Iterator interface in its java.util package that defines the methods that all classes that implement it must provide. The next() returns a generic value. For my binary tree iterator I knew there'd be times when I wanted to get the next Node instead of the value, so I extended the Iterator interface and added a next method that returned a Node<T>:

package tree;

import java.util.Iterator;

public interface BinaryTreeIterator<T extends Comparable<T>> extends Iterator<T>
{
Node<T> nextNode();
}


Then I created an abstract class that provided default behavior for all methods except the ones that provided the next Node value and whether or not the walk through the tree was complete:

package tree;

import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;

import java.util.Iterator;

public abstract class AbstractBinaryTreeIterator<T extends Comparable<T>> implements BinaryTreeIterator<T>
{
public static final Log LOGGER = LogFactory.getLog(AbstractBinaryTreeIterator.class);

public void remove()
{
throw new UnsupportedOperationException("cannot remove from a binary tree");
}

public T next()
{
Node<T> nextNode = nextNode();

return nextNode.getValue();
}
}


The post-order traversal examines the left child first, then the right child, then the value in a recursive manner. It uses a last in, first out (LIFO) stack to hold the nodes in the opposite order they are to be visited. The iteration starts by pushing the root node onto the stack and recursing down the tree:

package tree;

import java.util.NoSuchElementException;
import java.util.Stack;

public class PostOrderIterator<T extends Comparable<T>> extends AbstractBinaryTreeIterator<T>
{
protected Stack<Node<T>> stack = new Stack<Node<T>>();

public PostOrderIterator(BinaryTree<T> tree)
{
init(tree.getRoot());
}

public void init(Node<T> root)
{
if (root != null)
{
stack.clear();
stackChildren(root);
}
}

private void stackChildren(Node<T> node)
{
stack.push(node);
Node<T> next = node.getRight();
if (next != null)
{
stackChildren(next);
}
next = node.getLeft();
if (next != null)
{
stackChildren(next);
}
}

public Node<T> nextNode()
{
if (!hasNext())
{
throw new NoSuchElementException();
}

Node<T> x = null;

if (!stack.empty())
{
if (LOGGER.isDebugEnabled())
{
LOGGER.debug(stack);
}

x = stack.pop();
}

return x;
}

public boolean hasNext()
{
return !stack.isEmpty();
}
}


Of course there are unit tests:

package tree;

import org.springframework.test.context.ContextConfiguration;
import static org.testng.Assert.assertFalse;
import static org.testng.AssertJUnit.assertEquals;
import org.testng.annotations.DataProvider;
import org.testng.annotations.Test;

import java.util.HashMap;
import java.util.Iterator;
import java.util.Map;
import java.util.NoSuchElementException;

@Test
@ContextConfiguration(locations = "classpath:app-context.xml, classpath:app-context-test.xml")
public class BinaryTreeIteratorTest
{
public void testHasNextEmptyTree()
{
BinaryTree<String> empty = new BinaryTree<String>();
AbstractBinaryTreeIterator<String> iterator = new DepthFirstIterator<String>(empty);

assertFalse(iterator.hasNext());
}

@Test(expectedExceptions = UnsupportedOperationException.class)
public void testRemove()
{
BinaryTree<String> empty = new BinaryTree<String>();
AbstractBinaryTreeIterator<String> iterator = new DepthFirstIterator<String>(empty);
iterator.remove();
}

public void testNextDepthFirst()
{
Integer[] data = {5, 3, 9, 1, 4, 6,};
BinaryTree<Integer> tree = new BinaryTree<Integer>();
tree.insert(data);

String expected = "5,3,1,4,9,6,";
AbstractBinaryTreeIterator<Integer> iterator = new DepthFirstIterator<Integer>(tree);
String actual = createCommaSeparatedString(iterator);

assertEquals(actual.toString(), expected);
}

public void testNextPostOrder()
{
Integer[] data = {5, 3, 9, 1, 4, 6,};
BinaryTree<Integer> tree = new BinaryTree<Integer>();
tree.insert(data);

String expected = "1,4,3,6,9,5,";
AbstractBinaryTreeIterator<Integer> iterator = new PostOrderIterator<Integer>(tree);
String actual = createCommaSeparatedString(iterator);

assertEquals(actual.toString(), expected);
}

private static String createCommaSeparatedString(Iterator iterator)
{
StringBuffer actual = new StringBuffer(1024);
while (iterator.hasNext())
{
actual.append(iterator.next()).append(',');
}

return actual.toString();
}

@DataProvider(name = "emptyTreeIterators")
public Object[][] createEmptyTreeIterators()
{
BinaryTree<String> tree = new BinaryTree<String>();

return new Object[][]{
{new BreadthFirstIterator<String>(tree)},
{new DepthFirstIterator<String>(tree)},
{new InOrderIterator<String>(tree)},
{new PostOrderIterator<String>(tree)},
};
}


@Test(expectedExceptions = NoSuchElementException.class, dataProvider = "emptyTreeIterators")
public void testNextEmptyTree(AbstractBinaryTreeIterator<String> iterator)
{
iterator.next();
}

public void testNext()
{
String[] data = {"F", "B", "A", "D", "C", "E", "G", "I", "H",};
BinaryTree<String> tree = new BinaryTree<String>();
tree.insert(data);

Map<String, String> expected = new HashMap<String, String>();
expected.put("depth-first", "F,B,A,D,C,E,G,I,H,");
expected.put("in-order", "A,B,C,D,E,F,G,H,I,");
expected.put("post-order", "A,C,E,D,B,H,I,G,F,");
expected.put("breadth-first", "F,B,G,A,D,I,C,E,H,");

String name = "depth-first";
AbstractBinaryTreeIterator<String> iterator = new DepthFirstIterator<String>(tree);
AbstractBinaryTreeIterator.LOGGER.debug(name);
assertEquals(name, expected.get(name), createCommaSeparatedString(iterator));

name = "in-order";
iterator = new InOrderIterator<String>(tree);
AbstractBinaryTreeIterator.LOGGER.debug(name);
assertEquals(name, expected.get(name), createCommaSeparatedString(iterator));

name = "post-order";
iterator = new PostOrderIterator<String>(tree);
AbstractBinaryTreeIterator.LOGGER.debug(name);
assertEquals(name, expected.get(name), createCommaSeparatedString(iterator));

name = "breadth-first";
iterator = new BreadthFirstIterator<String>(tree);
AbstractBinaryTreeIterator.LOGGER.debug(name);
assertEquals(name, expected.get(name), createCommaSeparatedString(iterator));
}
}


Tests are great, but being a visual person I like to be able to see a picture. There's a terrific package called graphviz from AT&T that lets you generate graph plots in an elegant way. I wrote a quick program to walk the binary tree { F,B,A,D,C,E,G,I,H } using an iterator and spit out the dot representation, like so:

digraph simple_hierarchy {
F->B [label="L"]
F->G [label="R"]
B->A [label="L"]
B->D [label="R"]
D->C [label="L"]
D->E [label="R"]
G->I [label="R"]
I->H [label="L"]
}


I can use this as the input to dot.exe to render the tree as .png or .svg file:



I'll post the other iterators next time.

All this work just to answer a single interview question! This tells me that doing it justice would be hard in such a short period of time. I would flunk such an interview.

Thanks to my friend Steve Roach, who was the first person to bring graphviz (and so many other things) to my attention. He's one of the most talented developers I've had the pleasure of working with, but it's his teaching ability that's his greatest strength. He's one of those people who makes everyone around him better. Such intellectual generosity is rare indeed.

Sunday, October 18, 2009

Practice, Practice, Practice









I spent several hours last week interviewing Java developers. We needed to bring in some contractors. It was decided that we'd ask candidates to write some Java code as part of the interview process. We had a list of twenty quiz questions that ranged in difficulty. Since our time was limited, we'd restrict ourselves to one or two quiz questions per candidate.

I had mixed feelings about the quiz questions. I'm familiar with Joel Spolsky's Guerrilla Guide to Interviewing. I didn't want to be a Quiz Show Interviewer.

But there was something else nagging me. How would I fare if presented with this quiz?

One of the questions was: "Traverse a binary tree in depth-first order." I'd bet that anybody fresh out of a good data structures class would be able to whip this out quickly enough to be able to fit into the scope of a 30 minute phone conversation.

But what about a guy like me?

The only data structure I was taught as a mechanical engineer was a FORTRAN array. I was keenly aware of my ignorance when I started down the software path, so I did what I always do: sign up for a degree and start taking courses. I started down the path of an Master of Science degree in computer science. It included basics like data structures, but they taught it using Eiffel. And that was ten years ago. I'd have to dredge up those memories and express them in Java.

Even worse, I've been working as an architect for years now. The big firms that employ me tend to take a dim view of architects that write code. Programming is considered a low level, commodity, low skill task that's best left to the least expensive off-shore individuals you can find.

It made me nervous to think about how foolish I'd look taking my own interview.

So I started working through the interview questions, one by one. I spent some time this weekend on that binary tree implementation. I got through it using all the best practices I knew: Java, test driven development using TestNG, IntelliJ and refactoring, etc. It took more than thirty minutes to finish, but I'm pleased with the result. I am not fast, but I think I'm conscientious and thorough.

Practice will help with my speed. Part of the value of this exercise is to practice, practice, practice. Individuals that I have a great deal of respect for advocate the concept of constant practice, calling it code kata.

I think it's especially important for someone with "architect" in their job title. How on earth can you represent for "best practices" when you're so woefully out of practice yourself?

The other benefit? It's fun and satisfying. There's still a great sense of satisfaction, of an aesthetic for mathematical beauty, whenever I manage to pull myself through whatever rabbit hole I've fallen into. There's frustration, too, when I struggle and fall and fail. But when it works, there's nothing like it.

It's no different from any other profession. We all have to keep learning, struggling, adding new skills, re-sharpening old ones.

If you're not a programmer, you can stop reading here. (Thank you for coming at all and getting this far.)

If you are a programmer, and you're still interested, here's my solution to the first part of the problem: a binary tree in Java. I started with a Node class that encapsulated a value plus left and right child references:

package tree;

public class Node<T extends Comparable<T>>
implements Comparable
{
private T value;
private Node<T> left;
private Node<T> right;

public Node(T value)
{
this(value, null, null);
}

public Node(T value, Node<T> left, 
Node<T> right)
{
this.setValue(value);
this.left = left;
this.right = right;
}

public T getValue()
{
return value;
}

public void setValue(T value)
{
if (value == null)
throw new IllegalArgumentException("node value 
cannot be null");

this.value = value;
}

public Node<T> getLeft()
{
return left;
}

public void setLeft(Node<T> left)
{
this.left = left;
}

public Node<T> getRight()
{
return right;
}

public void setRight(Node<T> right)
{
this.right = right;
}

public boolean isLeaf()
{
return ((this.left == null) && 
(this.right == null));
}

public int compareTo(Object o)
{
Node<T> other = (Node<T>) o;

return this.getValue().compareTo(other.getValue());
}

@Override
public boolean equals(Object o)
{
if (this == o)
{
return true;
}
if (o == null || getClass() != o.getClass())
{
return false;
}

Node node = (Node) o;

if (!value.equals(node.value))
{
return false;
}

return true;
}

@Override
public int hashCode()
{
return value.hashCode();
}

@Override
public String toString()
{
return value.toString();
}
}


Of course I wrote a TestNG class to unit test it:

package tree;

import static org.testng.Assert.*;
import org.testng.annotations.Test;

@Test
public class NodeTest
{
/**
* For any non-null reference value x, x.equals(null) 
* must return false.
*/
public void testNotNull()
{
Node<String> x 
= new Node<String>("test");

assertFalse(x.equals(null));
}

/**
* It is reflexive: For any reference value x, 
* x.equals(x) must return true.
*/
public void testReflexive()
{
Node<String> x 
= new Node<String>("test");

assertTrue(x.equals(x));
assertEquals(0, x.compareTo(x));
}

/**
* It is symmetric: For any reference values x and y, 
* x.equals(y) must return
* true if and only if y.equals(x) returns true.
*/
public void testSymmetric()
{
Node<String> x 
= new Node<String>("test");
Node<String> y 
= new Node<String>("test");
Node<String> z 
= new Node<String>("something else");

assertTrue(x.equals(y) && y.equals(x));
assertTrue((x.compareTo(y) == 0) 
&& (y.compareTo(x) == 0));
assertFalse(x.equals(z));
assertTrue(x.compareTo(z) > 0);
}

/**
* It is transitive: For any reference values x, y, 
* and z, if x.equals(y) returns
* true and y.equals(z) returns true, 
* then x.equals(z) must return true
*/
public void testTransitive()
{
Node<String> x 
= new Node<String>("test");
Node<String> y 
= new Node<String>("test");
Node<String> z 
= new Node<String>("test");

assertTrue(x.equals(y) && 
y.equals(z) && 
z.equals(x));
assertTrue((x.compareTo(y) == 0) && 
(y.compareTo(z) == 0) && 
(z.compareTo(x) == 0));
}

public void testHashCode()
{
Node<String> x 
= new Node<String>("test");
Node<String> y 
= new Node<String>("test");
Node<String> z 
= new Node<String>("something else");

assertTrue(x.hashCode() == y.hashCode());
assertFalse(x.hashCode() == z.hashCode());
}

public void testToString()
{
String expected = "expected";

Node<String> node 
= new Node<String>(expected);
assertEquals(node.toString(), expected);
}

@Test(expectedExceptions = NullPointerException.class)
public void testCompareToNull()
{
Node<String> x 
= new Node<String>("test");

x.compareTo(null);
}

public void testCompareTo()
{
Node<String> x 
= new Node<String>("x");
Node<String> y 
= new Node<String>("y");
Node<String> z 
= new Node<String>("z");

assertTrue((x.compareTo(x) == 0) && 
(x.compareTo(y) <  0) && 
(x.compareTo(z) <  0));
assertTrue((y.compareTo(x) >  0) && 
(y.compareTo(y) == 0) && 
(y.compareTo(z) <  0));
assertTrue((z.compareTo(x) >  0) && 
(z.compareTo(y) >  0) && 
(z.compareTo(z) == 0));
}

@Test(expectedExceptions = IllegalArgumentException.class)
public void testNullValue()
{
Node<String> x   
= new Node<String>(null);
}

public void testIsLeaf()
{
Node<String> x 
= new Node<String>("test");

assertTrue(x.isLeaf());

x.setLeft(new Node<String>("left"));
x.setRight(new Node<String>("right"));

assertFalse(x.isLeaf());

assertEquals("left", x.getLeft().getValue());
assertEquals("right", x.getRight().getValue());
}
}


Then I wrote a BinaryTree:

package tree;

import java.util.Arrays;
import java.util.List;

public class BinaryTree<T 
extends Comparable<T>>
{
private Node<T> root;

public BinaryTree()
{
this(null);
}

public BinaryTree(Node<T> root)
{
this.root = root;
}

public Node<T> getRoot()
{
return this.root;
}

public boolean contains(T value)
{
return contains(this.root, value);
}

private boolean contains(Node<T> node, T value)
{
// empty tree can't contain the value; value 
// cannot be null
if ((node == null) || (value == null))
{
return false;
}

if (value.equals(node.getValue()))
{
return true;
}
else if (value.compareTo(node.getValue()) < 0)
{
return contains(node.getLeft(), value);
}
else
{
return contains(node.getRight(), value);
}
}

public void insert(T value)
{
this.root = insert(this.root, value);
}

public void insert(List<T> values)
{
if ((values != null) && 
(values.size() > 0))
{
for (T value : values)
{
insert(value);     
}
}
}

public void insert(T [] values)
{
if ((values != null) && 
(values.length > 0))
{
insert(Arrays.asList(values));
}
}
private Node<T> insert(Node<T> node, 
T value)
{
if (node == null)
{
return new Node<T>(value);
}
else
{
if (value.compareTo(node.getValue()) < 0)
{
node.setLeft(insert(node.getLeft(), value));
}
else
{
node.setRight(insert(node.getRight(), value));           
}
}

return node;
}

public int size()
{
return size(root);
}

private int size(Node<T> node)
{
if (node == null)
{
return 0;
}
else
{
return (size(node.getLeft()) + 
1 + 
size(node.getRight()));
}
}

public int height()
{
return height(root);
}

private int height(Node<T> node)
{
if (node == null)
{
return 0;
}
else
{
int leftHeight = height(node.getLeft());
int rightHeight = height(node.getRight());

return (Math.max(leftHeight, rightHeight) + 1);
}
}

public boolean isEmpty()
{
return (root == null);
}
}


And a TestNG unit test:

package tree;

import static org.testng.Assert.assertFalse;
import static org.testng.Assert.assertTrue;
import static org.testng.AssertJUnit.assertEquals;
import org.testng.annotations.BeforeTest;
import org.testng.annotations.Test;

import java.util.Arrays;
import java.util.List;

@Test
public class BinaryTreeTest
{
private BinaryTree<Integer> tree;
private Integer [] data = { 5, 3, 9, 1, 4, 6, };

@BeforeTest
public void setUp()
{
tree = new BinaryTree<Integer>();
tree.insert(data);
}

public void testIsEmpty()
{
assertFalse(tree.isEmpty());
assertTrue(new BinaryTree<Integer>().isEmpty());
}

public void testSize()
{
assertEquals(tree.size(), data.length);
}

public void testHeight()
{
int expected = 3;
assertEquals(tree.height(), expected);
}

public void testGetRoot()
{
Node<Integer> expected 
= new Node<Integer>(data[0]);
assertEquals(tree.getRoot(), expected);
}

public void testContains()
{
for (int value : data)
{
assertTrue(tree.contains(value));
assertFalse(tree.contains(value*1000));
}
}

public void testInsertList()
{
// When you insert a list, you get it back without 
// alteration using a pre-order, depth-first traversal.
List<String> data 
= Arrays.asList("F","B","A","D","C","E","G","I","H");
BinaryTree<String> tree 
= new BinaryTree<String>();
tree.insert(data);

// Check the size
assertEquals(tree.size(), data.size());

// Now check the values
BinaryTreeIterator<String> iterator 
= new DepthFirstIterator<String>(tree);
int i = 0;
while (iterator.hasNext())
{
assertEquals("i = " + i, iterator.next(), 
data.get(i++));
}
}

public void testInsertArray()
{
// When you insert a list, you get it back without 
// alteration using a pre-order, depth-first traversal.
String [] data = {"F","B","A","D","C","E","G","I","H",};
BinaryTree<String> tree 
= new BinaryTree<String>();
tree.insert(data);

assertEquals(tree.size(), data.length);     

BinaryTreeIterator<String> iterator 
= new DepthFirstIterator<String>(tree);
int i = 0;
while (iterator.hasNext())
{
assertEquals("i = " + i, iterator.next(), 
data[i++]);     
}
}

public void testInsertNullList()
{
List<String> data = null;
BinaryTree<String> tree 
= new BinaryTree<String>();
tree.insert(data);

assertEquals(tree.size(), 0);
}

public void testInsertNullArray()
{
String [] data = null;
BinaryTree<String> tree 
= new BinaryTree<String>();
tree.insert(data);

assertEquals(tree.size(), 0);
}
}


This was the ground work. The real solution meant writing iterators to traverse the BinaryTree. If you're reading closely, you'll see that I used a DepthFirstIterator in the unit test for BinaryTree.

I'll post those next time. In the meantime, I'll keep practicing.

Monday, October 12, 2009

Frightening Update




I recently wrote about how frightening the financial situation of the United States was. How could it be that an entity as complex as the United States economy could employ cash based accounting, the same method used by hot dog stands?

After posting that blog I had breakfast with one of my oldest and best friends. I met him early in my engineering career. We branched in graduate school after both of us completed Masters degrees. I went on with mechanical engineering, while he pursued an MBA. We branched again when I abandoned engineering and ran down the software track.

We've managed to stay in contact in spite of the fact that we no longer work together. We'd meet for lunch until the noon hour stopped being considered a sacred "meeting free" time. Then we switched to monthly breakfasts before work. These are coffee-fueled discussions about all our favorite topics - engineering, politics, religion, movies, sports. I can't think of many people in my life with whom I could spend an hour this way. I look forward to them like nothing else.

My friend is a dyed-in-the-wool leftie who's against all things Republican. I tend to be left of center as well, but it's easier for me to slip into "a pox on both your houses" mode when I think that Democrats aren't living up to their ideals. It's a position that's easy to assume these days. Except for the change in tone I don't see much difference between the current administration and the Republican regimes of the last eight years, with two wars in progress and being funded by "off budget" expenditures, the Treasury taken over by Goldman Sachs, the Patriot Act in place, Glass-Steagall rolled back, and Guantanamo in operation.

During our last breakfast I relayed the essence of my distress: Why doesn't the US government use GAAP? I thought myself very clever when I said that cash based accounting made us no better than a hot dog vendor.

My friend drew on the wellspring of knowledge he accumulated during his MBA and shot me down: "Cash-based accounting is the most honest form there is. You look into the till and report on how much cash you see!" He pointed out that GAAP is loaded with loopholes and tricks. Depreciation makes all kinds of shenanigans possible.

My argument was shot down. I was an easy target, because I've never studied accounting.

I'd appreciate it if someone could explain to me where I went wrong. I realize that it won't be possible to relate or absorb such a vast subject in the space of a web comment, but I could use a nudge in the right direction.

Perhaps the real answer is that no accounting system is a guarantee of honesty and transparency. Scandals too numerous to list, such as Baan, Enron, etc., have taught us that outright lying can be done using any accounting system. GAAP won't stop you from booking loans as income, or moving sales back from one quarter to another to make a bad quarter look better, or shipping product to a warehouse you own and treating them as sales, or ignoring an IOU to Social Security and Medicare because you spent the cash accumulated within back in the Vietnam days.

The real problem isn't the way the Congressional Budget Office is tallying the numbers. It's the way we're misled about what the numbers are in the first place.

One reason for the housing boom is people taking on debts beyond their means, assuming that the value of the house would always appreciate at a rate that would make it possible to flip the house at a profit before paying for it became a problem.

I think our current fiscal situation is based on an equally flawed assumption: that the United States will always be the world's pre-eminent economic power, that the rest of the world will never catch up, that our economy will always continue to grow fast enough to make it possible to pay off the obligations we're piling up.

I keep waiting for the Obama administration to start telling us the truth, but it's like waiting for Godot.






Thursday, October 8, 2009

Why Is UML So Hard?





I changed careers back in 1995. I jumped from mechanical engineering to software development. I've worked hard to try and learn what object-oriented programming is all about, what advantages it brings to helping to solve problems in software design and implementation.

First I learned C++, the great new language that Bjarne Stroustrup gave us. I thought that figuring out pointers when I moved from FORTRAN to C was hard; wrapping my brain around objects was much more difficult.

Then Java came along. I took a one-week class, but I didn't really get it.

Then I moved along to a company that wrote client-server accounting software using Visual C++. One day the CTO asked if I was willing to tackle an assignment that required Java. "Oh sure, I know that language," I said. I really had no business taking on that problem, but I muddled my way through it well enough to deliver something.

That company was struggling with the transition from client-server to a distributed, three-tier architecture. They had a long history with the Microsoft platform, but they liked Java's "write once, run anywhere" promise. Their clients were banks and businesses, not all of which ran on Windows. They also wanted to get away from the tight coupling between their user interface and the database tier. They had all their business logic tied up in stored procedures. This meant that they had to support Oracle, DB2, Microsoft SQL Server, Informix, Sybase - any flavor of stored procedure language that a client wished to run. They had a "can do" cowboy attitude that said hacking stored procedure code on site for a new customer was just good business, even if it meant that every installation was a custom. Why let an out-of-synch source code repository stop you from saying "Yes, sir!" to the customer?

The CTO brought in a bunch of folks to try and help them move to a more object-oriented approach. He bought several licenses to the most well-known UML tool of the day. He hired a consulting firm from the Washington DC area to come up and give us a week's intensive training in the use of this UML tool. When the pressures of keeping the production version rolling out the door subsided, he took us all to a hotel conference room, away from the office, and had us spend two weeks locked away with our UML tool, flip charts, and markers. When we were done, we'd have an awe-inspiring object-oriented design for the 21st century accounting system.

As you can guess, the two weeks were a disaster. No object-oriented design came out of those sessions. The company didn't get their distributed accounting system.

What went wrong?

We lacked a strong leader with experience at object-oriented design. We were still learning the tools. Domain knowledge in accounting and experience with the product varied among the participants.

Each session would start with something like "Let's do one use case." We'd draw stuff on flip charts and quickly veer off the road. Every discussion would descend into a dizzying argument that was a roller coaster ride from the heights of abstraction to the stomach-churning drop into implementation details. I was trying to persuade them to list the steps for accounts payable when one old hand smirked and said "I can tell you what accounts payable is! Pay me!", holding out his hand with palm facing up.

The developers would scowl and listen quietly until one of them would stomp out of the room, tossing something like "If you don't make up your mind soon, I'm just going to start coding what I want" over their shoulder as they headed towards the soda machine.

We couldn't agree on what words meant. We'd have bike shed arguments for hours about what "customer" meant. We couldn't agree on how to drive from a meaningful description of the problem at hand to artifacts that a developer could use to produce working code. It's as if we'd get bored or frustrated doing that very hard work and give up before the payoff.

I left the company soon after those sessions ended. There was a layoff within six months. The CTO was forced out in a power struggle with the other two founding partners.

Fast forward eleven years. I'm working for another large company that is struggling with a transition from an older platform to a more modern one. UML has been championed as the cure for what ails us. Licenses to another UML tool have been procured. Training will commence. A large cross-disciplinary team has been convened to go through the UML design process. Consultants have been hired to shepherd us along the path of righteousness.

The funny thing is that it feels just like those sessions I sat through eleven years ago. Every discussion descends into a dizzying argument that's a roller coaster ride from the heights of abstraction to the stomach-churning drop into implementation details. We can't agree on what words mean. We have bike shed arguments for hours about design minutia. We can't agree on how to drive from a meaningful description of the problem at hand to artifacts that a developer can use to produce working code.

We'll see if we get bored or frustrated doing that very hard work and give up before any payoff comes through.

This might be the growing pains of a new team. But what if it's something wrong at the heart of UML? This object-oriented notation for rendering design decisions, codified and maintained by the Object Management Group, was born out of years of notation wars among the Three Amigos - Booch, Jacobsen, and Rumbaugh. They created a notation (UML), a company (Rational), a software tool (Rational Rose), and a software development methodology (Rational Unified Process) before selling out to IBM.

Agile approaches have largely discredited heavy approaches like a full-blown UML design.

Maybe somebody has found that this is a good way to develop large software systems in a team setting, but I haven't seen it yet. Things don't seem to have improved a bit in the past eleven years.




Saturday, October 3, 2009

Frightening





Articles like "Risk Free Is Not Without Risk" at Rude Awakening scare the living hell out of me. I've included their image of the US government's outstanding liabilities, calculated using cash accounting and GAAP, to lead off this entry.

One paragraph from that column says it all:


The fiscal condition of the United Sates has deteriorated dramatically during the last several years. On the basis of current obligations, U.S. indebtedness totals “only” about $12 trillion. But when utilizing traditional GAAP accounting – the kind of accounting that every public company in the United States MUST use – U.S. indebtedness soars to $74 trillion. This astounding sum is more than six times U.S. GDP. (GAAP accounting includes things like the present value of the Social Security liability and the Medicare liability – i.e. real liabilities.)


The $12T figure is scary enough. It's the basis for all the rationalizations that our politicians are using for their monetary policies: "We can afford this deficit, because it's still a small percentage of our GDP. America is still the mightiest economy in the world." Our overall debt is roughly one year's GDP now.

But with a true debt that's six times our GDP it feels we're in an airplane that's nosediving and in gravity's grip. Pulling out will take all of our strength.

Does it bother anyone else that our government can hector industries about their poor practices - deservedly so - yet continue to use cash based accounting? Isn't that the style that one would use to run a small, cash-based business like a hot dog stand? I suppose that would be fine, as long as we were talking about the world's mightiest hot dog stand.

There's a fight going on between "fresh water" and "salt water" economists about the wisdom of continuing to run a large and growing deficit to stimulate the American economy. The Keynesian school says that massive deficits are the only way to restore our prosperity.

This argument is based on the kind of wishful thinking that brought about the recent collapse of the real estate market. It assumes that deficits are a temporary condition and that the economy will always grow. What politician has ever shut down a program once it was put in place? What if the economy collapses under the weight of all that debt?

The only time I can recall running a surplus was at the end of the Clinton years. Even that was suspect, because it was fueled by an unanticipated windfall from capital gains taxes that were based on "irrational exuberance". Remember Al Gore and his "Social Security lockbox"? How did that work out? How much of our debt was retired after that adventure in creative finance?

I fear that our situation has become so unmanageable that the economy will never be able to grow its way out of debt. Neither our people nor our government show any inclination to cut back on consumption and retire our debts. We continue to hear about new rights (e.g., good jobs, universal health care, retirement with dignity, etc.), but no one wants to pay for them or face up to the obligations we've already incurred.

The only alternative we have is inflation. If we continue to debase our currency, perhaps all those dollars that the Chinese and Japanese are sitting on will become worthless. That'll teach them a lesson! Too bad that the dollars in my 401(k) will be suffering the same fate. The joke will be on us.

I'm glad that the Republicans are out of office after an eight-year nightmare. I like Mr. Obama and wish him well.

But the Democrats have done nothing with their filibuster-proof majority to date. And neither political party is telling us the truth or doing anything significant about the real problems that we all face. Our news media would rather tell us about Jon and Kate and their unholy brood rather than let us know how dire our finances have become. We're encouraged to continue to spend, as if it was our patriotic duty.

I fret about how right-leaning religious nut cases who believe that our wealth and happiness are mandated by divine right will react if the tide turns.

We're all going to have to learn to live within our means; our means will be declining. If we can't do it ourselves, perhaps the Chinese and Japanese will force the lesson on us. If they decide to stop buying our bonds we're going to see interest rates rise whether we like it or not.

We won't be a great economy if we don't make things that the rest of the world wants to buy. Contrary to what a lot of people think, God didn't make the United States the greatest economic power of the 20th century.

I think it's misguided to think that a service-based economy can allow us to maintain our pre-eminent position in the world. If we continue to see industries collapse, without something new coming along to take their place, things could get very ugly here in the coming years.

Sunday, September 27, 2009

Synthetic Biology





I read a terrific article written by Michael Specter, published in New Yorker Magazine, entitled "A Life Of Its Own." It asks the question "Where will synthetic biology lead us?"

I'm fascinated by the question. It marries science and ethics in equal measure. I can sympathize with the enthusiastic scientists who envision great benefits - everything from improved health to a way out of our deadly embrace of fossil fuels.

I can't claim to be that kind of scientist. Engineers concern themselves with applying the knowledge that the practitioners of fundamental sciences - physicists, chemists, and mathematicians - unearth for us. We fashion these intellectual raw materials into useful things, and even contribute back what we learn about the fundamentals during process development, but I've been reminded many times that a mechanical engineer is not a physicist. There was a time when I immersed myself into reading biographies of the great physicists of the 20th century. Feynman became a hero of mine after reading his autobiographical short stories in "Surely You're Joking, Mr. Feynman!" and James Gleick's wonderful biography "Genius". I devoured his famous red books, fancying myself a budding physicist.

Then I got my hands on Veltman's "Diagrammatica", and the dream died. It was beyond me. I had neither the physical intuition nor the mathematical chops to see my way through it.

I'm in a worse position with biology. The last biology course that I took came in high school. They taught us the rudiments of DNA, RNA, and the Krebs cycle, but it was well before the polymerase chain reaction came along. Chemistry is not my strong suit either, so the changes that are coming will leave me behind.

Neither of my parents went to college. I was alone when I went off to study mechanical engineering, because neither of them had experienced what I went through.

My youngest daughter is studying biology as an undergraduate now. In spite of all my education, I find myself in a position relative to my daughter similar to what my father had with me. I can relate my experiences as an undergraduate to hers, and tell her what graduate school was like for me. I know enough about fundamentals like thermodynamics, physics, etc. to keep the ball rolling when we talk. But she's already well beyond my capabilities in her chosen field. She's blazing that path alone. She's Lewis and Clark sending letters back to me, Thomas Jefferson, describing the wonders she's experiencing.

I found the New Yorker article particularly interesting, because a number of the phrases evoked things I'd read when the software industry was abandoning older procedural languages like FORTRAN and COBOL and embracing the newer idea of object oriented programming. The problem was complexity: it's impossible to manage all the details that go into developing software when the number of lines of code explode into the hundreds of thousands or millions. Problem solving in general, and computer science in particular, depends on being able to decompose large, intractable problems into smaller, more manageable pieces.

Object oriented programming helps us to manage complexity by mapping software components onto real-world objects and encapsulating the details inside. If done correctly, users of a component need only concern themselves with what they need to provide and what they get back; all the messiness of how it's done is hidden inside.

Brad Cox and others used to talk about "software integrated circuits": each component would have its own well-defined inputs and outputs, much like the pins on a hardware integrated circuit. There would be a marketplace of these software ICs, where you could search for a component that met your needs, plug it in, and off you'd go.

This phrase on page 5 of the article brought that vision back for me: "The BioBricks registry is a physical repository, but it is also an online catalogue. If you want to construct an organism, or engineer it in new ways, you can go to the site as you would one that sells lumber or industrial pipes."

It made me stop and think, because to a great degree the promise of software ICs has not been realized. Writing complex software systems is still a difficult, large scale problem. Object models claiming to model the industry I work in today have not lived up to their promise. The ideal presented by the hardware side of the problem has not translated over to software.

There's still something fundamentally different about software. It's not all science. The irony is that software was distinguished from hardware at the dawning of the computer age because it was believed to be more malleable stuff than the circuits it ran on. You could change it relatively quickly, far more easily than the machine that executed it.

But that's often precisely the problem. It's very easy to change, but the coupling and complexity make it difficult to predict what the effect of the change will be. Brittle software suffers from this problem. The effect of changes in one part of the code often ripple out, resulting in surprising, disappointing, sometimes catastrophic behavior.

Reading about the enthusiasm of biologists made me wonder about the brittleness, coupling, and unintended consequences that face them. Will they have better success than software engineers have to date? And if they do, what lessons can we learn to improve the lot of software development?



Sunday, September 20, 2009

Back From The Dead






"There are two kinds of hard drives: those that have failed, and those that will."

A week ago I fired up my home desktop machine, waited, and got nothing but a blank, black screen staring back at me. I didn't have another monitor on hand to test it, and no way to diagnose the problem. I assumed the worst: it was a failed hard drive. I unplugged all the peripherals, loaded the box into my car, and took it to a local computer shop to be triaged.

I had to wait a whole week just to hear what the problem was, because I had several machines in the queue ahead of me. I was able to sneak some time on my oldest daughter's Macbook during that dry season, but it was painful. All my development tools (e.g., IntelliJ, databases, Grails, etc.) were on my desktop.

I got my machine back yesterday. I started to think that it wasn't a hard drive problem, and fortunately I was correct. Diagnostics showed that memory and hard drive were both fine. My 5 year old monitor gave up the ghost. So I picked up my machine, turned in the dead monitor, and bought a 22" one for just $149. Not too bad.

I'm not that crazy about the place that repaired the machine. When I called to ask the prognosis, the kid told me that everything was fine, but my anti-virus software found "a bunch of viruses". He said he could run the cleanup for a mere $118. I politely declined, but I almost blew my stack: "Are you kidding me? What are you going to do, stand there and watch Kaspersky do the clean-up? How can you quote me that figure with a straight face?" If you look at what Kapersky is identifying, you see that it points out things like denial of service possibilities for the Java JDK that I'm using. That's not a "virus", and I wouldn't want the software to be removed.

The machine had dust in it when I picked it up. When I complained that the kid should have at least blown it out before servicing it, I was told that cleaning is another for-fee service. What a business.

So I brought the whole thing home, vacuumed it out, and set up the new monitor. I spent some time trying to finally sort out my backup and recovery problem. I've had scheduled Microsoft backups for a long time, but they aren't complete disk images. I bought a 250GB Passport external hard drive to replace the undersized 140GB one I had. Now I could copy my entire hard drive if I wanted. I found an inexpensive disk imaging suite called Acronis and set up a nightly backup. I had my entire hard drive on my Passport this morning, compressed into 15 neat 4GB files. I think I'll buy a license.

Just one last problem: How to you boot a PC without the hard drive when you don't have one of those ancient floppy drives? The answer is a USB key, of course. I started creating a bootable USB key, following the instructions from Greg Schultz at this link. I was almost done when I ran into another roadblock: I needed the Windows XP Professional installation CDs, but HP didn't give me any when I bought this machine five years ago. Where was I going to get a copy of Windows XP Professional, now that they don't sell or support it and we've moved through Vista to Windows 7? E-Bay, of course. I put in a bid last night and won at $70. It should arrive next week, and I'll be able to boot from a USB key next time I have a problem. I'll be able to diagnose any problems that I run into, and I'll be able to restore any failed hard drives from my backup. Brilliant!

I'm happy to spend some time thinking about this issue. I realize now how central the stuff I've got on my machine is to my life. I use a password generator to create passwords now. I can't possibly remember them, so I keep them in an encrypted vault called Keeper from Callpod. If I can't access it, I can't do on-line banking or pay my daughter's fee bill at university.

I have a drawer and albums filled with photos from our film camera days. If my hard drive crashes, I'll lose all those digital snaps I've got. Until I get into the habit of keeping them in the cloud, there's a significant memory loss if that disk drive head touches the spinning disk.

There's just one catch: How do you test this arrangement? I'd like to know that I can recover without any issue, but I don't know how to prove it. I don't want to wait until the next failure to find out if any of this is worthwhile.

We all need to think harder about our recovery plans.

My next thought will be about networked storage. I can buy a terabyte monstrosity to house all my whole family's data, but I'll want it attached to my network so everyone can see it. And I suppose I'll need to buy two, so I can backup the backup. And I should take the backup of the backup to my safety deposit box once a month so it'll be there in case my house burns down. Geez, where does it end? Someone with a more apocalyptic vision would be building a hardened data center to go along with the fully-stocked, armed to the teeth underground bunker in the back yard.

Here's another interesting question: When a new desktop would cost me a mere $500-600, why would I not just toss this relic and buy a new one? It's easy to calculate the cross-over point when this becomes a fool's errand. What did I buy yesterday? New monitor, a larger external hard drive, a USB key devoted to booting, backup and recovery software, and Windows XP Pro CDs. The total is a significant fraction of the cost of a new machine.

Am I an eejit to keep this machine going? I've written about Moore's Law in our lives. This is the third desktop machine I've bought for home use, and it's the first one that wasn't hopelessly outdated by the time it passed its fifth anniversary. It's a dual core machine with 4GB of RAM and a hard drive that's still only half full.

While I'm mulling over my economic trade-offs, I'm glad to have all my data and my familiar development environment back.

Saturday, August 29, 2009

Back To Ireland





My wife and I recently returned from a week's visit to Ireland. It was her second trip and my third. We went to visit a dear old friend that we hadn't seen since 2003.

We weren't planning any big trips this summer. Our youngest daughter is still in the midst of undergraduate studies. The US dollar is weak compared to the euro. I thought our plans would be no more elaborate than perhaps a week in Ogunquit ME.

But if the American economy isn't doing very well, the Irish situation might be worse. Their housing bubble has popped. They still depend on tourism to bring currency into the economy. One night my wife noticed that air fares on Aer Lingus were ridiculously low. Could we consider a trip, just the two of us? Erin was taking two organic chemistry courses this summer, so she couldn't join us. Meg was in New York City, looking for a job. We could leave the dog at home with Erin and take off without children for the first time since we had them. We called our friend to check his plans and booked the flight.

We flew out of Boston instead of New York. There's a shuttle to Logan Airport on the Mass Pike that's simply brilliant. You leave your car in a fenced lot for a week and don't have to fight the traffic in and out of the city.

The flight couldn't have been smoother. We took off at 6 PM on an Airbus A330 and arrived at 5:45 AM the next morning. There was little or no turbulence at 40,000 feet. The plane had individual screens for each passenger built into the back of the seat in front, with a nice choice of movies. The food was even good! I didn't sleep at all, but then I can never sleep on airplanes. When we arrived in Dublin the car rental counter wasn't even open. We had to wait until someone came at 6 AM. We were given a Ford Fiesta with a manual transmission. It's a small car - we couldn't fit two black bags in the rear, so one had to go into the back seat.

Driving on the left side was no problem; neither was shifting with my left hand. The gas, brake, and clutch pedals are arranged exactly as they are in American cars, so I didn't have to adjust too much. But I could never live in Ireland, because driving there all the time would kill me. The roads are too narrow; there are walls on either side; there are twists, turns, hills, and bushes that prevent you from seeing more than a few feet ahead. Every time a truck came at us I was gripping the steering wheel and holding my breath.

It's about a two hour drive south and west of Dublin to get to our friend's house. Thank God we had precise, detailed directions, because his house is in the middle of a bog between two small towns named Mullingar and Delvin. We arrived without mishap at around 7:45 AM. No one was stirring so we sat in the car, resting and reading. We got extra points for arriving without having to resort to rescue call.

It was wonderful to see our friend. It was a quiet, unscheduled visit that alternated quiet days spent hanging around with excursions out. On the quiet days we'd eat breakfast, read the paper or books, go for walks, sit in front of fires, and talk.

We went into Dublin one day and had a wonderful time. We were all dressed to the nines for a night at the Gate Theater. We had lunch and did some shopping in the afternoon. We had high tea at the Merrion Hotel, which is a five-star establishment that lived up to its reputation. The service, food and atmosphere were impeccable. We saw a revival of Noel Coward's "Present Laughter" that was terrific.

Maureen and I went alone to Kilkenny. The 2.5 hour drive was stressful, but we managed. We had lunch, shopped, and toured Butler Castle. The Kilkenny Arts Festival ended the day before, but we were still able to tour one of the galleries.

Our last day out was to a nearby abbey whose first buildings were erected around 650 AD. Houses in America with plaques indicating that they were built in the 1700s appear old, but that's nothing compared to stone houses in Europe.

Our flight back was even smoother than the one that brought us to Ireland. It was another Airbus A330 (sorry, Boeing). Logan Airport is a first-rate operation. We arrived early, got through customs in 20 minutes, waited not more than five minutes for our bags, and got right onto the shuttle to take us back to our car.

This vacation was a lesson in not putting things off forever. We all assume that we'll go "next year" when we think about opportunities like this one, but you never know if the chance will pass you by. I was so happy to reconnect with my friend. Phone calls and e-mail are nice, but there will never be a substitute for face-to-face contact.