Showing posts with label software. Show all posts
Showing posts with label software. Show all posts

Saturday, February 11, 2012

Tactical Solutions




"It's a tactical solution."

The tone was so dismissive. When did "tactical" become such a dirty word?

Software is rife with dogmatic pronouncements these days. BUFD is always bad. Unit tests are always good. Everyone knows that <insert-language-or-technique-of-your-choice-here> is better than anything else. Developers and coding are commodities. Process wins out over people; a good process can produce great code even when executed by average or below-average developers.

I personally hate dogma of all kinds. I recognize that accumulated wisdom needs to be respected. Physical laws are involate; I don't need to relearn lessons to keep fingers away from hot stoves. Gravity works. The second law of thermodynamics says that the egg I dropped on the floor might spontaneously reassemble itself before my eyes, but the probability is so small that it's never been observed. (But give it ten billion years and evolution could reconstitute it into something else, like a Boeing 737.)

Always relying on what everyone knows becomes a problem when context is ignored. Rules of thumb usually have exceptions, but the dispensers of dogma fail to bring them up or acknowledge them only grudgingly when they are pointed out.

There's also the question of who gets to decide what "everybody knows" when it's a non-scientific issue that's not entirely settled. History is full of people who have used dogma to serve their own ends. If the winners write history, it's also true that they decide questions of dogma.

I agree that slapping things together and being sloppy should not be anyone's goal in life. But prototyping something quickly, getting it out into the field, providing short-term value, and learning lessons from that experience has its place.

The objection to tactical solutions harbored by organizations is that they often become the de-facto standard solution, never to exit the portfolio again. This kind of organic growth can cause problems over time. But banning all tactical solutions feels like throwing the baby out with the bath. A better answer is to plan and budget for refactoring as needed. Retire those tactical solutions when the better, strategic thing comes along.

There's a tension between technology and economics. It doesn't pay to leap onto every new thing that comes along. The adoption curve for technology says that there are always leaders and laggards:


Organizations that are smaller, more nimble, and less risk averse get to be leaders and early adopters. If they play the game well they can get the jump on larger, slower, more conservative competitors. The larger competitors can leverage economies of scale and the damping that copious resources provide. They just need to be mindful of those tipping points that can shift the earth under their feet.

I wish that the timelines for payback were easier to quantify. Cost-benefit analysis can be made to come out to support the desired conclusion. Could Bayes and Monte Carlo be used to improve on what's done today? I'll have to think about that.

The truth is that everything is tactical. Technology is changing at a breathtaking rate. Every solution is tactical; they just have different timelines. (Just like there are no "permanent employees". We're all contractors.)

One final, funny thought: When I searched Google Image for 'tactical solution', I was surprised to see all the photos of weapons of all kinds. I had to scroll down several pages to find an image that expressed what I had in mind. The word is overloaded with military implications. It brings to mind SEAL squads and surgical strikes. That sounds effective when applied to certain problems. Why can't tactical software solutions be granted the same consideration when they apply?



profile for duffymo at Stack Overflow, Q&A for professional and enthusiast programmers



Thursday, October 8, 2009

Why Is UML So Hard?





I changed careers back in 1995. I jumped from mechanical engineering to software development. I've worked hard to try and learn what object-oriented programming is all about, what advantages it brings to helping to solve problems in software design and implementation.

First I learned C++, the great new language that Bjarne Stroustrup gave us. I thought that figuring out pointers when I moved from FORTRAN to C was hard; wrapping my brain around objects was much more difficult.

Then Java came along. I took a one-week class, but I didn't really get it.

Then I moved along to a company that wrote client-server accounting software using Visual C++. One day the CTO asked if I was willing to tackle an assignment that required Java. "Oh sure, I know that language," I said. I really had no business taking on that problem, but I muddled my way through it well enough to deliver something.

That company was struggling with the transition from client-server to a distributed, three-tier architecture. They had a long history with the Microsoft platform, but they liked Java's "write once, run anywhere" promise. Their clients were banks and businesses, not all of which ran on Windows. They also wanted to get away from the tight coupling between their user interface and the database tier. They had all their business logic tied up in stored procedures. This meant that they had to support Oracle, DB2, Microsoft SQL Server, Informix, Sybase - any flavor of stored procedure language that a client wished to run. They had a "can do" cowboy attitude that said hacking stored procedure code on site for a new customer was just good business, even if it meant that every installation was a custom. Why let an out-of-synch source code repository stop you from saying "Yes, sir!" to the customer?

The CTO brought in a bunch of folks to try and help them move to a more object-oriented approach. He bought several licenses to the most well-known UML tool of the day. He hired a consulting firm from the Washington DC area to come up and give us a week's intensive training in the use of this UML tool. When the pressures of keeping the production version rolling out the door subsided, he took us all to a hotel conference room, away from the office, and had us spend two weeks locked away with our UML tool, flip charts, and markers. When we were done, we'd have an awe-inspiring object-oriented design for the 21st century accounting system.

As you can guess, the two weeks were a disaster. No object-oriented design came out of those sessions. The company didn't get their distributed accounting system.

What went wrong?

We lacked a strong leader with experience at object-oriented design. We were still learning the tools. Domain knowledge in accounting and experience with the product varied among the participants.

Each session would start with something like "Let's do one use case." We'd draw stuff on flip charts and quickly veer off the road. Every discussion would descend into a dizzying argument that was a roller coaster ride from the heights of abstraction to the stomach-churning drop into implementation details. I was trying to persuade them to list the steps for accounts payable when one old hand smirked and said "I can tell you what accounts payable is! Pay me!", holding out his hand with palm facing up.

The developers would scowl and listen quietly until one of them would stomp out of the room, tossing something like "If you don't make up your mind soon, I'm just going to start coding what I want" over their shoulder as they headed towards the soda machine.

We couldn't agree on what words meant. We'd have bike shed arguments for hours about what "customer" meant. We couldn't agree on how to drive from a meaningful description of the problem at hand to artifacts that a developer could use to produce working code. It's as if we'd get bored or frustrated doing that very hard work and give up before the payoff.

I left the company soon after those sessions ended. There was a layoff within six months. The CTO was forced out in a power struggle with the other two founding partners.

Fast forward eleven years. I'm working for another large company that is struggling with a transition from an older platform to a more modern one. UML has been championed as the cure for what ails us. Licenses to another UML tool have been procured. Training will commence. A large cross-disciplinary team has been convened to go through the UML design process. Consultants have been hired to shepherd us along the path of righteousness.

The funny thing is that it feels just like those sessions I sat through eleven years ago. Every discussion descends into a dizzying argument that's a roller coaster ride from the heights of abstraction to the stomach-churning drop into implementation details. We can't agree on what words mean. We have bike shed arguments for hours about design minutia. We can't agree on how to drive from a meaningful description of the problem at hand to artifacts that a developer can use to produce working code.

We'll see if we get bored or frustrated doing that very hard work and give up before any payoff comes through.

This might be the growing pains of a new team. But what if it's something wrong at the heart of UML? This object-oriented notation for rendering design decisions, codified and maintained by the Object Management Group, was born out of years of notation wars among the Three Amigos - Booch, Jacobsen, and Rumbaugh. They created a notation (UML), a company (Rational), a software tool (Rational Rose), and a software development methodology (Rational Unified Process) before selling out to IBM.

Agile approaches have largely discredited heavy approaches like a full-blown UML design.

Maybe somebody has found that this is a good way to develop large software systems in a team setting, but I haven't seen it yet. Things don't seem to have improved a bit in the past eleven years.




Sunday, September 27, 2009

Synthetic Biology





I read a terrific article written by Michael Specter, published in New Yorker Magazine, entitled "A Life Of Its Own." It asks the question "Where will synthetic biology lead us?"

I'm fascinated by the question. It marries science and ethics in equal measure. I can sympathize with the enthusiastic scientists who envision great benefits - everything from improved health to a way out of our deadly embrace of fossil fuels.

I can't claim to be that kind of scientist. Engineers concern themselves with applying the knowledge that the practitioners of fundamental sciences - physicists, chemists, and mathematicians - unearth for us. We fashion these intellectual raw materials into useful things, and even contribute back what we learn about the fundamentals during process development, but I've been reminded many times that a mechanical engineer is not a physicist. There was a time when I immersed myself into reading biographies of the great physicists of the 20th century. Feynman became a hero of mine after reading his autobiographical short stories in "Surely You're Joking, Mr. Feynman!" and James Gleick's wonderful biography "Genius". I devoured his famous red books, fancying myself a budding physicist.

Then I got my hands on Veltman's "Diagrammatica", and the dream died. It was beyond me. I had neither the physical intuition nor the mathematical chops to see my way through it.

I'm in a worse position with biology. The last biology course that I took came in high school. They taught us the rudiments of DNA, RNA, and the Krebs cycle, but it was well before the polymerase chain reaction came along. Chemistry is not my strong suit either, so the changes that are coming will leave me behind.

Neither of my parents went to college. I was alone when I went off to study mechanical engineering, because neither of them had experienced what I went through.

My youngest daughter is studying biology as an undergraduate now. In spite of all my education, I find myself in a position relative to my daughter similar to what my father had with me. I can relate my experiences as an undergraduate to hers, and tell her what graduate school was like for me. I know enough about fundamentals like thermodynamics, physics, etc. to keep the ball rolling when we talk. But she's already well beyond my capabilities in her chosen field. She's blazing that path alone. She's Lewis and Clark sending letters back to me, Thomas Jefferson, describing the wonders she's experiencing.

I found the New Yorker article particularly interesting, because a number of the phrases evoked things I'd read when the software industry was abandoning older procedural languages like FORTRAN and COBOL and embracing the newer idea of object oriented programming. The problem was complexity: it's impossible to manage all the details that go into developing software when the number of lines of code explode into the hundreds of thousands or millions. Problem solving in general, and computer science in particular, depends on being able to decompose large, intractable problems into smaller, more manageable pieces.

Object oriented programming helps us to manage complexity by mapping software components onto real-world objects and encapsulating the details inside. If done correctly, users of a component need only concern themselves with what they need to provide and what they get back; all the messiness of how it's done is hidden inside.

Brad Cox and others used to talk about "software integrated circuits": each component would have its own well-defined inputs and outputs, much like the pins on a hardware integrated circuit. There would be a marketplace of these software ICs, where you could search for a component that met your needs, plug it in, and off you'd go.

This phrase on page 5 of the article brought that vision back for me: "The BioBricks registry is a physical repository, but it is also an online catalogue. If you want to construct an organism, or engineer it in new ways, you can go to the site as you would one that sells lumber or industrial pipes."

It made me stop and think, because to a great degree the promise of software ICs has not been realized. Writing complex software systems is still a difficult, large scale problem. Object models claiming to model the industry I work in today have not lived up to their promise. The ideal presented by the hardware side of the problem has not translated over to software.

There's still something fundamentally different about software. It's not all science. The irony is that software was distinguished from hardware at the dawning of the computer age because it was believed to be more malleable stuff than the circuits it ran on. You could change it relatively quickly, far more easily than the machine that executed it.

But that's often precisely the problem. It's very easy to change, but the coupling and complexity make it difficult to predict what the effect of the change will be. Brittle software suffers from this problem. The effect of changes in one part of the code often ripple out, resulting in surprising, disappointing, sometimes catastrophic behavior.

Reading about the enthusiasm of biologists made me wonder about the brittleness, coupling, and unintended consequences that face them. Will they have better success than software engineers have to date? And if they do, what lessons can we learn to improve the lot of software development?