14 April 2017

The Master Equation

Elsewhere I've talked about metrics and delivery and mindset. There is a lot of talk about metrics being evil and wrong and misguided, but there is a reality that we need to consider how to make software delivery as effective as possible. That is, how do we get the highest quality, targeted, well factored systems for the lowest cost, and as fast as possible. Call it balancing the Iron Triangle if you will.

I don't know where this originally came from, but years ago in a water park/hotel conference room in Ohio I was presented with this equation,


Throughput = Work - (Rework + Waste)

I call this The Master Equation. From this we can derive everything else when we talk about effective software development. When we look at what we do for a living and how we are compensated, we have to consider that the only really valuable thing we can do is produce value, and more or less, the faster and more efficiently we create that value, the better we are compensated. So throughput is really a thing. 

Lets set aside understanding what is valuable and what is not for a moment and focus on optimization. If we use The Master Equation, we can observe the product and processes we use to deliver software and identify what is Rework and what is Waste. With some care and precision we can even identify their source. Given that all of these things as possible, we can optimize for throughput in a system.

How is this possible you ask? Well, lets consider a few things. 

One wasteful thing that we see on a regular basis is bad designs. That is, something technically correct that is embarrassingly slow or unscalable, or unsuitable for the solution space. We can mitigate those issues in a number of ways.

One, we can do our homework. A little bit of research into problem spaces and published solutions goes a long way to ensure that we don't run into issues with a proposed design. We can learn from the mistakes of others.

Spiking. We can do a series of small experiments to ensure that something feasible. When I do this, I create a decision tree of spikes. I then start at the top and work my way through the tree. If at any point the proposal becomes unsuitable, I discard and start again. I generally do some research first and make sure I'm not traveling the well worn road to failure. 

Small steps make big strides in determining viability without blowing out your budget. If you can see a way to slice a solution into long end-to-end strips and deliver just those strips you can get a good sense of the effectiveness of the solution without building the whole thing. 

Parallel development is another option. If we have two unproven approaches and no other way to verify which is more correct, build both. Setup two teams, give each one the guidance on the solution they need and then compare the results. You need to establish good measures before you start and a GE (good enough) threshold too. If you understand what is good enough before you start and one team reaches that goal, you can cut off the other approach and move one. 

Another thing we see frequently is defects and cruft clogging up the development pipeline. You can solve these problems pretty effectively with good tooling and good communications. For one, use static analysis tools. If you can find them, use tools that automatically correct the little things like formatting, spelling, punctuation, etc. Then set a zero tolerance policy for violations and keep the team on it. I like to cook the static analysis into the automatic build process and reject PR's that can't pass these tests.

Additionally you can apply code reviews and campfires to improve quality. Code reviews are a good way to spot bad design. If commits are small (as they should be) you can usually crank out a review in fifteen minutes, and if you get good at it you can usually see the design taking shape and stop a bad one before it gets out of control.

Campfires are a great way to communicate with the team before trouble starts. Typically the tech-lead or product-owner leads a discussion on a topic with the rest of the team. Ideally you have a whiteboard present so you can draw pictures. Take 30 minutes and discuss what needs to be done. Talk through the options for how it can be done, and consider all the disagreements that might arise. This is both a great way to stay on track around design/development and for more junior players to learn from others in a group. 

Defects, the bane of everyones existence. I could probably write a book on this topic. I'll start with, don't have any defects. Thats a lot to ask, but its a good objective to have. I'll also say that, the best way to avoid defects is lots of communication. Start with asking plenty of questions and validating answers, then follow up with clear explanation and demonstration. You really need to be able to show that the code you have built does what you understood it was supposed to do. If nothing else, at this point you can be told it is wrong and have a chance to fix it before it makes it into the wild. 

Sometimes you don't know that you have created a defect. Often times the Product Owner doesn't know it either. You just have to role with those. You also have to get that sometimes the business doesn't know the right answer either, they just know when they don't get what they want/expect. This is just part of the human experience and we have to tolerate it.

Lastly there are defects caused by inexperience. These are the gaps in the code that sneak up on the unwary and destroy hope. No, just kidding. When we are learning and doing something new, we don't know what we don't know, so we can create defects because we didn't realize something could happen. QA guys make a living off of thinking of those things, and they can be very creative. Code reviews and campfires can help a lot here. People with different experiences will think of things that maybe the others won't, and throwing those things into the mix can help to mitigate unintended defects like these. That said, people can't think of everything. So when these occur, write it down, try to remember it, and learn from the mistake. 

If we can strive toward having little or no rework and waste in our projects we can deliver more quality software. I hope The Master Equation can provide a framework for thinking about software development overall and how we can make it better.