Automation - a means, not an end

I'm interested in the productivity and quality of development teams, and the output of development teams cannot be analysed in isolation - the team's work must be providing value to the business it serves. Communication mismatch between tech and non-tech folk is something I've written about previously.

As "automation" has been a buzzword for quite some time in the enterprise software world - and in most cases for very good reason - I thought it would make a good practical example of situations where tech teams lose sight of the most meaningful objectives. 

Rube Goldberg contraption

What can be automated?

In a typical software development/delivery workflow, there are a number of popular candidates for automation, such as:

  • Local development environment setup
  • Test automation
  • Continuous integration automation
  • Deployment automation
  • Infrastructure setup automation (e.g. IaC)

Automation is not the objective

As there are "industry standard" approaches to all of the above, it's almost certain that at least some level of automation for any of these areas will bring quality and productivity benefits to your development team. However, automation is not the "real" objective. Why automate?

  • Less error-prone and more repeatable processes
  • Computers are faster and more accurate at doing well-defined, repetitive tasks than humans
  • Processes are better documented "as code" (which doesn't lie, or go out of date)
  • Done well, this should mean the business can devote more of its "human" resources to the "real" work of creating valuable software

These are the measurable objectives which determine whether any automation is a success. "100% automated" is often not required, or meaningless, or a downright lie, depending on your interpretation. Similarly, 0% automation isn't something you'll generally find either... you could argue that just the act of using a computer means that there's at least *some* level of automation!

By measuring the wrong outcomes, or not measuring any outcomes, work on automation may not deliver the right amount of value relative to the cost of implementing it. For example:

  • If a portion of your deployment workflow is incredibly time-consuming and complex to automate, but automating it would only save a tiny amount of time, is it even worth doing? Is there a risk of the automation being flaky or bug-ridden itself?
  • If automation takes longer that the manual processes it replaced, you're probably doing something wrong
  • If manual processes are not yet stable, automation may not yet be appropriate if rework will have to be undertaken frequently, whilst processes establish themselves
  • If your automated tests are flaky (perhaps because they depend on things which change outside of your control) and don't increase the quality of delivered work, do they really have much value?
  • If the automation work costs a vast amount of time or money, the business case may be questionable

Again, this is not to say that "automation is bad"! Using automation as an example, the point I'm trying to make is not to lose sight of the "real" objectives (something I've definitely been guilty of as a developer.) Similarly with "productivity" tools or the latest cool COTS tool, beware of focusing on the tool itself, rather than how the tool can be usefully deployed - this is another rabbit hole I've watched developers disappear down.

What's the real end?

Of course, outcomes such as saving tech team time, better software quality and increasing delivery speed are really just the means to another end themselves. The real objectives being to deliver value to customers, and to make a business profitable. Development teams are less likely to be able to influence these more "ultimate" objectives directly. So, the tech team focus should be on the most meaningful (= closest to the business) objective over which they have an influence, in order to keep tabs on the quality and productivity of their output.


By James at 26 Jun 2018, 10:00 AM


Comments

Post a comment