Once an organization has decided that DevOps is their new destination, individual developers have to find their place in the flow. There are underlying principles of how each part of the overall effort relate to one another which will ultimately help crystallize how the developer sees their role.
The User Journey
The user experience is the primary theme of all parts of a project. By taking the user experience and making it the focus of how a programming effort is viewed, your result simplicity and clarity enhances. The programming effort has to carefully consider and understand what the user will experience when the development process completes. Respecting and improving what the end-to-end user journey will look like will make or break your creation.
With this comes the realization that performance testing is not some abstract project add-on. Instead, it’s an opportunity for developers to put themselves in the users’ shoes, acting just as a user would. It also enables a developer to observe system performance from the users’ perspective. By replicating user behavior, developers have a clearer picture of improvement requirements on a more in-depth basis. Whatever irritates a developer while on the journey a user will have to take is what will annoy the user when they end up on that journey. And nobody likes to be irritated.
Delivering often and early makes an Agile team learn and adapt as they progress through the cycle. Part of that learning is conducting the performance tests to evaluate what their efforts are enabling the user.
Rather than approach a project as one monolithic entity, Agile development is characterized by the continuous delivery of small segments. Each of these smaller services can be created faster (and perhaps better since the focus is on bite-sized chunks) rather than if the direct effect of the service was hidden below the size of the overall development effort.
The microservices are pieces of the more massive overall project. They are every functional aspect of the whole project recast as an individual service. A microservice has boundaries to it; it affects things only to some limit. Within those boundaries, it may be a primary effector but it also has to interact with the other microservices it will encounter.
Along with a different way of being created, microservices are also maintained differently than if they represented a more significant, imposing task. The rule of thumb – you build it, you manage it. This makes sense, especially when a team has enough expertise in an area to perform the original microservice creation. They will be the “best” team to change the microservice when required; and indeed, changes will be necessary.
They will be required because the performance testing done with every iteration of microservices will uncover what needs to change. Continuous Delivery of all these individual microservices means that the combination of them together is continually evolving as well.
Knowing what those combinational changes are doing to the result is what the performance testing will give back to the team. To truly test the performance, though, means that what is being asked of the microservice aggregation during a test stays constant as each of the microservices grows more developed.
Automation of Testing
Since maintaining consistency between tests, even if they grow to be more complex, requires that the foundation of the test remains stable, implies a that a certain level of automation exists. Automating a test removes a portion of the variability of testing that can creep in through the application development lifecycle as complexity rises.
Identifying what tests which can be automated can be the hard part. However, the success rate will be higher when the test has specific functional characteristics, such as when they’re API vs. command line based. APIs reduce the complexity of enacting a particular scenario.
The point of incorporating automation is to increase the rate of the Agile team’s continuous delivery. It goes faster with routine automation, naturally. But automation is also more effective in obtaining repeatable results.
Changing from a monolithic to a microservice view of development will also change what a team must do to test whether the microservices, which they have taken responsibility for, will meet agreed upon goals. Instead of waiting to test the vast monolith, teams can separately test each of the functions that will go into it as well as how they function in aggregate. This kind of approach allows the team to obtain their goals with less friction and less wasted effort.
Learn More about DevOps
Larry Loeb has written for many of the last century’s dominant “dead tree” computer magazines including BYTE Magazine (Consulting Editor) and the launch of WebWeek (Senior Editor). Additional works to his credit include a seven-year engagement with IBM DeveloperWorks and a book on the Secure Electronic Transaction (SET) Internet Protocol. His latest entry, “Hack Proofing XML,” takes its name based on what he felt was the commercially acceptable thing to do.
Larry’s been online since UUCP “bang” (where the world seemed to exist relative to DEC VAX) and has served as editor of the Macintosh Exchange on BIX and the VARBusiness Exchange. He lives in Lexington, Kentucky and can be found on LinkedIn here, or on Twitter at @larryloeb.