Many engineers, including myself, wonder what the end-to-end ownership of a feature looks like for a software company. I’ve had the opportunity to work on several projects with end-to-end ownership, and I wanted to share my journey at Gusto in building a feature to empower our product managers to launch experiments, and our data scientists to iterate on models to improve users engagement with our recommendation.
In this post, I hope to take you through the journey, and hopefully put into perspective everything that is required when given end-to-end ownership of a feature release, aligned directly with the values that embody being a Gustie.
Understanding the Why?
One of the most important drivers for a software engineer in building software is having an understanding of why you are building the features.
- Why does the end user require this feature?
- What’s the pain point that we are trying to solve as engineers?
- How will the feature help alleviate this pain for our users?
As software engineers, we build software to solve a problem, and so understanding this is the first step in the journey. In the case of the feature that I was building, this would enable Data Science and Gusto Segment Leaders to continuously improve the targeting of recommendations to drive customer expansion, and eventually roll out recommendations to other segments.
This first part of the journey goes hand in hand with one of Gusto's key values in Embody a service mindset, to never stop advocating for the needs of others.
Documentation and the power of white-boarding
As I was embarking on the task of thinking through the system and the use cases, my People Empowerer (PE for short, what we call managers) made sure to emphasize that I take my time thinking about the communication flow, the architecture, and the use cases before going into the implementation.
I must say, this was difficult, because I was fighting my urge to start writing code, but as we sat down and utilized a whiteboard to look at the system, the data flows and the use cases, everything started to become more clear.
I’ve always been a visual learner and one tool that aided my understanding of the high-level view of the system was a Database Modeling tool that allowed me to visualize the data flows and changes that were necessary for the feature to come to fruition.
The following shows the final diagram for the feature with the different data models that would power the experimentation, models, placements, and recommendations to show within the application.
The time spent documenting this made the implementation of the system very straightforward, so I’m glad that my PE kept pushing me to focus on the specification.
This touches on another Gusto value of Be proud of the how, ensuring deep integrity in everything you do.
Keeping Stakeholders informed
While developing the feature, our engineering team held two weekly syncs:
- One with the cross-functional stakeholders to resolve any general questions about the project and discuss blockers, if any
- Another, a technical sync between engineers and data scientists for in-depth technical discussions.
It’s very important to leverage these meetings and make sure everyone is aware of any major blockers. I always find that people are more understanding when you are clear with them and raise issues earlier in the project. Every project has obstacles along the way, what’s important is how you manage expectations and are clear with your progress.
Shall we pair?
During the implementation phase of the project, I found pairing to be one of the most useful takeaways. Gusto has a big culture around pair programming. I paired with Colin Harris, one of our talented engineers from the Growth Expansion team, who had more knowledge of the recommendation system. Having his input and breadth of knowledge during these pairing sessions assisted me in the implementation of the feature. During this phase, we leveraged the TDD (Testing Driven Development) approach in describing all the end-user requirements as tests first and then implementing the code beginning from the test and working up to the implementation.
Two brains are certainly better than one. Seek out your colleagues and pair with them. You may be surprised at the things you may have overlooked.
Building with humility, a Gusto value, is embodied in the pair programming culture, where engineers put collective success before individual achievements.
It takes a village
During the implementation of the feature, there are many people that are part of the effort that must be kept in close contact. In this particular case here are some of the key participants that helped shape the feature.
- UX/Design ensures that whatever is being built makes sense from the end-user perspective.
- Data Engineers/Data Scientists provide the machine learning models that output scores for the recommendations and the infrastructure that will power them.
- Product Managers translate the requirements of the end-users into the technical realm for the engineers to implement.
Opening Pandora's Box... behind a feature flag
One of the things to take into consideration as you are building software is feature flagging. Any new addition that introduces major changes should sit behind a feature flag. This allows for certain functionality to be turned on and off as needed, as well as slowly roll out the feature to consumers.
This guarantees that if anything should happen, turning the feature flag off guarantees stability and reliability for our consumers.
As to how this looks like, let's say we we have the following code:
class FeatureFlagService include Singleton def is_feature_enabled?(feature_name): // Checks the persistence layer or talks to a feature flag provider; Optimizely, Launchdarkly, Split end
# consumer def ml_scored_recommendations(company_id): if FeatureFlagService.instance.is_feature_enabled?('ml_system') ...new logic around the machine learning/experimentation workflow else ...old workflow end end
Should there be any issues rolling out the ml_system feature, as developers we can just switch it off for our end-users.
Let’s hold a Test Fest. A WHAT?
As the release date approaches, one of the events that Gusto likes to hold is something called a Test Fest. Engineers, product managers and other stakeholders test through various use cases in the staging or preview environments and collect any bugs or improvements that can be added.
It helps the team align on their expectations vs. reality and ensure that we built what we were supposed to be building.
It’s release time
Having created the feature behind a feature flag that would slowly allow us to roll out the feature to users to see if there are any issues provided us with confidence to ramp up the rollout to 100% by the third day. At this point you may think, congratulations, you are done!
As a software engineer, it’s important to have monitoring and metrics to support the work that was done from tracking the response time of key operations, logging errors, and validations that the engineering effort put forth was a wise decision for the company to make. At Gusto, we use:
- Amplitude to understand impressions and conversation rates for customers that are part of these experiments.
- Datadog to track response time, errors, logging, and metrics to track the stability of the feature.
Ensuring that performance has not degraded is an essential part of the end-to-end ownership, as you never want to create a feature that worsens the user experience within the application.
In conclusion, there are several takeaways from this experience.
- Interactions over processes and tools: Being able to have that face-to-face interaction and whiteboard the system to understand the data-flows helped immensely to unlock the rest of the project.
- Always start with a feature flag: Adding a feature flag allows software developers to roll out the feature with confidence, and provides a mechanism to turn off the feature is there are any issues.
- The power of pairing: Pairing with an engineer can speed up the process of implementation, and teach you some valuable lessons about how others think and tackle problems
- The work is not over once a feature is released: It’s important to have monitoring and analysis of the feature components to understand the performance and business impact. A feature that’s unusable because it’s slow doesn’t provide any value.
Onto the next feature. Ciao.