:wave: Hi everyone! I just had a conversation toda...
# general
a
👋 Hi everyone! I just had a conversation today with a few engineering leaders on building platform teams and a repeated question that came up was, How do you measure success as a platform team, especially when you’re doing a great job and it feels you’re not needed?
🙌 1
a
Of course it will depend on your context, here I will share what we measured: • Strean-Aligned Teams (SAT for short) Cognitive Load • Lead Time • Failure rate faced by clients Why those 3? Because they were the main purpose when we started the team. I'm sure they will evolve overtime. Lucky us, we were able to measure a good impact on Lead Time and Failure rate on the first quarter. Measuring CL effectively is a little more challenging
🙏 2
Yet, we cannot say that Platform => Better outcomes, I'm sure that multiple things played a role there and it was not only because of the platform team
l
Hi Alex, I'd say create some solid KPI trees for engineering productivity, reliability and maybe cost efficiency. The productivity ones will for sure include the DORA metrics, but also more in-depth metrics such a context-switching, cognitive load, build time, developer wait time... If you have multiple platform teams you may wanna assign some focus areas, e.g. for the CI/CD team you can show how your built time or wait time improvements contribute to a reduction in lead time
j
@Lambros Charissis would you be willing to share your KPI trees? I'm right in the middle of creating them for my teams, and I could use some inspiration 🙂
âś… 1
🧡 2
n
All of those are good metrics, but I’m also a massive fan of tracking reduction in tickets filed by developers. If your self-service platform is working well, you should see that dramatically reduce over time
🧡 2
a
So true @Nigel Kersten! We were measuring open tickets by customers as well.
đź‘Ź 1
n
I’m also a fan of doing something that gauges sentiment and happiness amongst the users by using a really simple Net Promoter Score survey at regular intervals https://en.wikipedia.org/wiki/Net_promoter_score
🧡 1
Particularly when you’re getting started, surveying devs to find out if they’d promote the use of the platform to teams that aren’t using it is a really great way to work out if you’re not just making things better quantitatively, but also qualitatively
people are squishy and their feelings don’t always fit into engineering productivity metrics
a
Hi all. Super interesting topic I'm facing the same issue. In our case main purpose is lowering cognitive load of Stream-aligned-teams. But how to measure that? Aside the NPS that is completely subjective, do you have a good approach for this? Thanks a lot
a
I’d look at internal and external KPIs, one thing that is very interesting to understand how much the platform teams understand the why behind what they are doing as a business objective. A lot of the times the root cause of misalignments isn’t the wrong external KPIs or interfaces built its how well the internal team understands their purpose. I am a huge fan of NPS in reasonable sized orgs for part of the external suite.
One thing to think about and unpack is “is the problem not being able to tell a story to senior leadership” vs “do i need a metric”
👍🏾 1
They are different problems and can have different solutions
👍🏾 1
n
^^^^^ those last two points are so important
In a very traditional IT environment I’ve worked closely with who are doing the platform team thing, they’ve been measuring unplanned work vs planned work each sprint in the platform team and most of the adjacent value stream teams - that’s been a really useful metric to work out if the platform team is making a difference - but you have to do the work to investigate whether the changes are due to the platform or due to other changes in the stream teams
a
Thanks so much for all these insights! This is wonderful to read through and one metric that I’m trying to capture is part of DORA metrics and specifically the Deployment Frequency and Lead Time for Changes. Our first full and complete platform team is our Design System Team and they have a direct impact on those metrics as they provide pre-build, ready to use components and modules for product teams. NPS is actually super interesting and will surely try that with our small infra teams, specifically around CI/CD.
j
@Nigel Kersten in a recent initiative where we migrated from Concourse to GitHub Actions, we tracked the number of CI related questions, and also how many questions we got for each products. Really cool to see our over CI related questions going down as more people moved over to GHA 🙂
🙌 1
n
As someone who worked on the State of DevOps Reports where we worked with the DORA folks to produce those metrics 🙂 I think they’re fantastic for broad-brush measurement of progress, but I’ve seen way too many teams get obsessed with making minor improvements to them and losing sight of the bigger picture. You should absolutely measure them and look to make big improvements to them, but if you start getting obsessed with fine-grained improvements, take a step back and think about what the bigger picture is with respect to extraneous cognitive load for the people in the system.
❤️ 2
l