This message was deleted.
# general
s
This message was deleted.
a
LD pricing has been a barrier
👍 2
r
@Alan Barr did you feel it would empower engineers to build the right thing or more a PM toy? Or had a different use case in mind?
a
The value LD provides is helping your devs step on each other less and coordinating releasing features a little better than trying to schedule updates
Depends on how many hands you have in a cookie jar but the more hands the more likely it can offer value in getting code out without coordination overhead
We've had people build their own versions of it in house and usually regret it because we can't keep it maintained
🙂 1
r
Yes, that makes sense. Especially with larger distributed groups.
Was originally thinking around A/B testing, but smaller features and less coordination is a good point.
s
What’s your org’s specific problem LD/Hotjar may solve, @Ralf Huuck?
r
Wondering if we can we somehow measure that we are building the right product by connecting customer use/experience back to engineering activity (Epic/Release), @Schuyler Bishop
The question is slightly meta as we are building an engineering analytics platform. And while connecting to different data sources gives you the ability to see if you deliver fast/smoothly (velocity) or recover well (think DORA), it is not yet obvious if you are actually building the right thing.
a
Do you have any success metrics?
s
They’re a little trite these days, but my mind usually goes to the Accelerate four metrics: Change Fail Rate, MTTR, Deployment Frequency and Cycle time to production. Of course the meta version of those is the big three: cost, revenue and keeping executives out of jail.
😄 1
r
Alan: Usage/adoption in a first instance. Let’s say you add “widget A”. Is anyone every using it and can we feedback this into our planning boards (past JIRA Epics for argument sake)?
Some obvious caveats: works better for frontend, does not reflect rare but important cases (e.g. recover pwd) etc.
a
Just an idea this isn't always applicable. Reforge had a good framework called TARS (Targeted, Acquired, Retained, Satisfied) It's a good way to track analysis of feature possible value and adoption/satisfaction
👍 1
I think there are a lot of possible ideas here
r
Schuyler, I think the DORA metrics are good for platform/operation side. What is DORA-like for engineering teams? Can do cycle time, throughput, capacity and risks. Need to be careful not to measure useless junk (aka LoC). We are promoting a “smooth flow”, but seems more can be done.
a
I did a youtube video on trying to apply it for my platform. https://www.alanmbarr.com/blog/reforge-growth-strategy/
👍 1
r
Thanks for the TARS link!
a
There's also a technique call the qualitative feature map that's good to share context on a feature and why / how broad it could be used
r
Thanks, just looking at it. Now next question, does it make sense to connect this back to past efforts or is this more a PM directional input?
I have seen in the past where we just built this 500k feature that nobody wanted/needed 😉
a
I think it can be valuable to connect to previous features and see if they hit the mark or not
That sounds strange and unexpected 😛
😆 1
engineers gold plating I never heard of that
r
Don’t think it was poor intention. Large org with a lot of distance and internal layers between customer and dev teams.
That’s why we were wondering if connecting the dots from “product analytics” back to dev data helps engineers to be more involved.
a
I think there's value there for sure
This is a filled in example of the qualitative feature analysis might be helpful or not https://docs.google.com/spreadsheets/d/1EYGJpNe-GVJvVgbPYDDb8HCArIBD8j4osTMJYvMCpHI/edit?usp=sharing
r
Cool!
Will digest this! Thanks for the great input so far.