This message was deleted.
# product-management
s
This message was deleted.
j
Hello @Madhur Jain From user research perspective, there is a great book called Testing Business Ideas by David J. Bland. I would look there for possible experiments in terms of Desirability to extend your interviews with more possibilities to get proper feedback. That could help with the design part. However. If it's mostly about making a technical decision on the Platform architecture, then I'm not 100% sure if it should be purely based on users voice.
c
I think the most important consideration here is whenever or not you’re going to be breaking compatibility for your consumers. Are the changes mostly to UX or do they represent a new paradigm which would require training and potentially create orphaned work if you retreat (i.e. A/B testing which doesn’t lead anywhere). If it’s just UX changes then I’d suggest looking into some 3rd party feature flag/ rollout tools which will allow you to do large scale testing on random samples of users. Either way you’re going to need to develop some metrics with which to measure success against your objectives. What is the goal in making these changes in the first place? Before you change anything for anyone know what your goal is and make sure you know how to measure success.
m
To provide more context, we currently have proprietary low code user interface to write code which fetches data from proprietary DB but there are concerns with the existing solution. The goal is to enable writing code much faster (specifically which fetches data from the DB i.e. improve productivity+satisfaction) e.g. The options are 1. Enhance the existing user interface and make it simpler 2. Enable text based solution e.g. enable writing a SQL directly 3. Mix of the above two approaches. Because of complexity, each option could take 6 - 9 months to build and rollout. How does my team figures out which options will be the best going forward? Can this be A/B tested and get some data or shd I just rely on giving he prototype to the users and get their qualitative feedback.
g
Solution validation techniques have their purposes very clearly defined as well as conditions in which can be applied. Make sure conditions and purpose for your validation are clear before using specific technique otherwise your results might be extremely incorrect. A/B testing is used to test one variable at a time and requires feedback from large group, this is statistical method, which means that you need to have a probe statistically significant to have good results and 15 engineers is not. Look here for more about this method https://amplitude.com/blog/ab-testing Based on your description, qualitative feedback fits much better to validation that you need to run - there's many aspects of these solutions that you need to get feedback on and exactly understand why tester choose one solution over other. Another aspect of running any validation is that you should try to define personas. When you build solution for 1000+ engineers (with a wide soectrum of skillset) but talk to 5 principles who don't code that's a giant mismatch. Eventually you will build a solution but this is going to work like 5 inches 4k screen for person with sight issues
c
You might already have all this, but the first step for me would be something other than A/B testing. It would be finding out which personas I have on my platform. 1. Do I have actual coders - they will not be happy with any low-code solution that doesn’t provide an API for them to integrate with their handcrafted applications. How many are there (% and actual numbers)? Do they actually use the data behind the low-code solution? How? How should they, in an optimal future? How would they benefit from your options - or are you lacking an option? 2. Do I have citizen developers? How technologically apt are they? How many are they (% and hard numbers)? How would they benefit from each of your options? 3. … you will most probably have more… If you have your personas, it will be easy to select a handful of each persona for a user interview - something you can do without any implementations, which would delay and cost you every time you need to go this route for a new feature. Narrow down the options as good as possible - you might be able to single out something here already. THEN proceed to A/B testing with your CURRENT and your NEW solution and measure the actual improvement and outcomes you achieved - with the least amount of prototypes possible.