Shifting the data measurement mindset from POC to MVP
One of the best practices, a lot of orgs. follow the idea of doing proof of concept for data products before crystallizing the solution using a data engineering owned ETL framework. Anytime a new metric or a new product feature is launched, it is natural for product leaders to figure out a way to measure it.
If you look at this new product or feature from analytics lens, everything associated with this is considered to be" brand new". So, when releasing a new metric to measure a new product feature or the business impact the feature is creating , the first step is to create a POC metric, build it using a temp. data pipeline using an in-house querying tool or a manual SQL, build a dashboard on top of this unstable data pipeline and release it to the stakeholders. Let us understand the rationale why analytics leaders naturally gravitate to this POC approach.
1. Speed to market
Organizations need to really analyze how "quick" is this quick method. Typically, these quick methods take atleast 8-10 weeks to deploy end-to-end and for consumers to see actual value. Most tech companies have mature data orgs. with technically strong talent pool. Any engineering created datamart and reporting tables would typically take the same amount of time. So the concept of speed to market may not be true in many instances.
2. Uncertainty of "new" data generate by new features
This POC approach is often preferred because analytics teams are uncertain how the data will look like and how quick can data pipelines be adjusted if the KPI definition evolves over the period of time. Both these reasons are valid concerns for analytics team to take a "safe route. As businesses evolve KPI definitions are bound to evolve, this is inevitable. I will address the operational and change management aspects of KPI evolution in another post. There can be best practices and collaboration models established in organizations where Product Development and Data teams coordinate to make these sample data sets available in advance.
3. Utilize the quality in-house talent
Most product companies have top notch talent around data skills and many analysts/data analytics managers are in those roles for several years , enough to have strong domain knowledge around businesses/ products. With these right talent already being available, analytics organizations can take calculated risks to avoid POC route all together. Comparing this scenario to a consulting gig, in most cases, teams are new to the new client, data domains are new, KPIs are new and teams are supposed to figure it out quickly and build durable/ scalable analytics solutions without any room for experimentation or POC and a lot of such successful projects are delivered. The possibility of follow-on work dependent on how well the first project landed hence the teams have tremendous be successful and correct in the first attempt. In tech orgs,with so much predictability, less employee turn-over and familiarity of the data domains, there is definitely room for POC to be completely eliminated.
So is experimentation a complete "no-go"? Not at all, there are a lot of product features that are experimentally launched with no plan to scale them up, there are beta products being tried to sample users and product managers often need ways to measure their success such cases, are definitely strong use cases for POC analytics.
POC analytics also cause organizations enormous amount of tech debt in terms of throw-away code, data pipelines , dashboards that get discarded in 3-6 months time frame and managing this tech debt is extremely costly and non-value add for organizations.
So to conclude, POCs need not be completely dissolved but undertaken on strictly case by case basis, guiding principles need to be established by organizations on when to go POC route. Instead organizations can think of MVPs, analytics products experimental in nature but scalable enough to quickly turn into productionalized and governed data solutions.
Comments