Reuben Ng
Data analytics is a scientific and data-driven approach to help organizations solve problems, make better decisions, and increase productivity. Despite its business origins, analytics has been applied in governments, hospitals, public policy, and even museums, spurring the growth of a $125 billion market. A significant number of analytics projects fail, however, due in part to poor science (techniques), art (e.g., communication, implementation, change management), or both. I draw on my experience as an analytics consultant, civil servant, and academic to share four learning points for organizations and governments embarking on their analytics journey.
Successful Analytics Involves both Science and Art
Organizations spend an inordinate amount of time on the science of getting algorithms right, and much less on implementation and changing mindsets. The perils of not implementing well overshadow the promises of analytics, as illustrated in the following case studies.
In the first case study, for an agency managing inventory, my team achieved a $250,000-per-month-saving with a model to optimize the buy-store-distribute process. Scientifically, it was phenomenally successful, yet our solutions were not adopted. Apparently, this was because the project manager was so worried about being punished for retrospectively losing the agency 250k/month for the past 24 months on the job! I was astounded at how a tight organizational culture could turn a successful solution into an opportunity for reprimand. Confronting the sobering reality that my team got the science right but neglected the art, I immediately convinced the company’s CEO with my point of view. What ensued was remarkable: The fearful project manager was promoted two levels up to junior director, every other manager started to initiate analytics projects, and thereafter, analytics blossomed in that organization.
In the second case study, my team was engaged by an Asian government client to design more proactive human resource (HR) practices. Current practices are reactive: When an employee leaves, it takes months to find a replacement, increasing the load of remaining staff. This client wanted to distill the drivers of attrition to achieve both macro and micro insights. At the macro-level, that meant adjusting HR policies to decrease attrition; on a micro-level, predicting who may leave the organization and intervening with those they want to keep. Although large multi-nationals like Walmart, Credit Suisse and E-bay have attempted these models, this was the first-known initiative for a government in Asia. As expected, the science was tedious but straightforward, but the art of change was more complex, since we had to work through tough implementation questions: If an employee had a 40 percent chance of leaving, and it took $50,000 to keep them, can the immediate supervisor make the decision, and if not, to which level should it be escalated?
Both case studies underscore the importance of paying attention to the art of analytics by asking, “how will the insights/models be used” and “how will processes change with this new capability?” Science distills the insights, art transforms them into strategy and implementation.
An Integrated-methods Approach is better than an Analytics-only Approach
When analytics is top-of-mind and investments—in the form of software and a new team—have been made, the bias towards analytics is inevitable, though it significantly narrows the pool of solutions. Instead, organizations should focus on understanding the problem deeply and coming up with the best combination of tools, which may or may not include analytics.
In a third case study, a mega hospital in Asia was confronted with an increased incidence of falls resulting in serious injury. From a design-thinking approach, one would track the patient journey, find out the pain points and brainstorm solutions. An analytics approach would link disparate datasets and analyze for risk factors to predict fall risk of new patients.
Integrating Design Thinking (DT), Analytics and Behavioral Insights (BI) along the solution value chain leverages the strength of each method to produce a better solution. Analytics can begin with patient segmentation: Among 2000 patients, how many unique behavioral clusters emerge? Behavioral segmentation, informed by analytics, provides the unique types of patients that DT can track through patient journeys. Thereafter, hypotheses can be quantitatively tested through analytics, revealing a list of risk factors for patient falls. BI picks up the baton to pilot test interventions and roll out the most successful program.
Similarly, in policy formulation and evaluation this trio of techniques presents different means to acquire evidence, which will not be robust if only one method is used. Analytics, no matter how powerful, should be used with other techniques to build the best solution.
Structuring the Team for Innovation
Given the benefits of incorporating different techniques, the next important question is how to build a multi-disciplinary team and how to decide who will manage them? Typically analytics, BI, and DT are in separate teams, with analytics reporting to the Chief Information Officer (CIO), and BI and DT, if existent, to the Chief Marketing Officer (CMO).
I recommend bringing all into one team that reports to the Chief Operating Officer (COO) or the Chief Executive Officer (CEO). In doing so, the team is given the central mandate to tackle strategic whole-of-organization challenges. Problems can also be analyzed from different perspectives with designers co-innovating with data scientists to craft creative solutions. The Singapore Health Promotion Board is one of the few national agencies that successfully integrated the DT-Analytics-BI talent trio into an innovation team filled with intellectually curious and academically agile members. When the team is centralized, rather than fragmented across the organization, they deepen their skills by working on problems across domains.
In reality, not many (government) organizations have the luxury of an analytics team. In Singapore, most government agencies leverage on a central pool of talented software developers and data scientists at GovTech—the government’s IT talent hub. There are “exchange programs” where the data scientist dedicates a few months to work with one agency to gain deeper domain knowledge; on the other hand, a policy analyst may spend an extended period at GovTech to pick up computational thinking and coding skills for a more evidence-based approach to policy formulation. Such programs seek to close the analytics-domain and theory-practice divide.
Restructuring the Public Service to Enable Whole-of-government Analytics
Of broader significance, the public service is moving towards a whole-of-government approach to policy making. For example, aging is not only a health issue, it is also a transportational, environmental, social, and family issue. The rallying cry is to break down silos.
I prefer a more conservative approach. Having worked on a farm, I’m reminded of silos containing grains, cement, and sawdust. I can’t imagine the mess if they are broken! Instead of perceiving silos as negative, consider them to be cylinders of excellence, with the need to build bridges between them.
To create opportunities for silos to “talk” to each other, Singapore’s government organized forums for top leaders within different sectors to discuss and craft policies. For example, the social forum brings together senior leaders of all the social agencies to build consensus for social policies. It is also a great platform to commission important analytics projects that require data from multiple agencies—usually a difficult process. When these projects are debated and benefits delineated among senior leaders across agencies, they are more willing to share data that contributes to a common good.
Besides platforms to talk between silos, the Prime Minister’s Office started an important team of “bridge builders” a.k.a. The Strategy Group, to shepherd and coax whole-of-government policy and practice. The bridge builders staff the sectoral forums, and pioneered the data science commissioning platform to nudge line agencies toward a more whole-of-government approach to policy, practice, and data sharing. At the tactical level, Singapore formed a Municipal Services Office whose app is likened to a one-stop shop for public feedback and manages it across government to ensure a coordinated response.
In sum, analytics must leap out of research to influence practice. Successful analytics in any organization—government or corporate—depend on several ingredients: science, art, team and ecosystem. Together they nourish a strong data-driven and evidence-based culture for data analytics to thrive.
No comments:
Post a Comment