Minding the Machines. Jeremy AdamsonЧитать онлайн книгу.
in great contrast to real life experience. Additionally, most curricula emphasize such topics as computer vision, natural language processing, and reinforcement learning, fairly esoteric topics that have little applied usage in industry. Finally, and most importantly, effectively none have mandatory coursework on the strategic and operational elements of an advanced analytics and AI team. Without this understanding, typical graduates have a thorough mathematic understanding, much in the way of raw horsepower, but require a significant investment in training before they understand how to leverage their education and apply it to a real-life scenario.
With so many new data and analytics graduates competing with mid-career transitioners and a global talent pool, they often seek ways to stand out as a potential hire. With the exception of highly specialized roles in technology companies, the key development opportunity for these new hires is the formation of leadership abilities in an analytical context. Reframing and focusing analytical concepts into a business context is an immediate and powerful way to differentiate yourself in a new role or in an interview, especially as the profession moves away from long-horizon highly technical solutions toward a focus on immediate value.
For the student, I hope that this book gives you the knowledge to stand out against your peers, to be seen as a strategic thinker, and to be able to add value to whatever organization you choose to work with.
For the Analytics Leader
Compared to other organizational functions, this exciting field has come about abruptly and without a blueprint for how to build or lead these new teams. Often playbooks from other functions have been used, with little success. The lessons that experienced analytics professionals have learned have been hard won. What further complicates the successful deployment of these teams is that they are so sensitive to the state of the organization, its immediate goals, and the technical maturity of its industry. While finance and human resources are largely the same between industries and individual companies, the number of factors impacting analytics are staggering.
Using a generalized, systematic, and sequential approach, adapted to the needs of the individual organization, is the best method to standing up a new team or restructuring an existing team. Once a base template has been established, careful reflection on organizational readiness and analytical maturity combined with regulatory requirements and immediate needs can help with developing a short-term roadmap in collaboration with executive sponsors. Though there is no approach that will work in every situation, these best practices can hopefully help you see a little further over the horizon.
For the current analytics leader, I hope that some parts of this book will challenge your views, other parts will confirm your experience, and the book as a whole will ultimately help you to build out a successful and engaged team.
Structure of This Book
The main body of this book has been organized within three key pillars: strategy, process, and people.
Strategy How to assess organizational readiness, identify gaps, establish an attainable roadmap, engage stakeholders, ensure sponsorship, and properly articulate a value proposition and case for change
Process How to select and manage projects across their life cycle, including design thinking, risk assessment, governance, and operationalization
People How to structure and engage a team, establish productive and parsimonious conventions, and lead a distinct practice with unique requirements
These pillars loosely follow the chronological and logical ordering of priorities with the creation or inheritance of an analytics team, with the understanding that this is an iterative and ongoing effort. The procedural requirements flow naturally from the strategy, and similarly, team structure and convention must be based on the processes that have been created.
Though this has been ordered to facilitate a front-to-back reading, subsections have been intentionally made self-sufficient to allow for ease of referencing, at the cost perhaps of occasional repetition.
Why Is This Book Needed?
It is my personal hope that this book will make creating and leading the function easier and help in some small way to advance the profession. Having been involved with or privy to rebooting these teams in several organizations, I have seen well-intentioned missteps repeated regardless of the maturity and sophistication of the company.
There are several underlying reasons why organizations and individuals struggle to get their hands around analytics.
Communication Gap
The business will rarely, if ever, have the analytics knowledge and vernacular required to clearly articulate its needs and to formulate a problem statement that naturally lends itself to an analytical solution. Whereas Kaggle competitions, hackathons, boot camps, and university assignments present problems with a well-formed data set and a clear desired outcome, business problems are fuzzy, poorly defined, and often posited without a known objective. As practitioners, it is our responsibility to find the underlying issue and present the most situationally appropriate and practical solution.
Advanced analytics and AI practitioners can often have the expectation that their stakeholder group will provide a solution for them. Just as a doctor cannot expect a patient to diagnose their own health issues and for the doctor’s approval an analytics team cannot expect a business unit to suggest an approach, provide a well-formed data set and an objective function, and request a model. What the business unit requests is very often not even what the analytics project lead hears.
Early in the project intake process, an analytics lead will meet with a business lead to discuss an opportunity. The business leader (actuarial, in this example) may say that they want a model that predicts the probability that a policyholder will lapse. The outcome that the leader is hoping for is a way to reduce their lapse rate, but what the analyst hears is, “Ignoring all other considerations, how can I best predict the probability of an individual lapsing?” If the practitioner executes on this misapprehension, the deliverable will have little use for the business; a prediction model of this sort has no operational value. This model would only work on a macro scale, and even if it could be disaggregated, the business would be making expensive concessions in the face of perceived threats.
Empathizing with the underlying needs of the business, understanding what success looks like for the project, and leveraging the domain knowledge of the project sponsor would have highlighted that the value in the analysis was further upstream. The factors driving lapse behavior were where the value to the business was and where an operationalizable change in process was possible.
As with the doctor analogy, it is through deep questioning, structured thinking, and the expert application of professional experience that the ideal path forward is uncovered. That path requires collaboration and the union of deep domain knowledge with analytical expertise.
Troubles with Taylorism
For every decision to be made there is a perception that there must be one optimal choice: a single price point that will maximize profit, a single model that will best predict lapse, or a single classification algorithm that will identify opportunities for upselling. The fact is that in effectively all cases these optima can never be known with certainty and can only be assessed ex post facto against true data. In professional practice as well as in university training, the results of a modeling project are typically evaluated against real-world data, giving a concrete measure of performance, whether AUC, or R squared, or another statistical metric.
This has created a professional environment where analysts can confidently point to a single score and have an objective measure of their performance. They can point with satisfaction to this measurement as an indicator of their success and evidence of the value they bring to the organization. Certainly, performant algorithms are an expectation, but without viewing the work through a lens of true accretive value creation, these statistical