Maria Kliatchko, Principal, ZS03.04.20
Machine learning and AI technologies are becoming mainstream. Many data sets can be harvested from all corners of the enterprise, and even more are available commercially. With new technologies, greater computing power, and better algorithms, we can now optimize our commercial engine much more precisely, wringing millions of dollars of inefficiencies from the process. Further, there is a mad rush on these solutions. Diverse problems such as pricing optimization, sales forecasting, customer churn prediction, next-best-action recommendations, inventory optimization, and outcomes research are now solvable with much more precision and predictability. As medtech companies rush to conquer this new territory and compete with analytics, the old question of what to build and what to buy is emerging once again.
Since the outsourcing era began about 30 years ago, this question has been pondered by generations of leaders. Some companies use specialized partners for everything, hiring only for roles that can be filled with employees who have CEO potential. They believe every specialist should be contracted because requirements continuously change, and it’s always easier to find a new partner than to reskill, rehire, or lay off employees. Other companies err on the side of keeping this work in-house, believing it helps control costs and protects their IP. While it may help the latter, the former is often elusive.
As companies undertake new types of analytics projects that require new skills beyond simple sales reporting, it’s worth revisiting the “build vs. buy” question, as the costs and benefits of each approach may present new and different trade-offs in this new era of analtyics. Such is the case with advanced analytics, which requires different skills for each sub-specialty, including data landscape knowledge, big data and cloud infrastructure design, functional and advanced data science modeling, and analytics storytelling.
As with other new capabilities, when considering either building or outsourcing advanced or big data analytics, there are pitfalls at the extremes. Companies that outsource too much, and for longer than necessary, may end up with poor institutional knowledge of their systems and the analytical models that drive their businesses. On the other hand, companies that do everything themselves may spend two or three times as much time and effort as specialists would, and may quickly fall behind their peers in the fast-changing analytics space.
So what analytics projects should companies do themselves, and which should they outsource? The following requirements can make do-it-yourself (DIY) projects particularly costly, lengthy, and risky.
New and highly specialized technology skills: Cloud architecture, big data, and advanced data science are all skills that take time and experience to develop. Moreover, people become more proficient at them when they can practice on multiple projects. While Google, Amazon, and many consulting firms can give their practitioners a variety of assignments and lots of support, a medtech company likely can’t. Even someone who has done a similar project once somewhere else may not have sufficient depth of expertise.
For example, about two to three years ago, we worked with a client that did not have a data lake. As the company was upgrading its analytics capabilities, it understood it needed a modern, cloud-based data lake to enable not only reporting, but analytical experimentation and advanced data modeling. A few experienced vendors proposed projects costing less than $500,000 and taking five to six months.
Despite the low price tag, the company decided to develop its own skills instead. In the process, it spent a year and a half finding qualified staff, and then even more time training the team to understand the requirements and data. Three years and more than $2 million later, the initial project scope has still not been delivered. And while the company may think it’s better off in the long run, the reality is in two to three years, yet another critical skill will appear that will be required for data lakes and the staffing challenges will continue. Meanwhile, the company is already many years behind the competition in its business capabilities, and it will be hard to catch up. Recognizing that technology skills evolve faster than it can catch up and finding the right partner could have saved this company years and millions of dollars.
Deep business domain skills needed periodically: Some heavy business domain skills—such as supply chain, sales deployment, pricing analytics, and incentive compensation—are in short supply in the medtech sector, and it doesn’t make sense to develop such expertise for a need that occurs every few years. Specialized vendors who tackle such projects for the whole industry or multiple industries can better train and retain such expertise and talent.
Most medtech companies are now establishing their advanced data science departments to tackle problems from pricing optimization to sales forecasting to next-best-action recommendations to supply chain optimization. Yet, data science requires not only technical skills, but also deep understanding of the business domain and available data. Data scientists without forecasting experience are unlikely to find the right data and make the right hypotheses to ensure the model will be highly predictive. We recently chatted with one data science manager who was tasked with pricing and contracting optimization. He had no one on his team who knew the business area deeply. Ultimately, he decided to get an experienced firm to build the initial model and then teach and transition it to his staff. In this way, he got both fast delivery for his customers and, over time, has built the required skills in his department.
Assets or accelerators: Many efforts would be cost prohibitive to start from scratch. Big data stacks for a given set of problems have been developed many times in the industry, and to develop another from the ground up would take months or years, even if the skills were available. Many vendors with such assets spread their costs over multiple projects, so each client can dramatically accelerate its implementation for a fraction of the full cost of the asset.
Renting an out-of-the-box data lake or analytics workbench not only could save companies years, but also provide lots of functionality that was built at someone else’s expense. One large life sciences company I’m familiar with spent a few years building its own set of analytics applications, and yet, when it issued an RFI to learn what’s available on the market, it recognized that several vendors could provide that same field reporting capability out-of-the-box and in a relatively short time. Further, they could provide much more: launch analytics, forecasting analytics, and an entire “marketplace” of models and reports for various functions. In addition to these already developed capabilities, the client would be able to work with the vendor to guide product roadmaps, so it could get the features it needed while spending no more than the agreed-upon license and operations costs.
In many cases we’ve seen in medtech, insourcing costs significantly more in both time and money, especially at the outset, and with analytics projects that require new skills and technologies, as well as substantial specialized expertise.
Of course, for a company that already has the necessary technology and domain skills, as well as various required roles (from data scientists and engineers to analytics translators), it may make sense to keep analytics projects in-house. Furthermore, if building the analytics capability—including infrastructure, data, analytics sophistication, and other “softer” dimensions—is considered strategic and a source of competitive advantage, and the company is willing to make significant investments that are visible to the investment community, it may be worth it to suffer the upfront delays inherent in a DIY approach.
One good way to jumpstart a DIY project is to find a partner and hire them to start the program. In parallel, take the time to hire and train people who can gradually take over the most strategic parts of the work until the department has been built.
Many other co-sourcing options are available, and as the decision tree graphic illustrates, co-sourcing is really a spectrum. Companies can insource more strategic work (business problems), but outsource less strategic work (technology). They can also find and train business analysts from internal talent, but outsource data scientists and big data specialists who are harder to find. They can tackle the steady, ongoing workstreams and those requiring frequent interaction with internal or external customers, while hiring vendors to run more periodic processes requiring infrequent or fast scaling up and down.
Finally, the right partner matters. In the case of advanced analytics for medtech, an ideal partner would have deep healthcare data knowledge, solid business domain understanding, and existing accelerators for whatever problem the company wants to solve, and an approach to blend business, technology, and analytics talent into high-performing cross-functional teams. They also need to be able to support large-scale projects and offer global service options to more cost effectively serve global companies. In addition, a good partner should be able to blend its teams with internal teams so they function as a unit. Over time, as strategic intent evolves, the partner should be able to bring in more staff or transition certain functions to internal teams. Vendors that insist on keeping their algorithms secret or obscuring how their accelerators work may be good short-term options, but not viable long-term partners.
It’s helpful to think of this decision as if you were a homeowner: It’s OK to mow your lawn or paint a room and, if you’re a carpenter, make your own furniture. Changing a furnace or rewiring the house, however, is best left to carefully selected specialists. If you ever want to do it yourself, spend time working with the right specialists before ripping that furnace out by yourself.
Maria Kliatchko is a principal and the leader of ZS’s medtech analytics practice. She has more than 20 years of experience working with pharmaceutical, medical device, and healthcare companies, consulting with clients on a diverse array of sales and technology strategy issues and implementations, including sales resource optimization, alignment, segmentation, and targeting, as well as business intelligence, CRM, and commercial data integration.
Since the outsourcing era began about 30 years ago, this question has been pondered by generations of leaders. Some companies use specialized partners for everything, hiring only for roles that can be filled with employees who have CEO potential. They believe every specialist should be contracted because requirements continuously change, and it’s always easier to find a new partner than to reskill, rehire, or lay off employees. Other companies err on the side of keeping this work in-house, believing it helps control costs and protects their IP. While it may help the latter, the former is often elusive.
As companies undertake new types of analytics projects that require new skills beyond simple sales reporting, it’s worth revisiting the “build vs. buy” question, as the costs and benefits of each approach may present new and different trade-offs in this new era of analtyics. Such is the case with advanced analytics, which requires different skills for each sub-specialty, including data landscape knowledge, big data and cloud infrastructure design, functional and advanced data science modeling, and analytics storytelling.
As with other new capabilities, when considering either building or outsourcing advanced or big data analytics, there are pitfalls at the extremes. Companies that outsource too much, and for longer than necessary, may end up with poor institutional knowledge of their systems and the analytical models that drive their businesses. On the other hand, companies that do everything themselves may spend two or three times as much time and effort as specialists would, and may quickly fall behind their peers in the fast-changing analytics space.
So what analytics projects should companies do themselves, and which should they outsource? The following requirements can make do-it-yourself (DIY) projects particularly costly, lengthy, and risky.
New and highly specialized technology skills: Cloud architecture, big data, and advanced data science are all skills that take time and experience to develop. Moreover, people become more proficient at them when they can practice on multiple projects. While Google, Amazon, and many consulting firms can give their practitioners a variety of assignments and lots of support, a medtech company likely can’t. Even someone who has done a similar project once somewhere else may not have sufficient depth of expertise.
For example, about two to three years ago, we worked with a client that did not have a data lake. As the company was upgrading its analytics capabilities, it understood it needed a modern, cloud-based data lake to enable not only reporting, but analytical experimentation and advanced data modeling. A few experienced vendors proposed projects costing less than $500,000 and taking five to six months.
Despite the low price tag, the company decided to develop its own skills instead. In the process, it spent a year and a half finding qualified staff, and then even more time training the team to understand the requirements and data. Three years and more than $2 million later, the initial project scope has still not been delivered. And while the company may think it’s better off in the long run, the reality is in two to three years, yet another critical skill will appear that will be required for data lakes and the staffing challenges will continue. Meanwhile, the company is already many years behind the competition in its business capabilities, and it will be hard to catch up. Recognizing that technology skills evolve faster than it can catch up and finding the right partner could have saved this company years and millions of dollars.
Deep business domain skills needed periodically: Some heavy business domain skills—such as supply chain, sales deployment, pricing analytics, and incentive compensation—are in short supply in the medtech sector, and it doesn’t make sense to develop such expertise for a need that occurs every few years. Specialized vendors who tackle such projects for the whole industry or multiple industries can better train and retain such expertise and talent.
Most medtech companies are now establishing their advanced data science departments to tackle problems from pricing optimization to sales forecasting to next-best-action recommendations to supply chain optimization. Yet, data science requires not only technical skills, but also deep understanding of the business domain and available data. Data scientists without forecasting experience are unlikely to find the right data and make the right hypotheses to ensure the model will be highly predictive. We recently chatted with one data science manager who was tasked with pricing and contracting optimization. He had no one on his team who knew the business area deeply. Ultimately, he decided to get an experienced firm to build the initial model and then teach and transition it to his staff. In this way, he got both fast delivery for his customers and, over time, has built the required skills in his department.
Assets or accelerators: Many efforts would be cost prohibitive to start from scratch. Big data stacks for a given set of problems have been developed many times in the industry, and to develop another from the ground up would take months or years, even if the skills were available. Many vendors with such assets spread their costs over multiple projects, so each client can dramatically accelerate its implementation for a fraction of the full cost of the asset.
Renting an out-of-the-box data lake or analytics workbench not only could save companies years, but also provide lots of functionality that was built at someone else’s expense. One large life sciences company I’m familiar with spent a few years building its own set of analytics applications, and yet, when it issued an RFI to learn what’s available on the market, it recognized that several vendors could provide that same field reporting capability out-of-the-box and in a relatively short time. Further, they could provide much more: launch analytics, forecasting analytics, and an entire “marketplace” of models and reports for various functions. In addition to these already developed capabilities, the client would be able to work with the vendor to guide product roadmaps, so it could get the features it needed while spending no more than the agreed-upon license and operations costs.
In many cases we’ve seen in medtech, insourcing costs significantly more in both time and money, especially at the outset, and with analytics projects that require new skills and technologies, as well as substantial specialized expertise.
Of course, for a company that already has the necessary technology and domain skills, as well as various required roles (from data scientists and engineers to analytics translators), it may make sense to keep analytics projects in-house. Furthermore, if building the analytics capability—including infrastructure, data, analytics sophistication, and other “softer” dimensions—is considered strategic and a source of competitive advantage, and the company is willing to make significant investments that are visible to the investment community, it may be worth it to suffer the upfront delays inherent in a DIY approach.
One good way to jumpstart a DIY project is to find a partner and hire them to start the program. In parallel, take the time to hire and train people who can gradually take over the most strategic parts of the work until the department has been built.
Many other co-sourcing options are available, and as the decision tree graphic illustrates, co-sourcing is really a spectrum. Companies can insource more strategic work (business problems), but outsource less strategic work (technology). They can also find and train business analysts from internal talent, but outsource data scientists and big data specialists who are harder to find. They can tackle the steady, ongoing workstreams and those requiring frequent interaction with internal or external customers, while hiring vendors to run more periodic processes requiring infrequent or fast scaling up and down.
Finally, the right partner matters. In the case of advanced analytics for medtech, an ideal partner would have deep healthcare data knowledge, solid business domain understanding, and existing accelerators for whatever problem the company wants to solve, and an approach to blend business, technology, and analytics talent into high-performing cross-functional teams. They also need to be able to support large-scale projects and offer global service options to more cost effectively serve global companies. In addition, a good partner should be able to blend its teams with internal teams so they function as a unit. Over time, as strategic intent evolves, the partner should be able to bring in more staff or transition certain functions to internal teams. Vendors that insist on keeping their algorithms secret or obscuring how their accelerators work may be good short-term options, but not viable long-term partners.
It’s helpful to think of this decision as if you were a homeowner: It’s OK to mow your lawn or paint a room and, if you’re a carpenter, make your own furniture. Changing a furnace or rewiring the house, however, is best left to carefully selected specialists. If you ever want to do it yourself, spend time working with the right specialists before ripping that furnace out by yourself.
Maria Kliatchko is a principal and the leader of ZS’s medtech analytics practice. She has more than 20 years of experience working with pharmaceutical, medical device, and healthcare companies, consulting with clients on a diverse array of sales and technology strategy issues and implementations, including sales resource optimization, alignment, segmentation, and targeting, as well as business intelligence, CRM, and commercial data integration.