by Angela Guess
Dr. David Johnston, Lead Data Scientist at ThoughtWorks recently wrote in Data-Informed, “A recent white paper and survey from IDG Research and Kapow Software states: ‘Big data projects are taking far too long, costing too much, and not delivering on anticipated ROI because it’s really difficult to pinpoint and surgically extract critical insights without hiring expensive consultants or data scientists in short demand.’” Johnston opines, “As a data science consultant, I have recognized the key problem involved in most of these failures: the insufficient attention given to data science skills and overemphasis on infrastructure. So many companies have the tools, but a deficit of ideas and the right kind of talent using them. Product companies benefit from this problem, and actively contribute to it: It’s easier to sell a software license than to solve a real problem, and it’s easy to believe that your employees already can solve problems but just lack the tools that are being sold to you.”
He goes on, “If you think you are in this situation, the answers to these questions might help: Where have you seen concrete examples that your employees have created innovative data products in the laboratory state and you just couldn’t implement them due to lack of technology? Have your employees created accurate predictive analytics solutions that just didn’t scale well enough to run in production? Has the strategy of ‘build it and they will come’ been successful at your company? With our clients, we rarely see situations in which lack of tools and technology is holding back a data science team. Most often, it’s either lack of skill or inexperience with integrating data science into full software applications.”
photo credit: Flickr/ ReillyButler