A well-executed big data strategy can streamline operational costs, reduce time to market and enable new products. But enterprises face a variety of big data challenges in moving initiatives from board room discussions to practices that work.
This abstract was originally published as 10 big data challenges and how to address them by George Lawton in TechTarget
“Oftentimes you start from one data model and expand out but quickly realize the model doesn’t fit your new data points and you suddenly have technical debt you need to resolve,” he said.
A generic data lake with the appropriate data structure can make it easier to reuse data efficiently and cost effectively. For example, Parquet files often provide a better performance-to-cost ratio than CSV dumps within a data lake.
Continue Reading 10 big data challenges and how to address them in TechTarget