Entrenched Data Culture Can Pose Challenge to New AI Systems 

A legacy company may have an entrenched data culture, with established procedures that may have historically worked well, that make a move to AI systems challenging. (Credit: Getty Images) 

By John P. Desmond, AI Trends Editor 

Companies established for a long time—decades or even a century or more old—with thousands of employees in many business units globally, with information systems built over many years on multiple platforms, have entrenched data cultures that may pose challenges for implementing AI systems.  

Data culture refers to the expectation that data will be used to make decisions and optimize the business, making a company data-driven. A data-driven company can be rolling along peacefully, with complex business processes and operations under control and doing the job. Users may have access to the data they need and be encouraged to present their analysis, even if the insights are unwelcome.   

Then someone asks if the company can do it like Netflix or Amazon, with AI algorithms in the background making recommendations and guiding users along, like a Silicon Valley startup. Might not be able to get there from here.  

Tom O’Toole, professor, Kellogg School of Management

“These great companies may have built enormously successful and admirable businesses,” stated Tom O’Toole, professor at the Kellogg School of Management, writing recently in Forbes.  

However, many legacy companies have IT organization structures and systems that predate the user of data analytics and now AI. The data culture in place may be resistant to change. In many firms, culture is cited as a primary challenge to the successful implementation of AI.   

“Established organizations are too often fragmented, siloed, and parochial in their data use, with entrenched impediments to information sharing,” stated O’Toole, who before working in academia was chief marketing officer at United Airlines. Questions to established authority might not be welcome, especially if the top executive doesn’t like the answers. 

To replicate the Silicon Valley approach, the author had these suggestions:  

Get comfortable with transparency. Data that previously resides only within one department is likely to have to be shared more broadly across the leadership team. Business performance data needs to be transparent.  

Heighten accountability. Greater accountability follows increased transparency. Data needs to be provided to demonstrate that a particular strategy or product launch is effective.  

Embrace unwelcome answers. A data analysis can challenge conventional assumptions, for example by showing performance was less than had been believed, or that the conventional wisdom was not that smart.   

“Creating a data culture is an imperative for continuously advancing business performance and adopting AI and machine learning,” O’Toole stated. 

Survey Shows Concern that Data Quality Issues Will Cause AI to Fail 

Nearly 90% of respondents to a survey by Alation, a company that helps organizations form an effective data culture, are concerned that data quality issues can lead to AI failure.   

Aaron Kalb, cofounder and chief data and analytics officer, Alation

AI fails when it’s fed bad data, resulting in inaccurate or unfair results,” stated Aaron Kalb, cofounder and chief data and analytics officer, in an account on the Alation blogBad data, in turn, can stem from issues such as inconsistent data standards, data non-compliance, and a lack of data democratization, crowdsourcing, and cataloging.” Survey recipients cited these reasons as the main reasons for AI failures. 

The company’s latest survey asked organizations how they are deploying AI and what challenges they are facing doing so. The results showed a correlation between having a top-tier data culture and being more successful at implementing AI systems.  

Data leaders who have deployed AI cite incomplete data as the top issue that leads to AI failures. “This is because when you go searching for data to create the models—be it for product innovation, operational efficiency, or customer experience—you uncover questions around the accuracy, quality, redundancy, and comprehensiveness of the data,” Kalb stated.  

Aretec, a data science-focused firm that works to bring efficiency and automation to federal agencies, helps clients deal with legacy data by leveraging AI services themselves to integrate and optimize huge and diverse datasets.   

In a post on the Aretec blog, the issues they consistently see that impede the implementation of AI systems are:   

Data Fragmentation. Over time, the data needed to support operations winds up fragmented across multiple data silos. Some can be outside an agency or stored with private companies. Fragmented data eventually results in “islands” of duplicated and inconsistent data, incurring infrastructure support costs that are not necessary. 

Data inconsistencies. Many government agencies need to aggregate data records coming from a variety of sources, records not always in a consistent format or content. Even when rigid standards are applied, the standards are likely to evolve over time. The longer the records go back, the greater the chance for variance.  

Learning curves. Many challenges arising from legacy data management are cultural, not technical. Highly-skilled employees have spent years learning how to do their job efficiently and effectively. They may see any proposed change as compromising their position, thus having a negative impact on their productivity and morale.  

NewVantage Survey Find AI Investment Strong, Success Fleeting 

A newly-released survey from NewVantage Partners found that Fortune 1000 companies are investing heavily in data and AI initiatives, with 99% of firms reporting investments. However, the ninth annual update of the survey finds that companies are having difficulty maintaining the momentum, according to a recent account in the Harvard Business Review.  

Two significant trends were found from the 85 companies surveyed. First, companies that have steadily invested in Big Data and AI initiatives report that the pace of investment in those projects is accelerating, with 62% of firms reporting investments of greater than $50 million.   

The second major finding was that even committed companies struggle to derive value from their Big Data and AI investments and from the effort to become data-driven. “Often saddled with legacy data environments, business processes, skill sets, and traditional cultures that can be reluctant to change, mainstream companies appear to be confronting greater challenges as demands increase, data volumes grow, and companies seek to mature their data capabilities,” stated the author, Randy Bean, the CEO and founder of NewVantage Partners, who originated the survey.  

Only 24% of responding firms said they thought their organization was data-driven in the past year, a decline from 37.8% the year before. And 92% of firms reported that they continue to struggle with cultural challenges related to organization alignment, business processes, change management,, communication, people skills sets, resistance and a lack of the understanding needed to enable change.   

“Becoming data-driven takes time, focus, commitment, and persistence. Too many organizations minimize the effort,” stated Bean. 

One recommendation by the study authors was for companies to focus data initiatives on clearly-identified business problems or use cases with high impact.  

Read the source articles and information in Forbeson the Alation blog, on the Aretec blog and in the Harvard Business Review. 

Leave a Reply

Your email address will not be published. Required fields are marked *