American Honda was quick to grasp the business potential of generative AI.
Early last year, as the large-language-model goldrush gathered momentum, the automotive company outlined a five-pronged strategy to bring the organization’s innovation engine up to speed, test drive off-the-shelf models and identify ways to embed the technology enterprisewide.
“In January of 2023, when Microsoft made their huge announcement of investment in OpenAI, that was really an attention getter for me,” Bob Brizendine, VP of IT at American Honda, said during a CIO Dive panel in late March. “It made this extremely real in terms of the capability that’s there.”
Brizendine also recognized the manifold challenges ahead, from navigating model performance issues to upskilling workforce throughout the company.
“The breadth of use cases has increased exponentially,” as has the number of users, Brizendine said.
The road to AI adoption leads directly through the heart of an organization’s data estate. To drive value and unlock the technology’s transformative potential, IT must first clear clean pathways to its data stores, erect robust guardrails and install reliability checkpoints.
Brizendine said he was particularly wary of predictive degradation known as model drift and of users being caught off guard by the linguistic eloquence of LLM chatbots.
With that in mind, American Honda made data central to its broader AI strategy.
“We've given data a promotion in the company,” said Brizendine. “We've risen it to a top-level position in the IT organization … putting it on the same plane as our chief information security officer or our chief privacy officer.”
Democratizing AI
Companies like American Honda expect big gains from the processing power and speed LLMs afford. But broad accessibility is one of the more immediate changes wrought by ChatGPT and its natural language processing compatriots.
Brizendine described the democratization of AI capabilities as “a watershed moment in almost every facet of both our AI and our data initiatives.”
While analytics were already baked into the business at American Honda, the tools were mainly in the hands of data scientists.
Now, the company is embracing an “AI for all” approach, according to Brizendine.
“The big initiative is going to be training our entire workforce on practical application of these capabilities,” Brizendine said. “We think that's going to yield the highest overall value for the company as we do prepare for the digitally enabled workforce.”
The goal has put additional strain on IT, which still has cloud deployments, application modernization and other technologies to attend to.
“There are a few areas where our previous approach of maturing data quality simply cannot keep pace with this AI-for-all mantra that we're promoting,” Brizendine said.
Brizendine estimates his team spends roughly 60% of its time on routine work and, if they’re lucky, the other 40% on innovation.
“We are not a startup company,” Brizendine said. “We have a lot of legacy cost. We have a lot of legacy systems that are at play. So previously, the complexities of our data models, or even the inconsistencies in our data quality, these were largely unseen from the broader workforce.”
Unlocking unstructured data
Legacy companies like banks and insurers carry the burden of technical debt even as they migrate to cloud. But they are also repositories for troves of largely untapped data locked away in manuals, contracts, memos and other unstructured formats.
Salesforce and Snowflake are two enterprise software vendors that have targeted solutions at unlocking unstructured data. American Honda is one company that’s cognizant of the value.
“Some of the best sources of high-quality data for generative AI and specifically the large language models are not found necessarily in your structured data, but they're in what we would have historically considered to be our unstructured datasets,” Brizendine said.
American Honda has opened up its owners manuals, policy and procedure documents and other sources of unstructured but well curated proprietary data to LLMs for ingestion. The company is also leaning on its technology and cloud providers to connect models to data sources.
“We're taking the AI to the data,” Brizendine said. “If the data is cloud-based data on Amazon, we're going to utilize a lot of AWS capabilities. If it's Azure-based we’re going to use a lot of the Microsoft-based capabilities for some of these more advanced models.”
As with cloud and enterprise modernization more broadly, preparing unending streams of data for model consumption is not a one-and-done endeavor — it’s part of the day-to-day management of AI operations.
“Our data foundation currently is, and it may forever be, a continuous work in process,” Brizendine said.