Lessons in AI from Pella Corp.

Pella’s head of data science speaks at Iowa Technology Summit

When Jacey Heuer joined Pella Corp. in 2020 as manager of advanced analytics and data science he was tasked with leveraging data and AI to help the nearly 100-year-old company reach its goal of doubling in size by 2025.

In leading Pella’s team of data scientists over the last three years, he has seen that all components of a business need to work together with AI to drive economic value. Bringing on AI touches data, processes and systems but also company culture, people and customers.

At the recent Iowa Technology Summit held by the Technology Association of Iowa, Heuer shared how Pella has ushered AI into its organization, processes and culture. Here are some takeaways from the presentation.

Establish a foundation in data before implementing AI
Heuer said a company’s data is the foundation that analytics done by AI rests on top of. The analytics and outcomes that AI can generate are only useful if businesses have a strong foundation in data. Building that bedrock includes understanding what functions you want AI to perform, the types of data already being captured and what additional data is needed.

When Heuer joined Pella, he said one of his first tasks was “flipping that message” of analytics first to data first.

“We started with, ‘We’ve got a team of data scientists. Push your easy button; give me the output’ without really doing the diligence to build that foundation,” he said. “Like I mentioned, I’ve been at Pella for three years. The last two and a half years was really establishing a data foundation on which we’ve been able to now accelerate our AI journey on top of.”

Help shift culture with storytelling
A significant part of Heuer’s role at Pella is helping lead the cultural shift by telling the story of AI to colleagues and explaining its value to the business’s future. But it is a big change for employees with years of experience doing their work the same way

“Culture, the way I define it in this sense, it’s the acceptance of interacting with these AI tools,” Heuer said. “Being OK as a user with having this AI element in my workflow, being OK adopting it, letting it influence the decisions I make and how I make decisions. It’s a cultural nuance that, especially if you’re a legacy organization, … you’re introducing this to people that have been doing the function for 25 years or 30 years. Influencing a purpose becomes essential to that [shift].”

Heuer said in defining what AI is to the organization, he discusses how AI ranges from simple to complex applications to show that “not every solution has to be that flashy shiny object.”

Dedicate a role to AI ethics
Heuer said the most apparent approach to addressing AI ethics in the organization was to assign it to the development team, which would apply statistical models, try to remove outliers and leverage each other’s backgrounds and perspectives to “cancel out bias and ethical concerns.”

“I personally believe that that’s not a solution. You really need someone [who is] more qualified, more dedicated and removed from the development cycle to monitor, evaluate and govern the ethical concerns of what’s being created.”

Keep regulatory implications in mind
As companies start to adopt more AI solutions, Heuer said they need to consider how they would react to new regulations of the technology.

“How do we build these solutions with an eye on the future of what regulation may bring to the table?” he said. “If you deploy something and regulation catches up to it, are you ready to turn it off? To pivot? To react to it? To be proactive through it? These are all big questions that we’re wrestling with.”

Building an AI solution versus buying one also has important considerations. Heuer said businesses buying from a third party will have limited control over how an AI solution arrives at its decision but will still be responsible for the outcomes customers receive.