Unleashing Productivity Across the Enterprise With AI
- By Paul Mah
- October 31, 2023
Productivity has become a pressing concern in our hypercompetitive global economy, where businesses must constantly innovate and adapt to stay ahead of the curve. Today, companies across various industries are mulling how they can improve enterprise productivity with the latest AI technologies.
Business leaders acknowledge that the more advanced the generative AI, the greater their competitive advantage, says Jay Jagadeesan, Senior Partner, ASEAN Business Transformation Services Leader, IBM Consulting.
At an exclusive, invite-only "Unleashing Productivity Across the Enterprise with AI" session at IBM Think 2023, Jagadeesan cited an IBM study revealing that financial returns from traditional AI usage increased sixfold during the pandemic. It's no surprise that these businesses are keen to build on this success by embracing generative AI.
Traditional AI or Generative AI?
How does IBM utilize AI to boost productivity and achieve business outcomes within its workforce? To provide context, Jagadeesan started by chronicling the transition from data analytics to AI.
“We started with advanced analytics, which is basically analyzing datasets with speed, agility, and consistency. We moved to machine learning which is a broad set of techniques to train a model to make predictions based on input, data, and other data sets,” he said.
“Deep learning is a technique for implementing machine learning or deep artificial learning based on neural networks to perform complex tasks, like image recognition and language processing,” he said, noting that ChatGPT utilizes a transformer-based deep learning algorithm.
“When deciding between traditional AI and generative AI, consider your objectives. If you aim to summarize, engage in information search, or create new content or code, generative AI is the way to go. If your focus is on structured data analysis, predictions, forecasts, computer vision, or robotic process automation, traditional AI is typically the preferred choice.”
He noted that traditional AI and generative AI can exist in isolation or work together. "Many of the models that we are seeing being built combine the benefits of both these models."
AI starts with good data
For Ronald Castro, Vice President, IBM Supply Chain, the road to AI adoption started with a business challenge that IBM faced years ago.
"Our supplies and our processes were very fragmented in nature. This was the case within different aspects of the business, and even across different geographies," he said.
Resolving this began with efforts to clean up the organization's data and establish a single source of truth, says Castro. "We spend a lot of time on consolidating the data, making sure that we can trust the data from the supply chain."
When the team eventually incorporated AI capabilities, the move increased revenue and reduced costs to the tune of USD 160 million: “AI is helping us to make decisions faster, and more importantly, to make better decisions. This leads us to get parts cheaper, reduce costs, and manage our inventory better.”
The result was particularly notable during the pandemic, as global supply chains were disrupted and strained under the pressure of lockdowns, border closures, and shifting consumer demands.
"At a time where others were struggling with getting supplies, IBM was able to ship absolutely every order for the last few years,” said Castro.
Deploying generative AI
How is IBM using generative AI today? Sena Periasamy, Partner, AI, Data Analytics, Automation and IoT Services, IBM Consulting, shared about this.
“We have many systems today supporting our non-customer-facing employees. If they want information that somebody already has and stored in some repositories, how do you get to it?”
He mentioned that a chatbot capable of searching relevant knowledge databases to find important information is under development.
In the meantime, Periasamy says IBM employees are already using an HR chatbot designed to help them with HR-related queries and tasks. This includes accessing salary information and benefits, and even applying for leave directly through the chatbot.
Naturally, organizations using generative AI should ideally be familiar with the data sets the models they employ are trained on. Additionally, it's crucial to understand the inherent weaknesses of large language models, such as their tendency to hallucinate, as noted.
Enterprises will need measures in place to address them, says Periasamy. “The first thing is understanding [hallucination] and putting guardrails around it. The second is around how you control bias. The third is explainability. Can we establish why a model gave a certain output?”
"We predict that these foundation models which are built with your proprietary data, are going to be prized intellectual property in the next wave of AI adoption,” said Periasamy.
Learn more about the top trends and strategic bets business leaders are making to boost productivity here.
Paul Mah is the editor of DSAITrends. A former system administrator, programmer, and IT lecturer, he enjoys writing both code and prose. You can reach him at [email protected].
Image credit: iStockphoto/champpixs
Paul Mah
Paul Mah is the editor of DSAITrends, where he report on the latest developments in data science and AI. A former system administrator, programmer, and IT lecturer, he enjoys writing both code and prose.