It is no secret that companies must consider ways to scale data and AI initiatives. How do enterprises do so successfully? In today’s episode, Jeevan Duggempudi is speaking to us about applying data fabric artificial intelligence and machine learning to make a step change improvement in business operations. He identifies the common challenges enterprises face when scaling their insights initiatives, most notably in the areas of data availability and scalability. Also, he shares his observations on the impact of decentralized data and how this structure allows teams to own their data in a much more manageable way and allows them to provide insights to the business stakeholders faster.
Jeevan is passionate about applying Data Fabric, Artificial Intelligence (AI), and Machine Learning (ML) to make a step change improvement in business operations. He has a deep fundamental belief that human-in-the-loop workflows built by cross-functional teams supported by hybrid cloud technology are critical to the trusted adoption of AI within an enterprise. He is currently a Partner at IBM Consulting, leading the AI & Analytics practice for Healthcare, Life Sciences, and State & Local Government clients. Before IBM, Jeevan led AI/ML market development at Amazon Web Services (AWS), and before AWS, spent over a decade in the data science consulting practice at Deloitte Consulting.
Connect with Jeevan Duggempudi
“One of the core fundamental beliefs that organizations are going through right now is data is the new currency within the organizations and how do you maximize the value that you get out of date. So through that and this data architecture is really where companies see the value that can be derived at scale within these enterprises.”
– Jeevan Duggempudi
There are several challenges that enterprises face when scaling their initiatives. This can be categorized into three distinct buckets: non-availability of quality data, AI scalability, and talent. Having “bilingual talent” within teams is fundamental to driving better insights. The goal is to combine technical and functional talents and provide them with relevant resources to work on specific challenges.
Even during these recessionary times, companies are making investments in digital initiatives on data and AI. Now, AI is used as a tool for pricing, marketing, sales, and so much more to achieve a competitive advantage. Other than this, companies are also investing in leadership roles to support these initiatives.
When organizations realize the value of the analytical data, they realize the value of decentralized data. As central organizations start to scale across the various functions within the organization, the bottleneck starts because of spending too much time managing the differences between operational and analytical data planes. This withholds time for analytical data teams to dive deeper into the data to serve insights to the business stakeholders faster. Enterprises are moving to a decentralized data lake through data mesh architecture to overcome the challenges of centralized data management. This shift is due to the domain-specific knowledge and data ownership by function-specific data teams, allowing for faster insights and better decision-making.
Decentralization of data is a trend within enterprises, and there are many benefits to this approach, including making data more accessible and trusted. One way to achieve this is to decentralize data sources within a mesh architecture, which allows different parts of the organization to access and use data productively. Another benefit is that domain knowledge can be brought into data processing, leading to more efficient and effective use of data.
Advances are driving value in trustworthy and responsible AI practices, which aim to improve the AI models’ bias, fairness, and transparency. Data science communities are working on ways to improve delivery models and identify value within enterprises. Through virtual regions, it is possible to set up quick operational improvements. An example is how a client’s call volume spiked up by 60-70% in just a week due to the use of cognitive care tools, which IBM was able to deploy quickly. This caused the average call time to increase from a minute to 10-15 minutes, and many calls were contained within five minutes or less. This demonstrates the value of using cognitive care tools and shows how IBM can rapidly deploy solutions to help customers improve their customer experience.
Sunny Side Up
B2B podcast for, Smarter GTM™