Why Retailers Need Advanced Data and Analytics Services
Retailers need to invest in data analytics and advanced technologies such as AI and ML, NLP and deep learning.
Accumulate data capital and monetize it effectively.
Your modern data infrastructure isn’t delivering even after modernization? The limiting factor could be the processes and operations surrounding the systems. Learn how to improve your operating model to best leverage infrastructure and technology investments.
Cognitive enterprise planning is a key area where we generative AI is changing the game. We're seeing finance, supply chain and healthcare sectors lead the way by fusing GenAI, blockchain and IoT technologies to get real-time insights and make more informed, proactive decisions. Use cases include strategic forecasting for business expansion, workforce planning, inventory management, operational efficiency optimization and more. We help companies achieve the leapfrog benefits of a true cognitive enterprise.
Despite growing investments and sustained C-suite interest in generative AI, advanced analytics, BI and data infrastructure, businesses and their customers often are underwhelmed by the capabilities of their data systems. And the competitive pressure from nimbler competitors and new entrants is on the rise. We also can expect growing regulatory requirements in the data realm – from customer privacy, data sovereignty, compliance, cybersecurity, to fairness and inclusivity.
To advance your data capabilities, you must grow your investments and talent. ISG helps you evaluate existing investments, identify opportunities and build a cognitive enterprise.
AI investment is accelerating, but results remain uneven. Only one in four initiatives is meeting revenue impact expectations, at an average spend of $1.3M per use case. Enterprises are no longer asking whether AI works. They are being asked to prove that it pays.
We help you identify where AI agents deliver the most value, restructure workflows around them and build the accountability models that keep autonomous execution auditable. The enterprises that win won't be the ones that reacted. They'll be the ones that designed for it first.
We give enterprises transparent, benchmarkable pricing models that tag each resource unit with the autonomy level used to deliver it. As AI capability advances, your pricing keeps pace. Both buyers and providers can quantify what that progress is worth.
We bring analysis of more than $2.6 billion in tracked AI spend to every sourcing decision. Procurement, technology and finance leaders get the independent intelligence to rationalize vendor portfolios and hold providers accountable to measurable outcomes.
We embed controls at the point of data creation, define accountability for autonomous actions and build adaptive frameworks that keep pace with AI without impeding it. Enterprises that get this right don't just manage risk. They build the trust that lets them scale faster.
We ground strategy in research across 2,400 enterprise use cases, aligning investment to where impact is proven and designing the data, talent and governance foundations that move AI from pilots into the workflows that drive commercial results.
We benchmark your AI readiness against peers across 75 countries, identify the dimensions holding you back and give you a personalized roadmap to close the gap.
AI investment is shifting decisively toward revenue-generating functions. CRM automation, sales enablement and forecasting have replaced chatbots and IT productivity tools as the leading use case priorities, reflecting enterprise recognition that productivity gains alone do not satisfy board-level scrutiny. At the same time, use cases in production have doubled since 2024, and the portfolio is diversifying rapidly, with over 300 distinct function and industry-specific use cases now in active deployment.
ISG research across 2,400 enterprise use cases shows that the strongest AI returns are currently concentrated in compliance, risk management and quality control, not in the growth and cost outcomes most enterprises originally set out to achieve
The gap between where enterprises are investing and where AI is actually delivering is the defining commercial tension of 2025. Organizations that close it by targeting functions with structured, revenue-attributable data and clear ROI measures will establish performance benchmarks that compress the window for competitors still cycling through pilots. The standard is being set now.
ISG is a leader in proprietary research, advisory consulting and executive event services focused on market trends and disruptive technologies.
Get the insight and guidance you need to accelerate growth and create more value.
Learn MoreISG recently published the 2025 ISG Buyers Guides for DataOps, providing an assessment of 51 software providers offering products used by data engineers, data scientists, and data and AI professionals to facilitate the use of data for analytics and AI needs. The DataOps Buyers Guide research generated three reports and five quadrants assessing providers in relation to overall DataOps, Data Observability, Data Orchestration, Data Pipelines and Data Products. By providing an assessment of all software providers with tools in the portfolio of DataOps, the research offers a unique perspective on the extent to which emerging capabilities are being adopted by software providers. Given the amount of noise being made by providers about AI, it’s easy to assume that all providers have already delivered AI-driven capabilities that automate and accelerate DataOps use-cases. However, the DataOps Buyers Guide research illustrates that, for many providers, support for AI functionality remains a work in progress.
The emergence of natural language analytics interfaces driven by generative artificial intelligence (GenAI) models has accelerated enterprise initiatives to enable data democratization—making data available to business decision-makers without the need to train them to use business intelligence (BI) tools. It has also heightened the need for agreed semantic models and business metrics, as well as technologies that facilitate the sharing and consumption of data as a product. As I previously discussed, data as a product is the process of applying product thinking to data initiatives to ensure the outcome—the data product—is designed to be shared and reused for multiple use cases across the business as it enables enterprises to streamline and accelerate the delivery of analytics and artificial intelligence (AI) initiatives. The market for software that enables the design and delivery of data products is evolving rapidly, especially among providers of data catalog-based data intelligence software.
The IT department of any enterprise is integral to implementing and managing the execution of its data objectives, just as the finance department is integral to implementing and managing financial objectives. Few enterprises would allow the finance department complete autonomy to define financial strategies; however, too many enterprises allow the IT department to define data strategies. Treating data as a business discipline—rather than a technical one—is a critical component of delivering competitive advantage through investment in data processing, analytics and artificial intelligence. This can be facilitated by adopting the most appropriate organizational approach, depending on the data activity.
I have previously described how data as a product was initially closely aligned with data mesh, a cultural and organizational approach to distributed data processing. As a result of data mesh’s association with distributed data, many assumed that the concept was diametrically opposed to the data lake, which offered a platform for combining large volumes of data from multiple data sources. That assumption was always misguided: There was never any reason why data lakes could not be used as a data persistence and processing platform within a data mesh environment. In recent years, data as a product has gained momentum outside the context of data mesh, while data lakes have evolved into data lakehouses. It has become increasingly clear that data lakehouses and data as a product are well matched, as the data intelligence cataloging capabilities of a lakehouse environment can serve as the foundation to enable the development, sharing and management of data as a product.
In an earlier Analyst Perspective, I discussed data democratization’s role in creating a data-driven enterprise agenda. Building a foundation of self-service data discovery, data-driven organizations provide more workers with the ability to analyze and use data. I’ve also examined how generative artificial intelligence (GenAI) could revolutionize business intelligence software by using natural language interfaces to lower the barriers to working with analytics software. Today, however, data democratization ensures that access is not limited to analytics software. Users in different roles should be able to use data through whatever applications or tools best align with their business workflow and objectives. This is the driver behind growing interest in a new category of products that enable headless BI.