Forrester

How Banks Can Win At CX: Lessons From The Front Lines

One of our 2025 banking predictions was that banking customer experience (CX) would continue to decline, on average, across the globe. And indeed, the year hasn’t started well, after IT outages left the customers of UK’s largest banks, Barclays and Lloyds, unable to access their money or make payments. This is similar to what happened at Bank of America last October. Dependability is one of the key levers of banking trust, but of course, IT resilience and availability is not something that CX professionals control. There are many other things, however, that they can do to improve customer experience and loyalty. To help banking executives identify what those things are, we’ve interviewed the top scorers in Forrester’s Customer Experience Index (CX Index™) to identify some best practices. Our two newly published reports highlight best practices from across both frontstage and backstage. So what are some of our key findings? Focus On Creating Customer Value — Not Just Fixing Broken Experiences There’s a number of reasons for the drop in CX quality, but chief among them is the poor performance of banks on emotion — which influences customer loyalty more than ease or effectiveness. The top emotions that drive loyalty in certain regions include feeling confident, valued, and respected. Yet most CX teams are too tactical to shift the dial on emotion, opting instead to monitor customer feedback to identify broken experiences rather than tackling big challenges, such as the lack of customer-obsessed culture, opaque products or customer communications, or complex cross-channel journeys. The shift of customers to digital channels, while convenient and cost-effective, is also failing to elicit a deep emotional connection between banks and their customers. CX banking leaders look for opportunities to create new highs and connect with customers emotionally. According to Caroline Tucker, VP of CX transformation at Navy Federal Credit Union, “It’s not just about improving interactions. It’s about creating new interactions that don’t exist today to delight our members […] The art is thinking through the emotional arcs along that journey and inserting interactions that could be a nice delighter and make a lasting impression.” This requires blending methodologies to drive divergent and convergent thinking. Hire Exceptional People — And Keep Training Them As banks look to reduce the cost to serve and continue to boost digital self-service chatbot capabilities, it’s easy to forget that exceptional people are often behind exceptional experiences. Leading banks hire people with the right attitude, train them, and empower them to do what’s right for the customer. This applies particularly to customer-facing employees but not only. For example, UK-based first direct focuses on recruiting exceptional people who aren’t necessarily bankers but who demonstrate empathy, kindness, and the willingness to go the extra mile. The bank invests significantly in training, focusing on developing skills related to connecting with the customer, listening, and creating trust. In the words of Lucinda Scott, customer service director at first direct: “You do not need a bank when things are good. We train our customer service agents to solve problems. The agent and the team leader listen to the call together to reflect on what went well and how they can go the extra mile. Were they listening to catch the true customer need? For example, if the customer doesn’t say they’re struggling, the agent must have the skills to identify vulnerable customers who are going through a difficult time.” For other best practices, Forrester clients can read the frontstage and backstage reports or connect with me via inquiry or guidance session. We will also discuss these and other topics at our upcoming CX Summit EMEA this year, so please join us June 2–4 in London. source

How Banks Can Win At CX: Lessons From The Front Lines Read More »

Ask, Don’t Interrogate: Best Practices For Collecting Zero-Party Data

Companies have been grappling for years with how to personalize customer interactions without being creepy, and data deprecation only adds more complexity. Though Google scrapped its plans to deprecate the third-party cookie, privacy regulations and consumer use of privacy-protecting browsers and tools motivate brands to focus on collecting data directly from consumers. Doing so helps drive personalization across both known and unknown site visitors, builds consent into the workflow, and creates a more transparent approach to personalization. But this presents a new challenge: how to ask consumers for data in a way that is user-friendly and encourages them to share their information. Zero-Party Data Helps Brands Better Understand Consumers Even first-party data has its limits when encountering new prospects or unknown site visitors. Zero-party data experiences, such as a quick poll, quiz, or website widget, provide high-quality, accurate data from consumers directly. Forrester defines zero-party data as: Data that a customer intentionally and proactively shares with a brand. It can include preference center data, purchase intentions, personal context, and how the individual wants the brand to recognize them. The most successful zero-party data experiences are short, simple, and offer a clear value exchange. We just updated our report, An Illustrated Guide To Collecting Zero-Party Data, which showcases updated examples of asking for zero-party data for three use cases: product recommendations, consumer segmentation, and market research. MECCA Australia, a beauty retailer, draws consumers to a skincare quiz through a banner at the top of the website. This quiz asks seven questions about the shopper’s skin type, skincare routine, goals, and more, with questions such as “Where are you in your skin journey?” and “How would you describe your skin?” This is data that MECCA can’t infer, observe, or buy with certainty, which makes it highly valuable. And MECCA can use this data to understand how to communicate relevantly and personalize the customer journey. For shoppers, they receive tailored product recommendations and communications based on their answers. For more examples and best practices, check out our new report and set up a guidance session for a deeper dive. source

Ask, Don’t Interrogate: Best Practices For Collecting Zero-Party Data Read More »

Top Recommendations For CISOs In 2025: Deal With Uncertainty … Again

The security landscape continues to evolve, as does global uncertainty, leaving CISOs to prepare for turbulence ahead. Our latest report, Top Recommendations For Your Security Program, 2025, provides timely guidance for security leaders as they navigate another precarious year for their roles, programs, and organizations. We’ve included four of our 12 recommendations in this blog as a starter pack for what CISOs will deal with in 2025 and, most importantly, what they should do about it. Our recommendations for 2025 fall into four main themes: The changing consequences of the CISO role Changing technology across the enterprise and in cybersecurity Ever-present yet changing threats Securing emerging tech We design our insights to help technology leaders, chief information officers, and chief information security officers (CISOs) and their teams stay ahead of the curve and more effectively advocate for their programs. Deal With Changing Consequences: Cover Stakeholders, Reduce Risk For the past four years, we’ve been advising CISOs to link three groups of external stakeholders to their programs and budgets. Customers, cyberinsurance carriers, and regulators represent revenue won or lost, tie security to the cost of doing business, and should be an integral part of program planning in 2025 and beyond. Recommendation: Conduct a materiality tabletop exercise. With the SEC’s Item 1.05 of Form 8-K requiring companies to disclose the material impact of cybersecurity incidents, it’s crucial for CISOs to prepare. Conducting a materiality tabletop exercise with senior executives and counsel helps form an understanding of the processes and decision points needed to determine incident materiality. This proactive approach ensures that your team is ready to disclose incidents appropriately, avoiding civil penalties. Deal With Changing Technology: Make Plans For (Or Against) Platformization As tools, technologies, products, and services consolidate and compete for the biggest share of your security tech stack and the market hurtles toward behemoth proactive and reactive security platform players — in some cases, both — CISOs shouldn’t necessarily match the frenetic pace of the market with platform adoption. Not all platforms make sense for your program and organization, but some may provide benefits exceeding those of point solutions. Recommendation: Reduce your SIEM bill with data pipeline management. Data pipeline management (DPM) tools help reduce data ingest costs and facilitate easier migration to new platforms. By adopting DPM tools, security teams can manage data more efficiently, reducing costs and improving their overall data management strategy. Deal With Changing Threats: Address Geopolitical Issues The current geopolitical climate leaves CISOs with the duty and responsibility to protect their organizations or risk becoming collateral — or direct — damage as governments posture against one another. With trade breakdowns fraying already fragile supply chains and nations vying for AI dominance, focus your defensive efforts to stay nimble and ready to meet new demands placed on your program. Recommendation: Prepare for cryptoagility as a prerequisite for post-quantum security. Quantum computing poses a significant threat to traditional cryptography. CISOs must start preparing for post-quantum security by assessing the impact of quantum computing and ensuring that their systems are cryptoagile. This involves discovering and prioritizing data, keys, and algorithms that need to be updated to quantum-safe cryptography. Deal With Emerging Technology: Keep Your Eyes On The Horizon These technologies should be on the radar of your emerging technology team and security architects, because things will happen quickly once they arrive. Prepare now for what happens as 2025 progresses and we move into 2026. Recommendation: Grow machine identity governance. Machine identities are proliferating, and securing them is crucial. CISOs should build an inventory of machine identities and implement a purpose-built machine identity management solution. This will help prevent unauthorized access and reduce the risk of data breaches. For a deeper dive into these insights and more, read the full report, Top Recommendations For Your Security Program, 2025, and register for our webinar on Wednesday, April 16 at 11 a.m. ET. Forrester clients can also schedule an inquiry or guidance session to discuss our recommendations and how they apply to your organization. source

Top Recommendations For CISOs In 2025: Deal With Uncertainty … Again Read More »

The Graphic Future Of IT Management

[Graphic created by prompting an LLM to create a graph model in Neo4j.] As we approach the 2025 ServiceNow Knowledge and Atlassian Team conferences, IT management is entering a new era. The rapid adoption of AI-driven automation and the increasing use of graph-based models signal a fundamental shift in how organizations manage IT portfolios. IT management platforms (considered broadly) are evolving from forms and workflow-driven systems into systems based on intelligent, interconnected knowledge graphs that provide a real-time, holistic view of enterprise IT. In 2024, we saw this shift take hold as vendors such as ServiceNow and Atlassian matured and promoted their graph-based approaches. ServiceNow continues to expand its Configuration Management Database (CMDB) with graph-based models to represent IT assets and dependencies more dynamically. Atlassian, on the other hand, has taken a system-of-work graph approach to model how teams collaborate and deliver value. These advancements mark a paradigm shift in IT management: the emergence of AI-powered graph-based IT operating models. What is in the graph? All the usual things: servers, clusters, containers, applications, software, technology products, service offerings, cloud resources, endpoints, APIs … and projects, products, epics, tickets, stories, requirements, work orders, source code, packages, pipelines … and events, alerts, incidents, metrics, logs, traces, policies … everything. It may be centralized but at scale is more likely to be federated. Essentially, it is a massive digital twin of the IT organization.* Why is this possible now? AI and generative AI (genAI) are overcoming the discovery and quality issues that have bedeviled IT management data for years, and the graph database is a superior platform for data integration. IT leaders have wanted this kind of a view since the days of the mainframe. We now have the technical infrastructure to create it and keep it current. The fundamental question that will shape IT management in the coming years is: Who owns the graph? As organizations realize the power of interconnected IT knowledge, controlling and governing these knowledge graphs will be central to enterprise IT strategy. We are on the cusp of a struggle over the ownership, governance, and monetization of IT knowledge graphs. Graphs Are Reshaping IT Management Graph databases have long been used in adjacent domains such as fraud detection, social networks, and recommendation engines. Now, vendors are leveraging graphs to create more intelligent, dynamic representations of IT landscapes. The reasons for this shift are clear: AI requires structured knowledge. GenAI and large language models (LLMs) require structured and contextualized data. Graphs provide a foundational knowledge model that enhances AI-driven automation, reasoning, and prediction. If unstructured data and the LLMs and vector databases that make sense of it are like flesh, graphs are the skeleton, the bones that give it structure. You need both. Complexity requires relationships, not lists. IT service management (ITSM) tools originally were based on relational database technologies that struggled with the dependency-centric nature of IT management data. (Ever tried to write a recursive SQL query?) Graphs and their associated query languages are much more efficient approaches to modeling and using such information. IT domains are converging. ITSM, DevOps, FinOps, SecOps, and AIOps are all converging, requiring a unified model of IT management. A graph-based control plane can interconnect these domains into a coherent system. The Battle Over “Who Owns The Graph?” The strategic importance of IT knowledge graphs raises a critical governance question: Who controls the enterprise’s representation of IT knowledge? There are multiple interested stakeholders: Enterprise architecture (EA), strategic portfolio, and CMDB owners. Understanding dependencies has always been a core objective of CMDBs, from their earliest days: If I change X, what is affected? EA teams need similar data for strategic purposes: Product A is approaching obsolescence; what is dependent on it? The technology is finally supporting these dreams, and portfolio managers need to see how it all comes together in terms of the work, the artifacts, and the costs. ITSM vendors. ServiceNow and Atlassian are embedding graph capabilities into their platforms, positioning themselves as the central source of truth for IT knowledge. AIOps vendors. Dynatrace and its competitors build dependency graphs from the operational data they manage, including OpenTelemetry traces and other dependencies. Already, customers are integrating such dependency data bidirectionally with CMDBs. Cloud providers. AWS, Azure, and Google Cloud maintain extensive metadata about infrastructure, services, and security configurations. They have a vested interest in controlling enterprise IT graphs; certainly, they are the origin of much of the base data for the graph. Security and risk management teams. As security increasingly depends on understanding complex attack surfaces, security and risk teams will demand control over IT graphs and may choose to build their own. FinOps, value stream management, and other IT functional areas. These teams will need direct access to IT knowledge graphs to ensure that their models remain grounded and relevant and that they again may choose to build their own. This governance question will define enterprise IT operating models in the coming years. Organizations that fail to take a proactive stance on graph ownership risk ceding control to external vendors, winding up with the technical debt of redundant, sprawling graphs, and/or losing strategic visibility over their IT landscapes. AI + Graph: A New IT Operating Model The fusion of AI and graph databases is not just a technical shift; it is reshaping IT operating models. The next generation of IT management will center around real-time, interconnected knowledge graphs that allow AI-driven automation to replace traditional manual workflows. Key implications include: Automated IT decision-making. AI agents can detect issues and optimize performance. Proactive risk and incident management. Graph-based relationships enable AI to predict security issues or operational failures and recommend remediations before issues escalate. Enhanced developer productivity. Engineering teams will navigate IT landscapes more easily, improving DevOps velocity and reducing cognitive load. Dynamic IT governance. Policies can be linked directly into the IT graph information, leveraging a single source of truth and increasing assurance. The Road Ahead: Preparing For A Graph-Based IT Future The transition to AI + graph-driven IT management is inevitable, but organizations must take deliberate steps to prepare:

The Graphic Future Of IT Management Read More »

Cybersecurity’s Latest Buzzword Has Arrived: What Agentic AI Is And Isn’t

Cybersecurity vendors have come out of the woodwork in the past few months to announce their “agentic AI” innovations. These include vendors such as Swimlane, ReliaQuest, Dropzone AI, Intezer, and others. Some are announcing legitimate agentic AI features, while others are renaming existing ML or generative AI features to catch the hype: The blob strikes again! This has become further complicated as the definition and understanding of agentic AI capabilities have been as in flux as the rest of the generative AI market. After significant research and careful consideration, Forrester released a report defining agentic AI: Agentic AI Is Rising And Will Reforge Businesses That Embrace It (client-only access). According to this research, agentic AI is: Systems of foundation models, rules, architectures, and tools which enable software to flexibly plan and adapt to resolve goals by taking action in their environment, with increasing levels of autonomy. Agentic AI Is A Subset Of AI Agents To be explicitly clear: An AI agent is not the same thing as agentic AI. AI agents have been around for literal decades. Back when I was getting my computer engineering degree, I had to build an AI agent for an artificial intelligence class. The agent wasn’t anything crazy (certainly not to the level of generative AI) … it was a knowledge-based agent meant to understand and navigate the Wumpus world. The concept of agentic AI and its implementation are far from new. The challenge is that the majority of AI agents that have been developed don’t operate without human intervention. AI agents such as Waymo, Tesla, Apple’s Siri, and Amazon’s Alexa all require some level of human input, with Waymo being the most advanced by far. In contrast to AI agents, agentic AI is one or more AI agents that operate without human intervention. They learn and adapt to feedback and inputs to achieve a certain mission. Judges and critics are components that allow the system to perform more effectively, where the judge evaluates the output to ensure accuracy and the critic evaluates the output for specific flaws, biases, or ethical risks. In that way, agentic AI is a subset of AI agents that operate autonomously. Agentic AI Enables More Complex Use Cases Than RAG Agentic AI requires a different architecture than we see with retrieval-augmented generation (RAG), which only orchestrates pulling relevant information, not orchestrating a series of complex steps. Unlike agentic AI, RAG does not do significant planning or reasoning across information and cannot take adaptable action within enterprise environments. Agentic AI may use RAG as part of its capabilities, however. Security Tools Are Using Agentic AI To Automate Triage Agentic AI is being used to automate alert triage and some aspects of investigation in security tools. It has been particularly useful in automating triage and, in some cases, closing alerts related to phishing, though other use cases are on the horizon. While this blog breaks down what agentic AI is, we aren’t able to touch in depth on securing generative AI here. For advice on how to prepare for and secure agentic AI adoption in the enterprise, check out the report, Top Recommendations For Your Security Program, 2025. If you have more questions about how agentic AI is used in security tools, or if you want to talk about a particular vendor, book an inquiry or guidance session with me. source

Cybersecurity’s Latest Buzzword Has Arrived: What Agentic AI Is And Isn’t Read More »

The Power Of Open Source: Cloud-Native Is Transforming As AI Takes The Limelight

Three years ago, Lee Sustar and I published the report, Navigate The Cloud-Native Ecosystem In 2022. In that report, we analyzed how the cloud-native ecosystem, driven by open-source software (OSS), has been powering architecture modernization across infrastructure and application development, enabling platform-driven innovation in the meantime across a spectrum of technology domains such as data and AI. Since then, we have witnessed the same critical role being played by OSS amid the rise of generative AI and agentic AI, as well as the influential impact of DeepSeek. Many firms now look to the cloud-native ecosystem to accelerate their AI initiatives. OSS AI in the cloud-native ecosystem accelerates innovation and lowers the threshold for contributing to AI initiatives by providing access to a vast array of tools, frameworks, libraries, and models. While enterprises can use these resources to build and customize AI solutions tailored to their needs, the rapid development of open-source AI also introduces complexity and maturity challenges. As a result, we recently published another two reports: Navigate The Open-Source AI Ecosystem In The Cloud and The Key Challenges Of Open-Source Software In AI. In these two reports, we not only outline the major open-source AI initiatives within the cloud-native ecosystem but also provide an overview of the major barriers to OSS AI adoption in terms of cost, governance, and complexity, with a deep dive into the specific openness complexity of AI foundation models. More importantly, we provide a holistic view of key areas of the open-source AI ecosystem in cloud and representative offerings in the global market. Specifically: Open-source AI infrastructure powers scalable AI workloads in distributed cloud. In AI cloud infrastructure, open-source AI cluster orchestration enables firms to execute, schedule, orchestrate, and scale AI workloads. Open-source AI storage enables object, block, and file storage and supports virtualization for AI applications. Open-source AI data infrastructure supports AI models in various infrastructure segments, such as feature stores for AI models and databases such as relational, distributed cache, vector, and multimodel databases. Open-source AI models as a service enable ModelOps across the cloud model development lifecycle. Open-source AI data management covers data preparation, analytics, and visualization. Open-source AI model development spans AI models; machine-learning (ML) and deep-learning frameworks; AI model fine-tuning; and development collaboration. Open-source AI models target distributed model serving and inferencing in the cloud and on-premises with model compilers and MLOps support. Open-source AI observability provides insights into AI workloads and models. Open-source AI app-dev streamlines RAG and AI agent development. Open-source retrieval-augmented generation (RAG) plays a key role in enterprise adoption of AI applications. Open-source agentic AI platforms with AI agents at the core create agentic workflows and build multiagent systems to automate complex tasks and power applications. Open-source chatbots powered by large language models offer easy solutions for contextual chatbot support. Open-source AI software enables a range of segments for AI DevOps automation. Open-source AI governance offers cloud-native guardrails for the AI supply chain. Open-source AI security for AI models and supply chains evaluates ML models and apps and defends them against threats. Open-source AI privacy and ethics help firms assess ML models and data sets for fairness and bias and improve privacy and ethics. Open-source policy management allows cloud security experts to control access, enforce data privacy policies, and comply with security standards for AI and other applications. Open-source AI communities facilitate collaborative innovation. Mainstream open-source AI communities such as the Cloud Native Computing Foundation and Linux Foundation prioritize AI in community initiatives. Vendors are driving open-source collaboration via dedicated AI model communities. Open-source AI model benchmarking organizations are making substantial contributions, especially on open data sets for model evaluation.   Enterprise decision-makers should understand that embracing open source doesn’t mean using open-source components directly to build your platform from scratch. Instead, in most cases, you should choose mature commercial offerings with an open architecture and support for mainstream open-source components from reliable partners. For more details or if you would like to share your thoughts on this, please book an inquiry or guidance session with us to discuss. source

The Power Of Open Source: Cloud-Native Is Transforming As AI Takes The Limelight Read More »

Meet the Analyst: Covering ERP, FP&A, SaaS, and the Enterprise Software Market

I am thrilled to begin my journey as principal analyst on the enterprise software, IT services, and digital transformation-focused team at Forrester. For nearly two decades, I have led major business-technology initiatives. During the last eight years for Forrester Consulting, I helped clients drive transformative results. Now, I am eager to deliver clear, actionable insights to technology and line-of-business leaders, software providers, and implementation partners. Together, we can tackle enterprise resource planning (ERP) modernization, software-as-a-service (SaaS) governance, and enterprise solution roadmaps with confidence, clarity, and measurable impact. My Research Focus: Driving Results Where It Matters Most Enterprises face growing pressure to modernize legacy systems, optimize technology investments, and unlock new avenues for growth. To address these demands, my research focuses on four key areas: We have been working hard at Forrester (view my full bio here) to be a valuable trusted advisor that is on your side and by your side. I will be delivering evaluative research and market intelligence across ERP, financial planning and analysis (FP&A), and SaaS marketplaces, equipping you with the insights needed to fuel innovation, sharpen your strategies, and stay ahead in an increasingly competitive market. A View Into My Consulting Journey Recently, I moved to Forrester’s research team after eight years as a principal consultant in Forrester’s strategy consulting practice. During that time, I worked closely with clients to solve pressing challenges. Notable highlights include: ERP transformations: led modernization projects spanning multiple industries, managed budgets of over $80 million, and boosted efficiency. Go-to-market strategies: teamed with global tech companies to unlock revenue streams and accelerate growth. Strategic sourcing: created frameworks that improved vendor selection and reduced costs. Business-case and ROI analysis: authored Forrester Total Economic Impact™ (TEI) studies to support smarter tech investments. Before Forrester Consulting, I served as associate director at KPMG’s CIO advisory practice. Earlier, I spent nearly a decade at Ernst & Young leading enterprisewide IT transformations. Where Do I Find Inspiration? Outside of Forrester, I can be found riding my motorcycle along the Pacific Coast and chasing my seven-year-old twins. Ready To Start A Conversation? Let’s shape your enterprise software and business application strategies with fresh insights. Click here to schedule an inquiry or guidance session. #Forrester #ERPModernization #DigitalTransformation #TechStrategy #ThoughtLeadership source

Meet the Analyst: Covering ERP, FP&A, SaaS, and the Enterprise Software Market Read More »

Dual-OMS TEI: Companies Actually Get Their Money’s Worth

We evaluated the Total Economic Impact™ (TEI) of companies that simultaneously use two order management systems (OMSes). Our research uncovered surprising findings. Our TEI focused on businesses that had a preexisting, primary OMS and then added modules of a newer, secondary OMS to fill functionality gaps. Our central question: Why — and is it worth it? For our study, we applied the Forrester Total Economic Impact methodology, which allows us to calculate the ROI of a business decision. The TEI process includes interviewing representatives from companies that have made the business change in question. Then, we aggregate the experiences of interviewees into a composite organization. Finally, we create a financial framework from the material gathered in the interviews to prove the ROI. It considers everything from hard, direct costs to labor. What did we learn? Spoiler: Pursuing a dual OMS strategy is worth it! The positive ROI has a very short break-even point (less than six months). But there are major caveats, and these results aren’t guaranteed. What we expected: Based on conversations with Forrester clients, we expected to find that organizations plan to perpetually maintain both solutions. We believed we would prove such significant benefits that the costs of maintaining both would be worthwhile. We were wrong — at least partially. Two of the most unexpected takeaways: The dual-OMS approach has a big impact on topline revenue in the pre-purchase stages. In fact, businesses that gained revenue-increasing benefits from the secondary OMS saw the most significant results. Modules that add functionality such as enterprise inventory management had the biggest impact. In fact, the modern module additions gave organizations tools to lock in sales that they previously lost due to stock inaccuracies. The dual OMS strategy allows brands to manage complicated inventory calculations and logic, such as managing “safety stock” more tightly. Organizations also served near-real-time inventory data into the shopping experience, which reduced order cancellations from overselling. Organizations unintentionally have begun a slow-motion “strangler” process. Most firms that used the dual OMS strategy initially intended to maintain both OMSes indefinitely, but businesses saw that slowly adding new modules from the second solution was as effective as a replatforming initiative but at a nondisruptive pace. That is why three of the four interviewees said they ultimately intend to incrementally replace their primary OMS with the secondary one. In addition to the considerable benefits the secondary OMS brought, interviewees realized the add-on process had inadvertently jump-started their replacement. They won’t move quickly, but with such a major step toward replacement complete, they now feel that the rest of the migration is possible. The OMS market right now is currently in flux as longer-standing systems work to modernize their architecture. Meanwhile, the vendors with open, modular architecture are developing functionality and enhanced experiences that push the market forward. In the full report, we dive into the details of how the organizations realized the ROI of their approach and how we calculated the economic benefits. We also noted the risks of attempting similar strategies due to the varying needs of digital businesses. To learn more, read the full report here. Have questions or need support about how to embark on a dual-system strategy in OMS or commerce? Please book a guidance session with me! source

Dual-OMS TEI: Companies Actually Get Their Money’s Worth Read More »

Apple Aims To Make Transparency The Core Of Its Trust Strategy

We made a call last week that Apple had created a trust issue in the UK, and possibly more widely, by breaking its commitment to protecting its users’ privacy by withdrawing Advanced Data Protection (ADP) for UK users to comply with the government’s demands to, in certain circumstances, be able to access all of a user’s data. Is Apple Doing Its Business Out In Public To Try To Show Transparency? But wait, there’s more: On one hand … a transparently public vote. Apple took the decision to remove or retain its DEI policies to its shareholders. Rather than stand firm and proclaim a commitment to diversity and inclusion, or capitulate and remove its public DEI commitment, Apple’s leadership put the vote to its shareholders, who overwhelmingly voted (97%) to keep DEI on Apple’s agenda. On the other … an effort that might cause some to question its commitment to its values. After removing ADP in the UK, Apple then took the case to the Investigatory Powers Tribunal, which in its own words is an “independent judicial body. We provide the right of redress to anyone who believes they have been the victim of unlawful action by a public authority using covert investigative techniques.” In both instances, Apple had a choice: Make a stand or comply. Was the former case weakness on behalf of Tim Cook and his leadership team unwilling to make a personal stand? Or a power play to demonstrate what they already knew that shareholders would tell them? Was the latter a play to publicly demonstrate the stakes and amplify the conversation? Or a strategy misstep that Apple is now trying to rectify? Transparency Matters Since we’ve been invoking classical mythology in our recent blogs, let’s turn to Greek historian Plutarch this week, who tells us of a statue of veiled Isis in the city of Sais in Egypt, symbolizing mystery and the unknowable. In both of Apple’s recent cases, the true motives are veiled and unknowable without a candid interview with Apple’s leaders. As analysts, we look at multiple angles, but right or wrong, transparency matters — a lot. When we established our trust research in 2021 and into 2022, we studied the impact of various levers of trust, such as transparency. Back then (and there’s no reason to think that this has significantly changed), we found that for a consumer technology company (like Apple), customers who believed the company was transparent were more than twice as likely to buy additional products/services from the company and were almost four times as likely to forgive company mistakes compared to those who didn’t believe that the company was transparent. Apple Has More To Lose In The UK Than In The US In Forrester’s December 2024 Consumer Pulse Survey, we asked 540 UK and 551 US online adults, “Which of the following specific companies or organizations do you trust to keep your personal information and data secure?” We found that: UK consumers trust Apple more than the national government. Some 35% of UK consumers trust Apple with their personal data compared to 25% who trust the government, and high-income UK households lean significantly more toward trusting Apple. US consumers trust federal, state, and local government more than Apple. State (42%) and federal government (40%) garner more trust when it comes to protecting personal data than Apple does, with only 31% of US consumers saying they’d trust Apple to keep their data secure. Region matters in the US, with Apple’s trust level plummeting in the Midwest and drawing even with state and local government on the West Coast. Transparency is a key driver of consumer trust, which in turn is a key driver of brand experience. Apple needs to step up the transparency of its transparency game in the UK, because while UK consumers trust it for now, complacency is a killer. Learn more about the relationship between brand experience and customer experience at our CX Summit EMEA this year, June 2–4 in London. source

Apple Aims To Make Transparency The Core Of Its Trust Strategy Read More »

Sycamore Partners’ $10 Billion Prescription For Walgreens

Walgreens Boots Alliance (WBA) announced it will go private in a $10 billion deal with Sycamore Partners. It comes at a crucial time for Walgreens, which faces declining sales, profitability issues, and scrutiny for its role in the opioid crisis. The formation of WBA in 2014 was intended to deliver a wider range of products and services globally and become a leading player in the international healthcare retail industry. Signs of large strategic shifts in the retail healthcare space emerged late last year, which Forrester analyzed in a CX Cast episode. Recently, WBA announced, among other measures, the closure of more than 1,200 stores and a pivot to expand its specialty pharmacy business. Reimagining The Corner Store: What Walgreens Looks Like Post-Deal Sycamore plans to retain Walgreens’ core retail operations while selling off or taking public other parts of the company. WBA CEO Tim Wentworth notes that going private will enable the company to “realize [its] goal of being the first choice for pharmacy, retail, and health services.” After the split, the company will be divided into three distinct units: US Retail Pharmacy. This unit will focus on Walgreens’ core retail pharmacy operations in the US, which currently includes approximately 8,500 stores. As US consumers rank location as the top attribute for why they choose a pharmacy, it will be imperative for Walgreens to strategically manage any future location closures. Shields Health Solutions. This unit will handle Walgreens’ specialty pharmacy operations, addressing the rapidly growing demand for specialty drugs for chronic conditions such as cystic fibrosis and hepatitis. In recent years, specialty drugs have accounted for more than half of the total prescription spending managed by any health plan, employer, or government health program. Boots. UK-based Boots includes 3,364 retail pharmacies and health and beauty stores in the UK, Mexico, Thailand, and the Republic of Ireland. As the UK has experienced some store closures, the company could potentially close more locations as part of the broader strategy to streamline operations and focus on profitability. In the US market, competitors Amazon and CVS Health are continuing to differentiate themselves. Amazon’s RxPass addresses price transparency, affordability, and convenience, offering Prime members eligible (“50+ commonly prescribed”) medications at home for a flat fee of $5 per month. CVS diversified through deeper vertical integration of insurance and pharmacy benefits and launched CVS CostVantage to lower prescription costs. A smaller footprint will require Walgreens to accelerate its digital transformation and improve CX if it is going to retain current customers and lure new customers away from these competitors. To win with customers, it will also have to enhance and build services that increase convenience and promote price transparency. Let’s Connect Forrester clients can continue the conversation by scheduling a guidance session with us here. Not a Forrester client? The healthcare industry is facing rapid change. Reach out to learn more about what it means to have Forrester on your side and by your side. Now is the time to be bold. source

Sycamore Partners’ $10 Billion Prescription For Walgreens Read More »