IDC

Navigating the Top Drivers of Service Provider Switching

Why Do Organizations Switch Their Service Providers? Companies today use service providers across all facets of the business. From modernizing technology and streamlining operational processes to strategic consulting and IT outsourcing, services providers provide vast value, expertise, and increased efficiency to their customers, enabling them to spend more time and energy focused on their core functions. These relationships between service providers and their clients are forged over time, typically years or even decades, and many providers often promote the average length of time they retain clients as a performance indicator. However, what happens when customers start questioning the relationship’s value? Many customers are often reluctant to switch providers because of the time it takes for a new provider to understand their business, and there are also both direct and indirect costs associated with switching providers. Sometimes, regardless of these drawbacks, customers still choose to move on. What developments usually result in customers reaching this decision? IDC recently examined this question utilizing its Services Path data, looking across all types of services, to understand the reasons why companies ultimately choose to make the switch. IDC’s Services Path program contains comprehensive data and guidance on the mind and journey of services buyers, for professional services, outsourcing services, managed services, and engineering services. The data set is based on a global survey of approximately 2600 organizations, spanning all sizes and industries, and covers topics ranging from adoption, budgeting trends, and purchasing preferences, to pricing and contract options and detailed customer satisfaction ratings for hundreds of vendors. Services Path data shows there are several factors that consistently stand out as the primary reasons why customers choose to switch service providers. The top 3 reasons, examined by role, are shared below (in rank order): IT Respondents: Enable more rapid digital transformation (DX) and modernization Improve service quality and raise service levels Lower cost Line of Business Respondents: Improve service quality and raise service levels Enable more rapid digital transformation (DX) and modernization Improve data analytics and decision support Digital transformation, modernization, and improved SLAs/service levels are the top reasons driving companies to make replacements. However, there are some differences between IT and LOB personnel in terms of what’s driving their decisions to switch. For example, IT places a much greater emphasis on lowering costs than line of business employees. Conversely, LOB leaders view improving corporate sustainability as a significantly greater driver of replacements than IT folks. To help reduce customer switching risk and further bolster the stickiness of long-term relationships, services firms should make note of these top switching drivers and ensure they clearly demonstrate their ability to accelerate their clients’ DX initiatives and deliver high-quality and cost-effective services.  Customers today are looking for strategic partners that can be leveraged broadly across the business, who understand the nuances of their business, have deep expertise in their industry, and can provide fast proof of value. Firms that continually deliver on these needs will be well-positioned to maintain long-term clients and minimize the risk of customer attrition. To learn more about IDC’s Services Path program, watch this short video. Contributing author: Jason Bremner – Research Vice President, Worldwide Services source

Navigating the Top Drivers of Service Provider Switching Read More »

Evolving Digital Marketing Strategies in the Era of AI Everywhere

IDC predicts that by 2025, the top 1000 organizations in Asia will allocate over 50% of their core IT spend on AI initiatives, leading to a double-digit increase in the rate of product and process innovations1. Moreover, according to a recent IDC survey, 35% of Asia/Pacific Japan (APJ) organizations will increase their IT budget to fund their AI Everywhere initiatives2. Both are proof of the ongoing AI revolution in the region. As AI adoption continues to increase in the region, some organizations will find out the hard way though that they cannot simply replicate what peers or competitors are doing. Worse, some will realize they cannot buy or build just any AI solution they fancy and hope for the best. As a framework to assess their organization’s AI Everywhere readiness, IDC drafted a framework outlining the key elements of successful AI implementations. First, it is imperative for organizations to be strategic in deploying AI in preparation for industry disruptions. Second is the importance of creating a roadmap that prioritizes use cases based on an organization’s distinct requirements. These first two are all about planning for success. Next will be to build their enterprise on a foundation of intelligent apps, data and models to drive productivity, enhance operational efficiency, and create exceptional customer, partner, and employee experiences, among other benefits. Fourth will be the importance of having a robust digital infrastructure able to support AI workloads at scale to fully harness AI’s potential. These two are all about transforming their people, processes, technology, business models, and data readiness. And, finally, the necessity of a rigid AI and data governance system to ensure data security, and safety and trust of their users. In short, organizations who are already or about to embark on an AI journey will need to have a foundation of various digital innovations in place and to AI-engineer their AI adoption path before they can ensure AI creates the most significant business impact for their organizations. It goes without saying then that tech vendors must master their customer’s AI journeys and readiness. Tech vendors must guide tech buyers every step of the way in effectively aligning their business objectives and priorities to their AI Everywhere initiatives. But, at the same time, tech vendors must also understand that each customer has unique objectives that require unique solutions. To succeed, tech vendors must be equipped to help them assess their markets better. Tech vendors should have the capabilities to demonstrate their AI solutions’ value, grow their business, create new opportunities, and elevate brand awareness. IDC recently hosted a discussion on this topic with a group of leading global IT vendor CMOs in Singapore during a Sunrise Kopi briefing on AI Everywhere. At the event, IDC Research and IDC Custom Solutions had a lively discussion and advised them on how they can improve their AI solutions’ go-to-market strategies while aligning these to their customer’s objectives and challenges. Below are some of the key points discussed during the session: Generate demand in a crowded market. Almost every vendor is touting an AI solution. Tech vendors need to raise their brands above the noise of the over 200 AI use cases available in the market by demonstrating the business impact of their “real AI’ solutions through effective content marketing. Go beyond the CIO – Convincing the CEO through other decision influencer personas.  AI is more of a strategic initiative than an IT investment.  The decision-making process involves multiple personas with the need to convince the CEO and the board.  Tech vendors need to understand the priorities of these different personas and position their solutions appropriately, and not just focus on the CIO agenda.  IDC shared some of its published research on the AI investment priorities and buyer behavior of different personas across industries, and how to drive targeted marketing and enable sales teams to engage more effectively with these personas. Communicate brand position clearly and change brand perception to an AI provider.  There is a need to cut across the hype and demonstrate who has a “real AI” solution.  We advised them on how to leverage thought leadership assets to show their understanding of the buyer journey and its corresponding issues, and to clearly demonstrate their solution’s business value and ability to generate tangible and significant business outcomes. Shorten the decision-making process.  Although the buying decision involves multiple personas, the CEO and CIO must engage and align on the strategy.  A lot of conversations need to happen between the CEO and CIO.  IDC discussed its AI Readiness Maturity Model that helps bring all parties into a common assessment of where they are in their AI journey and aligns all stakeholders into a common decision pathway.  This tool can be used to measure partner/channel readiness to sell AI solutions as well as open an engagement opportunity for sales. Tracking the dark funnel.  How do we track buyers who are clicking through, researching into the topic but are not picked up by the sales/marketing funnel?  IDC shared some of the TCO/ROI tools that can help tech vendors assess their customer’s environments and help draw them from the dark funnel into the sales/marketing funnel.  In addition, IDC also talked about leveraging media amplification and lead-generation services to facilitate the conversion of buyers in the dark funnel as marketing qualified leads. In summary, there is a lot of noise in the market that muddles tech buyers’ visions as they search for suitable, legitimate, and reliable AI solutions to help them address their unique business needs. Given the right data, insights, and go-to-market strategies, tech vendors can clear tech buyer’s views if they plan, market, and sell their AI solutions well, and stand out as the preferred and trusted AI technology vendors of choice. IDC Custom Solutions helps tech vendors enhance results with tailored insights and proven tools. Discover how IDC’s AI Use Case Discovery Tool can elevate your AI strategy—learn more here. 1 Source: IDC FutureScape: Worldwide Artificial Intelligence and Automation 2024 Predictions – Asia/Pacific (Excluding Japan)

Evolving Digital Marketing Strategies in the Era of AI Everywhere Read More »

The Rising Urgency for Telecom Innovation

High Expectations Collide with Market Realities Telecom Service Providers are facing historic challenges amidst shifts in both enterprise and consumer demand and challenges transforming from “connectivity providers” to digital platform players. Historically, telecom service providers have championed connectivity at scale. In past decades, this proved a profitable strategy, with the value of connectivity garnering consistent year-over-year revenue growth and profits. However, recent years have seen telecom providers grapple with a host of challenges including industry competition, commoditization of services, and inflexible IT systems that have made it hard for them to swiftly innovate and compete against new threats. Further, while network traffic continues to rise, predominantly driven by video apps, service providers have been unable to effectively monetize this traffic growth. The disconnect between revenue growth and network traffic growth remains one of the top challenges globally for service providers as they hunt ways to reinsert themselves and justify connectivity as not just a commodity, but as a value-based service that can be delivered to support a range of use cases and verticals. In response, many forward-thinking telecom providers have made a purposeful decision to focus their technology offerings and ecosystem partners on targeting digital engagement and new revenue opportunities and rearchitecting their technology stacks to align with hyperscale cloud models as a means to simultaneously control costs and position for service agility longer term. Even so, third-party entities, including CPaaS, cloud, and other digital platform players, have moved into largely siphon off these digital opportunities while curating vast developer ecosystems, once again relegating many telecom providers to a connectivity-only role. New Tools in the Arsenal Create New Monetization Opportunities for Telcos Amidst this push and pull of telecom service provider efforts, a new opportunity has emerged, driven by the promise of SA 5G networks and API exposure capabilities to empower telecom providers to reinsert themselves within the digital landscape by unlocking the ability to more easily sell and scale customized, programmable connectivity designed to be packaged and consumed by application developers. Unsurprisingly, hyperscale cloud providers, CPaaS companies, and systems integrators have also positioned themselves for this new market opportunity by aligning with industry consortia (e.g., Camara, Open API Gateway) that are championing global standards; however, it remains to be seen where, how, and by whom value will ultimately be created and monetized. Figure 1: Emerging Telecom and Network API Ecosystem Source: IDC, 5G Exposure and Network APIs: How Will the Telecom Ecosystem Capture New Opportunities with Developers? As part of these market developments, the worldwide IDC team has spent the past couple of years building a methodology to size this opportunity and define ways the telecom API ecosystem can work together to enhance this emerging market. Telecom Service Providers Can Capitalize on AI and GenAI to Improve Business Results and Potentially Reshape Their Market Role While APIs represent one-way service providers can capture new monetization opportunities, Artificial Intelligence (AI) presents another avenue to drive business results. More specifically, AI can be inserted into the telecom technology stack to improve TCO, enhance service agility (e.g., AIOps), as well as improve the customer experience (CX) lifecycle. As telcos move toward future network architectures governed by cloud-native architectures, this ushers in a much greater role for automation and orchestration across various physical, virtual, and containerized network functions, as well as AI-informed operations and monetization platforms. This in turn raises the importance of adopting AIOps within network operations; however, network-related AIOps brings its own unique set of challenges for Telecom Service Providers as well as a vendor community that overlaps but does not entirely match, the more generalized ITOps vendor roster. Meanwhile, GenAI has emerged as a powerful tool to enable telcos to embrace some of the benefits of AI while simultaneously investing in the internal skillsets and capabilities required to embrace AI more broadly. The graphic below highlights some of the key use cases IDC envisions for GenAI across telco environments. Figure 2: GenAI Telco Use Cases Across Telco Environments While this graphic provides an optimistic outlook for the full set of Gen AI’s impact on telecom service providers, the reality is it will take time, effort, and AI partners for telecom providers to realize gains from AI. Indeed, with AI curators racing to drive AI innovation across multiple environments (e.g., hybrid and multi-cloud, etc.), it is likely multiple models will become prevalent in which telecom service providers serve dual purpose by becoming some of the strongest consumers and distributors of AI and Gen AI going forward. Further, interest in AI applications is also prompting service providers to build near-term roadmaps clarifying how enterprise customers can leverage their core and edge assets to support emerging use cases (e.g., AI inferencing at the edge) while reinforcing connectivity as the foundation of AI-enabled applications and services. Indeed, while AI is being emphasized by many organizations, it will require a global distribution mechanism to help scale. Hyperscale cloud providers are top-of-mind, but telecom service providers can also play a role in connecting AI apps. Overall, it is a critical time for telecom providers, and their technology vendors, to synchronize on key priorities and investment strategies, particularly in light of historical struggles to optimally monetize telecom networks. Doing so can enable them to rearchitect a brighter future for telecom monetization and set them up for a key role in a digital, AI-centric world. For a deeper dive into these topics, watch IDC’s July 10th webinar, “Revenue Enablers for the Future Telco: APIs, AI, and Emerging Tech”. source

The Rising Urgency for Telecom Innovation Read More »

Transform Customer Experience with Customer Data Platform and Generative AI

With each positive interaction a customer has with a brand, they expect similar or higher levels of service in the future. Unfortunately for brands there is no finish line, only continuous improvement to create better experiences. Brands realize that putting the customer at the center of their business is a way to deliver consistent, personalized, and timely engagement across digital and physical channels and across marketing, sales, service, and support functions.   According to IDC’s September 2023 Future Enterprise Resilience and Spending (FERS) survey, respondents ranked delivering great customer experience as their top focus area to derive customer value. Brands have a clear mandate to augment personalized experiences and acquire and retain customers through customer experience (CX) technology investments. To fulfill that mandate, customers should first prioritize continuous integration of dynamic data across touchpoints and deliver high-quality data using Customer Data Platforms (CDPs). IDC’s July 2024 Future of Customer Experience (FoCX) Survey identified that over the next 12 months, 77.8% of respondents plan to increase technology spending for CDPs.   Secondly, using AI and GenAI driven processes and tasks will help to better identify and segment audiences, uncover new levels of customer insights and create effective engagements. IDC’s April 2024 FERS survey shows that spending on GenAI–related infrastructure, models, applications, and services is expected to increase by an average of 64% across all companies. The survey also shows that companies that report an 80% success rate with their GenAI proof-of-concept efforts ranked “access to required high-quality data sets” as a top five success factor.   The final point is that acquiring customer data that fuels personalization and engagement is back in the news with Google’s latest planned announcement that it won’t be deprecating third-party cookies.  Google announced that it is introducing a new experience in Chrome that lets users make an informed choice across their browsing habits. While regulators decipher the plan and users decide on choices they face, organizations should continue to investigate what zero-, first- and second-party data they need to build segments and models with trust. Customers strongly prefer brands that are transparent and prioritize their data security and privacy, leading to a stronger, trust-based relationship.  Customer Data Platform to Enhance Customer Experience   According to IDC’s 2024 CX Path Survey, the top business outcome that organizations want to achieve from implementing CDPs is enabling customers to curate contextual experiences. CDPs provide high-quality data and analytics for this and other use cases involving growing revenue streams and delivering differentiated experiences with high value business outcomes. CDPs must include the following key components:   Aggregation: Ingest, integrate, cleanse, resolve and consolidate individual-level customer data from multiple sources and formats and determine which attributes and dimensions to include in a profile or segment.   Engagement: Activate segments for campaigns, advertising, and messaging across different channels and audience groups defined by multiple attributes and dimensions. Includes next best action, recommendations, etc. based on end-user choices and preferences synchronized across channels.  Insights: Descriptive, diagnostic, and predictive analytics to understand the complexities of the customer journey, predict future behaviors and tailor marketing efforts. Augment it with GenAI to drive automation and improve productivity for users to engage with CDPs and improve self-service.   Orchestration: Shared set of services will help to deliver a common orchestration layer for workflows, event management, scheduling, and rules. Having a solid framework for data governance and AI governance will help to balance personalization versus privacy, trust, and transparency.   GenAI and CDP to Drive Productivity and Personalization  While vendor roadmaps for AI are advancing, narrow down on which GenAI use cases you want to pursue and what does it take to implement the prioritized ones in context of CDPs. In parallel, define and develop the metrics and analysis required to justify investment in the selected use case or two. Organizations should use GenAI to improve productivity for CDP users and how it can deliver personalization to meet rising customer expectations in the following ways:  Custom GenAI models trained on CDP data are used for generating personalized content like product descriptions, custom messaging, landing pages, email copy.   Combine retrieval-augmented generation with GenAI models to provide grounded, trusted responses by extracting information within CDP and other knowledge repositories.  Conversational AI assistants enable marketers to query and interact with data or describe the customer journeys they want to create using natural language, making it more intuitive and efficient for marketers.   Dynamic segmentation allows for real-time adjustments to customer segments based on their behaviors and interactions analyzed by GenAI models with marketing campaigns.   Synthetic data generation helps in augmenting datasets where actual data is sparse or limited, enhancing the robustness of AI models. This approach is particularly useful in scenarios where data privacy concerns limit the availability of real data.   Prepare for the Next Phase of Customer Experience  According to IDC FutureScape 2024 Predictions, Customer Data Platforms will deliver high-quality data for predictive AI and GenAI, activating 80% of real-time personalized customer interactions at scale for G2000 firms with four times engagement gains by 2026. Organizations need to identify primary use cases that highlight the growing importance of unified customer data beyond marketing and across sales, customer service, and field service. They also need to quickly build a picture of full journey and behaviors exhibited by customers by accessing intent data, service and support data, and customer interactions captured in unstructured sources in a secure and trusted manner. Finally, understand what is practical today with GenAI and how it will automate CDP tasks and workflows to make marketers more productive and use it to build personalized content for activation in the best channel preferred by the customer.   Is your firm ready to take the next steps to meet rising customer experience expectations? Organizations need to prioritize investments in customer data platforms to deliver high-quality data for GenAI use case that will add to marketing productivity, enable CDP automation, and adopt trust- and governance-based marketing programs to drive personalization at scale and streamline customer experiences.   Learn what matters most to your customers with IDC’s AI Use Case Discovery Tool—find

Transform Customer Experience with Customer Data Platform and Generative AI Read More »

Why Supercloud Architectures Could Upend Cloud Computing

What Is Supercloud? Supercloud is an approach to cloud computing that abstracts underlying cloud platforms from applications so completely that it allows applications to move seamlessly between clouds – or even operate across multiple clouds at the same time. Thus, if you were to adopt a supercloud strategy, you’d build a cloud architecture that lets you migrate an application instantly from, say, AWS to Azure, without having to reconfigure the application or its environment in any way. You’d also be able to do things like host some of the application’s microservices on Azure and others on Google Cloud Platform (GCP) at the same exact time. Supercloud could prove to bring massive disruption to the cloud computing industry because it opens up a host of opportunities that aren’t viable under traditional multicloud architectures. Supercloud Versus Multicloud To explain why supercloud could turn out to be such a big deal, let’s first talk about how it’s different from traditional multicloud. As of 2024, multicloud architectures – which mean using multiple clouds at the same time – are commonplace. IDC’s March 2024 Cloud Pulse Survey (n = 1,350) shows that 74% of cloud buyers have multicloud strategies. It’s no longer a big deal to use multiple clouds. However, traditional multicloud architectures simply involve using one cloud platform to host some workloads and other clouds for other workloads. They don’t deeply integrate cloud platforms together. As a result, with traditional multicloud, migrating an app from one cloud platform to another is typically a complicated process because you have to reconfigure the application to run in the new cloud. This entails tasks like rewriting identity and access management (IAM) rules, reconfiguring networking, and selecting and setting up new compute and storage services. Likewise, the idea of hosting applications across clouds at the same time is virtually unheard of, even for organizations that have long used multiple clouds. It’s very rare to try to have an application frontend run in one cloud while its back-end components are hosted on a different cloud, for example. Network latency issues would present a big challenge if you tried to do this. You’d also need to implement application logic that allows your internal application services to connect across clouds, which would significantly complicate the application development and management process. But supercloud could change all of this. By making underlying cloud platforms irrelevant from an application’s perspective, supercloud has the potential to take multicloud to a whole new level. Benefits of Supercloud Specifically, supercloud architectures could deliver benefits like the following: Maximizing application reliability by hosting complete instances of an application on multiple clouds at once. This would mean that even if an entire cloud crashed, the app would keep running. Optimizing cloud costs by making it possible to migrate to a different cloud instantly if better pricing becomes available in that cloud. Eliminating the need for teams to learn the intricacies of multiple cloud platforms. With supercloud, cloud service vendors’ tooling and configurations would become less important because they’d be abstracted from IT operations. Improving application performance by making it easy to distribute application instances across cloud platforms and regions. This would reduce latency and speed application responsiveness, resulting in a better user experience. How Realistic Is Supercloud? In theory, supercloud would open amazing new doors in the realm of cloud computing. But is it actually feasible in practice to build a supercloud architecture? The answer remains unclear. Although the supercloud concept has generated a bit of chatter over the last year or two, no vendor has come close to developing solutions for actually creating a supercloud. There are, of course, plenty of cloud monitoring, management and security tools that support multiple cloud platforms. To an extent, they smooth the process of operating applications across clouds. But they certainly don’t erase the barriers to instant cloud migration or cross-cloud operation. Being able to use the same tool to monitor applications that run in different clouds is quite different from having apps that work exactly the same no matter which cloud hosts them. There are also some application hosting platforms that abstract applications from underlying infrastructure in ways that could, in theory, help to build superclouds. Kubernetes, the open source orchestration platform, is a prime example. Theoretically, you could build a Kubernetes cluster in which some nodes are virtual services running in one cloud, while other nodes are servers hosted in a different cloud. But this is not what Kubernetes was designed for, and multicloud Kubernetes clusters are very rare. Building them requires grappling with complex technical issues, like the difficulty of keeping the various parts of a Kubernetes cluster in sync when they are distributed across multiple clouds and rely on the internet, instead of superfast local networks, to communicate. So, while we do have some solutions that gesture toward a supercloud future, building a supercloud today would be a very fraught and clunky experience, at best. What It Will Take to Make Supercloud a Reality But the hurdles to supercloud don’t seem impossible to overcome. If cloud service providers were to collaborate around developing shared standards for configuring and using cloud infrastructure, building a supercloud would become quite easy. Imagine, for instance, that instead of having to write different IAM and networking rules for each cloud you use, or select different types of cloud server instances, you could write rules or select infrastructure that worked on any cloud. Technically speaking, this wouldn’t be too hard to do, if cloud providers got on board. The challenge, of course, is that cloud providers currently have little incentive to make it easier for customers to use competitors’ platforms at the same time. Amazon doesn’t stand to gain anything by making it easy for its customers to migrate AWS-based apps instantly to Azure or GCP, for example. Another possibility is for a single vendor to build a supercloud platform that abstracts underlying clouds from applications. A third-party solution could translate between different cloud service providers’ tooling and services in ways that enable

Why Supercloud Architectures Could Upend Cloud Computing Read More »

Hire for Potential, Not for Skills

Faced with the need to staff up quickly to carry out a generative AI initiative, Sanjay Srivastava did the logical thing. “We brought in kindergarten teachers to do prompt engineering,” says the chief digital strategist at Genpact, a professional services firm. Although the decision might appear unorthodox, it’s consistent with Srivastava’s fundamental outlook. He is a strong believer in hiring people from a variety of backgrounds who possess a vital trait: the ability to learn.   “My whole view of life is that we cannot know the skills we will need tomorrow. So we need curiosity, humility, and the desire to learn the answers, not to already know them,” he explains. “The people we hired from a very different walk of life worked out the best,” he adds.   Srivastava has discovered what many IT leaders are learning: that in a time of rapid technological change, traditional hiring practices — posting job openings, sifting through reams of resumés for relevant experience, making offers, and waiting for counteroffers — are falling short.   “To stay competitive in the midst of widening IT skills shortages, enterprises must ensure a culture of continuous learning. All employees from entry- to C-level must have the drive and capability to keep learning, to keep stretching,” says Gina Smith, IDC research director, IT skills for digital business.    Melissa Swift, vice president for workforce and organizational change acceleration at Capgemini Invent agrees. “You’re on a conveyor belt. Things move. If you take six months to hire someone, you might only have six months to use those treasured skills,” says Swift,  who counsels clients on how to rebuild their workforces to reengineer transformation.   Like Srivastava, Swift asserts that the ability to learn is more valuable than what a person already knows. But finding a “learn-it-all” rather than a “know-it-all” is not a simple matter.   “You have to be willing to tolerate a bit of non-linearity. When you don’t understand how they went from there to here [in their work history], that might be an indicator of learning agility,” she suggests.   Trust Your Gut?   Where seeming intangibles, such as curiosity, are concerned, you might think that gut instinct would play a large role. But, according to Swift, that can lead to bias when hiring managers gravitate to an applicant because they appear outwardly similar to a previously successful hire. When traits are difficult to measure, objectivity becomes more important. Swift recommends carefully evaluating for learning agility. “Look for a test that has been psychologically and statistically validated. It needs to have research-driven rigor,” she advises.  Because the ability to solve business problems is the most desirable trait, Srivastava says interviewers should ask applicants direct questions like, “Tell me about three problems you ran into and how you solved them.” He says interviewers should seek affirmative answers to these questions: “Are you seeking insights from others? Are you interrogating data? Are you testing your own hypotheses or assumptions? Are you fundamentally reexamining your point of view?”   Hidden Passions, Overlooked Winners  Swift says unusual passions outside of work can be a tip-off to learning agility. “Are they into reading about Teddy Roosevelt? Do they like crochet or horseback riding?” She adds that some jobs, like sales and teaching, inculcate traits such as the ability to think positively and communicate clearly that pay off in many different fields.    She also advises looking closely at a company’s current employees, some of whom may have the requisite learning agility but remain undiscovered because of the penchant of hiring managers to look outside for talent. “It’s the shiny object syndrome. Internal talent pools are chronically neglected,” says Swift.   And internal employees who don’t call attention to themselves could be overlooked difference-makers. “Look for introverts; there is something in our culture that does not value introversion,” she says. In her experience, one woman was very introverted in meetings, but afterward would come up with ideas that were clearly “better and smarter” than what other team members offered.   It’s Not About the Money   There are cases in which hiring for potential can generate significant savings. Data science, for example, is an area in which experts are demanding, and getting, inflated salaries. “People are asking seven figures; however, you might be able to upskill people into those roles,” says Swift.   While some might think that hiring for potential would save money compared with hiring the person with the longest resumé, Srivastava says cost savings are beside the point. “What is the opportunity cost of having the wrong person on the job? If you’re not going to be part of the new economy, you have already lost the game. I would change the metrics of how we measure success from the cost of compensation to the opportunity cost of missing the next wave.”   Srivastava has learned the lessons of outside-the-box hiring from first-hand experience — his own. “I never went to school to become a CDO,” he says. Born in India, he studied aerospace engineering, then moved to the U.S. to take a sales job. However, he decided to become a technology entrepreneur, building several startups that were acquired.   GenAI is a perfect example of a technology that seemingly came out of nowhere, he says, a harbinger of future transformational waves that will make today’s expertise obsolete. “The skills we will need for the future, we don’t know what they are,” says Srivastava. “Look at prompt engineering. Who knew we would have to hire for it?”   Learn how three IT organizations are modernizing their skills and talent development practices. source

Hire for Potential, Not for Skills Read More »

From Discard to Demand: The Growing Popularity of Used Smartphones

State of the India Smartphone Market India ships around 145-150 million new smartphones per year for the domestic market, ranking it second globally after China in annual shipping volume. There are approximately 650 million smartphone users in India or about 46% smartphone penetration in the country. There is no other market of this size with such huge untapped potential, making India a very attractive market for all smartphone ecosystem participants from brands to component makers. India’s smartphone market grew modestly in 2021, coming out of a challenging 2020 (due to pandemic-led shutdowns). This growth was driven by the need of a better device for remote learning/work and increasing media consumption on the go. However, in 2022 and 2023 the market faced challenges because of the rising average selling price (ASP) for devices (growing by a CAGR of 38% from 2020 till 2023), improving device quality, and continuing income stress especially in the mass consumer segment. This in turn has elongated the average smartphone replacement cycle in India from 24 months to almost 36 months currently, further restricting the growth of the new smartphone market. Why Are Consumers Choosing Used Smartphones? All the above mentioned factors are contributing to the increasing popularity of used smartphones in the past few years. As the quality of smartphone hardware improves, increasing device prices are keeping the new smartphone models out of reach of the mass segment. The aspiration to own a good device without paying much is making the used smartphones a very attractive choice for consumers wanting to upgrade or even with first-time smartphone users. Another important factor in the popularity of used smartphones is the rising preference for 5G smartphones. As of now only approximately a third of the 650 million Indian smartphone users have a 5G smartphone, the rest are still using 4G phones.  However, the price differential between 4G and 5G smartphones and the lack of wide availability of 5G models under INR 10K (US$125) is restricting their upgrade to a 5G device thus forcing many consumers to go for mid-priced used smartphones. According to the latest IDC research (IDC Used Device Tracker), India ranks third globally in used smartphone units’ annual volume after China and the USA, and is one of the fastest growing markets.  In 2024, IDC forecasts 20 million used smartphones will be traded in India with a YoY growth of 9.6%, outpacing new smartphone shipments of 154 million units in 2024, growing at 5.5% YoY. Apple and Xiaomi Are the Top Choices! The “premiumisation” of India’s smartphone market or more aptly the rising aspirations of the Indian consumer to upgrade to a mid-premium or a premium phone is also contributing to the popularity of used smartphone space. While Apple has seen healthy growth of new iPhone shipments in India in the past few years, it is also leading the used smartphone space, capturing a quarter of the market as per IDC Quarterly Used Device Tracker. Everyone in India wants to buy an iPhone because of its premium brand positioning and status signaling value, but not everyone can afford one. The used phone market comes to the rescue of many such aspirational consumers going for previous gen models like iPhone 11, 12 and 13 series. Xiaomi led India’s new smartphone market for 20 straight quarters from 3Q17-3Q22. As a result, it has a huge user base which is reflected in the used smartphone market as well. Xiaomi sits at the second position followed by Samsung. These top 3 brands combined make up around two-thirds of the used smartphone market in India. Who are the Market Players? IDC’s used smartphone research tracks both second hand and refurbished smartphones being traded via organized refurbished players in the market. It excludes the peer to peer sales. In India, several startups in this space like Cashify, Budlii, Instacash, Yaantra, etc. have tried to organize this hitherto largely unorganized market. With their efforts around marketing and omnichannel presence across both online and offline counters, these players have been able to build confidence and trust among consumers regarding the quality of the used smartphones on their platforms.  Cashify is one of the biggest platforms in this market with over 200 stores in 100 cities, many in Tier 2 & 3 towns. For Yaantra, the company is owned by Indian e-commerce giant Flipkart, with branding named as Flipkart Reset. It is mainly focused on its online portfolio. For offline space, the company has partnered with Airtel to be available in the telco stores in only two Indian cities for now (Delhi & Hyderabad). From Here, the Only Way Is Up! IDC forecasts the used smartphone market in India to grow at 8% CAGR in the next 5 years, reaching 26.5 million units per annum in 2028. It is evident that the used smartphone market in India is gradually taking shape with interesting channel play by key trading players making smartphones more affordable to a larger audience. This is also reassuring for the ever-discerning Indian consumer when they explore buying a used smartphone without worrying about the quality of the device and spending too much. From an overall market perspective, growth in used smartphone market in India can certainly be a factor in increasing smartphone adoption in India, creating a parallel revenue stream for channel players, help vendors in addressing e-waste concerns around discarded devices, and generate employment (skilled/unskilled). This market can certainly play a major role in achieving the goal of bringing a billion Indians in the smartphone fold in the next few years. source

From Discard to Demand: The Growing Popularity of Used Smartphones Read More »

The Four Key Issues CrowdStrike Exposed: CIOs’ Next Steps

IDC’s Quick Take The recent IT outage caused by silent updates pushed out by CrowdStrike to its Falcon agent exposes an issue that is at the heart of how the IT industry operates. It highlights the contrasting trust and attestation mechanisms taken by operating system vendors like Microsoft, Apple, and Red Hat in allowing its ecosystem of independent software vendors (ISVs) direct access to certain parts of the operating system stack and especially software that can potentially severely negatively impact the system kernel. While this issue impacted Windows devices– network and human centric – managed by CrowdStrike, none of the iOS, MacOS, or even Linux devices were affected. That is very telling and should compel vendors like Microsoft and Apple to take a long hard look at what “openness” means in the wake of regulations like EU’s Digital Markets Act (DMA). It should also compel the largely Windows-dependent customer base to redefine their long-term cyber recovery strategy. It should include making a shift to more modern operating system environments. Event Highlights On July 19, 2024, at 04:09 UTC, a sensor configuration update was released by CrowdStrike for Windows systems as part of the Falcon platform’s protection mechanisms. This update contained a logic error that led to a “blue screen of death” (BSOD), affecting certain systems. A remediation was implemented by 05:27 UTC on the same day. According to CrowdStrike, the impact of this event was specific to customers using Falcon sensor for Windows version 7.11 or higher. It needs to be pointed out that to make their endpoint protection products effective, vendors like CrowdStrike require access to the system files. Any configuration issues with these files can lead to unpredictable behavior at best and leave the system in an unrecoverable state at worst. The resulting outage caused disruptions to airlines, businesses and emergency services and could be the largest IT outage in history. In time, we will know whether the scale and impact of the outage will reach the level of the “NotPetya” cyberattack in 2017. At the time of writing, two days later, airlines – the biggest group of affected enterprises – were still reeling from the outage. It is important to note that this incident was not caused by a cyberattack but rather routine update to configuration files, often referred to as “Channel Files.” In the context of the Falcon sensor, Channel Files are integral to the behavioral protection mechanisms that safeguard systems against cyber threats. These files are dynamically updated multiple times daily. The Falcon sensor’s architecture, designed to incorporate these updates seamlessly, has been a foundational component. In Windows environments, Channel Files are typically located within the directory path C:WindowsSystem32driversCrowdStrike, identifiable by their “C-” prefix. Each file is uniquely numbered, serving as an identifier that aids in the management and deployment of updates. For instance, Channel File 291, denoted by the filename “C-00000291-“, plays a crucial role in how Falcon assesses the execution of named pipes—a standard method for interprocess communication within Windows systems. The significance of Channel File 291 came to the forefront during an update aimed at neutralizing the threat posed by malicious named pipes associated with prevalent Command and Control (C2) frameworks. The update introduced a logic error, leading to a system crash. IDC’s Point of View For historical context, this is not the first time something like this has happened. For example, in 2010, McAfee had an issue with a “DAT” file. The issue with McAfee’s DAT file version 5958 caused a reboot loop and loss of network access on Windows XP SP3 systems due to a false positive that misidentifies the Windows binary “svchost.exe” as the virus “W32/Wecorl.a”. In 2017, Webroot released an update that misidentified Windows system files as malware and Facebook as a phishing site. This update quarantined essential files, leading to instability in numerous computers. In 2021, a mass internet outage was caused by a bad software update by Fastly, there have been many others. This situation – which is not unique to CrowdStrike – exposes four key issues that are fundamental to the IT industry and its complex ecosystem of ISVs. First, it exposes the fact that by giving its ecosystem ISVs direct access to the system kernel, the operating system vendor is essentially removing itself from the trust value chain. Thus, the trust value chain now only includes the ISV and its customers. Second, the process of silent updates in which the customers implicitly rely on the QA process employed by the ISV leaves them inadequately prepared for drastic and timely intervention in the case of mass outages that leave the system in an unrecoverable state. Third, this situation is a wake-up call for the industry on what a system of checks and balances means and what kind of accountability operating system vendors, ISVs and customers must play to avoid this kind of a situation from repeating itself. And finally, fourth, this situation indirectly exposes the fragile human-centric Windows stack that unlike modern network-centric Unix and Linux operating systems cannot robustly manage exception errors instead defaulting to a manually recoverable state. The first point exposes contrasting approaches taken by leading operating system vendors. On one side there are vendors like Apple that take a very prescriptive and closed approach to endpoint protection making it almost impossible for any ecosystem ISV/provider like CrowdStrike to push out configuration changes that can potentially catastrophically impact on the operating system (e.g., iOS or macOS) kernels. Apple has been a fierce advocate of a “walled garden” approach implementing stringent attestation mechanisms to ensure that no one – and we mean no one – gets to modify the system kernel without express approval from Apple. This has made Apple run afoul of the European Commission, and its hawkish regulatory approach to open up operating systems under the premise of fair competition. On the other hand, Microsoft takes – or more importantly was forced to take – a more open approach enabling at least a dozen ISVs in offering modern endpoint protection

The Four Key Issues CrowdStrike Exposed: CIOs’ Next Steps Read More »

Sales Planning: Uncovering Blind Spots and Eliminating the “Swivel Chair Effect”

When selecting a restaurant, you’d likely ask friends for recommendations, do an online search from reliable and trusted sites, and check out the menu before making a reservation. It’s this data that allows you to sort through all the choices, make an informed decision, and have a memorable meal.   If only it were that easy when it comes to sales planning, prospecting, and account segmentation strategies. Traditionally, these tasks have relied on a certain amount of subjective opinion, historical CRM data that may be outdated, and independent, third-party data that may — or may not — be reliable for your cut of the market.  In an ever-evolving B2B market driven by outcomes, you don’t have time to waste on guesswork or relying on unreliable data when you’re prioritizing sales efforts.   The Dizzying Effects of the “Swivel Chair Approach” For many, the sales planning and market evaluation process involves gathering and reconciling historical data from internal and external sources. Without consistent and reliable data, you risk relying on inaccurate information — and that can lead to wasted efforts, suboptimal sales resource allocation, missed opportunities, strategic misalignment, and low win rates. Bottom line? Sellers are not set up for success and, ultimately, your business pays the price.  The “swivel chair approach,” when disparate data is gathered, then manually transferred between processes, can lead to inefficiencies and an incomplete — or incorrect — view of the opportunities that could have led to big revenue wins.   Inadequate data has very real consequences. Consider the customer service platform vendor forced to make assumptions about segments to go after and companies to target based only on their own limited historical data. Accurate information could uncover inherent blind spots in their planning and open new revenue streams for their business.  Or, consider a cyber security platform vendor who used a freelance consultant to build a propensity-to-buy model, but now can’t update it with new data. The result? Wasted and inefficient resource allocation, missed opportunities, inability to efficiently iterate on the original findings and potential loss in revenue.  Sales and Revenue Operations teams consistently strive to streamline operations and improve accuracy through automation and better system integration. But, don’t discount the critical importance of gaining access to data that is consistent, contextual, and reliable at a granular market level.    When Data Can’t Be Trusted, the Cost Can Be High  When it comes to data used to support planning and prospecting functions, trust can’t be overemphasized. Trusted data is informed, accurate, consistent, relevant, and has been vetted through stages of the data value chain. It’s data that sales operations can reliably use to identify areas of growth, set goals and priorities, improve sales productivity, increase competitive advantage and, ultimately, increase win rates.   But, gathering data and intel for sales planning can be challenging:  Multiple, independent data sources make it difficult to build cohesive and consistent plans. Historical internal data often leaves out current market dynamics and external factors needed to make key decisions. Competitive analyses may be faulty or incomplete.  Force fitting data to match growth projections leading to increased pressure and unrealistic expectations. And, defining ideal customer profiles (ICPs) and actual opportunities can be too much of a guessing game.  Fortunately, there’s a better option. By leveraging reliable data and strategic intelligence, you can break through complexities, drive operational excellence, and enhance sales performance. Transforming Sales Operations Through Trusted Data Trusted data is mission critical to formulating robust strategic plans and go-to-market (GTM) strategies. Trusted data is the foundation you need to optimize resource allocation, ensuring sales resources are invested where they will generate the highest returns. Planning that is strategically data-driven, and again built on trusted data, leads to improved sales outcomes and higher win rates.   To effectively plan, you need to define your addressable market, but all too often Total Addressable Market (TAM) calculations are too macro-level for actual sales planning and don’t reveal actual opportunity. This is why current, accurate, and trusted data is so important.  To more clearly assess the situation, you need Serviceable Addressable Market (SAM) data based on your own ICP metrics, such as geography, technology markets, and targeted industries.   And finally, you need Serviceable Obtainable Market (SOM) data, focusing on the market share you can realistically acquire, enabling you to set expectations and allocate resources where they have the best chance of reaping reward.  Data-driven TAM, SAM, and SOM calculations ease the process of market analysis with insights driven from high-quality data. This supports resource allocation by ensuring sales efforts are invested where they will generate the highest returns and assures that sales goals reflect market realities and are directly tied to ICP.  Access to granular data can help assess the viability of your ICP, as well as give insight into previously untapped sales opportunities. This solution negates the limitations of relying on historical data and/or past experiences alone. It identifies adjacent markets and potentially underserved or emerging markets within territories which leads to increasing competitive advantages and more productive sales outcomes.  In short, trusted data helps you address very real questions, including:  Which potential adjacent markets should be targeted for new growth opportunities?  How does our ICP definition translate to sales planning and account prioritization?  Is my competitor vulnerable in specific areas of the market, like certain revenue bands, geographic regions or vertical markets?  Are my projections accurate? Will they help us reach our goals — or maximize our growth potential?   The IDC Velocity for Sales Advantage  IDC Velocity for Sales provides reliable data-driven TAM, SAM, and, most importantly, SOM projections aligned to your areas of growth focus. So, you can streamline — and fine-tune — sales planning and prioritization with insights driven from the highest quality data. Your sales goals will reflect accurate market realities and will be directly tied to your ICP. This, in turn, ensures sales resources and efforts are invested where they will generate the highest returns. You’ll be empowered with strategic insight that can’t be gleaned from one-off purchase intent data sources alone. IDC Velocity provides

Sales Planning: Uncovering Blind Spots and Eliminating the “Swivel Chair Effect” Read More »

Sustainable AI to AI for Sustainability

AI has come a long way, turning from a futuristic concept into a driving force behind some of today’s most exciting and impactful innovations, but that benefit comes at a cost.  AI requires performance-intensive computing achieved with high core count CPUs, coprocessors such as GPUs, and high-speed networking, which can require up to 10 times the amount of power for AI infrastructure compared to general-purpose computing. While organizations need to minimize the environmental impact of AI, the true sustainability promise of AI is how to make all industries more sustainable.  Energy and Carbon Estimates  IDC estimates that AI datacenter energy consumption was 23.0 Terawatt hours (TWh) in 2022, growing at a CAGR of 44.7% and reaching 146.2 TWh by 2027.  To put that into perspective, the forecasted total for 2027 exceeds the estimated 2021 country usage of Sweden, Argentina, or the United Arab Emirates1. In recent years, the datacenter industry has made significant strides in sustainability. Despite being expected to account for 18.0% of the carbon emissions in 2027, AI is expected to account for 14.6% of all datacenter carbon emissions. Those sustainability efforts are apparent, but unfortunately, carbon emissions are still expected to grow by 2027.  Completing the Journey to Net Zero Regardless of industry, most organizations see the business value of environmental goals, with many having set net-zero targets. In IDC’s Datacenter Operations and Sustainability Survey, datacenter operators indicated that Improving Sustainability was their second-highest priority. While sustainability goals can be a tapestry of many initiatives, three principles stand out: energy sourcing, efficiency, and circularity.  Energy Sourcing Data centers significantly lower their carbon footprints by leveraging renewable energy sources such as solar, wind, and hydroelectric power, whether onsite generation via microgrids or funded via power purchase agreements, long-term contracts between an electricity generator and a buyer to purchase renewable energy at predetermined prices. Many sustainable data centers also invest in energy storage solutions to effectively balance supply and demand. Technologies such as advanced battery systems and thermal storage help ensure a consistent energy flow, even when renewable sources are intermittent or the generation grid mix is unfavorable.  In addition to traditional carbon-free renewables, the industry is starting to see investment and implementation via hydrogen and nuclear power.  Efficiency Simply put, efficiency is maximizing datacenter performance and minimizing resource usage, including energy, space, and hardware. Furthermore, sustainable datacenters often implement energy-efficient infrastructure, such as advanced cooling systems, workload consolidation and optimized server configurations, to minimize energy consumption. By prioritizing these green energy initiatives, sustainable datacenters also pave the way for more resilient and cost-effective data management solutions.  Efficiency is not limited to energy.  Datacenter water efficiency focuses on minimizing water usage in cooling systems and other operational processes to reduce environmental impact. Techniques such as liquid cooling, evaporative cooling, and water reclamation systems help data centers achieve this goal by optimizing water consumption and recycling. By implementing these strategies, datacenters can significantly lower their water footprint while maintaining optimal performance and cooling efficiency.  Circularity AI is driving the need for IT asset refreshes. Datacenter capacity planning that includes the circularity, or resale, value of IT assets can open up investment capacity for GenAI budgets and new equipment. IDC forecasts the market for refurbished IT equipment and attached services to reach nearly $15 billion in 2028.  Responsible processing of the used datacenter assets, whether they get recertified for redeployment and resale, harvested for parts, or recycled, represents an opportunity to not only create investment capacity but also be part of meeting corporate sustainability targets. IDC research shows that while organizations increasingly value a broad set of sustainability factors when it comes to IT procurement, the sense of shared sustainability aspirations with suppliers and partners comes in as a top 2 requirement.  Another example of circularity is waste heat reuse. Liquid cooling, a staple in AI datacenters, involves capturing the heat generated by servers and repurposing it for other uses, such as heating nearby buildings or industrial processes. In addition to its energy efficiency, liquid cooling enhances energy efficiency and reduces the data center’s environmental footprint.   AI For Sustainability  While AI will undeniably consume significant amounts of energy, making every effort to implement sustainable AI practices is crucial. This energy use should be viewed as an investment in a more sustainable world, as AI has the potential to drive substantial improvements across various industries. Despite the projected increase in energy consumption, IDC forecasts that datacenters in total, not just AI, will account for only 2.5% of global energy use by 2027, highlighting its relatively modest footprint. The true value of AI lies in its ability to enhance sustainability in sectors such as agriculture, manufacturing, and transportation by optimizing resource use, reducing waste, and improving efficiency. Thus, embracing AI responsibly can lead to a net positive impact on global sustainability efforts, outweighing its energy demands.  Learn what matters most to your customers with IDC’s AI Use Case Discovery Tool—find out more. source

Sustainable AI to AI for Sustainability Read More »