Microsoft just dropped Drasi, and it could change how we handle big data

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Microsoft has launched Drasi, a new open-source data processing system designed to simplify the detection and reaction to critical events in complex infrastructures. This release follows last year’s launch of Radius, an open application platform for the cloud, and further cements Microsoft’s commitment to open-source innovation in cloud computing. Mark Russinovich, CTO and Technical Fellow at Microsoft Azure described Drasi as “the birth of a new category of data processing system” in an interview with VentureBeat. He explained that Drasi emerged from recognizing the growing complexity in event-driven architectures, particularly in scenarios like IoT edge deployments and smart building management. From complexity to clarity “We saw massive simplification of the architecture, just incredible developer productivity,” Russinovich said, highlighting Drasi’s potential to reduce the complexity of reactive systems. Drasi works by continuously monitoring data sources, evaluating incoming changes through predefined queries and executing automated reactions when specific conditions are met. This approach eliminates the need for inefficient polling mechanisms or constant data source querying, which can lead to performance bottlenecks in large-scale systems. The system’s key innovation lies in its use of continuous database queries to monitor state changes. “What Drasi does is takes that and says, I just have a database query… and when an event comes in… Drasi knows, ‘Hey, part of this query is satisfied,’” Russinovich explained. Open-source synergy Microsoft’s decision to release Drasi as an open-source project aligns with its broader strategy of contributing to the open-source community, particularly in cloud-native computing. This strategy is evident in the recent launch of Radius, which addresses challenges in deploying and managing cloud-native applications across multiple environments. “We believe in contributing to the open-source community because… many enterprises are making strategies that are, especially around Cloud Native Computing, centered on open-source software and open governance,” Russinovich said. The Azure Incubations team, responsible for both Drasi and Radius, has a track record of launching successful open-source projects including Dapr, KEDA and Copacetic. These projects are all available through the Cloud Native Computing Foundation (CNCF). While Radius focuses on application deployment and management, Drasi tackles the complexities of event-driven architectures. Together, these tools represent Microsoft’s holistic approach to addressing the challenges faced by developers and operations teams in modern cloud environments. Drasi’s continuous queries usher in a new era of reactive systems Looking ahead, Russinovich hinted at the possible integration of Drasi into Microsoft’s data services. “It looks like it’ll probably slot into our data services, where you have Drasi integrated into Postgres database or Cosmos DB, or as a standalone service that integrates across these,” he said. The introduction of Drasi could have significant implications for businesses grappling with the complexities of cloud-native development and event-driven architectures. By simplifying these processes, Microsoft aims to enable organizations to build more responsive and efficient applications, potentially leading to improved operational efficiency and faster time-to-market for new features. As with Radius, Microsoft is actively seeking feedback from partners and early adopters to refine Drasi and address any scaling, performance, or security concerns that may arise in production environments. The true test for both tools will be their adoption and performance in real-world scenarios across various cloud providers and on-premises environments. As businesses increasingly rely on cloud-native applications and real-time data processing, tools like Drasi and Radius could play a crucial role in managing the growing complexity of modern software systems. Whether Drasi will indeed establish itself as a new category of data processing system, as Russinovich suggests, remains to be seen, but its introduction marks another significant step in Microsoft’s ongoing efforts to shape the future of cloud computing through open-source innovation. source

Microsoft just dropped Drasi, and it could change how we handle big data Read More »

Navigating the Top Drivers of Service Provider Switching

Why Do Organizations Switch Their Service Providers? Companies today use service providers across all facets of the business. From modernizing technology and streamlining operational processes to strategic consulting and IT outsourcing, services providers provide vast value, expertise, and increased efficiency to their customers, enabling them to spend more time and energy focused on their core functions. These relationships between service providers and their clients are forged over time, typically years or even decades, and many providers often promote the average length of time they retain clients as a performance indicator. However, what happens when customers start questioning the relationship’s value? Many customers are often reluctant to switch providers because of the time it takes for a new provider to understand their business, and there are also both direct and indirect costs associated with switching providers. Sometimes, regardless of these drawbacks, customers still choose to move on. What developments usually result in customers reaching this decision? IDC recently examined this question utilizing its Services Path data, looking across all types of services, to understand the reasons why companies ultimately choose to make the switch. IDC’s Services Path program contains comprehensive data and guidance on the mind and journey of services buyers, for professional services, outsourcing services, managed services, and engineering services. The data set is based on a global survey of approximately 2600 organizations, spanning all sizes and industries, and covers topics ranging from adoption, budgeting trends, and purchasing preferences, to pricing and contract options and detailed customer satisfaction ratings for hundreds of vendors. Services Path data shows there are several factors that consistently stand out as the primary reasons why customers choose to switch service providers. The top 3 reasons, examined by role, are shared below (in rank order): IT Respondents: Enable more rapid digital transformation (DX) and modernization Improve service quality and raise service levels Lower cost Line of Business Respondents: Improve service quality and raise service levels Enable more rapid digital transformation (DX) and modernization Improve data analytics and decision support Digital transformation, modernization, and improved SLAs/service levels are the top reasons driving companies to make replacements. However, there are some differences between IT and LOB personnel in terms of what’s driving their decisions to switch. For example, IT places a much greater emphasis on lowering costs than line of business employees. Conversely, LOB leaders view improving corporate sustainability as a significantly greater driver of replacements than IT folks. To help reduce customer switching risk and further bolster the stickiness of long-term relationships, services firms should make note of these top switching drivers and ensure they clearly demonstrate their ability to accelerate their clients’ DX initiatives and deliver high-quality and cost-effective services.  Customers today are looking for strategic partners that can be leveraged broadly across the business, who understand the nuances of their business, have deep expertise in their industry, and can provide fast proof of value. Firms that continually deliver on these needs will be well-positioned to maintain long-term clients and minimize the risk of customer attrition. To learn more about IDC’s Services Path program, watch this short video. Contributing author: Jason Bremner – Research Vice President, Worldwide Services source

Navigating the Top Drivers of Service Provider Switching Read More »

How To Adapt Site Search To Holiday-Season Changes In Customer Behaviors And Needs

Site search provides one of the best customer engagement tools in a retailer’s toolbox. As shoppers shift to holiday buying, adapt to their needs to capitalize on the season. Start by tuning up your search engines for the holidays. You’ll need to adjust your solution now — and throughout the season — to keep pace with: Seasonal changes in product relevance. Your shoppers will be hunting for different types of products this time of year compared with their historical shopping history. Optimize your product data to help them find what they need — and what they didn’t know they needed. Retailers and brands will shift from featuring cotton sheets to warm comforters, tank tops to long sleeves, grilling ingredients to hearty soups, SPF skincare to deep moisturizers, and vitamins to medicine. Behavioral and mindset changes. Consumers start gift shopping early in the season: In 2023, 23% of US online adults started their shopping before November. In 2024, 71% of US shoppers plan to watch for deals and special offers online, but many are in the mood to splurge: 25% of US online adults are planning to spend more this year than they did last year. Expect changes in even your most loyal customers’ behavior. Frugal shoppers might become generously open to pricier items for gifting. Savvy shoppers might need more information as they select gift items with which they’re not familiar. Tune your search to create a mixed assortment of pricier items and deals, offer paths to purchase with educational content, and recommend giftable goods to guide shoppers. The shift from personal-preference to recipient needs. Your customers are shopping for others, not themselves (as much). Don’t over-rely on your customer’s recent purchase history or established preferences — they may be looking for entirely different categories and items as gifts. Even locations will change; the customer who usually prefers in-store pickup at a specific location suddenly may be ordering to multiple far-away shipping destinations. One-quarter of US online adults plan to use “buy online, pick up in store” offerings this holiday season, per Forrester’s July 2024 Consumer Pulse Survey. Ensure that consumers can filter their product search results not only by traditional sizing and colors but by factors such as fulfillment options, allergens, or brand and product preferences. The more granularly a shopper can filter their results, the better the experience it is for them to find the items they are looking for. Want to continue the conversation about optimizing product discovery and site search for the upcoming holiday season? Book an inquiry or guidance session here, and continue to gear up for the holiday season with Forrester’s 2024 Holiday Prep series. (coauthored with Senior Research Associate Delilah Gonzalez) source

How To Adapt Site Search To Holiday-Season Changes In Customer Behaviors And Needs Read More »

Cohere just made it way easier for companies to create their own AI language models

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Artificial intelligence company Cohere unveiled significant updates to its fine-tuning service on Thursday, aiming to accelerate enterprise adoption of large language models. The enhancements support Cohere’s latest Command R 08-2024 model and provide businesses with greater control and visibility into the process of customizing AI models for specific tasks. The updated offering introduces several new features designed to make fine-tuning more flexible and transparent for enterprise customers. Cohere now supports fine-tuning for its Command R 08-2024 model, which the company claims offers faster response times and higher throughput compared to larger models. This could translate to meaningful cost savings for high-volume enterprise deployments, as businesses may achieve better performance on specific tasks with fewer compute resources. A comparison of AI model performance on financial question-answering tasks shows Cohere’s fine-tuned Command R model achieving competitive accuracy, highlighting the potential of customized language models for specialized applications. (Source: Cohere) A key addition is the integration with Weights & Biases, a popular MLOps platform, providing real-time monitoring of training metrics. This feature allows developers to track the progress of their fine-tuning jobs and make data-driven decisions to optimize model performance. Cohere has also increased the maximum training context length to 16,384 tokens, enabling fine-tuning on longer sequences of text — a crucial feature for tasks involving complex documents or extended conversations. The AI customization arms race: Cohere’s strategy in a competitive market The company’s focus on customization tools reflects a growing trend in the AI industry. As more businesses seek to leverage AI for specialized applications, the ability to efficiently tailor models to specific domains becomes increasingly valuable. Cohere’s approach of offering more granular control over hyperparameters and dataset management positions them as a potentially attractive option for enterprises looking to build customized AI applications. However, the effectiveness of fine-tuning remains a topic of debate among AI researchers. While it can improve performance on targeted tasks, questions persist about how well fine-tuned models generalize beyond their training data. Enterprises will need to carefully evaluate model performance across a range of inputs to ensure robustness in real-world applications. Cohere’s announcement comes at a time of intense competition in the AI platform market. Major players like OpenAI, Anthropic, and cloud providers are all vying for enterprise customers. By emphasizing customization and efficiency, Cohere appears to be targeting businesses with specialized language processing needs that may not be adequately served by one-size-fits-all solutions. Cohere’s Command R 08-2024 model outperforms competitors in both latency and throughput, suggesting potential cost savings for high-volume enterprise deployments. Lower latency indicates faster response times. (Source: Cohere / artificialanalysis.ai) Industry impact: Fine-tuning’s potential to transform specialized AI applications The updated fine-tuning capabilities could prove particularly valuable for industries with domain-specific jargon or unique data formats, such as healthcare, finance, or legal services. These sectors often require AI models that can understand and generate highly specialized language, making the ability to fine-tune models on proprietary datasets a significant advantage. As the AI landscape continues to evolve, tools that simplify the process of adapting models to specific domains are likely to play an increasingly important role. Cohere’s latest updates suggest that fine-tuning capabilities will be a key differentiator in the competitive market for enterprise AI development platforms. The success of Cohere’s enhanced fine-tuning service will ultimately depend on its ability to deliver tangible improvements in model performance and efficiency for enterprise customers. As businesses continue to explore ways to leverage AI, the race to provide the most effective and user-friendly customization tools is heating up, with potentially far-reaching implications for the future of enterprise AI adoption. source

Cohere just made it way easier for companies to create their own AI language models Read More »

Evolving Digital Marketing Strategies in the Era of AI Everywhere

IDC predicts that by 2025, the top 1000 organizations in Asia will allocate over 50% of their core IT spend on AI initiatives, leading to a double-digit increase in the rate of product and process innovations1. Moreover, according to a recent IDC survey, 35% of Asia/Pacific Japan (APJ) organizations will increase their IT budget to fund their AI Everywhere initiatives2. Both are proof of the ongoing AI revolution in the region. As AI adoption continues to increase in the region, some organizations will find out the hard way though that they cannot simply replicate what peers or competitors are doing. Worse, some will realize they cannot buy or build just any AI solution they fancy and hope for the best. As a framework to assess their organization’s AI Everywhere readiness, IDC drafted a framework outlining the key elements of successful AI implementations. First, it is imperative for organizations to be strategic in deploying AI in preparation for industry disruptions. Second is the importance of creating a roadmap that prioritizes use cases based on an organization’s distinct requirements. These first two are all about planning for success. Next will be to build their enterprise on a foundation of intelligent apps, data and models to drive productivity, enhance operational efficiency, and create exceptional customer, partner, and employee experiences, among other benefits. Fourth will be the importance of having a robust digital infrastructure able to support AI workloads at scale to fully harness AI’s potential. These two are all about transforming their people, processes, technology, business models, and data readiness. And, finally, the necessity of a rigid AI and data governance system to ensure data security, and safety and trust of their users. In short, organizations who are already or about to embark on an AI journey will need to have a foundation of various digital innovations in place and to AI-engineer their AI adoption path before they can ensure AI creates the most significant business impact for their organizations. It goes without saying then that tech vendors must master their customer’s AI journeys and readiness. Tech vendors must guide tech buyers every step of the way in effectively aligning their business objectives and priorities to their AI Everywhere initiatives. But, at the same time, tech vendors must also understand that each customer has unique objectives that require unique solutions. To succeed, tech vendors must be equipped to help them assess their markets better. Tech vendors should have the capabilities to demonstrate their AI solutions’ value, grow their business, create new opportunities, and elevate brand awareness. IDC recently hosted a discussion on this topic with a group of leading global IT vendor CMOs in Singapore during a Sunrise Kopi briefing on AI Everywhere. At the event, IDC Research and IDC Custom Solutions had a lively discussion and advised them on how they can improve their AI solutions’ go-to-market strategies while aligning these to their customer’s objectives and challenges. Below are some of the key points discussed during the session: Generate demand in a crowded market. Almost every vendor is touting an AI solution. Tech vendors need to raise their brands above the noise of the over 200 AI use cases available in the market by demonstrating the business impact of their “real AI’ solutions through effective content marketing. Go beyond the CIO – Convincing the CEO through other decision influencer personas.  AI is more of a strategic initiative than an IT investment.  The decision-making process involves multiple personas with the need to convince the CEO and the board.  Tech vendors need to understand the priorities of these different personas and position their solutions appropriately, and not just focus on the CIO agenda.  IDC shared some of its published research on the AI investment priorities and buyer behavior of different personas across industries, and how to drive targeted marketing and enable sales teams to engage more effectively with these personas. Communicate brand position clearly and change brand perception to an AI provider.  There is a need to cut across the hype and demonstrate who has a “real AI” solution.  We advised them on how to leverage thought leadership assets to show their understanding of the buyer journey and its corresponding issues, and to clearly demonstrate their solution’s business value and ability to generate tangible and significant business outcomes. Shorten the decision-making process.  Although the buying decision involves multiple personas, the CEO and CIO must engage and align on the strategy.  A lot of conversations need to happen between the CEO and CIO.  IDC discussed its AI Readiness Maturity Model that helps bring all parties into a common assessment of where they are in their AI journey and aligns all stakeholders into a common decision pathway.  This tool can be used to measure partner/channel readiness to sell AI solutions as well as open an engagement opportunity for sales. Tracking the dark funnel.  How do we track buyers who are clicking through, researching into the topic but are not picked up by the sales/marketing funnel?  IDC shared some of the TCO/ROI tools that can help tech vendors assess their customer’s environments and help draw them from the dark funnel into the sales/marketing funnel.  In addition, IDC also talked about leveraging media amplification and lead-generation services to facilitate the conversion of buyers in the dark funnel as marketing qualified leads. In summary, there is a lot of noise in the market that muddles tech buyers’ visions as they search for suitable, legitimate, and reliable AI solutions to help them address their unique business needs. Given the right data, insights, and go-to-market strategies, tech vendors can clear tech buyer’s views if they plan, market, and sell their AI solutions well, and stand out as the preferred and trusted AI technology vendors of choice. IDC Custom Solutions helps tech vendors enhance results with tailored insights and proven tools. Discover how IDC’s AI Use Case Discovery Tool can elevate your AI strategy—learn more here. 1 Source: IDC FutureScape: Worldwide Artificial Intelligence and Automation 2024 Predictions – Asia/Pacific (Excluding Japan)

Evolving Digital Marketing Strategies in the Era of AI Everywhere Read More »

Don't Look Up, movie review: Smart, funny and depressing

Don’t Look Up • Written by Adam McKay and David Sirota • Netflix Image: NIKO TAVERNISE / Netflix I fear I will never appreciate the humour in Netflix’s new film Don’t Look Up (written by Adam McKay and David Sirota) until we all stop squabbling about facts that pose an existential threat. Yes, it’s terribly clever. Yes, it’s well-produced and well-acted, and meticulously casts Meryl Streep, as a feckless Trumpian US president with an eye for the main chance. I know it’s funny, but I can’t laugh. It’s like reading Private Eye: the humour can’t squeak past its extremely depressing underlying reality. The plot: Michigan State PhD candidate Kate Dibiasky (Jennifer Lawrence) finds a giant comet heading straight for Earth. She and her professorial supervisor, Randall Mindy (Leonardo DiCaprio) calculate that it will cause an extinction-level event.  In hindsight (were it available to them), they might have done better to post the news and supporting data on Twitter, where the world’s astronomers, journalists, and activists would at least have applied some seriousness. But this movie’s target is the unsavoury industrial complex formed by the traditional media, politicians, and business.   President Orlean (Meryl Streep) chairs a White House meeting to discuss the imminent Armageddon. Image: NIKO TAVERNISE / Netflix   So instead, our heroes do the time-honoured thing of calling the authorities. In this case, these are NASA scientist Teddy Oglethorpe (Rob Morgan), who gets them an appointment at the White House with President Orlean and her chief-of-staff son Jason (Jonah Hill). While they wait just outside, bigger problems seize priority inside the Oval Office. “Does the president know why we’re here?” Randall asks. “They know,” Oglethorpe says wearily.  See also: Project management: Five ways to make sure your team feels engaged. In frustration, they turn to the media — a newspaper, which insists on booking a TV appearance for publicity.  “Keep it light, fun…” the producer tells them as they’re being prepped for early morning airtime. This is certainly the approach of TV hosts Jack (Tyler Perry) and Brie (Cate Blanchett), who after all are here every day and must keep their audience’s affection.  Meanwhile, the head of NASA drives off serious media coverage by calling the comet “near-miss hysteria”.  In satirising modern America’s lack of qualification to tackle an existential crisis, Don’t Look Up ignores alternatives. No activists fire up campaigns. No bloc of governments convenes to find solutions. In this movie, it appears that only the US can save us. Hollywood is not ready for movies in which China rescues the world, even if the rest of us would be grateful. Recent book reviews source

Don't Look Up, movie review: Smart, funny and depressing Read More »

Are You Aligning Your Tech Success With Metrics That Matter?

As a leader of your technology organization, you most likely face the perennial measurement issue of misalignment. This misalignment manifests in various ways, differing in scope and intensity. One of the most critical areas where this shows up, however, is metrics. Tech leaders often grapple with finding a measurement approach that drives alignment. Why does this matter? According to Forrester, companies with high alignment experience nearly twice as much revenue growth compared to those without it. The most common misalignments we’ve identified include the following: There’s a lack of linkages between technology metrics and organizational strategic goals. This stems from the traditional approach of measuring tech success with an inward focus instead of being business- and customer-led. Portfolio performance indicators such as being on schedule and on budget are important but are operationally oriented and don’t reflect how they link to quantifiable customer and business outcomes that are crucial to achieving your overall organization’s strategic goals. Your tech metrics and your peers’ metrics are out of sync. Your peers such as the chief marketing officer, chief data officer, or chief experience officer have their own metrics, and more often than not, your metrics look very different or speak a different language compared to theirs. This becomes problematic over time with value capture, leading to a struggle to articulate how the tech organization helps your peers achieve their goals and eventually drive business success. Worse still, sometimes misalignment might lead to conflicts; for instance, the tech team might have minimizing error rates as a metric, which could lead developers to create additional, sometimes onerous, verification steps, thereby increasing customer effort to complete tasks. Your tech metrics are driving the wrong behaviors and actions, creating a disconnect between intended and actual outcomes. It’s important to evaluate the potential behaviors and actions that your metric could drive across your tech organization to avoid unintended consequences. For instance, to maintain cost effectiveness and efficiency, number of tickets closed and time to resolution are common metrics that tech organizations track. But an emphasis on these metrics could lead to bad outcomes; tech teams are likely to rush through issues and provide superficial fixes, rather than holistic ones (which would take a longer time) that would prevent issues from happening in the first place. The result is that issues continue to linger and possibly create bigger organizational inefficiencies. Join me at my session, “Align Tech Success With Metrics That Matter,” at Forrester’s Technology & Innovation Summit APAC on October 29 in Sydney (and digitally) to: Understand how your peers are measuring tech success and the key areas of misalignment. Learn the five principles of effective measurement. Uncover how you can drive alignment to ensure tech success. source

Are You Aligning Your Tech Success With Metrics That Matter? Read More »

The Rising Urgency for Telecom Innovation

High Expectations Collide with Market Realities Telecom Service Providers are facing historic challenges amidst shifts in both enterprise and consumer demand and challenges transforming from “connectivity providers” to digital platform players. Historically, telecom service providers have championed connectivity at scale. In past decades, this proved a profitable strategy, with the value of connectivity garnering consistent year-over-year revenue growth and profits. However, recent years have seen telecom providers grapple with a host of challenges including industry competition, commoditization of services, and inflexible IT systems that have made it hard for them to swiftly innovate and compete against new threats. Further, while network traffic continues to rise, predominantly driven by video apps, service providers have been unable to effectively monetize this traffic growth. The disconnect between revenue growth and network traffic growth remains one of the top challenges globally for service providers as they hunt ways to reinsert themselves and justify connectivity as not just a commodity, but as a value-based service that can be delivered to support a range of use cases and verticals. In response, many forward-thinking telecom providers have made a purposeful decision to focus their technology offerings and ecosystem partners on targeting digital engagement and new revenue opportunities and rearchitecting their technology stacks to align with hyperscale cloud models as a means to simultaneously control costs and position for service agility longer term. Even so, third-party entities, including CPaaS, cloud, and other digital platform players, have moved into largely siphon off these digital opportunities while curating vast developer ecosystems, once again relegating many telecom providers to a connectivity-only role. New Tools in the Arsenal Create New Monetization Opportunities for Telcos Amidst this push and pull of telecom service provider efforts, a new opportunity has emerged, driven by the promise of SA 5G networks and API exposure capabilities to empower telecom providers to reinsert themselves within the digital landscape by unlocking the ability to more easily sell and scale customized, programmable connectivity designed to be packaged and consumed by application developers. Unsurprisingly, hyperscale cloud providers, CPaaS companies, and systems integrators have also positioned themselves for this new market opportunity by aligning with industry consortia (e.g., Camara, Open API Gateway) that are championing global standards; however, it remains to be seen where, how, and by whom value will ultimately be created and monetized. Figure 1: Emerging Telecom and Network API Ecosystem Source: IDC, 5G Exposure and Network APIs: How Will the Telecom Ecosystem Capture New Opportunities with Developers? As part of these market developments, the worldwide IDC team has spent the past couple of years building a methodology to size this opportunity and define ways the telecom API ecosystem can work together to enhance this emerging market. Telecom Service Providers Can Capitalize on AI and GenAI to Improve Business Results and Potentially Reshape Their Market Role While APIs represent one-way service providers can capture new monetization opportunities, Artificial Intelligence (AI) presents another avenue to drive business results. More specifically, AI can be inserted into the telecom technology stack to improve TCO, enhance service agility (e.g., AIOps), as well as improve the customer experience (CX) lifecycle. As telcos move toward future network architectures governed by cloud-native architectures, this ushers in a much greater role for automation and orchestration across various physical, virtual, and containerized network functions, as well as AI-informed operations and monetization platforms. This in turn raises the importance of adopting AIOps within network operations; however, network-related AIOps brings its own unique set of challenges for Telecom Service Providers as well as a vendor community that overlaps but does not entirely match, the more generalized ITOps vendor roster. Meanwhile, GenAI has emerged as a powerful tool to enable telcos to embrace some of the benefits of AI while simultaneously investing in the internal skillsets and capabilities required to embrace AI more broadly. The graphic below highlights some of the key use cases IDC envisions for GenAI across telco environments. Figure 2: GenAI Telco Use Cases Across Telco Environments While this graphic provides an optimistic outlook for the full set of Gen AI’s impact on telecom service providers, the reality is it will take time, effort, and AI partners for telecom providers to realize gains from AI. Indeed, with AI curators racing to drive AI innovation across multiple environments (e.g., hybrid and multi-cloud, etc.), it is likely multiple models will become prevalent in which telecom service providers serve dual purpose by becoming some of the strongest consumers and distributors of AI and Gen AI going forward. Further, interest in AI applications is also prompting service providers to build near-term roadmaps clarifying how enterprise customers can leverage their core and edge assets to support emerging use cases (e.g., AI inferencing at the edge) while reinforcing connectivity as the foundation of AI-enabled applications and services. Indeed, while AI is being emphasized by many organizations, it will require a global distribution mechanism to help scale. Hyperscale cloud providers are top-of-mind, but telecom service providers can also play a role in connecting AI apps. Overall, it is a critical time for telecom providers, and their technology vendors, to synchronize on key priorities and investment strategies, particularly in light of historical struggles to optimally monetize telecom networks. Doing so can enable them to rearchitect a brighter future for telecom monetization and set them up for a key role in a digital, AI-centric world. For a deeper dive into these topics, watch IDC’s July 10th webinar, “Revenue Enablers for the Future Telco: APIs, AI, and Emerging Tech”. source

The Rising Urgency for Telecom Innovation Read More »

Get Visibility Into Healthcare’s Biggest Blind Spot: Concentration Risk

It’s been a banner year for healthcare, and not in a good way. As a healthcare provider, if your patients had trouble filling a prescription, if your organization struggled to submit claims to generate much-needed revenue, or if your organization had to ask a patient to reschedule a non-essential medical procedure, then you are likely a casualty of healthcare concentration risk. Concentration is a type of systemic (external) risk that occurs when extreme dependencies within an organization’s business, operating, or commercial model create a single point of failure. When a systemic risk event for healthcare occurs, it sets off a chain of failures and disruptions with negative implications for healthcare organizations (HCOs) and dire consequences for patients. You don’t need to understand concentration risk for your organization to be impacted by it, but it won’t get better until you do. In our new report, Concentration Risk In Healthcare, we outline the necessary steps that HCOs must take to identify and mitigate healthcare concentration risk in five key areas. Avoid These Five Sources Of Concentration Risk We’ve previously written about how HCOs must take proactive action against concentration risk and how oligopolies in the pharmacy benefit manager market, for example, accelerate the spread of medical deserts. To be resilient in response to disruptions caused by natural events, market conditions, or other systemic risks, HCOs must identify and mitigate concentration risk in five common areas or suffer the consequences of lost revenue, reputational damage, and, at worst, putting lives at risk: Labor. The existing labor supply and demand problem in healthcare will only intensify as the patient population grows and ages. The skills and knowledge gap, left behind by retiring clinicians and changes in training practices, further exacerbates this issue. Additionally, as reliance on technology increases, critical documentation skills are often missing during cybersecurity crises or routine downtimes. HCOs must prioritize flexible staffing solutions and knowledge transfer. Technology. Overreliance on a single technology vendor can leave HCOs vulnerable to data breaches and service disruptions, especially when electronic medical records and telehealth services are indispensable. HCOs must diversify their technology partnerships, ensure interoperability between systems, and establish robust cybersecurity measures. Artificial intelligence. Dependence on AI algorithms for critical decision-making processes, such as prior authorization, can lead to wrongful denial of care in favor of speed and cost-cutting. HCOs must balance AI innovation with proper precautions and guardrails. Data. Relying on limited or biased datasets for decision-making, research, and AI training can introduce biases into patient care, thereby perpetuating existing inequities. Don’t limit the effectiveness of emerging healthcare technologies. HCOs must aim to collect and utilize diverse datasets and implement rigorous data governance practices. Monopolies and oligopolies. When only a few big players dominate a market, customers can suffer if a disruption causes major shortages. Hurricane Helene’s damage of a single North Carolina plant that is responsible for 60% of the nation’s IV fluid production has hospitals nationwide experiencing a shortage, which is likely to be exacerbated by Hurricane Milton. This concentration of power in the hands of a few large entities can reduce competition, increase prices, and hinder innovation, leading to a complacent focus on incremental improvements rather than resilience. HCOs must identify single points of failure resulting from monopolies and oligopolies and develop mitigation strategies at the enterprise level. Don’t Wait For Disaster: Act Now To Mitigate Concentration Risk Read the full report to dive deeper into identification techniques and effective mitigation strategies. Forrester clients should schedule an inquiry or guidance session with Alla Valente and Arielle Trzcinski to discuss how you can protect your organization from the fallout of concentration risk. Tiffany Do contributed to this blog post. source

Get Visibility Into Healthcare’s Biggest Blind Spot: Concentration Risk Read More »

Chip industry faces talent shortage as revenues head to $1 trillion

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More In 2022, Deloitte expected that the global semiconductor industry would need to add a million skilled workers by 2030, or more than 100,000 annually. Two years later, that forecast still holds. But key industry trends continue to compound the talent challenge as the industry races toward $1 trillion in revenue by 2030, according to a new report by Deloitte, the accounting and consulting giant. The company said that advanced skills driven by demand for Generative AI (GenAI) mean that the talent needed for advancing technologies is often in high demand and can be difficult to attract and retain in a competitive talent market. The report’s timing is interesting, considering the U.S. is reportedly considering limiting sales of AMD and Nvidia AI chips aboard. Deloitte foresees a $1 trillion chip industry by 2030. The semiconductor industry is facing an aging workforce without a clear plan for succession, which may be further exacerbated by low industry appeal compared to the broader tech industry. I suppose this is because the chip industry isn’t as sexy as working for AI or social media companies. Global solutions needed for a global challenge Deloitte foresees a shortage of chip workers. Localization of manufacturing, as well as overall global demand trends, is contributing to a talent and skills shortage that spans the globe. Semiconductor companies are often left competing over the same insufficient pool of existing talent. And talent outcomes are tied to global chips laws. Both the U.S. and European chips legislation include specific objectives and grant application requirements regarding workforce development that companies should commit to in order to receive funding, remain in compliance, and achieve growth objectives. Geopolitical concerns and supply chain fragility continue to contribute to the onshoring of manufacturing (advanced node, trailing node, memory) and back-end ATP (assembly, test, and packaging) processes. A history of cycles The cyclical chips industry experienced its seventh downturn since 1990, with revenues declining 9% to $520 billion for 2023. As a result, development of some new fabrication capacity has been extended, which has also likely delayed some of the immediate, short-term need for talent. This downturn is expected to be temporary, with revenue set to grow by 16% in 2024 to an all-time high of $611 billion. With the industry back on track to reach the $1 trillion figure for 2030, talent will be needed to fuel that growth. But now there’s more time to optimize talent forecasts, mix, pipeline, skills and capabilities, and development plans. A richer understanding of the challenges driving the semiconductor talent shortages can enable semiconductor leaders to deploy targeted strategies to help address their looming talent needs. Advanced skills being driven by demand for GenAI Lots of countries are focusing on domestic chip industries. According to Deloitte’s 2023 Smart Manufacturing: Generative AI for Semiconductors Survey, 72% of industry leaders surveyed predict that GenAI’s impact on the semiconductor industry will be “high to transformative.” Respondents saw high potential for Generative AI’s use throughout their business, with heavier value realization expectations within core engineering, chip design and manufacturing, operations, and maintenance. Although GenAI may help alleviate some engineering talent shortages by addressing routine tasks and giving engineers more time to perform their core jobs better and faster, the GenAI skill set scarcity remains. The semiconductor workforce is expected to need to exponentially grow its GenAI skill sets due to their shortage in the market. And leaders in the field are often in high demand across most sectors ofthe economy. Semiconductor companies should consider offering more novel benefits beyond competitive compensation, such as having a seat at the table, to better attract AI talent and leadership. Having proficient GenAI talent is key in driving the industry’s ability to innovate and reap the benefits of this transformative technology. Looming talent cliff and low industry appeal An aging workforce, regulatory changes, newly required skill sets, and shifting employee expectations are changing the landscape of semiconductor talent. The lack of brand awareness and appeal in the semiconductor industry compared to better-known technology brands can make addressing these challenges more difficult for the industry. Semiconductor companies seem to recognize that attracting and retaining new and diverse talent is more important than ever, yet it continues to be a challenge for many organizations. Building diversity can be difficult; currently only one-third of the U.S. semiconductor industry employees identify as female and less than 6% as Black or African American. The U.S. semiconductor workforce is also older than other technology industries: As of July 2024, 55% of the U.S. semiconductor workforce is 45 or older, with less than 25% under the age of 35.11 In Europe, 20% of the industry is 55 or older, with Germany expecting about 30% of their workforce to retire over the next decade. Inconsistent knowledge management, and the lack of new talent to adopt institutional knowledge, presents an additional workforce barrier for many semiconductor companies. Relative to other sectors of the technology industry, semiconductor organizations can offer a sense of trust, stability, and projected market growth—attractive qualities to the most recent college entrants. While semiconductor companies may have struggled with brand recognition and a competitive employee value proposition, investing in recent high school graduates could help reinvigorate talent pipelines that may be more attracted to stability and flexibility over rapid advancement. A global shortage The need for semiconductor talent is a global issue. Countries are not producing enough skilled talent to meet their workforce needs. And companies can’t continue to tussle over the same finite talent pool while still expecting to successfully grow the industry, launch new (and expand existing) fabs, and keep up with rapid technological advances. In the United States, where the majority of annual graduates with a master’s degree in semiconductor-related engineering fields are foreign students, 80% of those graduates do not stay in the United States post-graduation. According to Deloitte China and Asia Pacific’s most recent APAC Semiconductor Industry Trends Survey,

Chip industry faces talent shortage as revenues head to $1 trillion Read More »