CIO CIO

Microsoft hit with more litigation accusing it of predatory pricing

“The likelihood of your largest enterprise customers being harmed is minimal,” Kimball said. “If there is a victim in this, it’s the small businesses. They are the ones who are being impacted.” Jim Mercer, the program VP for software development at IDC, said the lawsuit’s accusations are nothing new for the fiercely competitive cloud space.  “The competition among the big hyperscalers — major cloud service providers like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud — is intense due to the high stakes in dominating the cloud computing market,” Mercer said. “These companies are competing on multiple fronts, including pricing, innovation, scalability, infrastructure, performance, and service offerings, such as genAI capabilities. The hyperscalers are using whatever strengths or leverage they have to win market share.” source

Microsoft hit with more litigation accusing it of predatory pricing Read More »

How AI orchestration has become more important than the models themselves

Large language models (LLMs) just keep getting better. In just about two years since OpenAI jolted the news cycle with the introduction of ChatGPT, we’ve already seen the launch and subsequent upgrades of dozens of competing models. From Llama3.1 to Gemini to Claude3.5 to GPT-o1, the list keeps growing, along with a legion of new tools and platforms used for developing and customizing these models for specific use cases. In addition, we’ve seen the introduction of a wide variety of small language models (SLMs), industry-specific LLMs, and, most recently, agentic AI models. The sheer number of options and configurations, not to mention the costs associated with these underlying technologies, is multiplying so quickly that it’s creating some very real challenges for businesses that have been investing heavily to incorporate AI-powered capabilities into their workflows.     In fact, business spending on AI rose to $13.8 billion this year, up some 500% from last year as companies in every industry are shifting from experimentation to execution and embedding AI at the core of their business strategies. For companies that want to keep pace with each new advancement in a world that’s moving this fast, simply buying the latest, greatest, most powerful LLM will not address their needs. Computing costs rising Raw technology acquisition costs are just a small part of the equation as businesses move from proof of concept to enterprise AI integration. As many companies that have already adopted off-the-shelf GenAI models have found, getting these generic LLMs to work for highly specialized workflows requires a great deal of customization and integration of company-specific data. Applying customization techniques like prompt engineering, retrieval augmented generation (RAG), and fine-tuning to LLMs involves massive data processing and engineering costs that can quickly spiral out of control depending on the level of specialization needed for a specific task. In 2023 alone, Gartner found companies that deployed AI spent between $300,000 and $2.9 million on inference, grounding, and data integration for just proof-of-concept AI projects. Those numbers are only growing as AI implementations get larger and more complex. The rise of vertical AI To address that issue, many enterprise AI applications have started to incorporate vertical AI models. These domain-specific LLMs, which are more focused, and tailor-made for specific industries and use cases, are helping to improve the level of precision and detail needed for certain specialized business functions. Spending on vertical AI has increased 12x, this year, as more businesses recognize the improvements in data processing costs and accuracy that can be achieved with specialized LLMs. At EXL, we recently launched a specialized Insurance Large Language Model (LLM) leveraging NVIDIA AI Enterprise to handle the nuances of insurance claims in the automobile, bodily injury, workers’ compensation, and general liability segments. Our LLM was built on EXL’s 25 years of experience in the insurance industry and was trained on more than a decade of proprietary claims-related data. We developed the model to address the challenges many of our insurance customers were having trying to leverage off-the-shelf LLMs for highly specialized use cases. Because those foundational LLMs do not include private insurance data or a domain-specific understanding of insurance business processes, our clients found they were spending too much time and money trying to customize them. Our EXL Insurance LLM is consistently achieving a 30% improvement in accuracy on insurance-related tasks over the top pre-trained models, such as GPT4, Claude, and Gemini. As a result, our clients are seeing enhanced productivity and faster claim resolution with lower indemnity costs and claims leakage – all-powerful value drivers. Choreographing data, AI, and enterprise workflows While vertical AI solves for the accuracy, speed, and cost-related challenges associated with large-scale GenAI implementation, it still does not solve for building an end-to-end workflow on its own. To integrate AI into enterprise workflows, we must first do the foundation work to get our clients’ data estate optimized, structured, and migrated to the cloud. This data engineering step is critical because it sets up the formal process through which analytics tools will continue to be informed – even as the underlying models keep evolving over time. It requires the ability to break down silos between disparate data sets and keep data flowing in real-time. Once the data foundation is in place, it is important to then select and embed the best combination of AI models into the workflow to optimize for cost, latency, and accuracy.  At EXL, our repertoire of AI models includes advanced pre-trained language models, domain-specific fine-tuned models, and intelligent AI agents suited for targeted tasks. These models are then integrated into workflows along with human-in-the-loop guardrails.  This process not only requires technical expertise in designing the most effective AI architecture but also deep domain knowledge to provide context and increase the adoption to deliver superior business outcomes. EXL The goal here is to make AI integration feel completely seamless to end users. They should not be jumping in and out of different tools to access AI; the technology needs to meet them where they are in the existing applications they’re already using. For example, EXL is currently working on a project with a multinational insurance company designed to improve underwriting speed and accuracy with AI. At its core, that process involves extracting key information about the individual customer, unstructured data from medical records and financial data and then analyzing that data to make an underwriting decision. We created a multi-agent solution where one agent used specialized LLMs for data extraction alongside our EXL Insurance LLM for data summarization and insight generation. Meanwhile, a separate AI agent used machine learning and analytics techniques to make underwriting and coverage decisions based on the outputs from the first model. So, by orchestrating, engineering, and optimizing the best combination of AI models, we were able to seamlessly embed AI into underwriting workflows without adding excessive data processing costs.  Similarly, we orchestrated and engineered another multi-agent solution for a leading bank in the U.S. to autonomously address lost card calls. Strategic AI orchestration is the real

How AI orchestration has become more important than the models themselves Read More »

5 tips for better business value from gen AI

Centralize and improve data quality around customer interactions to enhance the accuracy, completeness, and timeliness of data insights; Improve customer retention and prospect conversion rates by developing gen AI use cases aimed at personalizing marketing content campaigns; Facilitate change management in marketing and sales by gaining adoption in a few winning approaches and sharing best practices rather than serially experimenting with many capabilities. Target call center and service operations Call centers, customer service departments, IT service desks, and other support services have significant amounts of data in the form of service tickets, knowledge bases, and user profile information from CRM and HCMS platforms. Gen AI applied in these areas can have a force-multiplying impact by improving customer or employee satisfaction scores, reducing costs, and improving job satisfaction for service desk employees. “In support functions, gen AI expedites call center operations by generating rapid, context-aware responses to intelligently route queries, reduce average handling time, and improve resolution rate,” says Ram Ramamoorthy, director of AI research at ManageEngine. “In IT service management, AI-driven knowledge graphs provide issue diagnosis and proactive resolution, decreasing downtime.” Ashwin Rajeeva, co-founder and CTO of Acceldata, recommends CIOs collaborate with department leaders on gen AI use cases and “track Net Promoter Scores and resolution times in customer support to quantify AI’s impact on loyalty and efficiency. In HR, measure time-to-hire and candidate quality to ensure AI-driven recruitment aligns with business goals. Observability metrics such as data quality, freshness, and consistency provide essential insights that enhance the reliability and precision of these AI-driven outcomes.” source

5 tips for better business value from gen AI Read More »

Nearly 25% of SAP ECC customers unsure about their future

The future of SAP architectures is hybrid. But according to a survey conducted by the Financials subgroup of the German-speaking SAP User Group (DSAG), where exactly the journey will go has not yet been decided for many organizations. Nearly half of SAP customers surveyed (47%) still work with on-prem SAP ERP Central Component (ECC), and many have not yet determined which path they want to take to S/4HANA once ECC hits end of maintenance. From Aug. 15 to Sept. 16, 2024, DSAG surveyed 267 representatives of member company in its Financials subgroup, which includes verticals such as financial services, energy supply, real estate, audit and risk management, data protection, and taxes. In addition to the 47% on ECC, 42% are using S/4HANA (Classic Edition) on-premises. Just 11% of SAP customers surveyed are currently on S/4HANA cloud, with 8% on the Private Edition and 3% on the Public Edition. source

Nearly 25% of SAP ECC customers unsure about their future Read More »

Is Technical Debt a Barrier to AI App Deployment?

Nearly every IT leader today is in the midst of moving the next generation of AI apps from the design phase into deployment, and they are finding that they must grapple with the problems that arise when those apps are dependent on legacy data or infrastructure. And according to the attendees of our CIO Roundtables, there is no one-size-fits-all answer. Each case of legacy dependence must be evaluated separately. Once IT leaders have evaluated how specific AI projects are impacted by technical debt, they then check the organization’s existing plan for addressing technical debt, possibly choosing to accelerate the parts of that plan that will best help meet the goals of the specific AI project. Particularly tricky are AI apps that are dependent on resources that are trapped by technical debt, usually because data is stuck in a system with substantial issues. Er There are two common problems.  In some cases, it is not possible to extract the data from the legacy environment in a way that will support the goals and functionality of the AI app. That might require a rebuild or a total scrapping of the app, with a new design needed to replace it. That’s expensive and time-consuming, but there might be no other option. The second scenario occurs when the new app can get at the data, but the data cannot be delivered at the speed necessary to support a real-time or somewhat real-time AI app. Addressing this issue is possible, but the solution will depend on what the particular legacy system can technically support. Again, one size does not fit all. source

Is Technical Debt a Barrier to AI App Deployment? Read More »

Enter the next phase of Industry 4.0 with edge AI

The business world is changing at a rapid pace. Less than 10% of the FTSE 500 companies that existed fifty years ago are still around today and less than half of the companies founded since 2000 are still operating. Company executives are well aware that their businesses need to adapt to keep up with the rapid transformation now taking place. Two things play an essential role in a firm’s ability to adapt successfully: its data and its applications. If these don’t have a modern foundation, then the whole transformation project will be doomed to failure. That’s why the issue is so important today. What companies need to do in order to cope with future challenges is adapt quickly: slim down and become more agile, be more innovative, become more cost-effective, yet be secure in IT terms. Which is why modernising applications is so important, especially for traditional businesses – they need to keep pace with the challenges facing trade and commerce nowadays. The matter is particularly pressing in view of the stiff competition from tech-savvy companies working in the cloud as it is much easier for them to be creative and agile. What is also important is saving the “intellectual legacy” of the more mobile but aging workforce and drawing on it in the new era. All kinds of things can be automated The question is, how should businesses go about modernising their own applications effectively? Generally speaking, a healthy application and data architecture is at the heart of successful modernisation. This requires understanding the current state of an organisation’s applications and data by conducting a thorough baseline analysis. Aligning modernisation with the firm’s business results and corporate vision is another key factor. The prioritisation and implementation of steps have to be adapted accordingly in order to achieve specific business objectives. A high-street bank in the UK shows just how necessary it is to tackle the challenges that modernisation poses systematically. Only three employees were left to maintain the IT system and run the company’s core processes at the time. They were no longer able to meet customers’ needs, and as a result, customer service at the bank suffered and its ranking dropped dramatically – a case where it would not have been enough to move the bank’s applications into the cloud. Stabilisation and extensive modernisation were called for to boost its business results. AI can accelerate processes What exactly is stopping companies from taking this kind of action, then (apart from the potential costs involved)? The thing that makes modernising applications so difficult is the complexity of the heterogeneous systems that companies have developed over the years. On top of that, there is a shortage of skilled workers capable of dealing with this degree of complexity. The good news is that these days, modernising applications is a discipline that draws on a wealth of experience, and much of it can now be harnessed automatically. For example, IBM has developed hundreds of tools and approaches (or “journeys”) over the last 25 years which facilitate the modernisation process in organisations and meet a broad range of requirements. These have all been grouped together now on a platform known as the IBM Consulting Cloud Accelerator. This can provide users with specific execution and transformation steps that accelerate the modernisation process rapidly – by around forty per cent in terms of planning alone. AI is another technology IBM employs that helps speed up the process – and fits into the existing framework as well. Take IBM Watson Code Assistant for Z, for example. Among other things, this AI-based solution helps developers change from COBOL to Java code quickly and efficiently. This makes their work easier and reduces new applications’ time-to-market. IBM Watson Code Assistant for Z is the first of a series of AI technologies that can help accelerate the modernisation process in the future. Partnerships and co-creation Business partnerships are another factor in accelerating application modernisation. After all, in many cases, modernisation is about creating the perfect interplay between secure core systems on a company’s premises with the capabilities that hyperscalers possess in the hybrid cloud. IBM and Amazon Web Services (AWS) have partnered up to make this easier. Both companies offer a wide range of joint services, spanning from the migration and modernisation of applications and databases to overhauling the apps available, developing modern applications, and DevOps on AWS. This benefits customers in several ways: the partnership between the two tech giants means considerable industrial know-how and technical capabilities can be combined to get their modernization on track strategically – and quickly. “Collaboration” is the key word when it comes to getting started in application modernisation. IBM’s garage method has proven its worth here, for example. Jointly developed by IBM consultants and their customers with the help of design thinking, pilot projects, use cases and standards are created in order to begin the modernisation process. Combined with using templates and architectural guidelines, this collaborative approach can be followed successfully through the whole modernisation process. Learn more about NTT DATA and Edge AI source

Enter the next phase of Industry 4.0 with edge AI Read More »

ADIB-Egypt announces 1 billion EGP digital transformation plan

ADIB-Egypt has announced plans to invest 1 billion EGP in technological infrastructure and digital transformation by 2025. This ambitious initiative is poised to position ADIB-Egypt at the forefront of the digital banking revolution, transforming how customers interact with their financial services. In recent years, ADIB-Egypt has already made substantial strides in integrating technology into its operations. The bank has been dedicated to enhancing its digital platforms and improving customer experience. From the launch of its mobile banking app in 2020 to the enhancement of its internet banking services, ADIB-Egypt has consistently focused on providing convenient, secure, and user-friendly digital banking solutions. The investment in digital infrastructure is not just an extension of these efforts, but a strategic move to drive efficiency, innovation, and customer satisfaction to new heights. The EGP 1 billion investment will be used to bolster the bank’s technological capabilities, including the development of state-of-the-art data centers, the adoption of cloud technology, and the implementation of artificial intelligence (AI) and machine learning solutions. These technologies will allow ADIB-Egypt to better serve its growing customer base, offering more personalized products and services while ensuring that all transactions are secure, swift, and seamless. Additionally, the investment will help improve the bank’s internal operations, streamlining processes and reducing costs. source

ADIB-Egypt announces 1 billion EGP digital transformation plan Read More »

What CIOs are in for with the EU’s Data Act

How CIOs are working on the Data Act As required by current regulations for private healthcare, elderly healthcare management company Karol Strutture Sanitarie collects patient data in their medical records, allowing them to use it even after hospitalization. The data is, in fact, recorded by medical devices, remains in the logs, and is shared with the suppliers or manufacturers of these devices. “The Data Act impacts data sharing,” says Massimo Anselmo, its director of information systems. “An important aspect is, for example, our ability to use patient data for research purposes after anonymizing them, in line with GDPR. The Data Act helps us because it defines more clearly how to use this data, and we’re currently trying to understand if, compared to the past, there’s more data we can make available to patients. So not only the results of a diagnostic test, but specifications of the machine used. Most of the medical machines are owned by us, but with the Data Act, we’ll always have a relationship with the manufacturer to analyze the logs and verify their correct functioning or schedule maintenance. I also foresee an intervention on contracts with suppliers, together with the legal office, and on rental machines to control which data are shared and for how long.” The impact on Karol’s data governance won’t be a major upheaval either, adds Anselmo. “I’ll have to work, above all, on monitoring data traffic and protecting communications, while isolating some data and regulation of access,” he says. source

What CIOs are in for with the EU’s Data Act Read More »

Dubai Police will have its first floating smart police station in 2026

Dubai Police has always been at the forefront of innovation, embracing technology to enhance the safety, security, and well-being of the city’s residents and visitors. As part of its ongoing digital transformation, the force is launching a series of initiatives that integrate smart technology, artificial intelligence (AI), and robotics into its operations. Among the most groundbreaking of these projects is the announcement of the Middle East’s first floating Smart Police Station (SPS), set to go live by the end of 2026. A vision for smarter policing This floating SPS is part of an ambitious AED 2 billion initiative announced by His Highness Sheikh Mohammed bin Rashid Al Maktoum, Vice President and Prime Minister of the UAE and Ruler of Dubai. The initiative, which includes specialized police training, improved employee well-being through housing projects, and enhanced security measures, is designed to elevate Dubai Police’s operations and ensure the safety of its citizens in a rapidly changing world. Lieutenant Colonel Faisal Al Tamimi, Director of the Assets and Facilities Department at Dubai Police, described the project as a transformative leap forward in police services. The floating SPS will offer a wide range of advanced services at sea, meeting the needs of yacht and boat owners, as well as water sports enthusiasts. The station is designed to ensure faster, easier access to police services, aligning with the broader vision of making Dubai the “World’s Smartest and Happiest City.” source

Dubai Police will have its first floating smart police station in 2026 Read More »