CIO CIO

Discover’s hybrid cloud journey pays off

Discover’s implementation is unique in that it operates its OpenShift platform in AWS virtual private clouds (VPC) on an AWS multi-tenant public cloud infrastructure, and with this approach, OpenShift allows for abstraction to the cloud, explains Ed Calusinski, Discover’s VP of enterprise architecture and technology strategy.   For many years, the Riverwood, Ill.-based finserv hosted workloads on a cloud platform within its own data centers. The OpenShift hybrid approach gives Discover the choice to run workloads on private or public clouds, enabling it to better manage and move workloads to multiple clouds and prevent vendor lock-in. “More workloads were moved [to the cloud] in the first six months of this year than in all the years before, by far, orders of magnitude more,” Strle says. “Due to the elasticity of the environment, we were able to handle circumstances such as big surges, and that’s very important to us because of the way we do marketing and campaigns and different ways people interact with our rewards. That can lead to very spiky consumer behavior, and we can dynamically grow our capacity on public clouds.” source

Discover’s hybrid cloud journey pays off Read More »

Process Intelligence: The CIO's secret weapon for unlocking value

CIOs are under intense pressure to deliver massive digital transformation initiatives with limited resources under tight time constraints. Boards of directors are placing a high priority on deploying generative AI as fast as possible so their organizations don’t lose competitive advantage. Meanwhile, organizations running SAP ERP platforms have until 2027 to upgrade from ECC and R3 to S/4HANA, when support will end. These are just two examples of the many challenges CIOs have on their plate. Enter process intelligence, a data-driven approach that’s revolutionizing how CIOs navigate these challenging transformations. By providing a fact-based view of how systems and processes flow within organizations, it enables more informed decision-making at both strategic and tactical levels. Here’s how it works. The platform uses process mining and augments it with business context to give companies a living digital twin showing the way their business operates. It’s system-agnostic and without bias, which means companies share a common language for understanding and improving how their business runs, connecting them to their processes, their teams to each other, and emerging technologies to their business. Meaning employees and teams can better collaborate to optimize their business within and across processes. Process intelligence can be applied to every process in every industry, allowing processes to scale to the level of your ambition, and drive the results we all know are possible. Consider a large system migration challenge. Process intelligence helps CIOs tackle the complexity by providing clear visibility into current operations. For instance, a major alcohol distributor uses process intelligence to create detailed heat maps of their requirements across regions and geographies, an analysis that would have been prohibitively expensive and time-consuming using traditional methods. Process intelligence provides a common language between stakeholders by objectively documenting how work flows through the organization, helping managers to make data-driven decisions. The technology also provides common language for the often-challenging gap between business and IT teams. During an upgrade, when custom code often needs to be retired and bespoke processes need to be standardized, business units may resist change With facts and data, this decision making becomes simpler. When it comes to generative AI initiatives, many organizations rush in without a proper understanding of their processes and risk implementing a large language model that doesn’t produce the ROI the business expects. Deployments are often extremely complex, involving specialized, high-performance hardware, rollout of use cases, change management and lengthy training cycles to help people adjust to new ways of working. Process intelligence identifies where slowdowns and bottlenecks occur so managers can speed up and, where appropriate, simplify the deployment process. Real-world success stories demonstrate the technology’s impact. HARMAN, a wholly-owned subsidiary of Samsung Electronics, leveraged process intelligence for business case planning during its transformation journey and currently uses it for fit-gap, custom code analysis and master data cleanup. As a result, accelerating progress towards completing its system migration. Another large consumer products company employed process intelligence to monitor user adoption during hyper care phases of their implementation, quickly identifying and resolving challenges in order execution and fulfillment. The end result? Happier customers. The benefits of process intelligence extend beyond technical considerations. Project Management Offices (PMOs) find that process intelligence helps define clearer program scope, reducing the risk of scope creep and budget overruns. Systems integrators can bid more accurately on projects and complete them faster when they have detailed process insights at their disposal. Celonis is the global leader in process mining and process intelligence. Well-known brands such as PepsiCo, Uber, ExxonMobil, Diageo, Mars, Calor Gas, Pfizer   and many more employ their platform for system transformation and execute initiatives faster. To find out how Celonis can help your organization, visit here.   source

Process Intelligence: The CIO's secret weapon for unlocking value Read More »

No support or updates for Windows 11 on machines not meeting minimum hardware requirements, says Microsoft

Microsoft also doesn’t elaborate on what it means by Windows 11 “compatibility issues,” so this is a matter of guesswork. However, it’s possible to imagine that new features that assume a TPM is available could cause instability on a machine lacking this facility. It could also affect drivers for older hardware no longer supported in Windows 11, although this would be likely to be an issue over the longer term. Meet the TPM Microsoft’s minimum requirements for Windows 11 cover several hardware components, including having enough RAM and a powerful enough microprocessor. But the most contentious issue is whether a PC contains or supports a Trusted Platform Module (TPM), specifically version 2.0, released in 2014. A TPM is a secure enclave for storing data such as cryptographic keys, certificates, and biometric information fundamental for the security of a PC, including those required for low level PC checks such as Secure Boot, or for the use of Microsoft’s BitLocker in its more secure mode. Having one is somewhere between a good idea and essential, as more and more software systems going forward assume one will be there at the root of trust. For a summary of the arguments in favor of upgrading to a system with TPM 2.0, Hosking’s blog is a good place to start. source

No support or updates for Windows 11 on machines not meeting minimum hardware requirements, says Microsoft Read More »

Learning from the AI leaders

Unsurprisingly, lack of skills is cited as the biggest challenge. Issues around data governance and challenges around clear metrics follow the top challenge areas. All of these relate to the lack of experience with AI. As organisations embark on their journeys, they have to learn what is needed to ensure a successful project.  When it comes to failure, leaders contend with issues including privacy or compliance, compared to the followers, where the biggest cause of failure is the inability to access data due to infrastructure restrictions.    Having guardrails in place is key. “Two critical foundations for AI integration at a policy and governance level are that you have trust in your data and that the data is ethically managed,” says Deepak Ramanathan, Vice President of Global Technology Practice at SAS. He continues: “This demonstrates to your team and stakeholders that you are taking the appropriate actions to mitigate risk and liability. When it comes to Responsible AI, that includes not just those potential risks, but also the need to ensure that your models are driving accurate and actionable insights. At its core, Responsible AI begins with good policy and that flows onto rigorous technical execution, ensuring good governance is embedded at the heart of ‘AI Leaders’ systems.”  source

Learning from the AI leaders Read More »

Solar company leverages digital transformation for a brighter tomorrow

Humans have long been fascinated by the power of the sun. Since as early as the 7th Century B.C., humans used magnifying glass to spark fires with sunlight. Later, we used mirrors to light torches.  But if you get too close to the power of the sun while continuing to use outdated methods and tools, you’re apt to crash and burn. Just ask Icarus. Our curiosity with the sun has led us to work continuously to develop solar power and make harnessing the sun’s power a reality. Nowadays, you can use solar power to run your entire home. It can be used to clean water or power electric vehicles, trains and even spacecraft. It is even being used to power planes that can fly around the world. A sun-powered future is bright and sustainable Today, solar energy is one of the world’s cleanest, most sustainable energy resources. And decentralized solar power is currently the most cost-effective option for energy generation, compared to wind, gas, and coal.  That’s why SMA Solar Technology, a global energy solutions provider, is on a mission to help make this sustainable, secure, and cost-effective energy supply accessible to everyone worldwide.  Achieving this mission requires solar storage systems and digital solutions that balance energy generation and consumption so customers can share their electricity with other consumers. And rapid innovation is necessary for building these systems in the future. Blinded by the light as a result of aging and complex systems SMA’s manufacturing execution system (MES) was initially built in-house over 40 years ago. As the system continued to develop over the years, it became extremely complex, difficult to maintain, and inefficient. Production downtime resulted in frustrated employees, high logistics costs, unsatisfactory inventory management, and a lack of process transparency. The aging system posed a major challenge to the delivery of the latest solar technology innovations.  The forward-thinking technology company knew that if they were going to be successful in getting solar power into the hands and homes of more customers, they needed their business systems to be more agile and resilient.  Brilliant innovation requires digital transformation  PWC supported SMA in the selection of SAP solutions to achieve the client’s challenging goals. As an implementation and service partner, the SAP Cloud Success Services team supported SMA in its ambitious digital transformation and move to the cloud. Together, they built a comprehensive and scalable global digital business platform. With a cloud foundation that includes SAP S/4HANA Cloud Private Edition integrated with the SAP Digital Manufacturing solution, operations were overhauled and modernized. With the support of SAP’s Cloud Success Services team and PWC, SMA has adopted a flexible cloud roadmap to create a modern framework for the future. This secure framework will help SMA pave the way for the future of energy supply. Digital transformation lights up new horizons With their digital transformation fully underway, SMA is seeing impressive results, boosting supply chain productivity by 15%, improving supply-chain planning costs by 10%, and reducing inventory carrying costs by 10%.  The initiative will also help streamline transportation planning, curb freight spending, and establish a collaborative partner portal. Additionally, the platform’s real-time visibility can help improve order status transparency while supporting connected devices. These outcomes highlight an enhancement in operational efficiency across the organization.  Here comes the sun: Sustainable energy for all SMA now has the digital foundation it needs to support the global energy transition and decarbonization of the planet. They have the right software, processes, services, and data to enable customers to participate in the energy system of tomorrow and today. And they are doing it all in the most sustainable way possible. SMA Solar Technology is an SAP Innovation Awards 2024 finalist. To learn more about the SAP S/4HANA Cloud and SAP Digital Manufacturing, download the pitch deck. source

Solar company leverages digital transformation for a brighter tomorrow Read More »

Unlocking AI potential: The essential role of hybrid, multi-cloud strategies

At NetApp, we recognize that AI is not merely a technological tool—it’s a transformative mindset that can reshape organizations and industries. To harness its full potential, it is essential to cultivate a data-driven culture that permeates every level of your company.  Our company is not alone in adopting an AI mindset. Notably, hyperscale companies are making substantial investments in AI and predictive analytics. Their role is crucial in assisting businesses in improving customer experiences and creating new revenue streams through AI-driven innovations. However, each cloud provider offers distinct advantages for AI workloads, making a multi-cloud strategy vital.  AWS provides diverse pre-trained models for various generative tasks, including image, text, and music creation.  Google is making strides in developing specialized AI models, such as those tailored for healthcare applications like ultrasound image interpretation.  Azure’s generative AI solutions integrate seamlessly with Microsoft’s ecosystem, offering a cohesive experience for organizations heavily invested in their products.  NetApp’s first-party, cloud-native storage solutions enable our customers to quickly benefit from these AI investments. For example, NetApp BlueXP workload factory for AWS integrates data from Amazon FSx for NetApp ONTAP with Amazon Bedrock’s foundational models, enabling the creation of customized retrieval-augmented generation (RAG) chatbots. This integration allows organizations to leverage their proprietary data in generative AI applications, enhancing the relevance and accuracy of AI-generated responses. By using a multi-cloud approach, businesses can take advantage of each cloud provider’s unique strengths and choose the best platform for each GenAI RAG-based project, without being limited to just one provider’s ecosystem. Moreover, multi-cloud data solutions are essential for complying with regulatory frameworks like the Digital Operational Resilience Act (DORA) from the European Union, which goes into effect this January. DORA security requirements apply to a wide range of financial institutions, including banks, investment firms, payment service providers, asset managers, and crypto-asset service providers. Additionally, it encompasses third-party information and communications technology (ICT) service providers who deliver critical services to these financial organizations, such as data analytics platforms, software vendors, and cloud service providers. DORA requires financial firms to have strategies in place to manage risk related to their third-party service providers, such as AWS and Microsoft Azure. Whether it’s a managed process like an exit strategy or an unexpected event like a cyber-attack. By using intelligent data infrastructure from NetApp, financial institutions can securely end contracts with third-party providers and seamlessly transfer training and inferencing data to a new cloud platform. This ensures uninterrupted business operations during the transition, maintains service quality for clients, and adheres to regulatory requirements. In addition, they can actively detect and safeguard the data, enabling rapid recovery in the event of an attack. NetApp believes that, even though many businesses will choose public cloud services for AI, there are compelling reasons why specific organizations may decide to run AI workloads in their private data centers or use a hybrid cloud model. For particular industries, such as healthcare, defense contracting, government, and finance, the sensitivity of their business data makes cloud-based data preparation, model training and fine-tuning, and inferencing unsuitable. Our data solutions support companies that opt for a do-it-yourself (DIY) approach with proprietary or open-source models, leverage a turn-key Converged AI solution like NetApp AIPod with Lenovo or FlexPod for AI, or adopt a hybrid model that combines data center resources with cloud-based services. NetApp data solutions support a hybrid, multi-cloud strategy  AI has advanced rapidly, with models increasing in complexity, data sets expanding in size, and the demand for real-time insights becoming more crucial. Organizations can use hybrid, multi-cloud strategies to distribute their AI workloads across on-premises and different cloud environments, optimizing performance, cost, and resource allocation. NetApp has the tools necessary to make your hybrid, multi-cloud AI deployments a success: Unified data management: It is no secret that data silos slow down AI projects. NetApp’s intelligent data infrastructure unifies access to file, block, and object storage, offering configurations ranging from high-performance flash to cost-efficient hybrid flash storage. It is available in data centers, colocation facilities, and through our public cloud partners. As the only provider offering first-party, cloud-native storage solutions on all three major public clouds—Amazon FSx for NetApp ONTAP, Microsoft Azure NetApp Files, and Google Cloud NetApp Volumes—NetApp enables organizations to easily move, manage, and protect their data across various cloud platforms, reducing the strain of moving data and minimizing data silos. Integrated AI service capabilities: To leverage the unique strengths of each cloud platform with industry-specific knowledge or business-specific information, organizations need to integrate proprietary enterprise data with custom task-based models. This can be a challenging task. NetApp has developed a variety of integrated toolkits that are helping to solve this problem. Our AWS customers can deploy and manage RAG pipelines with the launch of the GenAI capability in BlueXP workload factory. With this new capability, customers can securely connect data in ONTAP with Amazon Bedrock to develop GenAI applications without having to copy it to Amazon S3. The GenAI toolkit, which supports Google Cloud NetApp Volumes, speeds up the implementation of RAG operations while enabling secure and automated workflows that connect data stored in NetApp Volumes with Google’s Vertex AI platform.  NetApp’s GenAI toolkit is also in preview in Azure NetApp Files in Microsoft Azure. To have GenAI RAG-based applications that can provide the most relevant results, companies need the ability to seamlessly connect custom models from any cloud partner to their business data. Data governance: Data classification involves categorizing data based on its sensitivity, value, and regulatory requirements across multiple clouds. Our comprehensive set of features goes beyond basic data cataloging. Leveraging AI, machine learning, and natural language processing technologies, we categorize and classify data by type, redundancy, and sensitivity, highlighting potential compliance exposures. NetApp offers a range of data classification strategies tailored to the unique challenges posed by Generative AI: Data Estate Visibility: Improve the cleanliness of your data and gain knowledge about sensitive information with complete visibility of your entire NetApp data estate, both on-premises and in the public cloud Discover Personal and Sensitive Data: Our classification capabilities can

Unlocking AI potential: The essential role of hybrid, multi-cloud strategies Read More »

2025 is the year of quantum science; what that’s all about

In celebration of the 100th anniversary of the discovery of quantum mechanics, United Nations has declared 2025 as the International Year of Quantum Science and Technology. If that milestone catches you off guard, don’t be surprised, as much of the activity and focus around quantum computing happens with little fanfare. Progress is steadily being made with quantum research, but the day when somebody can truly “master” quantum mechanics and release its true potential has remained elusive. When that does happen, the good news is that quantum computers will potentially be able to solve in seconds extremely complex tasks that currently take years. The bad news is that quantum computers could also solve the data “puzzles” that are at the heart of encryption protection, leaving all systems and data immediately vulnerable. “The advent of quantum computing is a double-edged sword, offering unparalleled compute power while posing unprecedented cybersecurity challenges. The transition to post-quantum cryptography may seem daunting, but with the right resources, strategic planning, and trusted partnerships, enterprises can ensure the protection of sensitive data against future quantum cyberattacks,” says Heather West, Ph.D., research manager, quantum computing research lead, IDC. Where quantum development is, and is heading In the meantime, the United Nations designation recognizes that the current state of quantum science has reached the point where the promise of quantum technology is moving out of the experimental phase and into the realm of practical applications. “Quantum computing is at an exciting stage, where active collaboration between academia and industry is leading to rapid innovation,” explains Mohit Pandey, a quantum computing scientist who has extensive experience using quantum computing for drug discovery while working in the biotechnology industry. “Before the 2010s, quantum computers were an exotic concept [and] were confined to the isolated academic discussions. But in the last 10 years, we have seen an accelerated pace of the use of quantum technology in business optimization problems, drug discovery, communication, and encryption,” Pandey explains. These advances have been made possible due to extensive availability of quantum computers to the public, Pandey says. There are now scores of quantum hardware companies, including large technology companies such as Google, Microsoft, and IBM and startups such as QuEra, IonQ, and Kipu Quantum. In addition, many organizations can now access quantum computers through cloud platforms such as Amazon Web Services’ Amazon Braket, Pandey explains. This widespread access opens the opportunity for organizations to investigate the ways in which quantum computing can solve challenging problems. A campaign to make quantum better understood Rather than commemorate a particular event or development in quantum research, the United Nations designation is more intended to make quantum science more approachable and understandable, explaining what quantum computing means for everyday life in simple terms, says Arthur Tisi, founder of Hunova and co-founder of BaseForge, an advanced technology advisory that deals with quantum computing, artificial intelligence, and machine learning strategies. “‘Quantum for everyone’ might be a tagline,” Tisi says. “The campaign will promote how advancements in quantum require international cooperation — similar to climate change — showing that breakthroughs in this field could benefit the entire world, impacting entire industries, not just a few nations or corporations.” The UN messaging needs to be that quantum technology comes with lots of challenges still, but it can be a catalyst for helping solve previously insurmountable problems, Tisi says. Global challenges come with hard work and alignment around internet protocols, national security, and regulation — especially around ethics. But it could have a huge impact on such things as drug discovery, setting aside the bureaucracy within government agencies, he says. Quantum computers will potentially be able to solve problems that include enormous databases and complex algorithmic challenges and do it with lightning speed, says Thomas Vartanian, executive director of the Financial Technology & Cybersecurity Center. What is not clear is that they will be able to function any better in a lot of areas that computers function today. Quantum benefits aren’t for every organization What will the full benefit be for organizations when the quantum challenge is resolved? For smaller organizations, probably not much. For large organizations, probably plenty, says Vartanian. Taking into account just the sector that he tracks — financial services — the impacts would be game-changing. “The G7 Cyber Expert Group [an intergovernmental group chaired by the U.S. Department of Treasury and the Bank of England] just put out a report basically highlighting the good, the bad, and the ugly of quantum computing,” Vartanian says. “The good is it will exponentially change the speed at which financial institutions can do market trading. It will facilitate enormously alternative investment strategies. It will enhance risk management. It will provide them greater capabilities to make more reliable predictions. It will change payment processing throughout the world in very dynamic ways, and it will make communications different and more secure.” These promised benefits have led to an arms race, of sorts, as organizations — and even nations — try to be the first to break the quantum barrier. “Supercomputing facilities and government entities worldwide are experimenting with the installation of a variety of quantum processing units to better identify the specifications needed for the integration of a gate-based quantum computer with a classical supercomputer. IBM specifically expects to deliver a quantum-centric supercomputer capable of executing 1 billion gates across 2,000 qubits by 2033,” IDC’s West says in her research report IBM Quantum: 156-Qubit Heron, Qiskit Functions, and the Future of Quantum Development. Also driving the race is the potential to control the bad and the ugly of quantum. Data encryption is based on the need to solve extremely complex mathematical equations in order to get past the encrypted “firewall.” The equations are so complex that a standard computer would need years to solve the puzzle. Being able to quickly do so could be a technological tipping point. “That depends on who gets quantum computing first and how they use it,” Vartanian explains. “Everything will either be more secure, or less secure.” Toward that eventuality, Vartanian

2025 is the year of quantum science; what that’s all about Read More »

The power of unified data storage with generative AI

The promise of generative AI (genAI) is undeniable, but the volume and complexity of the data involved pose significant challenges. Unlike traditional AI models that rely on predefined rules and datasets, genAI algorithms, such as generative adversarial networks (GANs) and transformers, can learn and generate new data from scratch. Training these models requires high-quality, diverse data to produce accurate, coherent, and contextually relevant output. The more comprehensive the training data, the better the model will perform in producing realistic and useful responses.  Organizations can find it overwhelming to manage this vast amount of data while also providing accessibility, security, and performance. For AI innovation to flourish, an intelligent data infrastructure is essential. This infrastructure must support data preparation, model training and tuning, retrieval augmented generation (RAG), and inferencing. Additionally, it should meet the requirements for responsible AI, including model and data versioning, data governance, and privacy. The data dilemma: Breaking down data silos with intelligent data infrastructure In most organizations, storage silos and data fragmentation are common problems—caused by application requirements, mergers and acquisitions, data ownership issues, rapid tech adoption, and organizational structure.  This fragmentation includes:  Different media types: high-performance flash, high-capacity flash, hybrid flash, hard disks  Multiple protocols: block, file, object Various deployment models: storage appliances, software-defined storage, storage as a service, public cloud storage Data fragmentation makes it difficult for data scientists and AI engineers to access necessary datasets. This is the primary reason why AI initiatives fail, according to IDC’s new survey, Scaling AI Initiatives Responsibly, commissioned by NetApp. Unified data storage resembles a well-organized library. In a modern library, every book, magazine, DVD, and digital media item is stored in one place and accessible from any section without hassle. Everything is categorized and readily available through a single system, regardless of whether you’re searching for a classic novel, a research journal, a documentary film, an ebook, or an encyclopedia (do they even produce those anymore?).  In the same way, intelligent data infrastructure brings together diverse data types under one cohesive umbrella. By combining access to file, block, and object-based storage from a single storage OS across corporate data centers, colocation facilities, and public clouds, unified data storage streamlines data access, enhances data management, and provides consistent data governance—providing silo-free infrastructure.  In genAI, this capability means providing structured, semi-structured, and unstructured data seamlessly to your data scientists. Whether you’re using RAG or fine-tuning a large language model (LLM), you can work with a rich and diverse dataset, regardless of location, to help provide nuanced language patterns, cultural references, and proprietary knowledge, making your AI more effective in producing accurate and domain-specific answers. With intelligent data infrastructure from NetApp, you can feel confident in data preparation, data security, and data mobility. You can select cloud-based AI services for compute-intensive training, a colocation facility to help with internal power constraints, or data center infrastructure to secure sensitive information. Our unified data storage solutions are designed to scale dynamically, making it easier to expand your storage performance and capacity as your genAI initiatives grow. This is the same NetApp® technology leveraged by the top three public cloud providers and available to you as a first-party cloud native storage service. Empowering innovation As genAI continues to reshape industries and drive innovation, the importance of unified data storage cannot be overstated. NetApp’s comprehensive suite of unified storage solutions provides the scalability, performance, and security needed to unlock the full potential of genAI. By streamlining data management workflows and maintaining the availability of critical resources, NetApp empowers organizations to accelerate their genAI initiatives and stay ahead in an increasingly competitive landscape. Intelligent data infrastructure is more than just a storage solution; it plays a strategic role in genAI innovation. With our industry-leading expertise and cutting-edge technologies, organizations can harness the power of genAI with confidence, driving transformative outcomes and unlocking new opportunities for growth. We make data infrastructure intelligent: any data, any workload, any environment. Explore more To explore further, visit the NetApp AI solutions page. Read more about NetApp AI thought leadership perspectives. If you missed out on our webinar where we talked through the survey results of IDC’s AI maturity model white paper, you can watch it on demand. source

The power of unified data storage with generative AI Read More »

How India is set to redefine AI maturity and data leadership in 2025

It’s easy to forget that the new AI revolution heralded by ChatGPT and OpenAI kickstarted just two years ago and has been quickly embraced by both businesses and consumers. But unknown to many is India’s meteoric rise to become a global leader in AI adoption and it is one to watch: what happens in the Indian market in 2025 will set the scene for the rest to follow. Atlassian’s AI Collaboration Index found that nearly half (46%) of Indian knowledge workers are advanced AI users, significantly higher than in other nations. In the US (34%) and Germany (32%), only about one-third of workers have advanced knowledge of AI usage, and Australia (23%) is even further behind. India leading AI adoption thanks to vast data reserves The Indian market has several qualities that have helped advance AI, as well as in its adoption and use. Firstly, India is home to the world’s largest pool of mobile data and is the second-fastest-growing data market globally. In 2023, India was ranked 15th in the list of top 25 AI nations but was considered to have the greatest potential thanks to its data pools.   With initiatives like Digital India further encouraging digital inclusion across its massive population, the country generates an unparalleled wealth of data, providing fertile ground for AI applications. India also has the skills and infrastructure needed to succeed with AI. Like everywhere else, there is still a skills shortage in India, but with some of the world’s best engineering and IT institutes, the country has a better capacity to build the skills base it needs to design and implement cutting-edge AI solutions. Finally, India’s thriving start-up ecosystem, coupled with government initiatives such as Startup India, is increasingly focused on AI innovation across sectors like healthcare, agriculture, education, and fintech, and that investment infrastructure will directly result in further acceleration in both AI creation and adoption. Thanks to these drivers, India is poised to be a leader in AI in 2025, but fully capitalising on that opportunity relies on data governance, ethical considerations, and operational challenges. The role of governance in data and AI maturity AI needs to have the structures and guardrails in place to ensure the technology retains the confidence of both business and individual users for a long-term and sustainable growth trajectory. When asked about the governance priorities India should focus on in 2025, Dilip George, Managing Director in India, Quest, points towards recent findings from Quest’s The State of Data Intelligence report. “The first is responsible AI development. With AI playing a central role in decision-making across industries, ensuring transparency, fairness, and accountability is essential to build trust and mitigate risks,” explains George. “Data privacy and security follow closely behind. With the increasing flow of sensitive data, frameworks like India’s proposed Personal Data Protection Bill and the upcoming National Data Governance Framework are essential to ensuring compliance and safeguarding user data.” Given the increasingly sophisticated threat landscape, it’s no surprise cybersecurity makes the list. The Indian Computer Emergency Response Team (CERT-In) guidelines and the Cybersecurity Policy aim to bolster resilience against cyber threats, creating a safer environment for AI-driven applications. Looking beyond governance, George shares the five strategic priorities business leaders should keep in mind to capitalise on the AI opportunity: Risk management: Organisations should prioritise building governance frameworks to align AI initiatives with legal, ethical, and operational standards, ensuring risk is managed proactively. AI-driven ROI: Businesses must grow their focus on demonstrating tangible returns from AI investments, integrating advanced analytics to measure performance, optimise operations, and drive decision-making. AI and sustainability: Sustainability goals should be highly considered, with AI being used to monitor environmental impact, reduce waste, and optimise resource usage. Operationalising AI at scale: Scaling AI beyond pilot projects will be key. This requires investing in infrastructure, breaking down data silos, and fostering cross-functional collaboration. Gartner predicts that one in three AI projects will be abandoned after the proof-of-concept-stage, and organisations should be focused on scalability early on to avoid this risk. Real-time decision-making: Leveraging AI to enable instant, data-driven decisions will become a critical differentiator, especially in sectors like finance, healthcare, and supply chain. Governance at the state level Looking more broadly from a national lens, India’s push towards digital transformation further highlights the growing importance and focus being placed on data governance. Key drivers include: Regulatory frameworks: The development of the Personal Data Protection Bill, data localisation norms, and sector-specific guidelines ensures data use aligns with national priorities and international standards. Economic drivers: Data is now seen as a critical economic asset. With the rise of digital payments platforms like UPI and innovations in fintech, businesses process vast amounts of personal and financial data, requiring stringent governance mechanisms. Digital inclusion initiatives: The Digital India programme aims to empower citizens and businesses, creating a framework for managing and leveraging data responsibly across sectors such as healthcare, education, and agriculture. Data readiness is key to de-risking AI adoption In a nutshell, effectively embracing AI requires companies to prioritise data readiness by investing in systems and processes to clean, manage, and ensure data accessibility for AI applications. “Establishing strong governance frameworks is equally essential as it fosters compliance, accountability, and trustworthiness in AI implementations,” adds George. “Organisations should also democratise data access, allowing broader access across teams to encourage innovation and facilitate faster decision-making, ultimately enabling the scalable application of AI.” Additionally, businesses must focus on agility and adaptability, remaining prepared to swiftly embrace new tools, techniques, and trends as AI technology continues to evolve India has every opportunity to turn 2025 into a milestone year for enterprise-wide AI implementation. “The rigours being applied to AI are going to be more substantial than in previous years, but those businesses that can harness this potential within the regulatory framework will not just keep pace but set the standards for AI success worldwide,” concludes George. For the latest insights on current data intelligence initiatives and planned investments by some of the largest organisations in the world, visit The

How India is set to redefine AI maturity and data leadership in 2025 Read More »

New US CIO appointments, November 2024

Nick Harness joins Fringe Benefit Group as CIO New CIO appointments, October 2024 Daikin Applied Welcomes Ashish Srivastava as CDIO Navistar announces Robert Oh as CDIO Jacqui Nevils joins Fresenius Medical Care as CIO MDU Resources promotes Dyke Boese to CIO Gagan Sinha named CIO of Bloomin’ Brands, Inc. K. Hovnanian Companies announces Sri Arumugam as CIO Matt Belanger joins Republic Airways as CIO NW Natural Holdings announces Brian Fellon as CIO Gemological Institute of America names Neil Jadhav CITO Paul Roche named CIO for Transdev US University of Oklahoma welcomes Nishanth Rodrigues as CIO DocGo names Eiwe Lingefors CIO Brigham Young University appointed Brian Radford CIO New CIO appointments, September 2024 Truist names Steve Hagerman CIO Mike Guhl appointed CIO at PulteGroup Inc. Norfolk Southern welcomes Anil Bhatt as CIDO Nelly Jefferson named CIO of Avangrid Marjorie Hutchings joins Berkshire Hathaway Homestate Companies Workers Compensation Division as CIO Itamar Albek joins American Jewish Committee as CIO HEICO welcomes William Velez as CIO First Watch names Rob Conti as CIO Winston & Strawn appoints Robert Kerr as CIO Jovan Marconi joins Inland as CIO CoolSys appoints Danny Rodriguez as CIO Justin McWhirter named CIO for CCC Intelligent Solutions TAB Bank appoints Tami Fisher as CIO Queens University of Charlotte names Kenitra Horsley CIO New CIO appointments, August 2024 Nike appoints Cheryan Jacob CIO Mondelēz International named Filippo Catalano CIDO Keith Credendino named CIO of Macy’s, Inc. Kyndryl appoints Kim Basile as CIO Neeru Arora named CIO for Mazda North American Operations Dan Shull joins Hasbro as CDIO REI Co-op promotes Guillaume Ledieu to CTO Marco Deutsch named CIO at Baker McKenzie Adolfo Rodriguez appointed CTIO at Guitar Center Yeman Collier to join UChicago Medicine as CIO Subaru of America, Inc. appoints Aurelian Sin as CIO Alliant Credit Union welcomes Jamie Warder as CIO Howard Hughes Holdings Inc. appoints Bhupesh Arora as CTO Jamie Head joins Parts Town as CIO Lloyd Boyd named CIO for Park Lawn Corporation Ryan McEnroe named CIO at Reed Smith Unison Risk Advisors appoints Tony Martinez as CIO Martina Schubert joins Van Meter Inc. as CIO Atlas Van Lines promotes Ryan Parmenter to CIO New CIO appointments, July 2024 BNY names Leigh-Ann Russell as CIO Charles (Rusty) Patel joins Baxter as CIO Clayco names Jeff Miller as CIO Kevin Ruggiero named CIO of Parsons Corporation Urmila Menon joins Tri Pointe Homes as CIO Signature Aviation appoints Al Lettera as CDIO Boart Longyear names Niel Nickolaisen as CIO Jordan Ruch named CIO at AtlantiCare Holley Performance Brands names Charan Mann CIO Heath Tuttle named CIO at the University of Buffalo MEC appoints Daniel Bourquin as CIO George Haddad joins North American Partners in Anesthesia as CIO New CIO appointments, June 2024 Madhu Narasimhan named CIO of DaVita Cleveland Clinic announces Sarah Hatchett as CIO Quest Diagnostics appoints Murali Balakumar as CIDO Guncha Mehta joins Beyond as CDIO Gulfstream Aerospace appoints Anthony Newlin as CIO David Ugan named CTO for AAA, Inc. Mieko Shibata named new R&T Deposit Solutions CIO Keith Jones promoted to CIO at Cone Health Lendmark Financial Services announces Zaheer Khan as CIO Keith Correia named CIO at Portillo’s Freshworks appoints Ashwin Ballal as CIO Bill Shuler joins Planet Home Lending as CIO source

New US CIO appointments, November 2024 Read More »