CIO CIO

Global Tech Tales: What Buyers Want

episode Global Tech Tales: What Buyers Want | Episode 6: Analytics challenges in the age of AI In this episode of Global Tech Tales, host Keith Shaw is joined by global editorial leaders Matt Egan (U.K.), Chris Holmes (APAC), and Qiraat Attar (India) to explore a pressing question for modern enterprises: Is your data ready for AI? From analytics transformation to the readiness of IT infrastructures, our panel dives deep into: * Why clean, high-quality data is critical for AI success * How different regions are handling AI and analytics integration * The impact of generative AI on enterprise data strategies * Real-world examples from healthcare, manufacturing, and even wildlife parks! * The growing importance of ROI, trust, and explainability in AI initiatives Featuring insights from IDC, Gartner, and real IT buyers, this global conversation breaks down the future of data, analytics, and AI leadership. Don’t miss our rapid-fire “Yes/No” round on whether it’s too late to hop on the AI train! source

Global Tech Tales: What Buyers Want Read More »

Has CISO become the least desirable role in business?

For those reasons, Basu is helping to establish a professional association for CISOs modeled on entities like the Bar Association or the Society of CPAs. It is calledThe Professional Association of CISOs (PAC), and the goal is to formalize the profession through standardized accreditation, advocate for legal protections, and foster a strong, supportive peer network. “It’s time this critical leadership role is afforded the structure and recognition it rightfully deserves,” Basu says. The CISO role is not becoming undesirable because it lacks relevance, he says. “On the contrary, it is vital to the future of enterprise trust and resilience. It is undesirable only when we fail to match responsibility with protection. If we want to attract and retain top talent in these roles, we must build the guardrails that allow CISOs to operate with authority, integrity, and confidence.” The silver lining These issues aren’t insurmountable, says WatchGuard’s Nachreiner. “Realizing that the CISO role is more human-centric and political than technical is key. It’s not just about wizardry with network defenses; it’s convincing the board to greenlight projects that don’t immediately boost profits, rallying department heads to embrace security measures, and nudging employees to tweak their everyday habits,” he says. If you thrive in an environment where your curiosity is never satisfied, you’re always thinking a step ahead, and every day is different, the CISO role remains ideal, says Abnormal AI’s Titus. “Every day you’re learning new things about the field and every day there’s constant innovation happening, making my job better and faster,” she says. “At the end of the day, while the CISO role is demanding, with the right mindset and approach, it remains a critical and rewarding position filled with potential to drive meaningful change,” agrees Nachreiner. source

Has CISO become the least desirable role in business? Read More »

Modernization without disruption: How hybrid cloud empowers evolution

When IT leaders raise the idea of modernization, there’s often an instinctual recoil – it’s a project that sounds time-consuming, expensive, and risky. One that feels easier to put off for another day. But that hesitation is often rooted in a common misconception: modernization is a one-time project that requires a costly, full-scale overhaul. For enterprises that rely on mainframe data or don’t have time to rewrite applications from scratch, especially those in regulated or high-risk industries, the idea of starting over isn’t just expensive. It’s also disruptive and ultimately impractical. So, is there a way to modernize more strategically without necessarily ripping out and replacing the IT solutions you’ve already invested in? Let’s talk about it. Motivation to modernize Most businesses pursue modernization to achieve three main outcomes: improved IT reliability and resilience, enhanced capabilities that offer a better stakeholder experience, and better access to insights through more streamlined data practices. However, acting on these goals tends to be hindered by a web of intermingled challenges, including a lack of talent, overly complex systems, and prior failed attempts to integrate newer solutions. For executives frustrated with their existing IT solutions and any failed attempts to update them, the rip-and-replace method sounds like a simple fix. But while replacement is a good choice for some, it is far from the only option. A fresh start may feel like the quickest way to overcome obstacles, step into value, and realize those outcomes. In practice, however, especially for those in regulated industries, that kind of overhaul often creates more problems than it solves. The problem with rip-and-replace One of the first and most obvious concerns with a rip-and-replace strategy is cost. Total replacement involves multiple layers of expense – some predictable, others less so. There’s the risk of downtime, which can be unacceptable in industries where availability and reliability are paramount, like finance and other meticulously regulated sectors. Then come the infrastructure costs, including new hardware and software or cloud service costs that are often hard to estimate in advance. But one of the most underestimated costs is retraining. Your teams need time to learn the new environment, but so does the system itself. Years of work configuring permissions, hardening security protocols, defining integrations and workflows, and fine-tuning operational parameters must all be reconfigured from the ground up. Strategic modernization Modernization is a transformative journey from your organization’s current state toward a future that sets your business up for success. There is no set course to follow, as it looks different for every organization based on its individual needs. It is about enhancing efficiency, improving customer experience, and driving innovation that sets you apart. Whether you’re making strategic upgrades to your on-prem system, replacing systems entirely, or adopting a hybrid-cloud environment, modernization is not just an upgrade of technology. It’s a strategic evolution of your entire operation. This is why it’s so important to have an experienced partner who not only understands its nuances, but also has the solutions, expertise, and humanity to meet you exactly where you are on your journey. Put simply, there’s no reason to start over if you don’t want to. In fact, modernization is more powerful when used as a tool to drive evolution instead of as a crutch that forces a complete stop and restart. Using a more strategic plan that allows for an incremental transition gives you time to adjust, train staff, find bugs, and master new components without risk to day-to-day operations. That could be a modernization-in-place approach, maintaining transactions on the mainframe while leveraging data elsewhere, or replatforming the apps to refactor them more incrementally later. Regardless of the path you choose, this more intentional approach allows you to preserve processes, prevent skill gaps, reduce risk, and keep momentum – all while stepping confidently toward the future. Putting it together The path to modernization doesn’t have to be dramatic or sudden. It can be intentional, sustainable, and designed around your needs. The key is to start from where you are and build forward with confidence. This approach offers: Faster time to value than starting a new build from scratch Cost efficiency by leveraging your existing infrastructure as a framework for future-proofing Reduced risks to business continuity through incremental change and avoided potential downtime Greater control over your IT evolution without overwhelming your teams or disrupting existing operations Enterprise modernization doesn’t have to be disruptive or expensive. You can modernize on your own terms using the systems and permissions you already have in place, while helping your team and customers stay confident and aligned through every step of the journey. Learn more about how Rocket Software can meet your organization where it is on its modernization journey and guide you through custom, scalable, and strategic decisions. source

Modernization without disruption: How hybrid cloud empowers evolution Read More »

The road to an AI revolution

Just as technology designed for high-performance race cars is put to work in the vehicles we drive every day, Dell Technologies is bringing the perspectives it gains designing data centers for the world’s most advanced AI companies to customers of all industries and sizes. Dell Technologies Chairman and CEO Michael Dell brought a packed house to Dell Technologies World in Las Vegas in May, embarking on a virtual road trip down “Dell Technologies Way” to connect the efforts of the world’s largest companies and its biggest thinkers through AI. “We just love customers who push our engineering and innovation to the edge,” Michael said, highlighting a large project that went “from a blank piece of paper to operational in weeks.” That project is a direct liquid-cooled data center design that includes 110,000 GPUs, 2,800 racks, and 27,000 miles of network cable, not to mention miles of water and cooling lines. Systems are being planned, Michael said, that will scale to 1 million GPUs. Like high-performance cars, massive “intelligence factories” for customers like xAI, CoreWeave, and ServiceNow are at the very leading edge. “These companies are in the business of pure intelligence,” Michael said. This isn’t the case for most customers. They’re solving problems in fields as different as disease prevention and manufacturing. They’re focused on making the customer and employee experience better, faster, and more secure. And they’d like to grow while they’re at it. “AI isn’t your product, but AI can power your purpose,” Michael said. “You don’t need your own (xAI) Colossus, but you do need AI. And we’re taking all the learnings from these massive systems to make AI easier for you.” Dell is certainly making AI easy for financial services giant JPMorganChase, whose Head of Global Technology Strategy Innovation and Partnerships, Larry Feinsmith, joined Michael on the main stage to detail the company’s AI journey and its impact on customers and employees. Dell makes all the components of modern architecture available in one place. The company is creating a future where intelligence amplifies human potential on a massive scale. Dell Technologies is pioneering the edge AI revolution, bringing real-time intelligence to wherever data lives. From AI PCs to small, domain-specific models running on the edge, to planetary-scale AI data centers, Dell has customers covered. It all starts with the PC. The cornerstone of personal productivity is being reinvented by AI, and as the installed base of about 1.5 billion PCs ages, it is being replaced with AI innovation. Michael said Dell is distancing itself from the competition through a simplified portfolio and the ability to choose the latest technology from NVIDIA, Intel, AMD, and Qualcomm. Powerful PCs push AI to the edge, where data is processed instantly, securely, and on-site, Michael said. This requires modern, disaggregated infrastructure that combines the flexibility of the three-tiered model with the simplicity of hyper-converged. These open, automated pools of compute, networking, and storage run any workload anywhere, Michael said. “In many ways, modern architecture is not a destination,” Michael said. “It’s a street of continuous innovation, and we’ve been walking this path for a long time.” Longtime customers like home improvement retailer Lowe’s know they can count on Dell as a trusted advisor, vital partner, and strategic collaborator. As customers rethink their business for AI, Dell is helping them with more than technology. It’s helping them reimagine how they create and capture value, Michael said. “We are entering the age of ubiquitous intelligence,” Michael said, “but it’s not here to replace humans. AI is a collaborator that frees your teams to do what they do best: to innovate, to imagine, and to solve the world’s toughest problems. And Dell infrastructure is the backbone, enabling enterprises to think faster, act smarter, and dream bigger.” Now, the company’s job is to make AI more accessible to the broadest range of customers possible, Michael said. More than 2,500 customers are running Dell AI Factories already, and Michael estimates the number of Dell AI Factory customers can grow to millions in the years ahead. “We are on the cusp of an intelligence explosion,” he said. About 85% of enterprises plan to move GenAI workloads on-premises in the next 24 months, he said. Soon, global investment in AI will exceed $1 trillion, and AI will add an estimated $15 trillion to global GDP by 2030. In the spirit of continuous innovation, Michael detailed the Dell AI Factory with NVIDIA 2.0, as well as Dell Managed Services for the Dell AI Factory with NVIDIA. He also introduced a recorded conversation between himself and NVIDIA CEO Jensen Huang. “This is a once-in-a-lifetime opportunity,” Huang said, “the biggest reinvention we’ve seen. You want to be an early adopter. You don’t want to be second. This is the time you want to be first.” Indeed, the world may be entering what Michael called “a golden hour of progress.” To make it as golden as it can be, though, requires AI to be developed with a strong commitment to the planet and future generations, he said. “For us to realize the possibilities of AI, we need to do it responsibly.” The infrastructure needed to run all this AI will burn a lot of energy, making renewable energy sources very important, Michael said. It’ll also mean Dell will focus on hyper-efficient data centers that take advantage of smart power management, optimized thermals, and liquid cooling. Dell software tracks and forecasts energy and emissions, and uses telemetry to automate power and thermal management. The company’s leading asset recovery programs make it easy for customers to retire older systems, an important consideration as 1.5 billion PCs are refreshed. “The superpower of AI is not just for the enterprise,” Michael said. “AI is for all of us. AI is for human progress, and it’s powerful whether it’s in the hands of a Fortune 50 CIO, or a dad.” Michael introduced a video featuring Adrian Mullan, an Australian dad whose dinner table conversation led to the development of Norby, an AI robot powered by Dell workstations and

The road to an AI revolution Read More »

The tech team leading blockbuster transformation at Hoyts

On the mobile experience: It plays a very significant part of the user journey. More than 80% of our customers are engaging with us on a mobile device. So whenever we’re looking at streamlining the user journey, or how something is going to work, it’s all built from a mobile-first point of view, because that’s where and how our customers want to engage with us. We’ve seen instances where all of Australia wakes up and decides to go to the movies, and the first thing they do is they pick up their mobile phones and type in the website. That creates a tsunami of demand on our web environment, but they just expect it to work. Not only have we increased performance and reliability of our digital channels, but we’ve significantly reduced our Azure hosting costs as well because of the efficiencies we’ve achieved behind our new website and mobile app. On digital retail: If I reflect on the last few years of the digital strategy, one of the key things we did was to say that trying to host our own digital gift stores is really not the way to go. So as part of the greater strategy where we we’ve redeveloped our website, ticketing, and primary consumer-facing websites and apps, we decided to move our gift stores to the online platform Shopify, which runs ecommerce stores globally. What that gave us was the reliability and scalability to give users familiarity with their user journey to buy merchandise. In our case, it’s our gift cards. It also has a lot of built-in promotional functionality that we’re able to leverage. It ended up being a great decision that enabled us to sell on social channels and a wide range of other ways that we didn’t have the ability to do before. On change management: Company culture is hugely important. I don’t know whether it’s our industry or the culture I’ve helped build within the technology department, but we have very low turnover. I think all leaders will say your success comes from surrounding yourself with great people, and I feel like I’ve done that. I’ve got a head of development and architecture who had the forethought to think about where we needed that strategy to be, what we needed to achieve, and giving people the autonomy and support to do what they’re good at. This is what breeds great outcomes and a great culture for success. It’s all built around trust. Of course, it also comes with challenging those people to go beyond whatever it is you set out to do. If you stretch those goals, then ultimately you achieve more than what you want to achieve. I think that’s what people also enjoy. They want to know not only that they have clear objectives of what to achieve, but how their achievements are going to impact the bigger picture. source

The tech team leading blockbuster transformation at Hoyts Read More »

How Capital One drives returns on its AI investments

That speed has caught many IT executives off guard as techniques that have always worked for them stop working, Andersen adds. “With this absolute velocity, you are seeing the old norms of trying to figure out how much to invest, those are no longer useful tools,” he says. “If you use traditional methods, you just don’t get it.” Although Andersen agrees that inference pricing has gone down significantly, “the reality is that we are asking for more sophisticated tasks, queries that are perhaps 1,000 times more complicated” today as compared to two years ago, he says. Capitalizing on cloud and data When Natarajan joined Capital One in March 2023, ChatGPT was barely four months old. Despite having been used for about 15 years at that point, generative AI didn’t take off in terms of C-suite and board mindshare until OpenAI introduced ChatGPT. source

How Capital One drives returns on its AI investments Read More »

AI open-source projects that should be on your radar

Artificial intelligence is reshaping everything from application development pipelines to enterprise customer engagement strategies. But beyond the excitement surrounding Large Language Models (LLMs) and generative AI, a foundational transformation is also underway, one grounded in infrastructure, developer ecosystems, and operational control. At the center of this transformation is open source. Open-source innovation is at the heart of today’s most transformative AI breakthroughs. This isn’t just about free tools for developers. From performance tuning and model optimization to workload portability across heterogeneous environments, open-source projects are addressing complex enterprise requirements. Equally important, they’re doing so guided by principles that emphasize transparency, modularity, and vendor neutrality. These principles form the backbone of enterprise-ready AI: efficient, resilient, and free from vendor lock-in. In a world where technologies now evolve quarterly, not yearly, architectural flexibility isn’t a luxury; it’s a necessity. And as the open-source AI ecosystem accelerates at breakneck speed, some of the most transformative innovations are being shaped by the power of community. So, what’s driving this momentum? Let’s take a closer look. Why open source is central to AI’s future  Thanks to open source, the pace of AI innovation now surpasses anything closed models could have achieved. One big reason is community-led development. When innovations come from diverse contributors united by shared goals, the pace of progress increases dramatically. Open source allows researchers, practitioners, and enterprises to collaborate in real time, iterate quickly, share findings, and refine models and tools without the friction of proprietary boundaries.  We’ve already seen this play out in real-world examples. Meta’s LLaMA models have put additional pressure on proprietary alternatives, giving developers access to high-performing models with fewer restrictions. These advances are not anomalies. Rather, they’re the product of open collaboration. In addition, over many years there have been steady upstream contributions to a variety of open source projects from the major enterprise AI accelerator vendors, including NVIDIA, AMD, and Intel. Open source also delivers something enterprises increasingly demand: flexibility. As AI becomes a cornerstone of strategic decision-making, businesses don’t want to be tied to a single vendor’s roadmap or pricing model. With open source, they can adopt modular, interoperable solutions that evolve as their needs change. They gain autonomy, deploying models where their data lives, choosing hardware that fits their workloads, and avoiding buyer’s remorse from prematurely locked-in investments.  This autonomy, paired with the collective strength of vibrant communities, is what makes open source such a powerful force in today’s AI ecosystem. As a result, we’re witnessing an unmatched pace of innovation with community-driven projects pushing the boundaries of AI, often rivaling proprietary offerings from the biggest vendors.    Open-source AI projects to track  It’s worth remembering that open source doesn’t mean zero oversight. Responsible adoption still requires thoughtful planning and clear guardrails.  Security and governance remain top concerns. Many open-source models—available via Hugging Face, for example—don’t fully disclose their training data. That’s not inherently problematic, but it does mean enterprises must enforce strict controls around model outputs. The model itself is just a file; it’s the output that impacts business operations.  Support is another key consideration. When you run into issues, who do you call? With mature projects like PyTorch, vendor support is widely available. But for newer or more niche projects, you’ll want to evaluate community engagement, contributor activity, and whether commercial support options exist.  When assessing a project, look at the ecosystem. Who’s contributing? Is there a clear governance model? Are vendors building around it? These questions help determine whether a project is viable in a production environment.  Here are some of the most promising open-source AI projects and communities shaping the enterprise landscape:  1. Hugging Face Hugging Face has become the GitHub of open-source models.  It’s a central hub for discovery, collaboration, and benchmarking. Organizations can explore model rankings, understand popularity trends, and quickly bring new models into their workflows. What GitHub did for application communities, Hugging Face has done for AI, offering a single, centralized space where developers can coalesce around open-source models, evaluate performance, and collaborate across teams and organizations. The strength of the platform lies not just in the volume of models available, but in the quality of metadata and tooling that surrounds them. From performance benchmarks to user reviews, Hugging Face provides visibility into what’s working, what’s trending, and where innovation is happening. 2. vLLM vLLM, which originated at UC Berkeley’s Sky Computing Lab, is one of the most widely used inference engines in open source. It supports multi-accelerator deployments—across NVIDIA, AMD, and others—providing a level of portability that’s invaluable for hybrid and multi-cloud environments. It’s become foundational infrastructure in AI inference because of that very flexibility.   Whether you’re deploying on an NVIDIA A100 today or moving to an AMD MI300X tomorrow, vLLM’s multi-hardware support ensures seamless portability. That kind of modularity is key as organizations scale and diversify their AI workloads. Its popularity also reflects a growing expectation: inference engines should not lock you into a single hardware vendor. vLLM delivers on that promise and is a core component in VMware’s Private AI Foundation stack, where flexibility and performance go hand in hand.   3. NVIDIA Dynamo Dynamo is an AI inferencing framework that supports reasoning models that draw from multiple expert models to handle complex requests. It streamlines parallel processing at scale while maintaining modularity—a smart architectural decision from NVIDIA that reflects what enterprise customers need. The rise of reasoning models—where an AI system consults dozens or even hundreds of smaller expert models—introduces a new layer of infrastructure complexity. Dynamo addresses this challenge head-on by enabling the distribution, scaling, and orchestration of these models across high-demand environments.   Importantly, NVIDIA chose not to build Dynamo as a tightly controlled framework, maintaining modularity. 4. Ray Originally developed at UC Berkeley, Ray enables distributed training and inference across clusters. It’s already a backbone technology for OpenAI and other hyperscalers, which speaks volumes about its scalability. Ray was specifically designed to support parallel processing at scale—making it ideal for training and inference across multi-node environments where performance and speed are paramount. 5. SkyPilot SkyPilot simplifies hybrid AI

AI open-source projects that should be on your radar Read More »

From fixed frameworks to strategic enablers: Architecting AI transformation

Traditional architectural approaches have become unsustainable for technology leaders navigating today’s AI-driven landscape. Architecture is no longer a checkpoint at the end of development but must be woven throughout the entire AI transformation lifecycle. As organizations demand more tangible evidence of AI value and competitive advantage, enterprises must fundamentally transform how they approach architecture, shifting from rigid frameworks to strategic enablement.  Key takeaways: Architects as strategic business enablers  Shift from rigid control to distributed enablement: Move from centralized architectural governance to distributed frameworks that empower innovation while maintaining necessary guardrails.  Embrace the product mindset: Transform architectural thinking from project-centric deliverables to product-oriented capabilities that continuously deliver business value.  Develop new skills and competencies: Invest in architectural talent that combines technical expertise with strategic business acumen to lead AI transformation.  Implement outcome-based metrics: Measure architectural success through business outcomes rather than technical compliance.  Create self-sustainable systems: Design architectural frameworks that adapt and evolve without constant manual intervention, just as well-planned cities grow organically.  “As the tech function shifts from leading digital transformation to leading AI transformation, forward-thinking leaders are using this as an opportunity to redefine the future of IT.” — Deloitte Tech Trends 2025  Breaking free from the order-taking trap  Many IT organizations have devolved into sophisticated order-taking operations, where architecture teams simply implement strategies handed down from business units without meaningful input into their formation. This execution-only mindset has created several critical dysfunctions.  The feature factory syndrome  When IT operates purely as a feature delivery engine, architecture becomes reactive rather than proactive. Teams rush to implement disconnected capabilities without considering the broader ecosystem impact. This creates a devastating cycle: business requests lead to feature development, which accumulates technical debt, increases integration complexity, creates maintenance burden, reduces innovation capacity and ultimately generates more feature requests.  source

From fixed frameworks to strategic enablers: Architecting AI transformation Read More »

A C-suite agenda for scaling AI value

As enterprise leaders confront operational volatility and economic pressures in 2025, AI has reached a strategic crossroads. While generative AI is being discussed in executive conversations, its value remains unclear or unrealised for many organizations. For example, research by McKinsey suggests that only 1% of enterprise leaders believe that AI integration across multiple core processes in their organizations has been achieved. This highlights significant strategic gaps in the implementation of AI, where initiatives are often driven by hype cycles or isolated vendor offerings, rather than a unified enterprise strategy. This tends to produce disjointed pilots, low ROI and cultural resistance to adoption. This disconnect is based on a common misframing: AI is often used as a cost-cutting instrument instead of as a platform to develop strategic organizational capability. The view of AI as a labor substitution or task automation artificially limits its potential. However, focused on improving decision-making, organizational adaptability and innovation velocity, AI can generate compounding returns. For CIOs, CTOs and other C-suite members, this shift in thinking is essential. Competitive advantage will increasingly rely on embedding AI as a dynamic infrastructure, on par with ERP or cloud architecture, rather than just as a tactical tool. From labor arbitrage to learning systems: The evolution of AI’s enterprise role Historically, organizations have treated AI primarily as an automation or outsourcing tool, prioritizing immediate cost savings through predictive analytics, robotic process automation (RPA) and generative content tools. This mirrors earlier waves of IT adoption, where executives focused on quick efficiency wins rather than enduring capability-building. AI implementation research at BerkelyHaas (University of California at Berkeley) shows that only a small minority of firms realise significant financial benefits from AI despite extensive experimentation. This is supported by multiple substantial studies (Reutlingen University, Deloitte, LTIMindtree) that consistently identify common AI project failures due to governance deficiencies, organizational culture mismatches and strategic misalignment. This gap is not technological, but strategic. AI is fundamentally different from existing digital solutions:  source

A C-suite agenda for scaling AI value Read More »