CIO CIO

Why Sovereign Cloud is Essential for Today’s Businesses

So, why suddenly does everyone care about “where the data lives?” Well, the truth is, we’ve always cared.  Policymakers interested in leveraging data sovereignty requirements to advance economic and investment goals certainly continue to care. As a candidate for European Commission President in 2019, Ursula von der Leyen highlighted the economic importance of data sovereignty, and in his September 2024 report on European Competitiveness, former European Central Bank President Mario Draghi noted that a “minimum level of technological sovereignty” is needed “to increase the long-term ‘bankability’ of new investments in Europe.”  In today’s cloud-driven world, data residency and compliance have become top priorities due to government regulations and increasing public concern over data privacy. What has changed are the critical design factors in the interconnected world of cloud computing: the physical location of the data, metadata, and the governance surrounding it.  Sovereign Cloud solutions address these needs by ensuring data security and jurisdictional control while giving cloud service providers the tools needed to be more innovative and competitive for their customers.  The growing web of regulations worldwide, including Europe’s GDPR, the DGA, and the U.S. Cloud Act, highlight the need for organizations to manage data within specific jurisdictions to ensure compliance and security.  Public awareness around data privacy issues further fuels this trend, compelling enterprises to seek cloud solutions that ensure compliance and security. All of this occurs amid a backdrop of significant geopolitical factors, ranging from development and investment policies to national security concerns, which influence both government regulatory frameworks and enterprise cloud strategies, prompting public and private organizations to rethink their data residency approaches. Sopra Steria (Nordics)“The demand for sovereign cloud solutions that can help ensure national autonomy is on the rise in the Nordic countries. As a Pinnacle Tier VCSP partner, we are well-positioned to meet the data residency and security requirements for our customers. Our Sopra Steria SolidCloud platform is built on VMware Cloud Foundation, which enables us to deliver a flexible and scalable sovereign cloud infrastructure. VMware Cloud Foundation enables us to innovate and deliver sovereign cloud services quickly by providing built-in compliance management, operations, and security capabilities.” — Roger Samdal, Agency Director Hybrid Cloud, Sopra Steria With global data creation projected to surpass 180 zettabytes by 20252 and as much as 92% of business data previously projected to be stored by U.S.-based hyperscale cloud providers3, non-U.S. countries are reevaluating data residency requirements to strengthen jurisdictional control and protect their citizens’ data privacy. While the U.S. Public Cloud vendors have worked hard to mitigate these risks for their customers by gaining regulatory certifications across the world, some of the key factors that drive data governance and security controls to protect the data from unauthorized third-party access are beyond their ability to natively protect1. The U.S. government has a number of legal frameworks in place to address national security concerns that extend beyond its geographical borders. For instance, the U.S. Cloud Act of 2018 allows U.S. courts to instruct U.S. companies to collect data on systems they manage not just on U.S. soil but, in theory, anywhere in the world. As a result of this and other legislation enacted in multiple other countries, the landscape is changing. Much of the global population is now protected by data privacy regulations for both personal and business-related data, much of it in line with the EU’s GDPR framework. Furthermore, the definition of personal information is ever-changing, and people’s attitudes evolve, as we saw during the Covid-19 pandemic. One thing is sure: with our online presence ever increasing, this architecture will only get more complex over time. Typically, VMware’s approach to Sovereign Cloud involves defining three types of solution architecture, which can align with the requirements of different organizations and different regulatory requirements. Broadcom These might be defined as: Provider Sovereign Cloud: VMware Cloud Provider who offers services that deliver on customers’ requirements for data and application sovereignty. VMware partners providing such a service would be part of the VMware Sovereign Cloud Program for CSP Partners, outlined below.  Enterprise Sovereign Cloud: A commercial business entity with regulatory requirements (in some cases, industry-specific) for data/digital sovereignty. These include heavily regulated industries such as aerospace, nuclear energy, and military/defense suppliers. This solution is hosted and managed by the enterprise or in conjunction with a partner, with strict oversight from regulators.  Government Sovereign Cloud: A government entity that requires, for legal or governance reasons, to hold data and application processes on sovereign soil. This solution is hosted and managed by the government entity itself or in conjunction with a partner, with strict oversight from regulators. Broadcom: Sovereign Cloud Focused Broadcom’s Sovereign Cloud solution is designed for VMware Cloud Service Partners to deliver sovereign cloud services that comply with a specific jurisdiction’s digital sovereignty requirements, which are defined in our 10-point self-attested Sovereign Cloud criteria (listed below).  Data residency relates to where the data is physically and geographically stored and processed. Due to the extreme scale of the large public cloud providers, this is something they are usually able to primarily provide, but often artifacts like metadata (data about the data) can leak out into other regions, typically the U.S. In some cases, data residency alone is not sufficient to ensure compliance with data privacy laws. Data sovereignty relates to law, specifically, data being subject to the governance structure and, more importantly, the jurisdiction of the nation where the data is processed and stored. Despite this, the data still needs to be accessible by an authorized entity, and this is a really important aspect of sovereignty. A sovereign cloud solution needs to not only protect critical data but also unlock its value. Data can be extracted in a meaningful way for both private and public sector organizations while providing transparency around architecture and operations. STC (Middle East)  “STC offers Sovereign Cloud services built on VMware Cloud Foundation that meet the specific needs of businesses and government entities in Saudi Arabia. Our services aim to foster trust, security, and compliance in cloud computing while supporting the broader digital transformation

Why Sovereign Cloud is Essential for Today’s Businesses Read More »

Navigating data governance and classification in generative AI with NetApp

In today’s data-driven world, the proliferation of artificial intelligence (AI) technologies has ushered in a new era of possibilities and challenges. One of the foremost challenges that organizations face in employing AI, particularly generative AI (genAI), is to ensure robust data governance and classification practices.  In the realm of genAI, the quality and breadth of the data set directly affect the performance and creativity of AI models. Just as an artist draws inspiration from a wide array of experiences and observations, genAI relies on a rich tapestry of data to craft meaningful and innovative outputs. The right data fuels the learning process, enabling AI to understand the patterns, nuances, and complexities inherent in the task at hand. Without high-quality organization-specific context, genAI may produce outputs that lack coherence, relevance, or diversity. However, even data that is specific to an organization is seldom timeless; it is simply a snapshot in time that can become outdated, resulting in information that loses context. Your organization may initiate a new product launch, introduce new features and capabilities, combine products into a new solution set, or discontinue a product that is no longer relevant to the market. Incorporating these changes into your data repository is crucial to achieving a high degree of retrieval accuracy. Another factor to consider is that these massive data sets may contain sensitive information, such as personally identifiable details, confidential medical histories, and financial records. Even seemingly innocuous data, like customer purchasing trends or upcoming product strategies, could prove detrimental to the organization if disclosed to competitors. It’s vital for companies to consolidate, categorize, evaluate, and share data while preventing unauthorized access and adhering to regulatory standards. How can you make sure that you’re maximizing the potential of your company’s data assets responsibly and securely? Often, data that has been dormant for years or even decades. At NetApp, we understand the need for a comprehensive approach to data governance and classification. Our tools help you to unlock the value of your business’s most precious asset: its data. The role of data governance in generative AI Data governance refers to the framework of policies, procedures, and controls implemented to ensure the quality, integrity, and security of data throughout its lifecycle. In the context of generative AI, robust data governance practices are essential for the following practices. Protect sensitive information. By classifying data based on its sensitivity and implementing access controls, organizations can prevent unauthorized access to confidential data, mitigating the risk of breaches or misuse of genAI applications. Ensure ethical use. Establishing clear guidelines and ethical standards for data usage helps organizations navigate the ethical complexities associated with genAI, such as generating synthetic data responsibly and avoiding biases or discriminatory outcomes. Maintain regulatory compliance. Compliance with data protection regulations, such as GDPR and CCPA, is of utmost importance.  Data classification strategies for generative AI Data classification involves categorizing data based on its sensitivity, value, and regulatory requirements. NetApp’s comprehensive set of features goes beyond basic data cataloging. Leveraging AI, machine learning, and natural language processing technologies, we categorize and classify data by type, redundancy, and sensitive information, constantly highlighting potential compliance exposures. NetApp offers a range of data classification strategies tailored to the unique challenges posed by genAI. Data estate visibility. Improve the cleanliness of your data and gain knowledge about sensitive information with complete visibility of your entire NetApp® data estate, both on-premises and in the public cloud. Your data scientists, AI engineers, IT administrators, and compliance teams are able to harness the power of all your data sets, optimize costs, and reduce risk. Discover personal and sensitive data. Our classification capabilities can identify personally identifiable information (PII), credit card numbers, social security numbers, bank account numbers, and sensitive personal data like health details, ethnic background, or sexual orientation. This ability facilitates compliance with regulatory requirements across jurisdictions, so you can feel confident that your most sensitive information is safe. Data optimization. To reduce overhead and ensure that AI models receive the most current context, you need to eliminate duplicate, stale, and nonbusiness data that can distort results. The NetApp data intelligence platform helps you discover, map, and classify your data to prepare it for genAI and retrieval augmented generation (RAG), so that your chatbot provides the most accurate answers. Let NetApp be your strategic partner in generative AI As organizations increasingly harness the power of genAI to drive innovation and competitive advantage, the importance of robust data governance and classification practices cannot be overstated. NetApp’s expertise in data management and storage solutions, coupled with our deep understanding of the challenges posed by genAI, positions us as a trusted partner for organizations seeking to responsibly navigate this rapidly evolving landscape. By implementing comprehensive data governance frameworks and employing advanced data classification strategies, organizations can unlock the full potential of genAI while safeguarding against risks and ensuring ethical and compliant use of data. In collaboration with NetApp, organizations can harness the transformative power of genAI while upholding the highest standards of data governance and classification. To explore further, visit our NetApp AI Solutions page. If you missed out on our webinar where we talked through the survey results of IDC’s AI maturity model white paper, you can watch it here. Learn more about how NetApp BlueXP classification can help simply data governance and provide actionable insights. Start your free test drive of BlueXP classification in a completely isolated environment. source

Navigating data governance and classification in generative AI with NetApp Read More »

Agentic AI: Why this emerging technology will revolutionise multiple sectors

New advancements in GenAI technology are set to create more transformative opportunities for tech-savvy enterprises and organisations. These developments come as data shows that while the GenAI boom is real and optimism is high, not every organisation is generating tangible value so far. According to NTT DATA’s Global GenAI Report, 97% of CEOs expect a material impact from the technology and 99% of respondents are planning more GenAI investment, even though only 43% of the C-suite strongly agree their existing solutions are meeting expectations.[1] However, with the rise of agentic AI, this situation will soon change, with powerful use cases emerging that will not just drive profits but also change how organisations operate, and workers do their jobs.   Agentic AI shifts the dial NTT DATA’s report finds that 95% of organisations agree that the technology is driving a new level of creativity and innovation – and agentic AI is a major leap forward in the evolution of GenAI. [2] Extending well beyond the simple question-and-reply functionality of today’s solutions, agentic AI can execute tasks within workflows. The technology can operate autonomously, make decisions based on real-time analysis and, critically, execute on decisions. It’s this ability to “think” and act autonomously that will enable the complete transformation of business workflows and unlock value. Likely use cases for agentic AI In practical applications, agentic AI is emerging in various fields such as autonomous vehicles, automated trading systems, and healthcare and natural sciences, where they will be programmed to perform tasks, make choices and interact with their environment in a way that mimics human agency. Hospitals and healthcare providers, for example, will increasingly use AI-powered diagnostic tools to assist in the analysis of medical images and the detection of diseases. In insurance, we can soon expect to see agentic agents manage the end-to-end workflow for customer engagements. For example, an AI agent could update customer data with relevant information and complete complex tasks based on a customer inquiry. Five recommendations for success As organisations look to integrate agentic AI into their operations, how can they ensure a smooth deployment and maximise value creation? As experts in agentic AI, NTT DATA has identified five primary success factors: Integration. Smart agents are part of a full stack of technologies and services. To realise the transformative benefits that organisations seek, agentic AI must be integrated with other technologies and activities. Alone, it is insufficient to respond effectively to interactions and deliver meaningful outcomes. Alignment. Organizations need to find the right balance of technologies to meet their specific use-case and business requirements.  For example, GenAI must be seen as a core element of the business strategy itself. For now, 51% say this strategic alignment has not been fully achieved, according to NTT DATA’s study. [3] Preparation. Data readiness and governance are critical to success and must be addressed in tandem with business process transformation. NTT DATA recommends a staggered data transformation framework so that organisations can realise value incrementally. Operations. Mature AIOps will be critical for all organisations to maintain and improve agentic models, safeguard against model drift, establish guardrails and create a robust user-feedback loop to manage agent workflow. Human in the loop will be a high consideration as organisations weight the autonomy of certain decisions. Partnerships. Look for differentiated third-party services that help to build mid- and long-term competitive advantages, and partners that support smart-agent ecosystems. Being able to tap into such ecosystems will help accelerate and derisk agentic AI innovation with access to the latest technologies. Agentic AI is still nascent, but adoption will rapidly gather pace. Now is the time to start defining use cases, aligning preparatory resources and identifying partners who can make your smart-agent deployments a success. NTT DATA’s Global GenAI Report now. [1] NTT DATA, Global GenAI Report, 2024 https://services.global.ntt/en-us/campaigns/global-genai-report [2] Ibid [3] Ibid source

Agentic AI: Why this emerging technology will revolutionise multiple sectors Read More »

AI’s data problem: How to build the right foundation

2024 was undoubtedly “the year of AI,” with businesses across the globe attempting to fast-track implementations. In fact, EY’s 2024 Work Reimagined Survey found that Generative AI (GenAI) adoption skyrocketed from 22% in 2023 to 75% in 2024. Meanwhile, Forrester found that 67% of AI decision-makers plan to ramp up their GenAI investments in the coming year. The potential value GenAI could deliver is undeniable, particularly in expediting content generation or powering smarter chatbots. With the right systems in place, businesses could exponentially increase their productivity. Yet, as transformative as GenAI can be, unlocking its full potential requires more than enthusiasm—it demands a strong foundation in data management, infrastructure flexibility, and governance. Without these critical elements in place, organizations risk stumbling over hurdles that could derail their AI ambitions. Trusted, Governed Data The output of any GenAI tool is entirely reliant on the data it’s given. The better the data, the stronger the results. It sounds simple enough, but organizations are struggling to find the most trusted, accurate data sources. According to a recent Cloudera survey, just over a quarter (26%) of IT decision-makers said trusted data was a challenge to implementing AI within their organizations. On top of that, 73% of respondents said their company’s data exists in silos and is disconnected, and while 40% believe they are the sole person who knows where data exists in the organization. With data existing in a variety of architectures and forms, it can be impossible to discern which resources are the best for fueling GenAI. Not only that, but giving GenAI access to any data sources also opens up incredible governance risks. If you don’t know where your data exists or which data your LLMs have access to, how can you ensure you’re being compliant? As AI solutions process more data and move it across environments, organizations must closely monitor data flows to safeguard sensitive information and meet both internal governance guidelines and external regulatory requirements. Enterprises that fail to adapt risk severe consequences, including hefty legal penalties and irreparable reputational damage. The Right Foundation Having trustworthy, governed data starts with modern, effective data management and storage practices. This means having an environment capable of handling data in all its forms—structured and unstructured—which is increasingly complex to manage as data volumes grow. A hybrid approach often offers the best solution, allowing organizations to store and process sensitive information securely on-premises while leveraging the scalability and flexibility of the cloud for less critical workloads. It means managing and storing data where it will bring the most value to the enterprise, without having to move it. With the right hybrid data architecture, you can bring AI models to your data instead of the other way around, ensuring safer, more governed deployments. While only one-third of respondents currently deploy multi-cloud or hybrid data architectures, an overwhelming 93% agreed that these capabilities are essential for adapting to change. The infrastructure flexibility afforded by a hybrid approach ensures your company is ready to integrate tomorrow’s innovations, rather than being constrained by the limitations of yesterday’s solutions. Ensuring Data is Secure and Compliant GenAI has the potential to revolutionize productivity, but to harness its power, organizations need a strong foundation. By adopting the right data management practices, enterprises gain the ability to track, secure, and govern their data seamlessly from end to end, empowering them to know that the data powering their AI initiatives will deliver the most trustworthy, valuable insights. To learn more about GenAI and how Cloudera can help you maximize your investments, click here. source

AI’s data problem: How to build the right foundation Read More »

5 dead-end IT skills — and how to avoid becoming obsolete

“That’s why soft skills are such a vital part of your professional armory as you aim to add longevity to your career,” he says. “Coupled with a better understanding of business strategy and the part that digital infrastructure plays in it, and you’re better equipped to handle the shifts in the tech industry.” Future-proofing an IT career Talk to a manager about what options are available for training and what skills will offer the best chance of success within the organization, suggests Chris Campbell, CIO at DeVry University. “If you are currently in one of the roles mentioned, work to enhance and develop your durable skills such as problem solving, communication, lifelong learning, collaboration, and critical thinking,” Campbell says. “Along the way learn and leverage tools that allow you to activate those skills in the pursuit of your work. These skills, along with the knowledge of how to use emerging technology, will empower you regardless of industry, role or company.” Campbell suggests work in technology will involve virtual peers on a day-to-day basis sooner than some might expect.   source

5 dead-end IT skills — and how to avoid becoming obsolete Read More »

Tapping into the benefits of an open data lakehouse for enterprise AI

What does it take to implement and maximize the value of AI within an organization? In short, it takes data—and a lot of it. As it stands, many large organizations find themselves relying on a mix of solutions, platforms, and architectures to handle the volume of structured and unstructured data that has been created as their operations have expanded. But that approach can lead to complications in terms of data access, visibility, manageability, and ease of use for analytics and AI. Instead, organizations should look toward solutions like an open data lakehouse that can provide a unified architecture for data storage, processing, and analytics. Challenges to Effective Data Management Among the issues that surround data management and effective AI, one of the most common comes in the form of data silos that result from sprawling systems and operations. A large enterprise typically has a vast pool of data to store and manage. Having that data spread across the organization and a variety of environments can lead to pockets of data that are difficult to access, misused, and even unaccounted for in broader data management and unification efforts. There can also be serious issues surrounding data governance, security, and trustworthiness. Data can sometimes fall into the wrong hands either purposefully or accidentally – from a malicious hacker, or an unknowing employee who transferred data to a personal device that was later stolen. The scenarios are numerous. If an organization is going to achieve truly impactful, real-time outputs from analytics and AI, it needs to ensure that all data—including structured and unstructured—is properly governed and managed even as the scale of data grows rapidly. The Value of an Open Data Lakehouse An open data lakehouse provides a number of advantages that go beyond what other data architectures are capable of. As the name suggests, an open data lakehouse brings together the flexibility that comes with a data lake and merges it with the performance of a data warehouse, enabling better, quicker analytics to run on all relevant data. And in the case of an open data lakehouse, like Cloudera, it brings the ability to operate both on-premises and in the public cloud. That means it can be built once and run anywhere, saving time and ensuring portability on all clouds. Acting as a unified data platform means an open data lakehouse can do away with the silos that are hampering analytics and AI initiatives. Having that level of unification between data also means organizations can encourage data democratization and self-service analytics that allow users enterprise-wide to generate impactful insights from their data. We’ve talked a lot about the open data lakehouse to this point, but what do we mean when we say “open”? In this case, it means interoperability and compatibility with a variety of data processing frameworks, programming languages, and analytics tools. By establishing this openness, the data lakehouse enables deep collaboration and innovation, allowing data teams to leverage their preferred tools and methodologies. Both business needs and technological capabilities are constantly evolving. As those changes take hold, data platforms need to be flexible and future-proofed enough to keep up. An open data lakehouse architecture makes it easier to support modern analytics workloads, SQL-based querying, and advanced analytics frameworks. Whatever form the workload or analytics use case comes in, an open data lakehouse offers a robust platform to adapt and scale. [CTA] Learn more about how Cloudera’s open data lakehouse can support your AI and analytics initiatives. source

Tapping into the benefits of an open data lakehouse for enterprise AI Read More »

How to talk to your board about tech debt

Also, beware the proof-of-concept trap. Don’t get bogged down in testing multiple solutions that never see the light of day. Instead of focusing on single use cases, think holistically about how your organization can use AI to drive topline growth and reduce costs. What part of the enterprise architecture do you need to support this, and what part of your IT is creating tech debt and limiting your action on these ambitions? Present a balanced solution Here’s where many CIOs stumble: presenting technical debt as a problem that needs to be eliminated. Instead, show how leading companies manage it strategically. Our research reveals that top performers allocate around 15% of their IT budget to debt remediation. This balances debt reduction and prioritizes future strategic innovations, which means committing to continuous updates, upgrades, and management of end-user software, hardware, and associated services. And it translates into an organization that’s stable and innovative. We also found throwing too much money at tech debt can be counterproductive. Our analysis found a distinct relationship between a company’s digital core maturity and technical debt remediation. Using more of the IT budget to pay down tech debt only improves digital core maturity to a certain point. Beyond this peak, it indicates that a company is over-indexing investments in technical debt and not building its digital core capability effectively and efficiently.Close with clear choices source

How to talk to your board about tech debt Read More »

CIOs’ top 2025 goal? Turning around IT’s sagging reputation

Moreover, CIOs must migrate from “meeting expectations” to “exceeding expectations.” The consensus among my sample set of technology leaders indicates less than 10% of IT organizations are “exceeding expectations” today. During his seven years as Assistant Secretary of Commerce for Communications and Information (NTIA), where he was a principal advisor to the President, Vice President, and Secretary of Commerce, Larry Irving coined the phrase “Digital Divide,” advocating for domestic and international policies to increase more equitable access to the Internet and related technologies. Putting technology in the hands of end-users is not enough. End-users need to understand how to use the technology productively. Many CIOs recount misunderstandings among staff regarding when IT’s job is done. During the rollout of a collaboration platform, a project manager asked her team, “When are we finished?” The team responded unanimously, “When we have installed the icon [for the new app] on everyone’s desktop.” source

CIOs’ top 2025 goal? Turning around IT’s sagging reputation Read More »

Uber branches out into AI data labeling

Uber no longer offers just rides and deliveries: It’s created a new division hiring out gig workers to help enterprises with some of their AI model development work. Scaled Solutions grew out of the company’s own needs for data annotation, testing, and localization, and is now ready to offer those services to enterprises in retail, automotive and autonomous vehicles, social media, consumer apps, generative AI, manufacturing, and customer support. Its first customers include Aurora Innovation, which makes self-driving software for commercial trucks, and game developer Niantic, which is building a 3D map of the world. The skills Uber requires its new gig workers to have vary, Chris Brummitt, the company’s senior director for communications in Asia-Pacific, said via email. “It depends on the task, but some might need language skills, some programming, some no specific skills.” source

Uber branches out into AI data labeling Read More »