CIOs and CISOs grapple with DORA: Key challenges, compliance complexities

The delay in the arrival of the Regulatory Technical Standards (RTS) does not help. “The legislator has not completed the regulatory process,” says Giancarlo Butti, an auditor and expert in privacy and security. “To date, only some of the delegated regulations have been officially released, so financial entities that are, for example, redefining contracts with suppliers will subsequently have to — once the other delegated regulations arrive — add the part relating to the management of relationships with subcontractors. It is very important, in fact, that financial entities carefully consider the risk of the entire supply chain. An aspect that is not considered enough is that the impact of DORA does not only involve financial entities but, indirectly, the entire ICT supply chain.” The complexity of DORA, therefore, is not in the text itself, although substantial, but in the work it entails for compliance. As Davide Baldini, lawyer and partner of the ICT Legal Consulting firm, points out, “DORA is a very clear law, as it is a regulation, which is applied equally in all EU countries and contains very detailed provisions. By comparison, NIS2 is based on principles and is a directive, so each member country has room to maneuver in its implementation. However, DORA is very prescriptive, and this makes compliance complex in terms of time and the human and financial resources that need to be deployed.” source

CIOs and CISOs grapple with DORA: Key challenges, compliance complexities Read More »

Researchers find you don’t need a ton of data to train LLMs for reasoning tasks

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Large language models (LLMs) can learn complex reasoning tasks without relying on large datasets, according to a new study by researchers at Shanghai Jiao Tong University. Their findings show that with just a small batch of well-curated examples, you can train an LLM for tasks that were thought to require tens of thousands of training instances.  This efficiency is due to the inherent knowledge that modern LLMs obtain during the pre-training phase. With new training methods becoming more data- and compute-efficient, enterprises might be able to create customized models without requiring access to the resources of large AI labs. Less is more (LIMO) In their study, the researchers challenge the assumption that you need large amounts of data to train LLMs for reasoning tasks. They introduce the concept of “less is more” (LIMO). Their work builds on top of previous research that showed LLMs could be aligned with human preferences with a few examples. Less is More (LIMO) for reasoning (source: arXiv) In their experiments, they demonstrated that they could create a LIMO dataset for complex mathematical reasoning tasks with a few hundred training examples. An LLM fine-tuned on the dataset was able to create complex chain-of-thought (CoT) reasoning chains that enabled it to accomplish the tasks at a very high success rate. For example, a Qwen2.5-32B-Instruct model fine-tuned on 817 training examples chosen based on LIMO reached 57.1% accuracy on the highly challenging AIME benchmark and 94.8% on MATH, outperforming models that were trained on a hundred times more examples. It also scored higher on the benchmarks than reasoning models such as QwQ-32B-Preview (a version of the Qwen model that has been trained for reasoning) and OpenAI o1-preview, both of which have been trained with larger data and compute resources. Moreover, LIMO-trained models generalize to examples drastically different from their training data. For example, on the OlympiadBench scientific benchmark, the LIMO model outperformed QwQ-32B-Preview, and on the challenging GPQA benchmark, it achieved 66.7% accuracy, close to OpenAI-o1-preview’s leading score of 73.3%. What does it mean for enterprise AI? Customizing LLMs is an attractive use case for enterprise applications. Thanks to techniques such as retrieval-augmented generation (RAG) and in-context learning, LLMs can be customized to use bespoke data or perform new tasks without the need for expensive fine-tuning.  However, reasoning tasks often require training and fine-tuning LLMs. The widely-held belief has been that such tasks require large volumes of training examples with highly detailed reasoning chains and solutions. Creating such datasets is slow and impractical for many applications and companies. More recently, researchers have shown that pure reinforcement learning approaches can enable models to train themselves for reasoning tasks by generating many solutions and choosing the ones that work best. While this approach requires less manual effort, it still demands expensive compute resources that are beyond the reach of many enterprises. On the other hand, crafting a few hundred examples is an endeavor that many companies can tackle, bringing specialized reasoning models within the reach of a wider range of organizations. “This discovery has profound implications for artificial intelligence research: It suggests that even competition-level complex reasoning abilities can be effectively elicited through minimal but curated training samples,” the researchers write. Why LIMO works In their experiments, the researchers identify two key reasons why LLMs can learn complex reasoning tasks with fewer examples. First, state-of-the-art foundation models have been trained on a very large amount of mathematical content and code during pre-training. This means that these LLMs already possess rich reasoning knowledge in their parameters that can be activated through carefully-crafted examples. Second, new post-training techniques have shown that allowing models to generate extended reasoning chains significantly improves their reasoning ability. In essence, giving the models more time to “think” allows them to unpack and apply their pre-trained knowledge more effectively. “We hypothesize that successful reasoning emerges from the synergy of these two factors: rich pre-trained knowledge and sufficient computational resources at inference time,” the researchers write. “These developments collectively suggest a striking possibility: If models possess rich reasoning knowledge and are given adequate computational space, then activating their reasoning capabilities may require only a small number of high-quality training samples that encourage extended deliberation, rather than massive fine-tuning datasets.” Choosing more complex problems to include in the training dataset can have a significant effect on the trained model’s accuracy in reasoning tasks (source: arXiv) According to the researchers’ findings, creating useful LIMO datasets hinges on choosing the right problems and solutions. Data curators should prioritize challenging problems that require complex reasoning chains, diverse thought processes and knowledge integration. The problems should also deviate from the model’s training distribution to encourage new reasoning approaches and force it toward generalization. Accordingly, solutions should be clearly and well-organized, with the reasoning steps adapted to the complexity of the problem. High-quality solutions should also provide strategic educational support by gradually building understanding through carefully structured explanations.  “By focusing on a minimal yet meticulously curated set of reasoning chains, we embody the core principle of LIMO: High-quality demonstrations, rather than sheer data volume, are key to unlocking complex reasoning capabilities,” the researchers write. The researchers have released the code and data used to train the LIMO models in their experiments. In the future, they plan to expand the concept to other domains and applications. source

Researchers find you don’t need a ton of data to train LLMs for reasoning tasks Read More »

Digital Mindset: The Secret to Bottom-Up GenAI Productivity

As organizations look to increase business performance through generative AI, traditional methods for increasing adoption of new technologies are unlikely to be effective for several reasons.   First, unlike most enterprise systems, which are designed to automate specific tasks, GenAI tools are general purpose. While standard use cases can be developed and shared, sustainable productivity gains will result from employees innovating and finding novel ways to use GenAI tools in real-time as conditions change.  Second, many GenAI tools are enabled rather than implemented, thus bypassing the user engagement opportunities a formal implementation project affords. For example, many organizations are using GenAI for text generation in word processors and notetaking in video conference software. No implementation project was needed to make this leap; the new functionality was simply activated.   Third, GenAI tools are probabilistic rather than deterministic. Having employees attend structured training makes sense for a deterministic system, one that will always generate predictable outputs from a given set of inputs. Conversely, GenAI tools rely on statistical methods and have inherent variability in their outputs. Enter the same prompt in your favorite large language model (LLM) twice and you will get two different responses.   Related:Possibilities with AI: Lessons From the Paris AI Summit The final key difference between prior technologies and GenAI is the level of technical knowledge required. Unlike previous technologies, many GenAI tools are designed to be low code or no code. Users tell the technology what to do via natural language processing or simple graphic interfaces. Because there is no need to translate desired functions into computer code, employees can innovate automations independently, breaking the reliance on IT and specialized coding skills.  Culture at the Core of GenAI Adoption   The challenge for business leaders will be to increase the type of GenAI adoption that continually taps new pools of business value through independent, real-time use case innovation on pace with changing business demands. This will require an important cultural component that I call “digital mindset.”  Digital mindset entails a functional understanding of data and systems, enabling innovation in daily work activities across multiple domains. Digital mindset is a productivity accelerant, insufficient by itself, and most impactful when paired with domain expertise and other soft skills, like problem-solving and communications.   Leaders Can Drive Bottom-Up GenAI Adoption  Related:An AI Prompting Trick That Will Change Everything for You Cultural changes require a strong leadership push to be successful. There are several practical steps leaders can take to begin building or reinforcing digital mindset and driving value-add GenAI adoption:  Role model the behavior. Leaders should be embodiments of digital mindset, role modeling the desired behaviors and consistently walking the walk. To do this, leaders should gain hands-on experience using GenAI tools.  Create the right conditions. Encouragement for employees to use GenAI must be matched with a positive user experience, especially for first-time users. Leaders should establish an infrastructure that makes GenAI both safe and easy to use.  Communicate clearly and transparently. GenAI adoption should be enhanced through a multi-pronged communication plan, with messaging that evolves over time and, at a minimum, accomplishes a few critical objectives: provides clear guidance, demystifies the organization’s approach to GenAI, builds excitement, sets expectations, and celebrates specific examples of success.  Embrace the culture shift. For organizations that are resistant or lagging, leaders need to use cultural interventions to treat the root causes — the underlying employee beliefs and values — rather than the symptoms. Overcoming limiting beliefs like “AI is going to replace me” or “I need to wait for training before I can start” must be overcome to build momentum toward sustained success.   Related:How to Regulate AI Without Stifling Innovation Effective cultural interventions create positive changes in employee attitudes that drive new behaviors that generate artifacts that create business value. Because the change unfolds through these layers sequentially, it’s important to have benchmarks for each layer that help indicate a strong culture (“digital mindset”) versus a weak one (“analog mindset”). Some examples of good and bad at each layer include:  Layer 1: Culture — Beliefs and Values  Digital mindset examples – Technology can make my role more valuable; using new technologies will create skills that transfer to other systems; using new technology is a way to learn  Analog mindset examples – Technology will replace my job; by the time I learn this new technology, it will change again; I need to wait for training before I start  Layer 2: Attitudes  Digital mindset examples – Enthusiastic view of technology  Analog mindset examples – Cynical view of technology  Layer 3: Behaviors  Digital mindset examples – Seek out resources and training; experiment with new technologies on daily tasks; spread knowledge to colleagues  Analog mindset examples – Disparage and resist new technology; subvert implementation efforts; encourage complexity to reduce automation potential  Layer 4: Artifacts — Outcomes that Deliver Business Value  Digital mindset examples – Process innovation; productivity gains; analytics enablement  Analog mindset examples – Manual processes; unreliable data; stale skillsets  Measuring Progress   Levels of GenAI adoption can be measured across a continuum ranging from “resistant” to “champion adoption,” with several steps in between.   GenAI Adoption Levels (Worst to Best)  0 Resistant – Actively resists or avoids using GenAI tools, either due to fear, mistrust or a perception that they threaten job security.  1 Forced adoption – Engages minimally with GenAI, using only the basic features necessary to meet mandatory requirements or appease supervisors.  2 Cautious adoption – Begins to explore GenAI’s capabilities beyond the bare minimum, often through limited, low-stakes experimentation.  3 Enthusiastic adoption – Shows genuine interest in integrating GenAI tools into their workflow, actively participating in use cases provided by supervisors or team leaders.  4 Creative adoption – Develops novel use cases for GenAI independently, often designing solutions tailored to specific departmental needs or even contributing to larger strategic goals.  5 Champion adoption – Fully embraces GenAI as a core part of their work and actively promotes its use across departmental boundaries. Champions are adept at identifying new opportunities for GenAI, both operationally and strategically,

Digital Mindset: The Secret to Bottom-Up GenAI Productivity Read More »

Black IT leaders weigh in on what DEI means for IT

Jackson recounts the time he gave a talk on inclusivity, equality, belonging, parity, and justice at a manufacturer, where he sought to break down assumptions. “I said, ‘At the end of the day it doesn’t mean if you’re white, you’re wealthy, or if you’re Black, you’re a Democrat, or that if you have friends who are part of the LGBTQ community that you’re exploring. It’s just people are people,’” Jackson explains. He spoke to the uniqueness of all individuals and the inclusivity being a culture where everyone belongs. He adds, “Diversity for me just means people.” Jackson says a man approached him afterward first confiding that he only attended the talk because it was mandatory and then saying, “I didn’t think you and I could have anything in common. But you opened my mind.” “He seemed to take away the ability to be more collaborative with people who didn’t share his identity,” he says. ‘Inclusive excellence’ advances careers, organizational success Robert Scott, ITSMF vice president and dean of its Global Institute for Professional Development, defines “diversity” as “having a robust mix of people from all different backgrounds that brings diversity of thought and capabilities.” Robert Scott, vice president, ITSMF ITSMF He acknowledges that diversity “does include things that are flashpoints today such as gender, race, and orientation, but it’s also diversity of thought and educational diversity.” He adds, “Diversity is just having a mix. Equity is all about treating those folks on a level playing field. And inclusion is about making people feel like they belong and that they’re treated equitably.” As for the benefits produced by those concepts, he points to research from University of Michigan professor social scientist Scott Page that shows diverse teams operating in an equitable and inclusive environment routinely outperform homogenous teams in terms of business results, creativity, and innovation. Scott knows it takes work to create equitable and inclusive environments where diverse individuals can succeed. He mentored a Black woman whom he says thought hard work alone would get her ahead. Like others who have historically been shut out of the executive ranks, she didn’t realize that she needed more to advance. Scott advocated for her and she eventually became a CIO. Scott coached her on the fact that image and exposure are also needed to progress into management and executive tiers; hard work, how you do that hard work, and who sees it all matter. But image and exposure involve mentoring and sponsorship, which is where many people — particularly first-generation professionals and those from underrepresented groups — can be shut out. If you don’t have access to potential mentors and sponsors “because you’re not included and you don’t feel like you belong, if people don’t engage you, if you’re not invited to the weekend golf outing or out with the group because you’re not welcome there, then you won’t advance,” Scott explains. ITSMF programs such as corporate workshops help show companies those obstacles and give them strategies for removing them, he says. “That’s the power of DEI,” he says, noting he uses the term “inclusive excellence” to describe such work. See also: source

Black IT leaders weigh in on what DEI means for IT Read More »

Want to Use ChatGPT Like a Pro? These Courses Can Help

For instance, you might not realize it can simplify your workflow by summarizing lengthy business reports or drafting professional emails in seconds. With ChatGPT-5 on the horizon, now is an excellent time to work on going from a casual user to a ChatGPT expert. This 2025 ChatGPT Skills and Creativity training bundle won’t just show you ways to save time in your personal life but at work, too. You can get lifetime access for $29.99 (reg. $249.95). These ChatGPT training courses focus on two key ideas: the technical and creative side of ChatGPT. On the technical side, they’ll teach you how to communicate with the AI chatbot — the most vital part of getting good results. Right now, you might just be typing something into the textbox like “Summarize this for me…” and pasting your text, but you may be getting subpar responses. The Creative Writing & Content Creation with ChatGPT course covers prompt engineering and teaches you other keywords and phrases to add to get you a better output. As for the creative side of ChatGPT, you’ll learn how to think outside the box for ways to use it. Sure, you can use ChatGPT to help plan a vacation, but an entire course will teach you how to use AI to draft a business proposal or generate market insights. How? Think researching destinations, planning activities, budgeting your trip, calculating expenses. And for that business proposal, think structuring outlines, and gathering key data. . You can use similar techniques to automate meeting notes or generate quick client follow-ups. Learn how to use AI the right way with the 2025 ChatGPT Skills and Creativity training bundle at $29.99 for lifetime access. Prices and availability are subject to change. source

Want to Use ChatGPT Like a Pro? These Courses Can Help Read More »

How CIOs Can Prepare for Generative AI in Network Operations

AI networking has been a hot topic over the past few years and is a subset of AIOps. Generative AI (GenAI), which is part of AI networking, has taken this hype to a new level with the potential to transform network operations. However, with its conversational interface and ongoing learning capabilities, GenAI will likely be met with both favor and distrust.   But what can enterprises really gain by using GenAI as part of the network operations? CIOs must be aware of new GenAI capabilities for network operations, business case considerations and ways to build trust to minimize adoption risk.  GenAI promises great potential to enable improvements to long-standing traditional networking operations practices across Day 0, Day 1, and Day 2. With GenAI, network operations can accelerate initial configurations, improve the ability to change vendors, drive more efficient troubleshooting and simplify documentation access.  Day 0  For Day 0, for example, an engineer could use an iterative process and ask the GenAI network tool via a natural language interface to design a leaf-spine network to support 400 physical servers using Vendor X. Additional information like SLA requirements (such as availability and throughput) can also be included via natural language to deliver the desired performance level and design that includes cost implications.   Related:Possibilities with AI: Lessons From the Paris AI Summit Another example is in the area of capacity planning as new users, applications, and architectures are adopted, making network planning more complicated. GenAI can be used to help size network infrastructure and optimize costs based on the number and types of applications hosted on-premises, in the cloud and at end-user locations (in the office, at home or other locations).  Day 1  The GenAI network tool can then help generate/validate/optimize all the required Day 1 configurations based on desired criteria (for example, by price or performance). It may not be 100% accurate, which is why it may require an iterative process to refine GenAI tool outputs to accelerate/optimize network setup. Even if it requires several iterations, the use of GenAI would represent a substantial improvement over current rigid processes and tools, reducing time and errors by up to 25%. We envision that this will be leveraged in all networking domains (WAN, data center, cloud, and campus) to assist in the design and setup of networks.  Day 2  AI networking enhances Day 2 network operational support by correlating multiple data inputs, identifying problems faster, yielding quicker resolution and, where applicable, spotting problems proactively before a user is aware. GenAI will bring additional capabilities including a conversational interface and the ability to learn over time. It can also enhance user experience with specific outputs such as text, audio, video, or graphics.  Related:An AI Prompting Trick That Will Change Everything for You For example, to help isolate problems, CIOs can ask GenAI to build a dynamic graphic of networking performance issues over time based on packet loss, latency and jitter. It can also focus on specific questions such as “Is the CEO having network performance issues?”  GenAI can create detailed configurations and troubleshooting procedures based on natural language inputs without explicit templates. GenAI tools can drive network operational support time savings by up to 25% when compared with the status quo by driving efficiencies that can’t reasonably be achieved by scaling manual resources. It removes manual processes to identify issues more quickly, resulting in faster problem resolution.  Calculate the Value Before Investing  CIOs must ask pertinent questions to gain a complete understanding of the inherent value of GenAI networking, its use cases and common tools. A key facet in the process of GenAI adoption involves building the business case and calculating the value to the organization.   Related:How to Regulate AI Without Stifling Innovation Asking pertinent questions can offer more insights while creating a business case to determine the value of GenAI functionality. Specifically, determine if aligning network operations with GenAI can help build scale, control/reduce costs, drive resource efficiency, foster agility to keep up with the digital business and deliver a better end-user experience.  Prove the Concept First  In addition to the immaturity of GenAI networking functionality and the need to quantify the value, another key limitation that needs to be overcome to achieve wider adoption by network operations is a lack of trust. Network teams have been burned many times by vendor claims of automation or single pane of glass to solve existing issues. This, in part, is the reason why network operations teams have been slow to adopt network automation and are skeptical about GenAI. On top of this, GenAI networking tools may yield inconsistent responses, which introduces risk and fosters mistrust.  However, network operations teams need to include GenAI functionality in their RFPs/RFIs to determine the scope, value and capabilities of the solutions in the market as they mature.  Running a proof of concept (POC) is key for network operations personnel to determine the accuracy of the GenAI solution, alongside its maturity, level of trust and degree of comfort. This is really more about quantifying the accuracy of the GenAI networking solution across a wide range of scenarios. Even in production, we expect network operations personnel to have to validate some or many GenAI outputs, but baselining the capability gives context to the accuracy and the level of unsupervised trust (if any) that should be given.  When running the POC, begin by testing in a lab environment before moving to a real-life production environment. Test the solution over several weeks and months to stress it as much as possible. Have multiple personnel leverage the tool to capture multiple opinions/perspectives. Validate the GenAI networking tool outputs for accuracy by testing against alternative sources. Measure the time to perform tasks with the GenAI networking tool and with the previous/current method. In short, the goal is to compare process efficiency and accuracy of the current approach versus the intended GenAI approach. As part of this POC, both the level of trust and value (business case) can be determined to help inform a sourcing decision and simplify

How CIOs Can Prepare for Generative AI in Network Operations Read More »

How Advanced Analytics Can Transform Your CX Practice

Advanced Analytics: The Future Of CX Despite the recent challenges in overall experience quality seen in Forrester’s Customer Experience Index (CX Index™) benchmarks, customer experience (CX) remains a priority for many organizations. Unfortunately, these organizations have struggled to realize tangible benefits from their CX programs. In new research, we discuss key challenges CX programs face when relying on customer feedback as their primary capability and why they need to leverage more advanced quantitative analytics to drive action, increase financial impact, and prepare for a more analytics-driven future. The Challenges Of Survey-Only CX CX measurement programs report that their most common challenges are driving action to improve experience quality and proving the financial importance of CX. One primary cause is their reliance on soliciting customer feedback, usually through surveys. Surveys don’t often provide definitive root causes that compel business functions to make changes, and the relationship between survey scores and financial performance remains theoretical in most organizations. While a survey-reliant CX strategy is holding CX programs back, we are not advocating that they stop surveying customers. Instead, they should reduce their reliance on surveys and use that feedback data as part of a more comprehensive quantitative approach. The Quantitative Future Of CX Integrating advanced quantitative analytics into their strategy helps CX programs drive action and prove value. This involves shifting from treating survey score metrics as their primary output to using feedback data as an input to more advanced techniques. When CX programs combine customer feedback data with other metrics like operational interaction data, financial outcome data, and additional non-survey perception data, these inputs to advanced analytics can produce more actionable and financially connected insights than survey feedback alone. Executing On The Promise Of Advanced Analytics In CX After discussions with dozens of CX leaders, top vendors, and service providers in CX analytics, we found a consensus on several steps organizations must take to implement advanced CX analytics successfully. Among the five key components presented in our research, two demand considerable attention: Enabling a comprehensive experience dataset. This includes ensuring the availability, quality, and validity of a comprehensive dataset of customer perceptions, interactions, and the financial outcomes of their behavior. Most experts agree that this is the most critical and challenging aspect of implementing an advanced CX analytics strategy. Operationalizing insights from advanced analytics. While insights from advanced analytic techniques can prove fascinating in many organizations, acting on the outputs is crucial. This means using advanced analytic insights to take a proactive approach to CX, where organizations use diagnostics, predictions, and prescriptions to manage the experiences of all customers rather than reacting to feedback from a small percentage who respond to surveys. Avoiding Missteps In The Adoption Of Advanced Analytics In CX For our recent research, we have defined advanced CX analytics as “advanced analytic techniques — including diagnostic, predictive, and prescriptive machine learning — that identify how customers’ experiences affect their behaviors.” The terms “advanced analytics” and “predictive analytics” are used somewhat loosely in the CX ecosystem. While useful, language analytics, conversational and digital intelligence, and sentiment analysis differ from advanced diagnostic, predictive, and prescriptive methods. CX leaders should ensure they understand these differences when pursuing quantitative CX strategies. Another variation in CX analytics is leveraging machine learning models to predict common CX survey metrics like Net Promoter Score℠ (NPS) or customer satisfaction (CSAT). While novel, most organizations would be better served predicting the actual outcomes of customer behavior with direct financial impacts rather than making the effort to develop these capabilities only to reinforce challenges associated with relying on customer perceptions to manage experiences. Final Advice While advanced analytic techniques are uncommon in CX practices today, CX programs and leaders should challenge themselves and find a path to facilitate, collaborate , or expand the CX mandate to pursue a more quantitative approach that will prepare them for the future of CX. If you’re ready to advance your CX program’s analytics strategy, Forrester can help. Forrester clients can access our two new reports, Why You Need Advanced Analytics To Transform CX and How To Transform CX With Advanced Analytics, where we provide a practical approach for pursuing advanced CX analytics in your organization. They can also schedule a guidance session on the topic. CX leaders can join us in Nashville this summer for Forrester’s North America CX Summit, where I will be available for live in person guidance. source

How Advanced Analytics Can Transform Your CX Practice Read More »

ASUSTeK Hit With $10.5M Verdict In Chip Patent Case

By Andrew Karpan ( February 13, 2025, 10:27 PM EST) — A jury in the Eastern District of Texas on Thursday found that Taipei-based laptop maker ASUSTeK infringed electronic component patents by a fellow Taiwanese rival and owed $10.5 million…. Law360 is on it, so you are, too. A Law360 subscription puts you at the center of fast-moving legal issues, trends and developments so you can act with speed and confidence. Over 200 articles are published daily across more than 60 topics, industries, practice areas and jurisdictions. A Law360 subscription includes features such as Daily newsletters Expert analysis Mobile app Advanced search Judge information Real-time alerts 450K+ searchable archived articles And more! Experience Law360 today with a free 7-day trial. source

ASUSTeK Hit With $10.5M Verdict In Chip Patent Case Read More »

Create the future with AI: Join Microsoft at NVIDIA GTC

Presented by Microsoft and NVIDIA AI is producing tangible results for business at an astonishing rate and scale, which means the new question becomes, how do we tap into that potential?  But building a robust AI strategy is more than just adopting new technology. It’s about fostering a culture that prioritizes innovation, ensuring security at scale, equipping developers with the tools to succeed, and balancing cutting-edge innovation, secure deployment and developer empowerment. By leveraging a broad selection of models, ensuring high-quality deployment, and harnessing the power of strategic partnerships, organizations can create AI solutions that drive real business value. Microsoft is an elite sponsor at this year’s NVIDIA GTC AI Conference March 17 – 21, where company leaders will showcase the power of Microsoft Azure AI, an end-to-end AI platform that lets businesses of every size innovate quickly, securely and responsibly. The NBA chose Azure OpenAI Service accelerated by NVIDIA to easily incorporate OpenAI models into its applications, speeding up the time to market for new, innovative features. Helping fans connect with the league in the way they want, with personalized, localized insights, the NBA is keeping at the forefront of a great fan experience. BMW created a mobile data recorder (MDR) solution, placing an IoT device in each development car to transmit data over a cellular connection to an Azure cloud platform, where Azure AI solutions facilitate efficient data analysis. The vehicle data covered by the system has doubled, and data delivery and analysis happen 10 times faster. New York City–based software developer OriGen is revolutionizing the energy industry with proprietary AI models supported by Microsoft Azure AI infrastructure. Using Azure AI infrastructure, OriGen has fast, easy access to the compute resources required to drive its NVIDIA GPU-based solutions and the means to deploy its powerful offering as a software as a service platform. Microsoft and NVIDIA’s powerful technology partnership elevates the performance and scale of Azure AI services in a way other cloud providers can’t match — and it’s available to every Azure customer. Developers can leverage the latest AI models from Azure OpenAI Service, NVIDIA NIM and NVIDIA Foundation Models, all accessible via simple, up-to-the-minute APIs. Come see the Azure AI and NVIDIA AI  in action and get hands on with the latest technologies. Microsoft is a proud elite sponsor of NVIDIA GTC 2025, the premier developer conference at the heart of AI. What’s happening at NVIDIA GTC, March 17-21, San Jose Whether you’re a developer, researcher or business leader, the NVIDIA GTC AI Conference is the best opportunity to explore the future of AI, experience Azure and NVIDIA AI solutions and interact with 25,000 of the brightest minds in AI and accelerated computing. Taking place March 17-21 at the San Jose Convention Center, the conference offers over 900 sessions, 300+ exhibits, unique networking events and free two-hour workshops and training labs, sponsored by Microsoft Azure. Visit Microsoft at booth #514 and experience the latest in AI services and infrastructure. Join live discussion sessions, connect with Microsoft and partner AI experts and try out the latest AI technology and hardware. Plus, attend Microsoft talks and panels to learn about Azure’s end-to-end AI platform and how to accelerate the development and delivery of your AI innovations. To see a full list of conference sessions and add them to your calendar, visit the Microsoft Azure blog. Talks and panels S71145: Wired for AI: Lessons from Networking 100K+ GPU AI Data Centers and Clouds S73232: Physical AI for the Next Frontier of Industrial Digitalization S72355: Harnessing AI Agents for Enterprise Success: insights from AI Experts S72436: Building a 3D image-based search system for medical images: how foundational models can help? S71521: Build Secure and Scalable GenAI Applications with Databases and NVIDIA AI S71676: Accelerating AI Pipelines: How NVIDIA Tools Boost Bing Visual Search Efficiency S72905: Accelerating DiskANN Vector Search on GPUs S72435: Explore AI-Assisted Developer Tools for Accelerated Computing Application Development Registration is open now! Join Microsoft at GTC to discover what’s next in AI and accelerated computing. Visit the Microsoft blog for more details and register today. Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. For more information, contact [email protected]. source

Create the future with AI: Join Microsoft at NVIDIA GTC Read More »

Software-Focused PSG Secures $8B Across 2 New Funds

By Jade Martinez-Pogue ( February 12, 2025, 2:25 PM EST) — Software-focused growth equity firm PSG, advised by Davis Polk & Wardwell LLP and Weil Gotshal & Manges LLP, on Wednesday revealed that it clinched its two latest funds after securing a combined $8 billion in capital commitments…. Law360 is on it, so you are, too. A Law360 subscription puts you at the center of fast-moving legal issues, trends and developments so you can act with speed and confidence. Over 200 articles are published daily across more than 60 topics, industries, practice areas and jurisdictions. A Law360 subscription includes features such as Daily newsletters Expert analysis Mobile app Advanced search Judge information Real-time alerts 450K+ searchable archived articles And more! Experience Law360 today with a free 7-day trial. source

Software-Focused PSG Secures $8B Across 2 New Funds Read More »