Information Week

Next Steps to Secure Open Banking Beyond Regulatory Compliance

The concept of open banking, the ability for customers to share their financial information easily with third parties, is gaining momentum in the United States though in a piecemeal way. The Consumer Financial Protection Bureau recently finalized rules for financial institutions to offer open banking securely. It is one of the latest steps to further define how banks, credit card issuers, and other financial institutions should proceed forward in this space. Open banking already has footing in Europe. Meanwhile countries such as Canada, Japan, and Singapore have yet to formally adopt it, though their policymakers are exploring open banking frameworks. Though there is no single cohesive regulatory policy in the US yet, securing financial information will be paramount as open banking is made available. What is the balance of making financial information available to authorized parties versus keeping financial data secure? For this episode of DOS Won’t Hunt, Ben Shorten (upper left in video), Accenture’s finance, risk and compliance lead for banking and capital markets in North America; Adam Preis (lower right), director of product and solution marketing with Ping Identity; and Fernando Luege (upper right), CTO with Fresh Consulting, came together to discuss security hurdles and the way ahead for open banking. Related:2024 Cyber Resilience Strategy Report: CISOs Battle Attacks, Disasters, AI … and Dust Listen to the full podcast here. source

Next Steps to Secure Open Banking Beyond Regulatory Compliance Read More »

Talking About a Revolution: Making the Unsolvable Solvable in the SOC Infographic

“Talking About a Revolution: Making the Unsolvable Solvable in the SOC Infographic“ A Look into the Past in Order to Move Forward with a Machine-Led, Humna-Empowered Security Platform. Elevate your SOC with Automation and AI Capabilities Designed for the Modern Threat Landscape. In the last few years, the needs of the ­ security operations center (SOC) have changed, but the designs of the SIEM and SOC have not. The security information and event management (SIEM) category has served security operations for years with significant manual overhead and slow incremental improvement in security outcomes. Most other key pieces of the security architecture have been modernized: The endpoint moved from antivirus (AV) to endpoint detection and response (EDR) to extended detection and response (XDR); the network moved from a “hard shell” perimeter to Zero Trust and SASE; runtime moved from the data center to the cloud. In contrast, the SOC still operates on a SIEM model designed 20 years ago.  Explore the Future of Cybersecurity with Cortex XSIAM® – Palo Alto Networks’ AI-Driven Security Operations Platform. Discover how this innovative approach leverages AI to enhance, not replace, your security teams.Dive into our informative infographic to learn more. Offered Free by: Palo Alto Networks See All Resources from: Palo Alto Networks Thank you This download should complete shortly. If the resource doesn’t automatically download, please, click here. Thank you This download should complete shortly. If the resource doesn’t automatically download, please, click here. source

Talking About a Revolution: Making the Unsolvable Solvable in the SOC Infographic Read More »

What Military Wargames Can Teach Us About Cybersecurity

Cyberattacks in the first half of 2024 have been relentless, with organizations facing a surge in ransomware and data breaches aimed at theft and extortion. Unlike previous years, 2024 has seen major disruptions across industries, with consumers feeling the burn.   Unless you’ve been living under a rock, you already know that today’s ransomware operators are highly sophisticated and target businesses of all sizes across different industries. You’ve likely already deployed technology aimed at protecting against and recovering from a ransomware attack.   However, even with these technologies in place, many organizations find themselves unprepared when an actual attack happens.   Wargaming, a strategic military tool, has found its place in the world of cybersecurity through tabletop exercises designed to simulate these high stakes cyberattacks, such as ransomware. Cyber wargames equip corporate leaders with the skills needed to make swift, informed decisions in the critical first 24-48 hours of a crisis. Beyond backups, these exercises stress-test incident response plans, offering an essential, hands-on approach to disaster recovery.  Here’s what you need to know and how to approach.  What Is a Tabletop Exercise and Why Does It Matter?  A ransomware tabletop exercise is a simulation of a ransomware attack aimed at identifying vulnerabilities in your ransomware protection and recovery plan. Conducting a tabletop exercise is one of the best ways to increase your organization’s cyber resilience and prepare for recovery scenarios you have not yet encountered in the wild.   Ransomware tabletop exercises have other benefits, too. For example, a tabletop exercise could identify areas where you are out of compliance with security frameworks and/or demonstrate to regulators that you have taken steps to address these issues. Exercises can also help shape employee training efforts and technology investments.   There’s no “right” way to conduct a tabletop exercise. However, many exercises include some or all of the following:  A realistic scenario. All tabletop exercises should start with a realistic scenario, designed to challenge both technical and non-technical aspects of the organization’s incident response plan.  Key stakeholders. Key personnel from IT, cybersecurity, legal, communications, and executive teams should be involved to ensure all critical functions are covered.  Well-defined responsibilities. Stakeholders should be assigned a specific role that mirrors their real-world responsibilities during an actual ransomware incident (e.g., IT, executives, public relations).  Ransomware response testing. Technical and non-technical response activities should be tested. This might include IT activities like detection, containment, eradication, and disaster recovery operations. Internal and external communications should be tested as well. We’ll look at testing in more detail below.  A post-incident report. A review of the gaps, successes, and areas for improvement in the organization’s response strategy is critical. This review should be properly documented, both for future reference and to satisfy any regulatory or compliance requirements.  Ransomware Response Testing Food for Thought  Obviously, all aspects of your security stack should be considered in your IT testing. Preventing an attack before it happens is the goal, so testing should be designed to identify gaps in access controls, vulnerability management, employee security awareness training, and more.  Since attacks have the potential to cause prolonged IT downtime, a tabletop exercise should also reveal how long it could take to restore normal business operations following an attack. The exercise should account for the wide variety of restore scenarios IT might face (e.g., restoring a few desktops vs. a server hosting numerous virtual machines) and the recovery time associated with each.  Legal, HR, PR, and executive teams may have important responsibilities during and immediately following a ransomware attack. For example, do customers and/or vendors need to be notified? What about law enforcement? Who is responsible for these communications? Who is responsible for filing a cyber insurance claim? What specifically is required to file a claim?   Tabletop exercises require a good deal of coordination and can be time-consuming. However, they are highly effective and should be considered an essential piece for your security and disaster recovery efforts.  Conclusion  Ransomware tabletop exercises are invaluable for organizations looking to strengthen their defenses against one of the most serious cyber threats today. These exercises help businesses identify vulnerabilities, improve response strategies, and build long-term cyber resilience.   By involving leadership, focusing on realistic scenarios, and emphasizing secure recovery methods, ransomware tabletop exercises offer a practical and insightful way to ensure that your organization is prepared to handle a real ransomware attack.   source

What Military Wargames Can Teach Us About Cybersecurity Read More »

The Impact of the Presidential Election on Networks

America is speeding towards an unusually high-stakes election. The winner in the clash of polar opposite views will oversee drastically different government policies. Here’s a look at what either a Republican or Democratic outcome is likely to mean in terms of tech directives affecting networks. First, it’s important to acknowledge that tensions are likely to spill over after the election and potentially change the policies touted by either candidate in pre-election pledges. Public pressure, both for and against any given policy, would likely have some impact on the final version.   Meanwhile, the unease across the nation is palpable and nearly universal, while fear and anger creep insidiously into several segments of the population. Those tensions are unlikely to dissipate entirely after the election. Indeed, they are expected to become further entrenched, no matter which candidate wins. Come election day or sometime thereafter, one or the other of the top two candidates will be declared the victor. There’s been lots of ink, pixels, and TV and video streams dedicated to what either outcome means in terms of the most obvious social issues and abrasive policy contentions. But few have dug deep to see beyond these to other policy differences that can also impact America in important ways. “Telecom infrastructure remains a crucial yet under-discussed policy issue, and Trump and Harris are proposing distinct approaches,” says Richard Brandon, Vice President of Strategy at RtBrick, a provider of multi-service edge routing software for telcos. Let’s take a look at how networks are likely to be impacted by each candidate should they be the final winner in this election. Overall Candidate Tech Positions “In sum, I think we can expect divergent approaches to regulations, broadband funding, blockchain innovation, and digital assets based on recent policies and public statements made by each candidate,” says Dr. Tonya M. Evans, ESQ., professor of Law at Penn State Dickinson Law and Digital Money Expert at Penn State Dickinson Law. As to the specific points made or eluded to in the candidates’ public statements and other research, Deltek’s key technology findings to apprise government contractors of the two candidates’ positions are as follows: Kamala Harris: Prioritizes winning the global competition in space, AI, quantum computing, and emerging technologies. Advocates continued export controls to prevent Chinese companies from acquiring advanced semiconductor and computing technologies. Supports enforcement of the Biden AI executive order, requiring cross-agency collaboration, procurement decisions tied to risk and performance management, and investment in AI data centers. Promotes investment in Regional Technology and Innovation Hubs and the National Artificial Intelligence Research Resource to integrate AI and machine learning into healthcare technology. A strong proponent of telehealth and further cloud computing modernization. Donald Trump: Plans to invest in AI to compete with China while scaling back elements of the Biden AI executive order that he views as restrictive. Supports continued export controls similar to Harris but would pull back on regulations concerning data bias. Advocates expanding U.S. federal cyber capabilities, including offensive cybersecurity operations and workforce development. Likely to continue supporting cybersecurity investments while revisiting antitrust actions targeting large U.S. tech companies. What these positions will lead to in the way of policy actions remains to be seen. Even so, a recent EY report finds that the outcome of the US election will most impact the following areas of regulation: cybersecurity/data protections, artificial intelligence and machine learning, and user data and content oversight. The report found that 74% of tech leaders believe the results of the upcoming US election will have a major impact on the US tech sector’s ability to stay ahead of global competitors. Overall Candidate Positions on Broadband, Networks “All the presidential candidates have come up with encouraging policies regarding broadband access, but they can be implemented in different ways,” says Chris Dukich, the owner of a SaaS company that provides digital signage to screens called Display Now. As Dukich alludes to, the devil is in the details, so here are a few of the key takeaways for each of the leading candidates. Kamala Harris Harris’s platform “emphasizes her Opportunity Economy approach, which seeks to address the digital divide and expand broadband infrastructure as a means of economic empowerment for marginalized communities. This mirrors previous Democratic-led initiatives like the Infrastructure Investment and Jobs Act, which allocated $65 billion for broadband expansion under the Biden administration,” says Dr. Evans. And how will that likely play out in terms of government funding? “The future of this approach will probably be more government grants and public-private partnerships which will enable moving towards the direction of having high-speed internet access to all the communities within the country,” Dukich said. Other industry players and watchers tend to agree. “A Harris administration would continue investing in cost-efficient network expansion, extending Biden’s initiative of increasing broadband access and bringing digital equity to rural areas through programs like BEAD. Billions of dollars are already earmarked and distributed to states to support the development of next-generation telecom networks,” says Brandon. As to the impact on regulations, Dukich expects Harris to continue Biden’s “push for more restraints over tech corporations with a strong emphasis on data privacy, rivalry, and consumer welfare.” Merrill agrees and points to the FTC efforts currently underway as continuing under Harris. “Future regulations in a Harris administration will be the same in support of net neutrality and increase oversight against ISPs,” Merrill adds. But not everyone agrees with the assessment that Harris will follow in Biden’s footsteps. “Harris has signaled a shift from the strict regulatory stance of the Biden administration toward a more innovation-friendly framework. However, her policies are likely to continue emphasizing net neutrality and consumer protections, requiring network managers and architects to adapt to stricter compliance and reporting standards, which could increase opportunities for firms involved in public contracts and broadband development,” says Evans. Cloud infrastructure and distributed networking will also see significant policy impact. Harris has “indicated a focus on cloud security and resilience, integrating support for cloud computing technologies within her broader economic agenda. By

The Impact of the Presidential Election on Networks Read More »

5 Ways to Overcome Digital Transformation Culture Shock

As organizations strive to meet their goals, integrating digital technology into analytics, artificial intelligence and machine learning, and cloud migration has become essential. The end game is to transform businesses’ operations, share information, and deliver customer value.   While digital transformation promises increased efficiency, productivity, and reduced costs, its success fundamentally depends on people. Neglecting the human aspect of transformation is a recipe for failure from the outset.  A BCG study on digital transformation found that 90% of companies focusing on culture during their transformation journey experienced solid financial performance, compared to 17% that didn’t. Despite projections that global spending on digital transformation will reach $3.4 trillion by 2026, there’s a high failure rate — around 70%, according to McKinsey. Much of this failure can be attributed to organizational culture shock, where employees react negatively to sudden changes.  In 1955, Sverre Lysgaard developed a model describing how individuals adapt to a new culture, beginning with a honeymoon phase, followed by culture shock, then adjustment, and finally adaptation. This process mirrors what happens to employees during digital transformation. Companies must invest in addressing culture shock to ensure the success of their digital initiatives.  Related:2024 InformationWeek US IT Salary Report: Profits, Layoffs, and the Continued Rise of AI We recently embarked on a significant digital transformation with the introduction of our solution enablement platform. This platform unites various data and analytic assets built for risk management, marketing, and fraud prevention into one unified environment. This transformation enhances our ability to provide a more accurate picture of consumers across various use cases. From my experience rolling out this platform, I’ve identified five key strategies companies can use to navigate digital transformation successfully and avoid employee culture shock.  1. Foundation setting  It’s essential to communicate your vision and strategy. A well-defined roadmap that outlines the steps to achieve transformation goals is crucial. McKinsey reports that organizations with a clear change management strategy are six times more likely to succeed. Personalizing the vision for each employee ensures they believe in the transformation and actively participate in it.  2. Employee training and education  Training is vital for engaging employees and advancing their careers. Yet only 56% of organizations report expanding training on digital tools and new processes, according to PwC. At our company, we incentivize employees to complete training programs that enhance their skills, which leads to a more engaged workforce. Employees are encouraged to think about the skills they want to develop for their future, ensuring that our digital transformation also benefits their personal career growth.  Related:Curtail Cloud Spend With These Strategies A significant focus of our training has been on our solution enablement platform. We’ve curated specific training for employees, including certifications, across the organization. This approach encourages long-term career development while promoting a deeper understanding of new technologies.  3. Be transparent and share progress  Frequent updates on successes and challenges foster trust and authenticity. Organizations should openly communicate any changes to the roadmap or strategy. At my company, we hold regular meetings where we showcase both the progress and the hurdles we face during our technology evolution. Integration is a crucial theme; we highlight how different teams benefit from the work.  4. Embrace learning and failures  Encouraging a culture that views failure as a learning opportunity fosters innovation. Open lines of communication allow employees to share issues and contribute to continuous improvement. This helps employees feel secure enough to try new things and become active participants in the transformation.  Related:Forrester Speaker Sneak Peek: Analyst Jayesh Chaurasia to Talk AI Data Readiness At our company, we conduct regular retrospectives of our planned releases. When things don’t go as expected, we focus on what can be learned, not the failure itself. This feedback loop is shared transparently, providing valuable insights for the entire team and fostering a culture of continuous improvement.  5. Find champions  Too often, change management is reduced to sending out emails or presentations. While these methods are helpful, true transformation requires more personal involvement. Identifying champions within the organization can significantly boost morale and support. These champions don’t need to be formal leaders but are individuals who believe in transformation and help guide their peers through the process.  Recently, our enterprise capabilities marketing and investor relations teams met with our engineers to better understand the benefits of our solution enablement platform. They became champions of the transformation and shared their enthusiasm with key stakeholders, which in turn had a positive impact on investors.  Conclusion  Digital transformation offers tremendous potential, but it comes with inherent challenges. To succeed, organizations must place people at the heart of the process through training, transparent communication, and fostering a culture that embraces learning from failures. Companies can mitigate culture shock and achieve their transformation goals by following these five strategies.  source

5 Ways to Overcome Digital Transformation Culture Shock Read More »

ThreatLocker CEO Talks Supply Chain Risk, AI’s Cybersecurity Role, and Fear

It’s no secret that cybersecurity concerns are growing. This past year has seen massive breaches, such as the breach of National Public Data (with 2.7 billion records stolen), and several large breaches of Snowflake customers such as Ticketmaster, Advance Auto Parts and AT&T. More than 165 companies were impacted by the Snowflake-linked breaches alone, according to a Mandiant investigation. According to CheckPoint research, global cyber-attacks increased by 30% in the second quarter of 2024, to 1,636 weekly attacks per organization. An IBM report says the average cost of a data breach globally rose 10% in 2024, to $4.8 million. So, it’s probably not that surprising that Orlando, Fla.-based cybersecurity firm ThreatLocker has ballooned to 450 employees since its 2017 launch. InformationWeek caught up with ThreatLocker CEO Danny Jenkins at the Gartner IT Symposium/XPO in Orlando last month. (Editor’s note: The following interview is edited for clarity and brevity.) Can you give us a little overview on what you were talking about at the event? What we’re talking about is that when you’re installing software on your computer, that software has access to everything you have access to, and people often don’t realize if they download that game, and there was a back door in that game, if there was some vulnerability from that game, it could potentially steal my files, grant someone access to my computer, grab the internet and send data. So, what we were really talking about was the supply chain risk. The biggest thing is vulnerabilities: The things a vendor didn’t intend to do, but accidentally granted someone access to your data. You can really enhance your security through sensible controls and limiting access to those applications rather than trying to find every bad thing in the world. Related:2024 Cyber Resilience Strategy Report: CISOs Battle Attacks, Disasters, AI … and Dust AI has been the major reoccurring theme throughout the symposium. Can you talk a little about the way we approach these threats and how that is going to change as more businesses adopt emerging technologies like GenAI? What’s interesting is that we’re actually doing a session on how to create successful malware, and we’re going to talk about how we’re able to use AI to create undetectable malware versus the old way. If you think about AI, and you think about two years ago, if you wanted to create malware, there were a limited number of people in the world that could do that — you’d have to be a developer, you’d have to have some experience, you’d have to be smart enough to avoid protections. That pool of people was quite small. Today, you can just ask ChatGPT to create a program to do whatever you want, and it will spit out the code instantly. The amount of people that have the ability to create malware has now drastically increased … the way to defend against that is to change the way you think about security. The way most companies think about security now is they’re looking for threats in their environment — but that’s not effective. The better way of approaching security is really to say, “I’m just going to block what I don’t need, and I don’t care if it’s good and I don’t care if it’s bad. If it’s not needed in my business, I’m going to block it from happening.” Related:Juliet Okafor Highlights Ways to Maintain Cyber Resiliency As someone working in security, is the pace of AI adoption in enterprise a concern? I think the concern is the pace and the fear. AI has been around for a long time. What we’re seeing the last two years is generative AI and that’s what’s scaring people. If you think about self-driving cars, you think about the ability of machine learning, the ability to see data and manipulate and learn from that data. What’s scary is that the consumer is now seeing AI that produces and before it was always stuff in the background that you never really thought about. You never really thought about how your car is able to determine if something’s a trash can or if it’s a person. Now this thing can draw pictures and it can write documents better than I do, and create code. Am I worried about AI taking over the world from that perspective? No. But I am concerned about the tool set that we’ve now given people who may not be ethical. Related:Beyond the Election: The Long Cybersecurity Fight vs Bad Actors Before, if you were smart enough to write successful malware, at least in the Western Hemisphere, you’re smart enough to get a job and you’re not going to risk going to jail. The people who were creating successful malware before, or successful cyber-attacks, were people in countries where there were not opportunities, like Russia. Now, you don’t need to be smart enough to create successful cyber-attacks, and that’s what concerns me. If you give someone who doesn’t have capacity to earn a living access to tools that can allow them to steal data, the path they are going to follow is cyber crime. Just like other crime, when the economy is down and people don’t have job, people steal and crime goes up. Cyber crime before was limited to people who had an understanding of technology. Now, the whole world will have access and that’s what scares me — and GenAI has facilitated that. How do you see your business changing in the next 5-10 years because of AI adoption? Ultimately, it changes the way people think about security, to where they have to start adopting more zero-trust approaches and more restrictive controls in their environment. That’s how it has to go — there is no alternative. Before, there was a 10% chance you were going to get damaged by an attack, now it’s an 80% chance. If you’re the CIO of an enterprise, how should you be looking at building out these new technologies and building on these new

ThreatLocker CEO Talks Supply Chain Risk, AI’s Cybersecurity Role, and Fear Read More »

How Quantum Machine Learning Works

As quantum computing continues to advance, so too are the algorithms used for quantum machine learning, or QML. Over the past few years, practitioners have been using variational noisy intermediate-scale quantum (NISQ) algorithms designed to compensate for noisy computing environments.   “There’s a lot of machine learning algorithms in that vein that run in that kind of way. You treat your quantum program as if it was a neural network,” says Joe Fitzsimons, founder and CEO Horizon Quantum Computing, a company building quantum software development tools. “You write a program that has a lot of parameters in it that you don’t set beforehand, and then you try to tune those parameters. People call these ‘quantum neural networks.’ You also have variational classifiers and things like that that fall into that category.”  One can also take an existing classical machine learning model and try to accelerate its computation using a quantum computer. Noise is a challenge, however, so error correction is necessary. Another requirement is quantum random access memory (QRAM, which is the quantum equivalent of RAM).   “If we can get lower noise quantum computers, if we can start building the RAM, then there’s really enormous potential for quantum computers to accelerate a classical model or a quantum native model,” says Fitzsimons. “You can play with the variational algorithms today, absolutely, but achieving the more structured algorithms and getting to error-corrected quantum random access memory is five years and several Nvidia hardware generations away.”  Related:IT Pros Love, Fear, and Revere AI: The 2024 State of AI Report QML Needs to Mature While quantum computing is not the most imminent trend data scientists need to worry about today, its effect on machine learning is likely to be transformative.   “The really obvious advantage of quantum computing is the ability to deal with really enormous amounts of data that we can’t really deal with any other way,” says Fitzsimons. “We’ve seen the power of conventional computers has doubled effectively every 18 months with Moore’s Law. With quantum computing, the number of qubits is doubling about every eight to nine months. Every time you add a single qubit to a system, you double its computational capacity for machine learning problems and things like this, so the computational capacity of these systems is growing double exponentially.”  Quantum machines will allow organizations to model and understand complex systems in a computational way, and the potential use cases are many, ranging from automotive and aerospace to energy, life sciences, insurance, and financial services to name a few. As the number of qubits rises, quantum computers can handle increasingly complex models.  Related:Keynote Sneak Peek: Forrester Analyst Details Align by Design and AI Explainability Joe Fitzsimons, Horizon Quantum Computing “With classical machine learning, you take your model and you test it against real-world data, and that’s what you benchmark off,” says Fitzsimons. “Quantum computing is only starting to get towards that. It’s not really there yet, and that’s what’s needed for quantum machine learning to really take off, you know, to really become a viable technology, we need to [benchmark] in the same way that the classical community has done, and not just single shots on very small data sets. A lot of quantum computing is reinventing what has already been done in the classical world. Machine learning in in the quantum world, has a long way to go before we really know what its limits and capabilities are.”  What’s Happening With Hybrid ML? Classical ML isn’t practical for everything, and neither is QML. Classical ML is based on classical AI models and GPUs while quantum machine learning (QML) uses entirely different algorithms and hardware that take advantage of properties like superposition and entanglement to boost efficiency exponentially, says Román Orús, Ikerbasque research professor at DIPC and chief scientific officer of quantum AI company Multiverse Computing.  Related:Sidney Madison Prescott Discusses GenAI’s Potential to Transform Enterprise Operations “Classical systems represent data as binary bits: 0 or 1. With QML, data is represented in quantum states. Quantum computers can also produce atypical patterns that classical systems can’t produce efficiently, a key task in machine learning,” says Orús.   Classical ML techniques can be used to optimize quantum circuits, improve error-correcting codes, analyze the properties of quantum systems and design new quantum algorithms. Classical ML methods are also used to preprocess and analyze data that will be used in quantum experiments or simulations. In hybrid experiments, today’s NISQ devices work on the parts of the problem most suited to the strengths of quantum computing while classical ML handles the remaining parts.   Quantum-inspired software techniques can also be used to improve classical ML, such as tensor networks that can describe machine learning structures and improve computational bottlenecks to increase the efficiency of LLMs like ChatGPT.   “It’s a different paradigm, entirely based on the rules of quantum mechanics. It’s a new way of processing information, and new operations are allowed that contradict common intuition from traditional data science,” says Orús. “Because of the efficient way quantum systems handle information processing, they are also capable of manipulating complex data to represent complex data structures and their correlations. This could improve generative AI by reducing energy and compute costs as well as increasing the speed of the drug discovery process and other data-intensive research. QML also could be used to develop new types of neural networks that use quantum properties that significantly improve inference, explainability, and training efficiency.”  There’s a lot of innovation happening at various levels to solve various pieces of all things quantum, including system design, environmental optimization, new hardware and software.  Román Orús, Multiverse Computing “In addition to developing better quantum hardware to run QML, people are also exploring how to implement hybrid systems that combine generative AI modules, such as transformers, with quantum capabilities,” says Orús.  Like classical ML, QML isn’t a single thing.   “As with other aspects of quantum computing, there are different versions of quantum machine learning. These days, what most people mean by quantum machine learning is otherwise known as a

How Quantum Machine Learning Works Read More »

Is the CHIPs Act in Jeopardy? What the US Election Could Mean for Semiconductor Industry

Today’s election, which pollsters say is neck and neck in the presidential race between Republican candidate and former US President Donald J. Trump and Democratic candidate, US Vice President Kamala Harris, could determine the future of the $52.7 billion CHIPS and Science Act. The CHIPS Act, signed into law two years ago, is already doling out some of the $39 billion aimed at semiconductor manufacturing, with another 13.2 billion earmarked for R&D and workforce development. The Biden Administration has touted the effort as one of its major accomplishments. Trump recently took to the Joe Rogan podcast to declare: “That chip deal is so bad.” Trump says the US should instead impose tariffs he says would force more chips to be produced in the US. Others say tariffs, which are charged to the importing company and not the exporting country, would have the opposite effect. House Speaker Mike Johnson, in remarks that he recently walked back, suggested that the GOP would “probably” try to repeal the legislation. He later said that he misunderstood the question after pushback from GOP Rep. Brandon Williams, a New York state congress member locked in a tough race with Democrat candidate state Sen. John Mannion. Johnson told reporters that a repeal is not in the works, but “there could be legislation to further streamline and improve the primary purpose of the bill — to eliminate its costly regulations and Green New Deal requirements.” Related:2024 InformationWeek US IT Salary Report: Profits, Layoffs, and the Continued Rise of AI Billions at Stake According to the US Commerce Department, the CHIPS Act is expected to boost US chip manufacturing from zero to 30% of the world’s leading-edge chip supply by 2032. Chip companies like Intel, Micron, Samsung, and TSMC have announced massive US manufacturing upgrades and new construction. Last year, the Commerce Department said more than 460 companies had signaled interest in winning subsidies through the bill. The US has chosen 31 “underdog tech hubs” for potential hotspots that would funnel CHIPS funding into areas outside of traditional tech corridors. Earlier this week, Albany NanoTech Complex was selected as the first CHIPS Act R&D flagship facility, winning $825 million in subsidies to fund a new Extreme Ultraviolet (EUV) Accelerator. US Sen. Mark Kelly, (D-Ariz), was a key sponsor of the CHIPS Act. Since 2020, Arizona netted more than 40 semiconductor deals, with $102 billion in capital investment and the potential for 15,700 jobs. TSMC’s investment in Arizona stands at more than $65 billion. Intel is investing more than $32 billion in two new Arizona foundries (chip factories), and to modernize an existing fab. Related:Curtail Cloud Spend With These Strategies Republicans are staking their political fortunes on the CHIPS Act as well. Sen. John Cornyn (R-TX), also co-authored the bill. And Sen. Marco Rubio (R-Fla.) and Sen. Tom Cotton have been vocal about China-US competition. The CHIPS act could shore up a domestic supply chain that gives North America a real advantage in the chip wars. While the CHIPS Act itself is not on any ballot measures for this election cycle, economic policies that impact power consumption and other key tech-important issues may impact the industry as well. In Arkansas, one ballot measure proposal concerning lottery funds could help create more skilled tech workers, for instance. In Maine, a ballot measure proposes issuing $25 million in bonds to fund research for IT industries. Bob O’Donnell, president and chief analyst at TECHnalysis Research, says the future of US semiconductor manufacturing should not be a partisan issue. “It’s clear to me that the CHIPS Act is incredibly important and hopefully it will cross party lines,” he says in a phone interview with InformationWeek. “There’s no doubt there will be demand down the road. And there’s no question that the geographical diversity of semiconductor manufacturing is way out of whack. This is a US necessity.” Related:Forrester Speaker Sneak Peek: Analyst Jayesh Chaurasia to Talk AI Data Readiness A Question of Workforce Readiness and R&D John Dallesasse, a professor of electrical and computer engineering at the University of Illinois Grainger College of Engineering, says funding from the CHIPS Act will be crucial to workforce and educational needs. “It would be unfortunate if the US government were to backpedal on the investments in semiconductor technology enabled by the CHIPS and Science Act,” he tells InformationWeek in an e-mail interview. “While the [act] provides incentives for manufacturing, there’s also a significant emphasis on new technology R&D and workforce development — both of which will be needed to restore US competitiveness in semiconductors.” He adds, “Without the combination of new technology development and incentives to bring manufacturing back to the US, we will continue on the downward spiral which has brought us from a dominant force in semiconductor manufacturing to a country which only makes 12% of the world’s chips.” source

Is the CHIPs Act in Jeopardy? What the US Election Could Mean for Semiconductor Industry Read More »

6 Strategies for Maximizing Cloud Storage ROI

Enterprise IT leaders face a daunting challenge: delivering innovative solutions through new applications, data services, and AI investments while adhering to tight budgets. Cloud computing, often at the heart of these initiatives, presents a particularly uncertain landscape, especially regarding storage costs, which can significantly impact IT budgets. Rising expenses in cloud data storage have prompted many organizations to reconsider their strategies, leading to a trend of repatriation as enterprises seek more control during these unpredictable economic times. A February 2024 Citrix poll revealed that 94% of organizations had shifted some workloads back to on-premises systems, driven by concerns over security, performance, costs, and compatibility. In response, senior business and finance leaders might consider a swift transition back from the cloud to curb expenses. However, cloud repatriation carries its own set of risks, including potential egress fees, the need for new hardware, security investments, and other infrastructure costs. Additionally, companies may face the challenge of re-hiring staff previously laid off. Furthermore, there’s a significant opportunity cost associated with missing out on enhanced collaboration, innovation, agility, and access to advanced cloud-native tools and services, including AI and machine learning. Optimize Your Cloud Strategy Before You Repatriate Deloitte analyzed anonymized data from several FinOps engagements to assess optimization efforts, finding that businesses can save up to 45% (15% on average) on cloud costs by optimizing across waste management, consumption management, and purchasing best practices levers. Common tactics of re-architecting applications, managing cloud sprawl and monitoring spend using the tools each cloud provides are a great first start. However, these methods are not the full picture. Storage optimization is an integral piece. Focusing on cloud storage costs first is a smart strategy since storage constitutes a large chunk of the overall spend. More than half of IT organizations (55%) will spend more than 30% of their IT budget on data storage and backup technology, according to our recent State of Unstructured Data Management report. The reality is that most organizations don’t have a clear idea on current and predicted storage costs. They do not know how to economize, how much data they have, or where it resides. By gaining a thorough understanding of data and its needs, IT can place high-priority data on top-performing storage while moving older, less important data to cheaper storage. The point is that if you don’t efficiently manage data over its lifecycle, both options will be expensive. Six Ways to Cut Storage Costs and Optimize Cloud Investments Get holistic visibility on data to make the best cloud decisions. Understanding the characteristics of enterprise data—which is primarily file or object data not sitting in a database—is critical to optimizing cloud investments and right-place data. Top metrics include top data owners, most common file type, most common file size, total data, data growth rate, and data by time of last access (which indicates active or hot data versus inactive or cold data). Metadata searches can also highlight files containing PII, IP, or other sensitive data that have unique storage and security requirements. Calculate current storage costs across all storage technologies in your data centers and/or the cloud. Since most organizations have a hybrid cloud approach, you need to calculate the cost of both on-premises storage and backups as well as cloud storage. Calculating this across various accounts, buckets, and storage silos can be time-consuming and laborious. Look for automated ways to deliver these costs, such as through a data management solution. Predict future storage costs based on data growth rates. Unstructured data typically grows at 20% or more each year, so when looking at how much you can save, consider current costs and future projections. An ongoing data management strategy is needed to save costs as data continues to pile up. Include backup and disaster recovery costs in your analysis. Even in the cloud, most organizations create additional data copies for backups, snapshots, and multi-site redundancy. Be sure to include these costs in your analysis to get the full understanding of your true costs and potential savings. Model new storage plans for savings opportunities. Your data management analysis should detail how much you can save by leveraging the various cloud storage tiers and right-placing cold data at the appropriate lower tier. In most clouds, the cheaper storage tiers are often 20x less expensive than the performance tiers. Create an ongoing data lifecycle management plan. Rather than moving data to the cloud in a “set and forget” fashion, long-term savings require continuous refinement to accommodate data as it ages or when other conditions materialize, such as the need to move data under compliance rules to secure archival storage. With more than 12 classes of storage on some of the popular clouds, you’ll want to leverage them all at the right time. Don’t keep data in top-tier file storage once it is no longer in active use, such as at the completion of an analytics project or marketing campaign. Ensure that users can access tiered data from the lower storage tier without having to bring it back to a more expensive tier so that you don’t lose the savings. A Final Word on Cloud Storage As organizations look to reduce cloud waste this year, attaining a data-centric perspective has multitude of benefits. Analysis can indicate data growth rates, hot versus cold data, compliant data, and more so that IT can make the best decisions balancing data requirements, business needs, and budget. This way, you can continue to embrace the cloud for digital business initiatives without starting alarm bells in the CFO’s office. source

6 Strategies for Maximizing Cloud Storage ROI Read More »