Information Week

ThreatLocker CEO Talks Supply Chain Risk, AI’s Cybersecurity Role, and Fear

It’s no secret that cybersecurity concerns are growing. This past year has seen massive breaches, such as the breach of National Public Data (with 2.7 billion records stolen), and several large breaches of Snowflake customers such as Ticketmaster, Advance Auto Parts and AT&T. More than 165 companies were impacted by the Snowflake-linked breaches alone, according to a Mandiant investigation. According to CheckPoint research, global cyber-attacks increased by 30% in the second quarter of 2024, to 1,636 weekly attacks per organization. An IBM report says the average cost of a data breach globally rose 10% in 2024, to $4.8 million. So, it’s probably not that surprising that Orlando, Fla.-based cybersecurity firm ThreatLocker has ballooned to 450 employees since its 2017 launch. InformationWeek caught up with ThreatLocker CEO Danny Jenkins at the Gartner IT Symposium/XPO in Orlando last month. (Editor’s note: The following interview is edited for clarity and brevity.) Can you give us a little overview on what you were talking about at the event? What we’re talking about is that when you’re installing software on your computer, that software has access to everything you have access to, and people often don’t realize if they download that game, and there was a back door in that game, if there was some vulnerability from that game, it could potentially steal my files, grant someone access to my computer, grab the internet and send data. So, what we were really talking about was the supply chain risk. The biggest thing is vulnerabilities: The things a vendor didn’t intend to do, but accidentally granted someone access to your data. You can really enhance your security through sensible controls and limiting access to those applications rather than trying to find every bad thing in the world. Related:2024 Cyber Resilience Strategy Report: CISOs Battle Attacks, Disasters, AI … and Dust AI has been the major reoccurring theme throughout the symposium. Can you talk a little about the way we approach these threats and how that is going to change as more businesses adopt emerging technologies like GenAI? What’s interesting is that we’re actually doing a session on how to create successful malware, and we’re going to talk about how we’re able to use AI to create undetectable malware versus the old way. If you think about AI, and you think about two years ago, if you wanted to create malware, there were a limited number of people in the world that could do that — you’d have to be a developer, you’d have to have some experience, you’d have to be smart enough to avoid protections. That pool of people was quite small. Today, you can just ask ChatGPT to create a program to do whatever you want, and it will spit out the code instantly. The amount of people that have the ability to create malware has now drastically increased … the way to defend against that is to change the way you think about security. The way most companies think about security now is they’re looking for threats in their environment — but that’s not effective. The better way of approaching security is really to say, “I’m just going to block what I don’t need, and I don’t care if it’s good and I don’t care if it’s bad. If it’s not needed in my business, I’m going to block it from happening.” Related:Juliet Okafor Highlights Ways to Maintain Cyber Resiliency As someone working in security, is the pace of AI adoption in enterprise a concern? I think the concern is the pace and the fear. AI has been around for a long time. What we’re seeing the last two years is generative AI and that’s what’s scaring people. If you think about self-driving cars, you think about the ability of machine learning, the ability to see data and manipulate and learn from that data. What’s scary is that the consumer is now seeing AI that produces and before it was always stuff in the background that you never really thought about. You never really thought about how your car is able to determine if something’s a trash can or if it’s a person. Now this thing can draw pictures and it can write documents better than I do, and create code. Am I worried about AI taking over the world from that perspective? No. But I am concerned about the tool set that we’ve now given people who may not be ethical. Related:Beyond the Election: The Long Cybersecurity Fight vs Bad Actors Before, if you were smart enough to write successful malware, at least in the Western Hemisphere, you’re smart enough to get a job and you’re not going to risk going to jail. The people who were creating successful malware before, or successful cyber-attacks, were people in countries where there were not opportunities, like Russia. Now, you don’t need to be smart enough to create successful cyber-attacks, and that’s what concerns me. If you give someone who doesn’t have capacity to earn a living access to tools that can allow them to steal data, the path they are going to follow is cyber crime. Just like other crime, when the economy is down and people don’t have job, people steal and crime goes up. Cyber crime before was limited to people who had an understanding of technology. Now, the whole world will have access and that’s what scares me — and GenAI has facilitated that. How do you see your business changing in the next 5-10 years because of AI adoption? Ultimately, it changes the way people think about security, to where they have to start adopting more zero-trust approaches and more restrictive controls in their environment. That’s how it has to go — there is no alternative. Before, there was a 10% chance you were going to get damaged by an attack, now it’s an 80% chance. If you’re the CIO of an enterprise, how should you be looking at building out these new technologies and building on these new

ThreatLocker CEO Talks Supply Chain Risk, AI’s Cybersecurity Role, and Fear Read More »

How Quantum Machine Learning Works

As quantum computing continues to advance, so too are the algorithms used for quantum machine learning, or QML. Over the past few years, practitioners have been using variational noisy intermediate-scale quantum (NISQ) algorithms designed to compensate for noisy computing environments.   “There’s a lot of machine learning algorithms in that vein that run in that kind of way. You treat your quantum program as if it was a neural network,” says Joe Fitzsimons, founder and CEO Horizon Quantum Computing, a company building quantum software development tools. “You write a program that has a lot of parameters in it that you don’t set beforehand, and then you try to tune those parameters. People call these ‘quantum neural networks.’ You also have variational classifiers and things like that that fall into that category.”  One can also take an existing classical machine learning model and try to accelerate its computation using a quantum computer. Noise is a challenge, however, so error correction is necessary. Another requirement is quantum random access memory (QRAM, which is the quantum equivalent of RAM).   “If we can get lower noise quantum computers, if we can start building the RAM, then there’s really enormous potential for quantum computers to accelerate a classical model or a quantum native model,” says Fitzsimons. “You can play with the variational algorithms today, absolutely, but achieving the more structured algorithms and getting to error-corrected quantum random access memory is five years and several Nvidia hardware generations away.”  Related:IT Pros Love, Fear, and Revere AI: The 2024 State of AI Report QML Needs to Mature While quantum computing is not the most imminent trend data scientists need to worry about today, its effect on machine learning is likely to be transformative.   “The really obvious advantage of quantum computing is the ability to deal with really enormous amounts of data that we can’t really deal with any other way,” says Fitzsimons. “We’ve seen the power of conventional computers has doubled effectively every 18 months with Moore’s Law. With quantum computing, the number of qubits is doubling about every eight to nine months. Every time you add a single qubit to a system, you double its computational capacity for machine learning problems and things like this, so the computational capacity of these systems is growing double exponentially.”  Quantum machines will allow organizations to model and understand complex systems in a computational way, and the potential use cases are many, ranging from automotive and aerospace to energy, life sciences, insurance, and financial services to name a few. As the number of qubits rises, quantum computers can handle increasingly complex models.  Related:Keynote Sneak Peek: Forrester Analyst Details Align by Design and AI Explainability Joe Fitzsimons, Horizon Quantum Computing “With classical machine learning, you take your model and you test it against real-world data, and that’s what you benchmark off,” says Fitzsimons. “Quantum computing is only starting to get towards that. It’s not really there yet, and that’s what’s needed for quantum machine learning to really take off, you know, to really become a viable technology, we need to [benchmark] in the same way that the classical community has done, and not just single shots on very small data sets. A lot of quantum computing is reinventing what has already been done in the classical world. Machine learning in in the quantum world, has a long way to go before we really know what its limits and capabilities are.”  What’s Happening With Hybrid ML? Classical ML isn’t practical for everything, and neither is QML. Classical ML is based on classical AI models and GPUs while quantum machine learning (QML) uses entirely different algorithms and hardware that take advantage of properties like superposition and entanglement to boost efficiency exponentially, says Román Orús, Ikerbasque research professor at DIPC and chief scientific officer of quantum AI company Multiverse Computing.  Related:Sidney Madison Prescott Discusses GenAI’s Potential to Transform Enterprise Operations “Classical systems represent data as binary bits: 0 or 1. With QML, data is represented in quantum states. Quantum computers can also produce atypical patterns that classical systems can’t produce efficiently, a key task in machine learning,” says Orús.   Classical ML techniques can be used to optimize quantum circuits, improve error-correcting codes, analyze the properties of quantum systems and design new quantum algorithms. Classical ML methods are also used to preprocess and analyze data that will be used in quantum experiments or simulations. In hybrid experiments, today’s NISQ devices work on the parts of the problem most suited to the strengths of quantum computing while classical ML handles the remaining parts.   Quantum-inspired software techniques can also be used to improve classical ML, such as tensor networks that can describe machine learning structures and improve computational bottlenecks to increase the efficiency of LLMs like ChatGPT.   “It’s a different paradigm, entirely based on the rules of quantum mechanics. It’s a new way of processing information, and new operations are allowed that contradict common intuition from traditional data science,” says Orús. “Because of the efficient way quantum systems handle information processing, they are also capable of manipulating complex data to represent complex data structures and their correlations. This could improve generative AI by reducing energy and compute costs as well as increasing the speed of the drug discovery process and other data-intensive research. QML also could be used to develop new types of neural networks that use quantum properties that significantly improve inference, explainability, and training efficiency.”  There’s a lot of innovation happening at various levels to solve various pieces of all things quantum, including system design, environmental optimization, new hardware and software.  Román Orús, Multiverse Computing “In addition to developing better quantum hardware to run QML, people are also exploring how to implement hybrid systems that combine generative AI modules, such as transformers, with quantum capabilities,” says Orús.  Like classical ML, QML isn’t a single thing.   “As with other aspects of quantum computing, there are different versions of quantum machine learning. These days, what most people mean by quantum machine learning is otherwise known as a

How Quantum Machine Learning Works Read More »

Is the CHIPs Act in Jeopardy? What the US Election Could Mean for Semiconductor Industry

Today’s election, which pollsters say is neck and neck in the presidential race between Republican candidate and former US President Donald J. Trump and Democratic candidate, US Vice President Kamala Harris, could determine the future of the $52.7 billion CHIPS and Science Act. The CHIPS Act, signed into law two years ago, is already doling out some of the $39 billion aimed at semiconductor manufacturing, with another 13.2 billion earmarked for R&D and workforce development. The Biden Administration has touted the effort as one of its major accomplishments. Trump recently took to the Joe Rogan podcast to declare: “That chip deal is so bad.” Trump says the US should instead impose tariffs he says would force more chips to be produced in the US. Others say tariffs, which are charged to the importing company and not the exporting country, would have the opposite effect. House Speaker Mike Johnson, in remarks that he recently walked back, suggested that the GOP would “probably” try to repeal the legislation. He later said that he misunderstood the question after pushback from GOP Rep. Brandon Williams, a New York state congress member locked in a tough race with Democrat candidate state Sen. John Mannion. Johnson told reporters that a repeal is not in the works, but “there could be legislation to further streamline and improve the primary purpose of the bill — to eliminate its costly regulations and Green New Deal requirements.” Related:2024 InformationWeek US IT Salary Report: Profits, Layoffs, and the Continued Rise of AI Billions at Stake According to the US Commerce Department, the CHIPS Act is expected to boost US chip manufacturing from zero to 30% of the world’s leading-edge chip supply by 2032. Chip companies like Intel, Micron, Samsung, and TSMC have announced massive US manufacturing upgrades and new construction. Last year, the Commerce Department said more than 460 companies had signaled interest in winning subsidies through the bill. The US has chosen 31 “underdog tech hubs” for potential hotspots that would funnel CHIPS funding into areas outside of traditional tech corridors. Earlier this week, Albany NanoTech Complex was selected as the first CHIPS Act R&D flagship facility, winning $825 million in subsidies to fund a new Extreme Ultraviolet (EUV) Accelerator. US Sen. Mark Kelly, (D-Ariz), was a key sponsor of the CHIPS Act. Since 2020, Arizona netted more than 40 semiconductor deals, with $102 billion in capital investment and the potential for 15,700 jobs. TSMC’s investment in Arizona stands at more than $65 billion. Intel is investing more than $32 billion in two new Arizona foundries (chip factories), and to modernize an existing fab. Related:Curtail Cloud Spend With These Strategies Republicans are staking their political fortunes on the CHIPS Act as well. Sen. John Cornyn (R-TX), also co-authored the bill. And Sen. Marco Rubio (R-Fla.) and Sen. Tom Cotton have been vocal about China-US competition. The CHIPS act could shore up a domestic supply chain that gives North America a real advantage in the chip wars. While the CHIPS Act itself is not on any ballot measures for this election cycle, economic policies that impact power consumption and other key tech-important issues may impact the industry as well. In Arkansas, one ballot measure proposal concerning lottery funds could help create more skilled tech workers, for instance. In Maine, a ballot measure proposes issuing $25 million in bonds to fund research for IT industries. Bob O’Donnell, president and chief analyst at TECHnalysis Research, says the future of US semiconductor manufacturing should not be a partisan issue. “It’s clear to me that the CHIPS Act is incredibly important and hopefully it will cross party lines,” he says in a phone interview with InformationWeek. “There’s no doubt there will be demand down the road. And there’s no question that the geographical diversity of semiconductor manufacturing is way out of whack. This is a US necessity.” Related:Forrester Speaker Sneak Peek: Analyst Jayesh Chaurasia to Talk AI Data Readiness A Question of Workforce Readiness and R&D John Dallesasse, a professor of electrical and computer engineering at the University of Illinois Grainger College of Engineering, says funding from the CHIPS Act will be crucial to workforce and educational needs. “It would be unfortunate if the US government were to backpedal on the investments in semiconductor technology enabled by the CHIPS and Science Act,” he tells InformationWeek in an e-mail interview. “While the [act] provides incentives for manufacturing, there’s also a significant emphasis on new technology R&D and workforce development — both of which will be needed to restore US competitiveness in semiconductors.” He adds, “Without the combination of new technology development and incentives to bring manufacturing back to the US, we will continue on the downward spiral which has brought us from a dominant force in semiconductor manufacturing to a country which only makes 12% of the world’s chips.” source

Is the CHIPs Act in Jeopardy? What the US Election Could Mean for Semiconductor Industry Read More »

6 Strategies for Maximizing Cloud Storage ROI

Enterprise IT leaders face a daunting challenge: delivering innovative solutions through new applications, data services, and AI investments while adhering to tight budgets. Cloud computing, often at the heart of these initiatives, presents a particularly uncertain landscape, especially regarding storage costs, which can significantly impact IT budgets. Rising expenses in cloud data storage have prompted many organizations to reconsider their strategies, leading to a trend of repatriation as enterprises seek more control during these unpredictable economic times. A February 2024 Citrix poll revealed that 94% of organizations had shifted some workloads back to on-premises systems, driven by concerns over security, performance, costs, and compatibility. In response, senior business and finance leaders might consider a swift transition back from the cloud to curb expenses. However, cloud repatriation carries its own set of risks, including potential egress fees, the need for new hardware, security investments, and other infrastructure costs. Additionally, companies may face the challenge of re-hiring staff previously laid off. Furthermore, there’s a significant opportunity cost associated with missing out on enhanced collaboration, innovation, agility, and access to advanced cloud-native tools and services, including AI and machine learning. Optimize Your Cloud Strategy Before You Repatriate Deloitte analyzed anonymized data from several FinOps engagements to assess optimization efforts, finding that businesses can save up to 45% (15% on average) on cloud costs by optimizing across waste management, consumption management, and purchasing best practices levers. Common tactics of re-architecting applications, managing cloud sprawl and monitoring spend using the tools each cloud provides are a great first start. However, these methods are not the full picture. Storage optimization is an integral piece. Focusing on cloud storage costs first is a smart strategy since storage constitutes a large chunk of the overall spend. More than half of IT organizations (55%) will spend more than 30% of their IT budget on data storage and backup technology, according to our recent State of Unstructured Data Management report. The reality is that most organizations don’t have a clear idea on current and predicted storage costs. They do not know how to economize, how much data they have, or where it resides. By gaining a thorough understanding of data and its needs, IT can place high-priority data on top-performing storage while moving older, less important data to cheaper storage. The point is that if you don’t efficiently manage data over its lifecycle, both options will be expensive. Six Ways to Cut Storage Costs and Optimize Cloud Investments Get holistic visibility on data to make the best cloud decisions. Understanding the characteristics of enterprise data—which is primarily file or object data not sitting in a database—is critical to optimizing cloud investments and right-place data. Top metrics include top data owners, most common file type, most common file size, total data, data growth rate, and data by time of last access (which indicates active or hot data versus inactive or cold data). Metadata searches can also highlight files containing PII, IP, or other sensitive data that have unique storage and security requirements. Calculate current storage costs across all storage technologies in your data centers and/or the cloud. Since most organizations have a hybrid cloud approach, you need to calculate the cost of both on-premises storage and backups as well as cloud storage. Calculating this across various accounts, buckets, and storage silos can be time-consuming and laborious. Look for automated ways to deliver these costs, such as through a data management solution. Predict future storage costs based on data growth rates. Unstructured data typically grows at 20% or more each year, so when looking at how much you can save, consider current costs and future projections. An ongoing data management strategy is needed to save costs as data continues to pile up. Include backup and disaster recovery costs in your analysis. Even in the cloud, most organizations create additional data copies for backups, snapshots, and multi-site redundancy. Be sure to include these costs in your analysis to get the full understanding of your true costs and potential savings. Model new storage plans for savings opportunities. Your data management analysis should detail how much you can save by leveraging the various cloud storage tiers and right-placing cold data at the appropriate lower tier. In most clouds, the cheaper storage tiers are often 20x less expensive than the performance tiers. Create an ongoing data lifecycle management plan. Rather than moving data to the cloud in a “set and forget” fashion, long-term savings require continuous refinement to accommodate data as it ages or when other conditions materialize, such as the need to move data under compliance rules to secure archival storage. With more than 12 classes of storage on some of the popular clouds, you’ll want to leverage them all at the right time. Don’t keep data in top-tier file storage once it is no longer in active use, such as at the completion of an analytics project or marketing campaign. Ensure that users can access tiered data from the lower storage tier without having to bring it back to a more expensive tier so that you don’t lose the savings. A Final Word on Cloud Storage As organizations look to reduce cloud waste this year, attaining a data-centric perspective has multitude of benefits. Analysis can indicate data growth rates, hot versus cold data, compliant data, and more so that IT can make the best decisions balancing data requirements, business needs, and budget. This way, you can continue to embrace the cloud for digital business initiatives without starting alarm bells in the CFO’s office. source

6 Strategies for Maximizing Cloud Storage ROI Read More »

Iranian Threat Actors Ramp Up Ransomware, Cyber Activity

This summer, the Federal Bureau of Investigation (FBI), Cybersecurity and Infrastructure Security Agency (CISA), and the Department of Defense Cyber Crime Center (DC3) released a joint advisory on Iran-based threat actors and their role in ransomware attacks on organizations in the US and other countries around the globe.   With the US presidential election coming to a close, nation state activity from Iran could escalate. In August, Iranian hackers compromised Donald Trump’s presidential campaign. They leaked compromised information and sent stolen documents to people involved in Joe Biden’s campaign, CNN reports.   What are some of the major threat groups associated with Iran, and what do cybersecurity stakeholders need to know about them as they continue to target US organizations and politics?   Threat Groups  A number of advanced persistent threat (APT) groups are affiliated with the Islamic Revolutionary Guard Corps (IRGC), a branch of the Iranian armed forces. “[Other] relatively skilled cyber threat actor groups … maintain arm’s distance length from the Iranian government,” says Scott Small, director of cyber threat intelligence at Tidal Cyber, a threat-informed defense company. “But they’re … operating pretty clearly on behalf [of] or aligned with the objectives of the Iranian government.”   Related:2024 Cyber Resilience Strategy Report: CISOs Battle Attacks, Disasters, AI … and Dust These objectives could be espionage and information collection or simply disruption. Hack-and-leak campaigns, as well as wiper campaigns, can be the result of Iranian threat actor activity.  And as the recent joint advisory warns, these groups can leverage relationships with major ransomware groups to achieve their ends.   “Look at the relationships [of] a group like Pioneer Kitten/Fox Kitten. They’re partnering and collaborating with some of the world’s leading ransomware groups,” says Small. “These are extremely destructive malware that have been extremely successful in recent years at disrupting systems.”  The joint advisory highlights Pioneer Kitten, which is also known by such names as Fox Kitten, Lemon Sandstorm, Parisite, RUBIDIUM, and UNC757, among others. The FBI has observed these Iranian cyber actors coordinating with groups like ALPHV (also known as BlackCat), Ransomhouse, and NoEscape. “The FBI assesses these actors do not disclose their Iran-based location to their ransomware affiliate contacts and are intentionally vague as to their nationality and origin,” according to the joint advisory.   Many other threat groups affiliated with Iran have caught the attention of the cybersecurity community. In 2023, Microsoft observed Peach Sandstorm (also tracked as APT33, Elfin, Holmium, and Refined Kitten) attempting to deliver backdoors to organizations in the military-industrial sector.   Related:Juliet Okafor Highlights Ways to Maintain Cyber Resiliency MuddyWater, operating as part of Iran’s Ministry of Intelligence and Security (MOIS), has targeted government and private sector organizations in the oil, defense, and telecommunications sectors.   TTPs   The tactics, techniques, and procedures (TTPs) leveraged by Iranian threat actor groups are diverse. Tidal Cyber tracks many of the major threat actors; it has an Iran Cyber Threat Resource Center. Small found the top 10 groups his company tracks were associated with approximately 200 of the MITRE ATT&CK techniques.   “Certainly, this is just one data set of known TTPs, but just 10 groups being associated with about a third of well-known TTPs, it just demonstrates … the breadth of techniques and methods used by these groups,” he says.   The two main avenues of compromise are social engineering and exploitation of unpatched vulnerabilities, according to Mark Bowling, chief information, security, and risk officer at ExtraHop, a cloud-native cybersecurity solutions company.   Social engineering conducted via tactics like phishing and smishing can lead to compromised credentials that grant threat actors system access, which can be leveraged for espionage and ransomware attacks.   Related:Beyond the Election: The Long Cybersecurity Fight vs Bad Actors Charming Kitten (aka CharmingCypress, Mint Sandstorm, and APT42), for example, leveraged a fake webinar to ensnare its victims, policy experts in the US, Europe, and Middle East.   Unpatched vulnerabilities, whether directly within an organization’s systems or its larger supply chain, can also be a useful tool for threat actors.   “They find that vulnerability and if that vulnerability has not been patched quickly, probably within a week, an exploit will be created,” says Bowling.  The joint advisory listed several CVEs that Iranian cyber actors leverage to gain initial access. Patches are available, but the advisory warns those will not be enough to mitigate the threat if actors have already gained access to vulnerable systems.   Potential Victims   Who are the potential targets of ongoing cyber campaigns of Iran-based threat actors? The joint advisory highlighted defense, education, finance, health care, and government as sectors targeted by Iran-based cyber actors.   “What is … the case with a lot of nation-state-sponsored threat activity right now, it’s … targeting a little bit of anyone and everyone,” says Small.   As the countdown to the presidential election grows shorter, threat actors could be actively carrying out influence campaigns. This kind of activity is not novel. In 2020, two Iranian nationals posed as members of the far-right militant group the Proud Boys as a part of a voter intimidation and influence campaign. Leading up to the 2024 election, we have already seen the hack and leak attack on the Trump campaign.     Other entities could also fall prey to Iranian threat actor groups looking to spread misinformation or to simply create confusion. “It’s possible that they may target government facilities, state or local government, just to add more chaos to this already divided general election,” says JP Castellanos, director of threat intelligence for Binary Defense, a managed detection and response company.   Vulnerable operational technology (OT) devices have also been in the crosshairs of IRGC-sponsored actors. At the end of 2023, CISA, along with several other government agencies, released an advisory warning of cyber activity targeting OT devices commonly used in water and wastewater systems facilities.   In 2023, CyberAv3ngers, an IRGC-affiliated group, hacked an Israeli-made Unitronics system at a municipal water authority in Pennsylvania. In the wake of the attack, screens at the facility read: “You Have Been Hacked. Down With Israel, Every Equipment ‘Made In Israel’ Is CyberAv3ngers Legal Target.”  The water authority booster station

Iranian Threat Actors Ramp Up Ransomware, Cyber Activity Read More »

How Enterprises Use Cloud to Innovate

Cloud utilization patterns continue to evolve as cloud providers introduce new capabilities and the competitive landscape evolves. Over time, businesses have been building a foundation for their future as they house more data, develop more cloud apps, and take advantage of services.  “We are going beyond the cloud adoption for cost benefits and cloud adoption for velocity benefits arguments. We are well into cloud adoption/XaaS adoption for innovation,” says Shriram Natarajan, director, digital business transformation, at technology research and advisory firm, ISG. Enterprises can realize a super-return on their digitization and automation investments. By layering in AI to learn from the meta-data of previously digitized processes, they can further squeeze efficiencies and [advance] human augmentation.”  Cloud-delivered services enable companies to experiment more freely and execute new ideas faster and more efficiently than if they had to invest the capital expenses and time to build out on-prem IT infrastructure from scratch.   “There are a wide range of cutting-edge cloud services being delivered from the cloud that are transforming the competitive landscape,” says David Boland, VP of cloud strategies at hot cloud storage company Wasabi Technologies, in an email interview. “These include generative AI services, AI classification and recommendation, edge computing and IOT, quantum computing, advanced cloud storage solutions and cloud-based data analytics.”  Related:Infrastructure Sustainability and the Data Center Power Dilemma However, many organizations struggle to fully realize their vision due to a variety of strategic, operational and technical challenges.   David Boland, Wasabi “One of the most common challenges is managing cloud costs. Organizations often underestimate how quickly costs can escalate due to hidden costs, that results in a lack of visibility into cloud spending. Without proper monitoring, cloud budgets can spiral out of control, reducing the cost-effectiveness of cloud services,” says Boland. “Additionally, many organizations fall victim to vendor lock-in, where they become too dependent on a single cloud provider’s proprietary technologies and tools. This limits flexibility and makes it difficult to switch providers or use a multi-cloud strategy, hindering innovation and negotiation power.”  AI Service Adoption is Rampant  Companies are increasingly becoming cloud-first, where everything from innovation to collaboration happens over a public or hybrid cloud. When companies harness the cloud, they can save on costly on-prem infrastructure, opening the door to investing in more strategic objectives like product innovation and global growth.  Related:Outage Bootcamp: How Resilient Is IT Infrastructure in 2024? “Without the power of the cloud, companies would have difficulty taking advantage of technologies like AI. Many of those services are cloud-based, which opens the door to advanced insights, automation and more creative ways to engage customers,” says Jean-Phillipe Avelange, CIO of intelligent Internet platform Expereo, in an email interview. “This is only possible when companies are doubling down on their cloud strategy.”  The key to using cloud effectively is developing clear objectives before adoption, mastering issues like privacy and security, and clearly understanding the impact on the workforce.   “Once those issues are resolved, AI can be used in many ways to increase productivity and develop never-before-realized insights into customers and the competitive landscape,” says Avelange. “However, we’re only at the beginning of this journey. Business and IT leaders and employees need to understand many facets of AI before they can consistently and effectively harness this technology.”  According to John Pettit, CTO at Google solution provider Promevo, AI and data analytics are critical to drive greater productivity across the stack.   Related:Cloud Strategy in the Wake of the CrowdStrike Outage “We’ve seen industry-tech startups challenging traditional business models by being more efficient and data-driven. These highly optimized business models require a lot of data and platforms that can scale with them,” says Pettit in an email interview.   According to Alex Perritaz, chief architect at high availability infrastructure provider InFlux Technologies, leading organizations use cloud computing to train the latest models, and innovation is mostly driven by AI.  “Using cloud solutions for these businesses makes sense as they don’t need to commit to setting up the infrastructure but can use as they go the latest GPUs to train the latest models with as many parameters as they can fit in, allowing the companies to [stay] flexible and agile in their workflow, and remain at the cutting edge for their offerings,” says Perritaz in an email interview. “Many people [were] caught up on the high demand for computing, so many purchased and set up large infrastructures and the latest hardware. As NVIDIA rolls out new generations, they must refresh their hardware to keep up with the latest models. The obvious answer to being the most competitive in the market regarding service offerings is the price and the capacity of the infrastructure to run the largest AI models.”  John Samuel, global CIO and EVP at CGS (Computer Generated Solutions) says innovation is predominantly driven by cutting-edge technologies such as augmented and mixed reality (AR/XR), AI and now GenAI.   “Without the power of cloud computing, the cost of adopting these technologies to drive innovation can become prohibitive,” says Samuel in an email interview. “Cloud computing also allows companies to be more agile and benefit from the innovations offered by SaaS providers, who use the cloud to deliver their services to clients. The cloud’s consumption-based cost model enables companies to pilot and test innovations without making significant investments in hardware, software, and the associated build costs of creating technology for innovation from scratch.”  Companies are lowering costs and improving competitiveness using self-service generative AI services and agent-assist tools.  “These technologies can also rapidly surface insights from data, giving companies a competitive edge by enabling data-driven, agile decision-making,” says Samuel.  Driving the Most Value  Today’s companies are using the cloud to become more agile, efficient, and secure. Cloud is capable of many things, from increasing data accessibility to scaling based on demand. Migrating to cloud enables companies to adjust to the changing dynamics within their operations. It also helps ensure everyone has the resources they need to do their jobs effectively.   “When employees are equipped with the necessary tools, they can focus on strategic

How Enterprises Use Cloud to Innovate Read More »

Broadband Is On the Ballot

The next president will have a great say in a variety of issues related to broadband availability and services. He or she will establish funding priorities for broadband expansion to un- and under-served regions, direct (through various agencies) the allocation of spectrum for 5G and new satellite services, and more. Another factor to consider is that with recent presidents increasingly governing by executive orders, the next president will likely have a huge thumb on the scale with respect to broadband regulatory issues. What’s on the Line? Support for broadband expansion is something on which both candidates are remarkably aligned. Broadband is considered essential for the U.S. government and companies to innovate and compete in global markets.  However, Vice President Harris and former President Trump will likely take vastly different approaches to supporting broadband efforts. As we reported last week: Harris’s efforts will likely center on providing more government grants and public-private partnerships. And her administration would likely continue Biden’s drive to increase broadband access to rural areas through programs like BEAD.  Trump’s approach to broadband expansion and funding will likely embrace the principles of Project 2025 and other conservative thinking efforts that limit federal influence, support private deployment, and reduce regulations. For example, a Trump administration might seek tax incentives and private-sector partnerships to drive broadband infrastructure construction. Related:The Impact of the Presidential Election on Networks Divvying Up Spectrum Spectrum allocation is another area where both candidates will probably undertake initiatives to expand broadband access. One area where spectrum availability helps is with 5G services. During the past Trump administration, Ajit Pai, Federal Communications Commission Chairman, promoted plans to push more spectrum into the marketplace, promote 5G wireless infrastructure, and modernizing outdated regulations in the field. Harris might follow President Biden’s efforts, such as having the FCC explore ways to open up different spectral bands (e.g., the 42 GHz band) to support 5G fixed wireless access (FWA). Another area where the new spectrum helps is with emerging satellite broadband services. In September, the FCC opened 1300 megahertz of spectrum for non-geostationary orbit (NGSO) fixed-satellite service operations in the 17.3-17.8 GHz band. Satellite operators will use the extra spectrum to deploy advanced services, including high-speed internet access to unserved and underserved areas. In March, the FCC said it would allow SpaceX to use E-band frequencies between second-generation Starlink satellites and gateways on the ground. The move will allow SpaceX to improve the capacity of its Starlink broadband services. Given the close relationship between Trump and Elon Musk, it is likely a second Trump presidency would also focus on such efforts. Checks and Balances Still Exist Even though the next president will wield great power, there are judicial and Congressional aspects that will determine what actually gets done. For instance, presidential influence may be diminished thanks to the recent Supreme Court ruling that shifts power over federal regulations from agencies to judges. The party that controls Congress will have the ability to prioritize, direct, and fund legislative actions and confirm agency appointees. All Politics is Local No matter what is done at the federal level, local governments increasingly are getting their two cents worth in, too. For example, Florida’s Miami-Dade County has a straw poll measure on this year’s ballot related to the availability of free public Wi-Fi. A yes vote would expand free public Wi-Fi access countywide. In past years, placement of 5G cell towers got the attention of state governments. In 2023, the New York State Senate took up a bill that “prohibits the placement of 5G telecommunications towers within 250 feet of a business or residence in cities with a population of one million or more without the owner’s consent; requires community board approval and the completion of a city environmental quality review before the placement of any 5G tower can be approved.” (Senate Bill S5123 is still in committee.) And in 2021, then Governor Tom Wolf signed House Bill 1621 — the Small Wireless Facilities Deployment Act — into law. The law included new regulations for the deployment of small cells. A second area that is getting increased attention from the states is how to make up for funding cuts due to the lapsed Affordable Connectivity Program (ACP). The program provided monthly subsidies to 23 million households nationwide so that they could afford high-speed internet connections. It lapsed in May 2024. Since then, some state broadband offices and legislatures have been looking into ways to address the lack of funds. A Final Word It is safe to say that broadband will be a high priority in either presidential candidate’s administration. The two will take extremely different approaches, with Harris concentrating efforts on federal programs while Trump will look for private partnerships and fewer regulations. Control of the House of Representatives and the Senate, which are both in play, will have its own implications. The House will have the power to find programs through revenue bills. The Senate will get to confirm Presidential agency appointees. source

Broadband Is On the Ballot Read More »

IT Service Management Vendor Rankings, 2024 Edition

“IT Service Management Vendor Rankings, 2024 Edition“ Brought to you by TeamDynamix 2024 IT Service Management Vendor Rankings  Leveraging insights from the comprehensive ITSM Data Quadrants, this asset highlights the leading ITSM vendors in the current market. The report offers an in-depth look at vendors’ performance based on various criteria, providing you with a well-rounded perspective of your options. Key points include: -Ease of ESM expansion: Our report emphasizes the importance of scalability, ensuring that your ESM can grow seamlessly with your business.-Distinctive functionalities: Discover unique and innovative features that set top-performing ITSM vendors apart from the rest.-Tangible business value delivered by our platform: Understand the substantial benefits and ROI that these ITSM solutions can bring to your organization.-Shopping for a new ITSM platform for the future? This report outlines essential factors to consider when evaluating ITSM vendors, helping you find the right ITSM tool tailored to your business needs.  By examining these elements thoroughly, you can make a well-informed decision that supports both your current objectives and long-term goals. Offered Free by: TeamDynamix See All Resources from: TeamDynamix source

IT Service Management Vendor Rankings, 2024 Edition Read More »

The Current Top AI Employers

While the unemployment rate for IT professionals rose to 6% in August, up from 5.6% the prior month, the situation is far brighter for AI experts.  The AI job market has shown resilience and growth, especially in the first half of 2024, says Antti Karjalainen, an analyst with WilsonHCG, a global executive search and talent consulting firm. “Despite some fluctuations, the demand for AI professionals remains robust, driven by increased investments in AI technologies and projects,” he observes in an online interview.  Amazon currently leads the pack with 1,525 AI-related employees, primarily operating in the e-commerce and cloud computing sectors, according to data from WilsonHCG’s talent intelligence and labor market analytics platform. Meta follows closely with 1,401 employees, while Microsoft is next with 1,253 employees in AI-related roles. “As expected, Apple and Alphabet also have significant numbers with 1,204 and 970 employees, respectively,” Karjalainen notes.  TalentNeuron, a global labor market analytics provider, breaks down the market somewhat differently. “Globally, the top five AI employers are Google, Capital One, Amazon, ByteDance, and TikTok,” says David Wilkins, the firm’s chief product and marketing officer. “Of note, Amazon saw a 519% increase in AI job postings year-over-year, and Google saw a 367% increase,” he observes in an online interview. “Out of the top 20 AI employers, Reddit saw the largest year-over-year increase at 1,579%.”  Related:2024 InformationWeek US IT Salary Report: Profits, Layoffs, and the Continued Rise of AI While the US is a strong market for AI talent, there’s a significant shortage of AI specialists relative to the growing demand, Wilkins says. “So, companies, Google among them, have expanded overseas for talent.” TalentNeuron’s latest report on tech talent hubs found that demand growth is highest in emerging, lower-cost markets, such as the Indian cities of Pune and Hyderabad, as organizations seek to strategically place AI capabilities.  Sought-After Skills  The most sought-after skills in AI job postings, according to WilsonHCG data, include deep learning, machine learning model development, computer vision, generative AI, and natural language processing (NLP), Karjalainen says. “These skills are crucial for developing advanced AI systems and applications.” He adds that advanced algorithm development, model deployment and productionization (the process of turning a prototype into something that can be mass-produced), and AI-specific programming languages, such as TensorFlow, PyTorch, and Keras, are also highly valued by employers.  Related:Curtail Cloud Spend With These Strategies Many employers also value proficiency in programming languages, such as Python, MATLAB, C++, and Java, as well as data analysis and statistical modeling talents. “These skills are foundational for any AI-related role and are necessary for developing, testing, and deploying AI models,” Karjalainen says. Having the ability to work with large datasets, perform data mining, and apply statistical techniques is also crucial, he notes. “Employers are looking for candidates who can not only build AI models but also interpret and analyze the results to drive business decisions.”  Top Fields  WilsonHCG finds that the computer software industry leads with 4,135 AI professionals, indicating a strong demand for AI talent in software development and related services. Following closely is the IT and services sector, which employs 3,304 AI professionals. “This sector includes companies that provide IT consulting, system integration, and managed services, all of which are increasingly incorporating AI into their offerings,” Karjalainen says.  With 2,176 individuals working in the area, research organizations also have a significant number of AI professionals. This sector includes academic institutions, research labs, and private research firms focused on advancing AI technologies, Karjalainen says. Financial services, with 819 AI professionals, is yet another key sector, as banks, insurance companies and investment firms leverage AI for risk management, fraud detection, and customer service. Meanwhile, the internet industry, which includes companies providing online services and platforms, employs 635 AI professionals, reflecting the importance of AI in enhancing user experiences and optimizing operations.  Related:Forrester Speaker Sneak Peek: Analyst Jayesh Chaurasia to Talk AI Data Readiness Karjalainen says that other fields with significant AI employment include higher education (444 professionals), biotechnology (384 professionals), and mechanical or industrial engineering (378 professionals). The hospital and health care sector employs 324 AI professionals, highlighting the growing use of AI in medical diagnostics, treatment planning, and patient care. The automotive industry, with 320 AI professionals, is also a key player, particularly in the development of autonomous vehicles and advanced driver-assistance systems. Other important fields employing AI professionals include management consulting, electrical/electronic manufacturing, and semiconductors.  Salary Trends  WilsonHCG data shows that AI job postings consistently offer higher salaries than non-AI IT postings. For instance, in July 2024, the average advertised salary for AI jobs was $166,584, while for non-AI IT jobs the average was $110,005. The comparison represents a difference of $56,579, or 51.4%.  Looking at the annual median salary, AI jobs offer $150,018 compared to $108,377 for non-AI IT jobs, resulting in a difference of $41,641, or 38.4%, Karjalainen says. “This trend is consistent across various months, with AI job salaries consistently outpacing those of non-AI IT jobs by a substantial margin.”  Wilkins reports that top US AI employers offer a median base salary of $183,250, according to TalentNeuron salary data. The median base salary for US AI jobs overall is $143,000. In comparison, the US Bureau of Labor Statistics in May 2023 reported a median annual wage of $104,420 for computer and information technology occupations.  Overall, the data suggests that top AI employers generally pay more than other employers, particularly in the IT sector, Karjalainen says. “This higher compensation reflects the specialized skills and expertise required for AI roles, as well as the high demand for AI talent in the job market”  Talent Hubs  According to WilsonHCG statistics, California’s San Francisco-Oakland-Hayward, metro area has 10,740 AI professionals, making it the leading AI talent hub. In second place with 5,422 AI professionals is the New York-Newark-Jersey City-NY-NJ-PA region. “This area is a significant center for finance, media, and technology, attracting a diverse range of AI talent,” Karjalainen notes. The Seattle-Tacoma-Bellevue, Washington metro area, with 3,139 AI professionals, is another key location, driven by the

The Current Top AI Employers Read More »

5 Tips for Balancing Cost and Security in Cloud Adoption

In today’s fast-paced digital landscape, cloud services have become essential for organizations looking to accelerate business innovations and limit downtime. With these opportunities, however, businesses face the challenge of balancing cost savings with security — two priorities often seen as opposing forces.   While cutting costs is tempting, especially in times of economic uncertainty, the risks of inadequate security can far outweigh the immediate savings. A single breach can lead to financial losses, reputational damage, and hefty regulatory penalties, making security investments a strategic imperative rather than an optional expense.  Navigating Cost and Security  In Q2 2024, global spending on cloud infrastructure services grew 19% year over year to reach $78.2 billion, according to Canalys. This expansion reflects a growing reliance on cloud services as organizations seek flexibility, scalability, and operational efficiency. While the market continues to offer significant opportunities for cost optimization, it also introduces various new security challenges that businesses must confront.  Emerging trends like serverless computing and containerization drive cost savings by reducing infrastructure overhead and improving the efficiency of cloud environments. Serverless architectures, for example, allow businesses to operate without the need to manage physical servers, reducing the total cost of ownership. Containerization, similarly, enhances application portability and deployment speed, allowing businesses to optimize resources and scale more effectively.  Related:Infrastructure Sustainability and the Data Center Power Dilemma However, with these benefits come potential vulnerabilities. While eliminating the need to manage infrastructure, serverless computing can expose organizations to security risks if the infrastructure is not properly configured. Misconfigured serverless environments can lead to data breaches, unauthorized access or service disruptions. Such issues will likely negate initial cost savings. Similarly, while offering agility, containerization introduces risks related to container isolation and management, as vulnerabilities in one container could potentially compromise others.  In addition to the technical security challenges, organizations must navigate an increasingly complex regulatory environment when adopting cloud solutions. Data protection laws such as the General Data Protection Regulation (GDPR) in Europe and the California Consumer Privacy Act (CCPA) in the United States impose strict requirements on how businesses handle and secure personal data. Non-compliance with these regulations can result in substantial fines and penalties, making robust security measures non-negotiable for companies operating in regulated industries.  Related:Outage Bootcamp: How Resilient Is IT Infrastructure in 2024? Balancing Priorities  In reality, businesses should not view cost savings and security as opposing forces. By adopting a thoughtful approach, organizations can create a cloud strategy that achieves both. To effectively navigate this balance, consider the following five key strategies.  1. Conduct comprehensive risk assessments  Before selecting a cloud provider, organizations should assess their specific security risks and compliance requirements. This evaluation will help identify areas where cost savings can be safely realized without compromising critical security measures. A thorough risk assessment ensures that organizations allocate resources appropriately, investing in security where needed most.  2. Leverage managed services  For organizations lacking the resources or in-house expertise to manage complex cloud environments, partnering with managed service providers (MSPs) can offer a cost-effective solution. MSPs specializing in cloud infrastructure can offer targeted services like cloud migration support, security management, and optimization of cloud-native tools, all of which help to secure the environment while minimizing operational costs.  Related:Cloud Strategy in the Wake of the CrowdStrike Outage 3. Implement continuous monitoring  To balance cost and security, organizations must maintain vigilant oversight of their cloud services. Continuous monitoring allows businesses to detect vulnerabilities early, optimize resource usage and ensure cost efficiencies. Regularly reviewing cloud resource usage also allows businesses to optimize spending on storage and computing resources, combining security with cost efficiency.  4. Optimize cloud security configurations  Cloud misconfigurations can lead to vulnerabilities, such as leaving sensitive data in unprotected storage buckets. Regular reviews and automated tools designed for cloud environments can help ensure security settings, such as access to control lists and encryption policies, are properly configured and updated. By ensuring configurations are correct and aligned with best practices, businesses can prevent incidents that may incur hefty fines or recovery costs.  5. Invest in employee training  Training should focus on the unique security challenges of cloud environments, such as identity and access management, shared responsibility models, and how to manage cloud resources securely. Ensuring employees understand these cloud-centric security aspects reduces human errors that could expose vulnerabilities. Furthermore, a well-trained workforce can leverage cloud resources more effectively, maximizing the return on cloud investments.  Looking Ahead  The tension between cost savings and security is not just a technical issue; it is a strategic imperative for organizations to navigate in the digital era. As cloud adoption continues to accelerate, businesses must carefully maintain this delicate balance to ensure their bottom line and security posture remain strong.  Organizations can achieve the best of both worlds by adopting a cloud strategy that incorporates risk assessments, continuous education, and effective resource allocation.  source

5 Tips for Balancing Cost and Security in Cloud Adoption Read More »