Information Week

DOS Won’t Hunt: Breaking Bread — New Tech for Legacy Ops

After an organization settles into its operations, with a solid understanding of its tasks and needs, there might be a mindset of “If ain’t broke, don’t fix it” when it comes technology. At least, this might be the case until a new platform or solution emerges that promises to revolutionize an aspect of the business. What happens if an organization feels the need to move to new tech that will require significant investment and other internal changes? How do the operational and tech sides of the enterprise assess their needs and concerns, then share those perspectives over the fence? In this “Breaking Bread” session, Josh Mason, CTO of RecordPoint, and Katie Klein, vice president of marketing for Comcast Business, discuss pain points of introducing new technology to legacy operational needs. source

DOS Won’t Hunt: Breaking Bread — New Tech for Legacy Ops Read More »

Data Management recent news

May 15, 2025 Generative AI is already empowering creators and terrifying anyone who ever watched a Matrix movie. While the role of generative AI in business has just begun to scratch an itch, it’s crucial that IT thought leaders decide exactly how and what they’re going to do to stay ahead of the competition, before it’s too late. In this event we’ll discuss the uses of quantum computing, generative AI in development opportunities, hear from a panel of experts on their views for potential use cases, models, and machine learning infrastructures, you will learn how to stay ahead of the competition, and much more! Register Now source

Data Management recent news Read More »

A Primer for CTOs: Taming Technical Debt

Like a hangover, technical debt is a headache that plagues many IT organizations. Technical debt accumulates when software development decisions aren’t up to recommended or necessary standards when moved into production.  Like financial debt, technical debt may not be a bad thing when used to drive a critical project forward, particularly if the initiative promises some type of immediate value. Unfortunately, technical debt is frequently misused as a compromise that places speed above good practices.  Technical debt is a collection of design or implementation constructs that are expedient in the short term butcreate a context that can make future changes more costly or impossible, says Ipek Ozkaya, technical director of engineering, intelligent software systems, at the Carnegie Mellon University Software Engineering Institute, in an online interview.  Technical debt is often created by well-intended and sometimes justified trade-offs, such as looming deadlines, uncoordinated teams unintentionally developing competing solutions, or even patterns and solutions that were at one time elegant but haven’t aged well, says Deloitte CTO Bill Briggs. “There’s usually a commitment to come back and fix it in the next release or the next budgeting cycle, but priorities shift while the interest from technical debt grows,” he notes in an email interview.  Related:Cultivating Para-IT’ers and Super Users Facing Costs and Delays  For many public and private sector enterprises, paying down technical debt represents a large percentage of their annual technology investment, Briggs says. “As a result, new projects that depend on aging tech have a high probability of delays and ballooning costs.”  Perhaps most ominously, by siphoning funds away from critical cybersecurity updates and initiatives, technical debt can play a significant negative role in breaches and service outages, potentially leading to financial, operational, and reputational risks, Briggs says. Technical debt can also make it hard, sometimes even impossible, to harness and scale promising new technologies. “Transformational impact typically requires emerging tech to be embedded in business process systems, where technical debt is likely to run rampant.”  Regaining Control in Software Ecosystems  There’s no one-size-fits-all approach to controlling technical debt, since the priority and the impact of short-term gains and long-term system, resource, and quality impacts are often context specific, says Ozkaya, co-author of the book “Managing Technical Debt: Reducing Friction in Software Development“. However, teams can get ahead of unintentional technical debt by incorporating modern software development practices and investing in automated quality analysis, unit and regression testing, and continuous integration and deployment tools and practices, she notes.  Related:Confidential Computing: CIOs Move to Secure Data in Use Technical debt is a reality in today’s software ecosystems, Ozkaya states. “They evolve fast, have to adjust to changing technology, new requirements need to be incorporated, and competition is rough,” she observes. Virtually all organizations have some level of technical debt. “The right question to ask is not whether it’s useful or not, but how it can be continuously and intentionally managed.”  Still, organizations don’t want to find themselves drowning in unintentional technical debt. “Instead, they want to make the right tradeoffs and strategically decide when to accept technical debt and when to resolve it,” Ozkaya says.  A Strategy for Debt Taming  Taking a head-on approach is the most effective way to address technical debt, since it gets to the core of the problem instead of slapping a new coat of paint over it, Briggs says. The first step is for leaders to work with their engineering teams to determine the current state of data management. “From there, they can create a realistic plan of action that factors in their unique strengths and weaknesses, and leaders can then make more strategic decisions around core modernization and preventative measures.”  Related:Franciscan Health’s Pursuit of Observability and Automation Managing technical debt requires a long-term view. Leaders must avoid the temptation of thinking that technical debt only applies to legacy or decades old investments, Briggs warns. “Every single technology project has the potential to add to or remove technical debt.” He advises leaders to take a cue from medicine’s Hippocratic Oath: “Do no harm.” In other words, stop piling new debt on top of the old.  Technical debt can be reduced or eliminated by outsourcing, says Nigel Gibbons, a director and senior advisor at cybersecurity advisory firm NCC Group. Focus on what you do best and outsource the rest, he recommends in an email interview. “Cloud computing and managed security services are the panacea for most organizations, offering a freedom from the ball and chain of IT infrastructure.”  Coming to Terms with Tech Debt  Technical debt can be useful when it’s a conscious, short-term trade-off that serves a larger strategic purpose, such as speed, education, or market/first-mover advantage, Gibbons says. “The crucial part is recognizing it as debt, monitoring it, and paying it down before it becomes a more serious liability,” he notes.  Many organizations treat technical debt as something they’re resigned to live with, as inevitable as the laws of physics, Briggs observes. Some leaders vilify technical debt by blaming predecessors for allowing debt to pile up on their watch. Such attitudes are useless, however. “Leaders should be driving conversations to shine a light on the impact, implications, and potential path forward,” he advises.  source

A Primer for CTOs: Taming Technical Debt Read More »

Principal Financial Group CIO on Being a Technologist and Business Leader

Early on in her career, Kathleen Kay never pictured herself in the C-suite. She went to a public high school in Detroit and initially started on a pre-med track in her ungraduated studies. But she discovered computer science instead. She started her career at General Motors, where a mentor recognized her potential. Today, she is executive vice president and CIO of Principal Financial Group, an investment management and insurance company.   In a conversation with InformationWeek, Kay traces her career trajectory, sharing how she grew into CTO and CIO leadership roles at multiple companies.   The Path to the C-Suite  Kay has come a long way since her high school days. “It would never have occurred to me I would be in a position like this,” she shares.  She wasn’t even sure if she was going to go to college, but an organization that worked with kids at her high school helped her recognize that opportunity. She got a scholarship and initially decided to pursue a pre-med track, only to discover she hated biology.   “In Detroit … we were always looking for good, stable, well-paying careers. Computer science was paying well. So I thought, ‘I’m going to take a computer science class and see if I like it,’” she shares. “And I ended up loving it.”  Related:Confidential Computing: CIOs Move to Secure Data in Use Like many of her peers in Detroit at the time, Kay got her start in the automotive industry. Her first job was in a research laboratory at GM. She worked alongside various researchers in specialties like social sciences, anthropology, psychology, and operations science. The role afforded her a great deal of flexibility to explore her interests, and there she was drawn to learning systems.   Kay then worked with Chevrolet division of the company, where she caught the eye of an internal leadership program that selected high-potential employees.   “For me, just getting through college and into a company like General Motors was the dream,” Kay recalls. “I became a part of that group and that’s when I started realizing I could be in leadership positions, that I could do bigger things.”  GM hired Ralph Szygenda, its first CIO, in the 1990s.   “That’s when I thought, ‘Wow, I would love to be in a role like that,’ because it’s a combination of having to understand business as well as technology. And it was what I really liked doing,” says Kay.   Leadership in Different Industries   Kay worked with Maryann Goebel, who was the CIO at GM North America between 2003 and 2006. And Goebel became a lifetime mentor and the first person to recommend Kay for roles outside of GM.   Related:Franciscan Health’s Pursuit of Observability and Automation Kay’s first role outside of GM and the automotive industry was as senior vice president and CTO at Comerica Bank. The next steppingstone in her career was as enterprise CTO at SunTrust Bank. From there, she moved to Pacific Gas and Electric Company, where she worked her way up to senior vice president and CIO.   In each successive role, Kay’s responsibilities grew and broadened. As her career unfolded, she learned how to be a good technologist and business leader.  “I think [for] many of us, and I did it early in my career, we get enamored with a technology and then try and hunt for a place to put it,” says Kay. “What would often happen is you’d have this mismatch of technology that wouldn’t really solve the business problem at hand.”  In each of her different roles, Kay found that she needed the ability to empower people and remove blockers with technology, but she needed a keen understanding of the business and its specific industry to accomplish that. “Being a good technologist is really understanding the business problem at hand,” she explains.  The CIO Role at Principal Financial Group  Principal Financial Group approached Kay during its hunt for a new CIO.   “This company at the time … had recognized that what got them here to 140 years wasn’t going to be what gets them to another 140 years. So, having this humility to recognize that and willingness to address it and pivot was really appealing to me,” Kay shares.  Related:Preparing Your Tech Business for a Possible Recession She accepted the job and started in May 2020, just a few months into the global COVID pandemic. Kay aspires to be an accessible leader, a management style challenged by a 100% remote workforce. Stopping by someone’s desk or leaving her door open wasn’t an option. Even informal meetings would need to be scheduled via a virtual meeting platform.   She started hosting a weekly, open virtual meeting, Coffee with Kathy, as an experiment. Anybody in her organization could join for an unstructured conversation about work or anything else happening in the wider world. That meeting drew hundreds of participants, and it has outlived the pandemic. Today, it is a monthly meeting that remains virtual so team members working remotely and in other offices can still join.  “It’s really broken down barriers. I have so many people on my team who say, ‘I would have never thought I would be talking directly with the CIO of a company. I never thought I would feel comfortable sending a message or getting time on the calendar,’” says Kay.   Today, this meeting is one among many in Kay’s busy schedule.   “In the CIO role, I am front and center facing off with my business colleagues really understanding strategy, helping define it, challenging ways of doing things,” she says.   On any given day, she could be having financial discussions, checking on the progress of different initiatives, and spending time mentoring people on her team.   Kay notes her pride in how well-aligned technology and strategy have become at Principal Financial Group. When she first started, each of the company’s lines of business had a strategy and related technology plans. But there were a lot of shared technologies that weren’t necessarily coordinated across these different lines of business.  Kay and her

Principal Financial Group CIO on Being a Technologist and Business Leader Read More »

Preparing Your Tech Business for a Possible Recession

Over the past weeks, the odds of a recession increased significantly. While financial experts are divided on the likelihood of a recession happening anytime soon, prudent C-suite leaders are already taking tentative steps designed to help their enterprise survive and perhaps even prosper during an economic downturn.  To prepare for a recession, tech companies should focus on cutting unnecessary costs without compromising innovation, advises Trevor Young, chief product officer at cybersecurity firm Security Compass. “This means automating processes, streamlining operations, and using resources like cloud services more efficiently,” he says in an online interview. Young also suggests diversifying revenue streams. “Don’t put all of your eggs in one basket.” Max Shak, CEO of team services firm Zapiy.com, stresses the importance of driving innovation during economic downturns. “Companies that maintain or even accelerate their innovation efforts during a recession can emerge stronger when the economy recovers,” he says in an online discussion. “The key is to focus on products or services that meet the evolving needs of customers, whether that’s in response to changing market conditions or advancements in technology.”  Most Vulnerable  Some tech businesses are more vulnerable to a recession than others, Shak says. Startups and smaller tech firms that are reliant on venture capital funding, or have limited cash reserves, are often at greater risk, he observes. “These businesses may struggle to secure necessary funding during a recession, and their growth could stall if their investors tighten their belts.”  Related:Confidential Computing: CIOs Move to Secure Data in Use Tech firms that depend heavily on consumer discretionary spending are also likely to likely to suffer during an economic downturn, says Rose Jimenez, CFO at culture.org, a cultural news platform. “Think e-commerce platforms selling non-essential goods or subscription services that people tend to cancel first when tightening their budgets,” she states in an email interview.  Also endangered are ad-supported tech companies, Jimenez says. She notes that smaller social media or content platforms are particularly vulnerable, since advertising is often one of the first areas cut during economic uncertainty. “Startups that are still pre-revenue or burning through capital fast without a clear path to profitability are also at risk, especially if they’re relying on new funding rounds,” Jimenez adds. “When capital tightens, investors tend to get cautious, and that can put a strain on early-stage companies without strong fundamentals.”  Related:Franciscan Health’s Pursuit of Observability and Automation Firms that market non-essential products or services, such as luxury tech or expensive software solutions, are likely to suffer, Young says. Startups that aren’t well-funded or companies that depend heavily on outside investment can also be at risk, as investors might pull back in tough times, he states.  Especially vulnerable during a downturn are businesses that rely heavily on enterprise clients with long sales cycles, especially those in sectors such as B2B, SaaS, or enterprise software solutions. When a recession hits or even when economic uncertainty rises, large corporations tend to slow down their purchasing decisions, says Wes Lewins, CFO at financial advisory firm Networth. “Budget approvals take longer, IT investments get delayed, and non-essential upgrades are put on hold,” he explains. “For tech companies whose revenue depends on landing big-ticket clients or closing long, complex deals, that slowdown can significantly impact cash flow and forecasting.”  Warning Signs  Young sees warning signs that a recession could be on the horizon. “Things like inflation, rising interest rates, and global instability often point to tough economic times,” he notes. “However, the beauty of the tech industry is its ability to innovate and pivot, so companies that stay agile and forward-thinking can actually find opportunities even when the economy is struggling.”  Related:Principal Financial Group CIO on Being a Technologist and Business Leader While there’s no recession yet, there are signals suggesting a higher risk of an economic slowdown in the near future, Jimenez says. “The next few quarters will be key, especially as businesses react to global trade tensions and consumer confidence shifts.”  Final Thoughts  Recessions, although challenging, can also be a time to rethink and innovate, Young says. He notes that a recession can provide an opportunity to focus on digital transformation, explore new markets, or refine products. “For example, businesses in cybersecurity may see even more growth, as security threats often spike during downturns,” Young explains. “The key is to be proactive, flexible, and to always stay connected to what your customers need.”  Shak says that preparing for a recession involves maintaining financial flexibility, focusing on customer value, and staying agile in the face of changing market conditions. “Tech companies that are proactive, innovative, and resilient are more likely to weather the storm and come out stronger on the other side.”  Young agrees. “If you embrace change and stay ahead of the curve, you cannot only survive a recession but come out stronger on the other side.”  source

Preparing Your Tech Business for a Possible Recession Read More »

Franciscan Health’s Pursuit of Observability and Automation

The layers of tech and data used by an institution such as Franciscan Health, a 12-hospital system in Indiana that also has a presence in suburban Chicago, can need a bit of decluttering for sake of efficiency. The path to sort out data and other aspects of observability led the health system to observability platform Pantomath. Sarang Deshpande, vice president of data and analytics for Franciscan Health, says when he joined three years ago, he saw that — much like other healthcare providers — they operated with a combination of tools and technologies stacked together. That approach may have served in the moment, choosing the best tools available at the time, he says. It also piled up a bit of confusion. Diagnosing the Problem As with other types of long-running institutions, hospitals might not move swiftly when it comes to technology adoption. “The maturity typically you’ll see on the provider side around technology … digital adoption is lower than you would find in manufacturing or even on the healthcare side if you think of pharmaceuticals or medical devices,” Deshpande says. On the nonprofit side, he says, the main focus is patient care with most capital investments going into buildings, hospitals, and clinics rather than new tech. At least that may have been the case until the pandemic put the world on different footing. “Technology tends to lag a little bit, but after COVID that has changed quite a bit,” Deshpande says. Related:Confidential Computing: CIOs Move to Secure Data in Use Prior to COVID, Franciscan Health tended to purchase technology tools based on what was needed at the time, he says, and largely on-premises. Compounding the complexity, Deshpande says there is a plethora of ways data is collected and ingested in the hospital system. “Our electronic medical record system is the biggest of all where most of our patient data comes from,” he says. On top of that, he says there are also billing and ERP systems, ITSM ticketing systems, and time-keeping systems to account for. Further, there are regulatory requirements around the hospital system’s reporting, he says. Assessing the Tech Ailment Information that Franciscan’s system ingests, Deshpande says, includes flat file datasets, as well as data from a CMS, third-party payers, or ancillary third parties. With so many formats and inputs, he says there was not a very clear-cut way to access data. Franciscan Health must also be accountable for sending information out, Deshpande says. The varied tech tools Franciscan Health collected over the years meant there was no standardized data pipeline. “That problem was very obvious to me from the get-go,” Deshpande says. “We have tried to solve it through people and process to a large extent, but there’s only so much you can do when there are siloed teams that are accountable for one piece of the data flow.” Related:Preparing Your Tech Business for a Possible Recession With so many pieces and layers in play, tech challenges were inevitable. “Whenever there was a failure or a data quality issue or a job didn’t run on time or got delayed, the downstream impact of that was very localized,” Deshpande says. Being accountable for accuracy, timeliness of the data, he says the issues became apparent to him. “That’s where we realized we had a big problem where the non-standardized set of tools, processes, and people in their jobs were making it very difficult for us to have any level of accuracy that our leadership demands of us,” he says. In the digital transformation era, with migrations to the cloud and more automation, Deshpande says post-COVID resources were extremely limited and most every health system seeks to do more with less. “Labor costs are off the charts,” he says. “I think that’s where most people are realizing that we need to leverage not just technology at the frontlines for our patients, but also for optimal work internally.” Related:Principal Financial Group CIO on Being a Technologist and Business Leader Prescribing a Strategy That’s where the observability platform Pantomath came into play to help transform Franciscan Health’s data operations. Deshpande says use of the platform introduced automation with the intent to reduce human error and dependency in the equation. “We will always need eyeballs on things to validate, verify, and to fix,” he says, “but basic monitoring, observation, alerting and things of that nature should be very easy to automate. Things are never as easy it seems.” Use of the platform let Franciscan Health repurpose their labor force to work smarter through AI and LLMs, Deshpande says. “We wanted a more consistent way of monitoring and solving the problem of data accuracy, data currency, and data validation.” Franciscan Health’s system comprises five different regions, he says, that historically were separate entities that came together through mergers and acquisitions. They still operate relatively independently from a daily workflow perspective, says Deshpande. That includes management of staff and patient population. Deshpande says one measurement for success of the observability effort is whether his team can conduct business, grow, and transform at the same time without additional labor — and still deliver. He says the work continues, with at least two years out in terms of migrating all on-prem infrastructure while also building new use cases on the data platform. “The next couple of years will be all about migration, consolidation, and how can we get to a point where this modern data platform in the cloud will be up and running and we can reduce our footprint in the data center and the cost that comes with it,” Deshpande says. source

Franciscan Health’s Pursuit of Observability and Automation Read More »

Confidential Computing: CIOs Move to Secure Data in Use

As cyber threats grow more sophisticated and data privacy regulations grow sharper teeth, chief CIOs are under increasing pressure to secure enterprise data at every stage — at rest, in motion, and now, increasingly, in use.  Confidential computing, a technology that protects data while it is being processed, is becoming an essential component of enterprise security strategies. While the promise is clear, the path to implementation is complex and demands strategic coordination across business, IT, and compliance teams.  Itai Schwartz, co-founder and CTO at Mind, explains confidential computing enables secure data processing even in decentralized environments, which is particularly important for AI workloads and collaborative applications.  “Remote attestation capabilities further support a zero-trust approach by allowing systems to verify the integrity of workloads before granting access,” he says via email.    CIOs Turning to Confidential Computing  At its core, confidential computing uses trusted execution environments (TEEs) to isolate sensitive workloads from the broader computing environment. This ensures that sensitive data remains encrypted even while in use — something traditional security methods cannot fully achieve.  “CIOs should treat confidential computing as an augmentation of their existing security stack, not a replacement,” says Heath Renfrow, CISO and co-founder at Fenix24.  Related:Franciscan Health’s Pursuit of Observability and Automation He says a balanced approach enables CIOs to enhance security posture while meeting regulatory requirements, without sacrificing business continuity.  The technology is especially valuable in sectors like finance, healthcare, and the public sector, where regulatory compliance and secure multi-party data collaboration are top priorities.   Confidential computing is particularly valuable in industries handling highly sensitive data, explains Anant Adya, executive vice president and head of Americas at Infosys. “It enables secure collaboration without exposing raw data, helping banks detect fraud across institutions while preserving privacy,” he explains via email.    Implementation Without Disruption  Despite its potential, implementing confidential computing can be disruptive if not handled carefully. This means CIOs must start with a phased and layered strategy.  “Begin by identifying the most sensitive workloads, such as those involving regulated data or cross-border collaboration, and isolate them within TEEs,” Renfrow says. “Then integrate confidential computing with existing IAM, DLP, and encryption frameworks to reduce operational friction.”  Related:Preparing Your Tech Business for a Possible Recession Adya echoes that sentiment, noting organizations can integrate confidential computing by adopting a phased approach that aligns with their existing security architecture. He recommends starting with high-risk workloads like financial transactions or health data before expanding deployment.  Schwartz emphasizes the importance of setting long-term expectations for deployment.   “Introducing confidential computing is a big change for organizations,” he says. “A common approach is to define a policy where every new data-sensitive component will be created using confidential computing, and existing components will be migrated over time.”  Jason Soroko, senior fellow at Sectigo, stresses the importance of integrating confidential computing into the broader enterprise architecture. “CIOs should consider the value of separating ‘user space’ from a ‘secure space,’” he says.   Enclaves are ideal for storing secrets like PKI key pairs and digital certificates, allowing sensitive workloads to be isolated from their authentication functions.  Addressing Performance and Scalability  One of the main challenges CIOs face when deploying confidential computing is performance overhead. TEEs can introduce latency and may not scale easily without optimization.  Related:Principal Financial Group CIO on Being a Technologist and Business Leader “To address performance and scalability while maintaining business value, CIOs can prioritize high-impact workloads,” Renfrow says. “Focus TEEs on workloads with the highest confidentiality requirements, like financial modeling or AI/ML pipelines that rely on sensitive data.”  Adya suggests keeping fewer sensitive computations outside TEEs to reduce the load. “Offload only the most sensitive computations, and leverage hardware acceleration and cloud-managed confidential computing services to improve efficiency,” he recommends.  Soroko adds that hardware selection is critical, suggesting CIOs should be choosing TEE hardware that has an appropriate level of acceleration. “Combine TEEs with hybrid cryptographic techniques like homomorphic encryption to reduce overhead while maintaining data security,” he says.   For scalability, Renfrow recommends infrastructure automation, for example adopting infrastructure-as-code and DevSecOps pipelines to dynamically provision TEE resources as needed. “This improves scalability while maintaining security controls,” he says.   Aligning with Zero Trust and Compliance  Confidential computing also supports zero-trust architecture by enforcing the principle of “never trust, always verify.”  TEEs and remote attestation create a secure foundation for workload verification, especially in decentralized or cloud-native environments.  “Confidential computing extends zero-trust into the data application layer,” Schwartz says. “This is a powerful way to ensure that sensitive operations are only performed under verified conditions.”  Compliance is another major driver for adoption, with regulations such as GDPR, HIPAA, and CPRA increasingly demand data protection throughout the entire lifecycle — including while data is in use.  The growing list of regulations and compliance issues will require CIOs to demonstrate stronger safeguards during audits.  “Map confidential computing capabilities directly to emerging data privacy regulations,” Renfrow says. “This approach can reduce audit complexity and strengthen the enterprise’s overall compliance posture.”  Adya stresses the value of collaboration across internal teams, pointing out successful deployment requires coordination between IT security, cloud architects, data governance leaders, and compliance officers.   As confidential computing matures, CIOs will play a pivotal role in shaping how enterprises adopt and scale the technology.  For organizations handling large volumes of sensitive data or operating under stringent regulatory environments, confidential computing is no longer a fringe solution — it’s becoming foundational.  Success will depend on CIOs guiding adoption through a focus on integration, continuous collaboration across their enterprise, and by aligning security strategies with business objectives.  “By aligning confidential computing with measurable outcomes — like reduced risk exposure, faster partner onboarding, or simplified audit readiness — CIOs can clearly demonstrate its business value,” Renfrow says.  source

Confidential Computing: CIOs Move to Secure Data in Use Read More »

Common Pitfalls and New Challenges in IT Automation

Automation is moving from a routine IT task to a race to cross an ill-defined finish line.  AI tends to be the bug smearing the windshield and making it hard to see where you’re headed. Road hazards are further complicating the drive to increased efficiency.   “For some, automation is a buzzword and an uphill battle, but for most technical folks out there, it’s as simple as ABC. However, many technical leads and CIOs find themselves in trouble at the starting line,” says Muhammad Nabeel, chief technology officer at Begin, an entertainment streaming service in Pakistan.   At issue from the start are the usual company politics and AI — which can be more difficult to negotiate than bean counters and C-suite heavyweights combined.  “Nowadays, AI has a drastic influence on every walk of life, especially technology. Therefore, any CIO or head of technology must incorporate the AI factor,” Nabeel adds.  Although AI is a dominant force, it isn’t the only play in automation. Some established tools and rules still apply. Unfortunately, so do the previous pitfalls and challenges. Heaped on top of that are all the AI problems, too.  “This year, hidden costs and regulatory curveballs will bite if ignored. Beyond licensing fees, watch for integration spaghetti — systems that don’t “talk” smoothly — and training gaps that stall adoption. New data privacy regulations, like evolving GDPR [the European Union’s General Data Privacy Regulation] and AI transparency laws, mean CIOs must vet tools for compliance and ethical design,” says Dawson Whitfield, CEO and co-founder of Looka, an AI platform for designing logos.   Related:Franciscan Health’s Pursuit of Observability and Automation All told, there’s a lot for IT to manage all at once. For the sake of sanity and strategy, perhaps it’s best to first consider the pitfalls and challenges before trying to map out a strategy.  Pitfall 1: Running into obstacles you can’t see  In the process of implementing automation and getting all the moving parts right, sometimes people forget to first evaluate the process they are automating.   “You don’t know what you don’t know and can’t improve what you can’t see. Without process visibility, automation efforts may lead to automating flawed processes. In effect, accelerating problems while wasting both time and resources and leading to diminished goodwill by skeptics,” says Kerry Brown, transformation evangelist at Celonis, a process mining and process intelligence provider.   The aim of automating processes is to improve how the business performs. That means drawing a direct line from the automation effort to a well-defined ROI.  Related:Preparing Your Tech Business for a Possible Recession “When evaluating AI and automation opportunities for the organization, there are often gaps in understanding the business implications beyond just the technology. CIOs need to ensure that they can translate AI capabilities into concrete business strategies to demonstrate strong ROI potential for stakeholders,” says Eric Johnson, CIO at PagerDuty, an AI-first operations platform.   Pitfall 2: Underestimating data quality issues  Data is arguably the most boring issue on IT’s plate. That’s because it requires a ton of effort to update, label, manage and store massive amounts of data and the job is never quite done.  It may be boring work, but it is essential and can be fatal if left for later.  “One of the most significant mistakes CIOs make when approaching automation is underestimating the importance of data quality. Automation tools are designed to process and analyze data at scale, but they rely entirely on the quality of the input data,” says Shuai Guan, co-founder and CEO at Thunderbit, an AI web scraper tool.  “If the data is incomplete, inconsistent, or inaccurate, automation will not only fail to deliver meaningful results but may also exacerbate existing issues. For example, flawed customer data fed into an automated marketing system could lead to incorrect targeting, wasted resources, and even reputational damage,” Guan adds.  Related:Principal Financial Group CIO on Being a Technologist and Business Leader Pitfall 3: Mistaking the task for the purpose  A typical approach is to automate the easy, repetitive processes without giving thought to a problem that lurks beneath. Ignoring or overlooking the cause now may prove highly damaging in the end.  “CIOs often fall into the trap of thinking automation is just about suppressing noise and reducing ticket volumes. While that’s one fairly common use case, automation can offer much more value when done strategically,” says Erik Gaston, CIO of Tanium, an autonomous endpoint management and security platform.  “If CIOs focus solely on suppressing low-level tickets without addressing the root causes or understanding the broader patterns, they risk allowing those issues to snowball into more severe problems that can eventually lead to bigger risks down the road. It is often the suppressed Severity 3-4 issue that when left unattended, becomes the S1 or 2 overtime!” Gaston says.  Remember also that business goals and technologies change over time and so too must processes.   “Focus on high-impact areas, leverage the power of open-source tools initially, and monitor the outcome. Change when and where necessary. Do not adopt the ‘fire and forget” principle,’” says Nabeel.  Pitfall 4: Failing to plan for integration  Integration becomes a necessity at some point. With AI, integrating with human overseers is an immediate need. Often it must be integrated with other software as well.  “One mistake is assuming AI-driven automation can run without human oversight. AI is a powerful tool, but it still requires human checks to catch errors, bias, or security risks,” says Mason Goshorn, senior security solutions engineer at Blink Ops, an AI-powered cybersecurity automation platform.  However, even traditional automation tools require integration. Most in IT are aware of this but it doesn’t mean that planning for it made it into the final strategy.  “Another challenge is failing to plan for integration, which can lead to vendor lock-in and disconnected systems. CIOs should choose automation tools that work with existing infrastructure and support open standards to avoid being trapped in a single provider’s ecosystem,” says Goshorn.  Pitfall 5: Not allowing the data to drive decisions in what to automate 

Common Pitfalls and New Challenges in IT Automation Read More »

Babak Hodjat Talks Groundbreaking Work on Natural Language Tech

A desire for consistency in how AI performs, and with the results it delivers, is shared among many companies and their customers, says Babak Hodjat, CTO of AI for Cognizant. Delivering such consistency may require internal and external effort to mature the technology. Hodjat’s experience with AI runs from the early days of his career, through the development of natural language technology found in Apple’s Siri digital assistant, to his current role at Cognizant. In addition to discussing some of his pioneering work, he shared his perspective on the need CTOs have to research innovations in development outside of their organizations, ways CTOs can set the future they want to see in motion, and how a book of all things can be his go-to “device” that offers him inspiration in his role. source

Babak Hodjat Talks Groundbreaking Work on Natural Language Tech Read More »

The Kraft Group CIO Talks Updating Gillette Stadium and More

The gridiron action of the New England Patriots naturally takes center stage in the public eye, but when the team’s owner, holding company The Kraft Group, wanted to update certain tech resources, the plan encompassed its extensive operations. Michael Israel, CIO for The Kraft Group, discussed with InformationWeek the plan for networking upgrades — facilitated through NWN — at Gillette Stadium, home field for the Patriots, as well as the holding company’s other business lines, which include paper and packaging, real estate development, and the New England Revolution Major League Soccer club. Talk us through not only the update effort for the stadium, but what were the initial thoughts, initial plans, and pain points that got the process started for the company. The roots of the business are in the paper manufacturing side. We have a paper, cardboard recycling mill in Montville, Conn. I have 10 cardboard box manufacturing plants from Red Lion, Pa. up through Dover, N.H., in the northeast. International Forest Products, which is a large commodities business which moves paper-based products all over the world. When we talk about our network, we have a standardized platform across all of our enterprise businesses and my team is responsible for maintaining and securing all of the businesses. Related:Surgical Center CIO Builds an IT Department We have a life cycle attached to everything that we buy and when we look at what the next five years brings to us, we were looking and saying we have the host of networking projects coming up. It will be the largest networking set of upgrades that we do from a strategic point over that period. So, the first of which NWN is currently working on is a migration to a new voice over IP platform. Our existing platform was end-of-life, moving to a new cloud-based platform, new Cisco platform. They are managing that transition for us and that again covers our entire enterprise. [We’re] building a new facility for the New England Patriots, their practice facility, which will be ready next April. Behind that we have FIFA World Cup coming in next June-July [in 2026] and we have essentially seven matches here. It’s the equivalent of seven Super Bowls over a six-week period. Behind that comes a refresh of our Wi-Fi environment, refresh of our overall core networking environment. Then it’s time for a refresh of our firewalls. I have over 80 firewalls in my environment, whether virtual or physical. And to add insult to injury, on top of all of that, we may have a new stadium that we’re building up in Everett for our soccer team, which is potentially scheduled to open in 2029 or 2030. Related:Knowledge Gaps Influence CEO IT Decisions So as we were looking at all of this, the goal here is to create one strategic focus for all of these projects and not think about them individually. Sat down with NWN saying, “Hey, typically I will be managing two to three years in advance. We need to take a look at what we’re going to do over the next five years to make sure that we’re planning for growth. We’re planning to manage all of this from standards and from a central location.” Putting together what that strategic plan looks like over that period of time and building a relationship with NWN to be able to support it, augment the staff that I have. I don’t have enough resources internal to handle all of this myself. And that’s a large endeavor, so that’s where this partnership started to form. Can you describe the scale of your operations further? You mentioned hosting the equivalent of several Super Bowls in terms of operations at the stadium. If you take the stadium as a whole and we focus there for a second, for Taylor Swift concert or a FIFA event coming in — for Taylor Swift, we had 62,000 unique visitors on our Wi-Fi network at one time. There’s 1,800 WAPs (wireless access points) supporting the stadium and our campus here now. Related:CIO Angelic Gibson: Quell AI Fears by Making Learning Fun I got a note on my radio during one of the evenings saying there’s 62,000 people. I said, “How can that be? There’s only 52,000 guests.” Well, it turns out there was a TikTok challenge in one of our parking lots and there were 10,000 teenagers on the network doing TikTok. These are the things that we don’t plan for, and FIFA is going to be a similar situation where typically we’re planning for how many people are physically sitting in the stadium for a FIFA event. Our parking lots are becoming activation zones, so we’re going to have to plan to support not just who’s physically entering and scanning tickets and sitting in the bowl, but who’s on the grounds as a whole. And that’s something that we haven’t had to do in the past. It’s something that some of the warmer stadiums down in the South or in the in the West Coast who host Super Bowls, they’re used to that type of scenario, but there are 16 venues throughout North America that are supporting FIFA and many of them, like us, we’re not used to having that large-size crowd and your planning to support that is critical for us as we start to do this. We are now 15 months away, 14 months away. We’re in high gear right now. What led the push to make changes? The interests are of the guests to the stadium? The team’s needs? Or was it to meet the latest standards and expectations in technology and networking? If you think about the networks, and it’s kind of irrelevant whether it’s here at the stadium or in our manufacturing plants, the networks have physically been — if it’s plugged in, if it’s a Wi-Fi attachment, etcetera, you can track what is going on and what your average bandwidth utilization is. What we were seeing over the last year with

The Kraft Group CIO Talks Updating Gillette Stadium and More Read More »