Information Week

Bridging the Tech Gap: Fostering Cross-Generational Adoption

As organizations navigate the challenges of technology adoption across generational lines, organizations are increasingly focusing on strategies to bridge the gap between tech-savvy younger employees and older workers with varied levels of proficiency.  Contrary to stereotypes, older employees often possess a wealth of tech experience from earlier computing eras, such as troubleshooting first-generation PCs or early networking systems.  Graham Glass, founder and CEO of Cypher Learning, says assessing the “technology history” of all employees can provide valuable insights into their skills and comfort levels.  “By treating training needs as individual as a medical history, organizations can design programs that respect experience while addressing gaps, avoiding one-size-fits-all approaches that may alienate staff,” he says.   He adds that equally as important as generational differences are cultural and ethnic differences.  “When you’re driving more tech proficiency in the workplace, consider every user’s point of view,” Glass says.   Ryan Downing, vice president and CIO of enterprise business solutions at Principal Financial Group, points to cloud transformation as a prime example of generational collaboration, with engineers across age groups elevating their skills together.  Related:Tech Company Layoffs: The COVID Tech Bubble Bursts “What I find most impressive is how newer employees bring fresh perspectives and energy, while more experienced team members contribute wisdom and expertise,” he says. “This dynamic levels the playing field, strengthens team cohesion, and ensures every voice is heard.”  Cross-Generational AI Adoption  To foster inclusive AI adoption, Downing says organizations should shift the conversation from merely adopting new tools to transforming ways of working.  “At Principal, we are beginning to pilot coaching programs that focus on setting clear outcomes that can help teams improve efficiency and quality,” Downing explains.  The programs guide teams to explore how AI tools can drive value creation in ways that differ from traditional approaches.  “This outcome-driven mindset encourages exploration and reduces apprehension around AI tools,” he says.   Tailored training is also playing in bridging the generational tech gap. Downing says at Principal, training is balanced with a mix of formal learning opportunities, coaching and mentoring, and meaningful assignments.  “This approach allows team members to apply new skills in real-world scenarios,” Downing says. “By offering varied learning methods, we can accommodate different working styles and readiness levels, ensuring all team members can effectively engage with new technologies.”  Related:Risk Management for the IT Supply Chain Downing explains the primary challenge typically isn’t a lack of tools or willingness to learn, but rather the tendency to treat AI tools as mere add-ons rather than enablers of transformation.  “It’s so important not to underestimate the human element of implementing these new tools to help team members reimagine their workflows,” Downing says. “Emphasizing transformation over tools helps to ensure meaningful adoption across these generational lines.”  Glass explains that while younger workers may adapt quickly to AI tools, older employees often need reassurance about their role in the workplace and the utility of AI as a tool to enhance, not replace, their contributions.  “Personalized learning platforms powered by AI allow employees to learn at their own pace, ensuring proficiency without wasting time or risking embarrassment,” he says.  Peer-to-peer mentoring and collaboration further bridge the gap, allowing younger workers to share their digital fluency while benefiting from the problem-solving resilience of older colleagues.  “The less exposed to AI people are, the more qualms they have,” Glass says. “Our recent research shows younger men are less worried about AI than, say, older workers or women.”  Related:The Top Habits of High-Performing IT Development Teams That “comfort gap” is a function of time spent experimenting with the technology, or lack thereof, so it’s a good idea for businesses to encourage it.  “Two more big issues that recur are privacy and fear of AI taking over peoples’ jobs,” Glass adds.  Managers can address the first by framing house rules governing AI use — defining tasks it shouldn’t be exposed to, for example. As for the second — reassure employees, especially older ones, that AI is a tool meant to take rote chores off their plates and elevate their roles.  “The more you underline how essential your people are, the less they’re apt to fret about job security,” Glass says.   Measuring Success  Organizations implementing multigenerational technology training programs often measure success through a combination of immediate feedback and long-term metrics, Gartner analyst Autumn Stanish explains.  “Retention and attraction rates are key indicators, such as tracking whether employees are staying longer or if the company is drawing new talent due to its reputation for inclusivity,” she says.   Stanish points to Broadridge’s reverse mentoring program, where younger employees mentored older colleagues on work-related topics.  After the program, both mentors and mentees completed surveys using a 10-point scale to evaluate outcomes like increased belonging, broadened perspectives, and willingness to recommend the experience.  Stanish says short-term insights from surveys help guide improvements, while broader goals, such as enhanced employee satisfaction and retention, require time to fully materialize.   Combining these methods allows organizations to fine-tune their programs and foster inclusivity effectively.  “The little surveys and moments where we gather feedback help us connect with employees directly, and over time, those qualitative insights drive the bigger quantitative outcomes,” she says.  source

Bridging the Tech Gap: Fostering Cross-Generational Adoption Read More »

The Top Habits of High-Performing IT Development Teams

Transforming a lackluster IT development team into a top performer isn’t particularly difficult. It does, however, require a commitment to excellence that’s achieved by practicing several essential traits.  A focus on results is a key differentiator in high-performing IT teams that continue to be successful over the long term, says Shriram Natarajan, a director with technology research and advisory firm ISG. “For traditional IT teams, a customer focus would be sufficient, but development teams are one step removed from customer feedback,” he observes in an email interview. “The team should be focused on the results they have set for themselves as measured by metrics, such as velocity, predictability, quality and acceptance.”  A high-performing development team consistently focuses on eliminating toil, says Matthew Sharp, CISO at Xactly, a provider of enterprise cloud-based sales performance management solutions. “This involves reducing or removing repetitive, manual, and tedious processes by building automated pipelines, tackling technical debt, and streamlining workflows,” he explains in an online interview. “By minimizing unnecessary overhead, teams can focus their energy on impactful, innovative work, rather than on routine maintenance.”  Related:Bridging the Tech Gap: Fostering Cross-Generational Adoption Achieving Full Productivity  Eliminating toil not only enhances productivity, but also directly contributes to a better developer experience, Sharp says. “When teams feel empowered to focus on meaningful, creative tasks rather than repetitive ones, they tend to be happier, more engaged, and better equipped to deliver high-quality results.”  The most crucial habit of any high-performing IT development team is consistent, transparent communication coupled with systematic knowledge sharing, observes Harmeet Bhatia, a technical account manager at Amazon Web Services. “This habit goes beyond routine stand-ups or documentation — it encompasses a culture where information flows freely and deliberately across all team members.”  In an email interview, Bhatia notes that open communication can be “extraordinarily effective,” since it simultaneously addresses multiple critical development aspects. “When teams maintain open communication channels and actively share knowledge, they reduce bottlenecks, eliminate single points of failure, accelerate the onboarding of new members, and foster innovative problem-solving.” A team member struggling with a complex bug, for instance, can benefit from a colleague’s past experience with similar issues, potentially saving hours or days of troubleshooting.  Related:Tech Company Layoffs: The COVID Tech Bubble Bursts High-performing teams need cooldown periods to stay at their best, observes Ludovic Dehon, CTO of Kestra, which offers an open-source orchestration and applications scheduling platform. “No one can keep up high-intensity work all of the time without burning out,” he states via email. “We move through different seasons — times of intense productivity that bring results, followed by rest seasons where we can take a breather, work on creative projects, or tackle lighter tasks.”  Knowledge Sharing  Successful teams treat knowledge sharing as a core part of their development process, not an optional add-on, Bhatia says. “They recognize that the time invested in communication and documentation pays dividends through improved code quality, faster problem resolution, and more resilient team structures,” he explains. “This approach creates a positive feedback loop in which better communication leads to better outcomes, which in turn motivates more sharing.” The key, Bhatia notes, is finding the sweet spot where communication enhances rather than impedes development work.  Leaders should foster a culture of commitment, Natarajan recommends. Development work involves experimentation. “Leaders should focus on overall learning and progress rather than metrics like immediate velocity,” he says. “This enables the development team to be creative in their approach and find power boosts — like AI tools — along the way.”  Related:Risk Management for the IT Supply Chain Sharp suggests accepting a high tolerance for experimentation and learning from failures. “By accepting mistakes as part of the journey, teams can develop creative solutions and innovate more freely.” He also recommends ensuring that every team member understands the “why” behind the project’s goals. “When the whole team buys into the mission, they’re more likely to take ownership of the processes that drive success.”  As a release deadline approaches, work grows increasingly intense as teams and their leaders race toward the finish line. “After we wrap-up the release, we get a breather — two weeks to slow down, address technical debt, and think creatively about solving some of our toughest problems” Dehon says. “This natural cycle has helped our team bring fresh, creative ideas to some of our most challenging issues.”  Final Thoughts  Sharp stresses the importance of aligning technical improvements with strategic business goals. “When IT and security teams understand the business impact of their work, they’re motivated to innovate and reduce toil, knowing it benefits not only their efficiency but also the organization’s overall success.”  “We all look forward to celebrating each release, knowing that once it’s done, no one expects us to dive headfirst into the next cycle,” Dehon says. “Instead, we get time to pause, appreciate what we’ve accomplished, and think deeply about what we want to tackle next.”  source

The Top Habits of High-Performing IT Development Teams Read More »

Risk Management for the IT Supply Chain

One positive development from the COVID-19 pandemic was that it forced companies to take hard looks at external supply chains to ensure they were reliable, secure and trustworthy, and that should one vendor fail, another could step in.  There were numerous supply chain misfires during the pandemic, and companies and consumers suffered and learned from the experience.  That brings us to IT.  The IT supply chain comes with its own set of risks, but it faces the same vulnerabilities corporate production supply chains encounter. One key difference is that organizations don’t regularly focus on those IT supply chains. While IT departments have active disaster recovery and failover plans, there are few that regularly vet vendors, or that audit their tech supply chains for resiliency.  Moody’s tells us, “Disruption in one part of the supply chain can have significant ripple effects, impacting businesses and economies across sectors and regions,” and the IT supply chain is no exception when it comes to risk.  I have seen these things firsthand:  A trustworthy vendor gets acquired by another vendor that IT has had poor past experience within the past. How easy is it to migrate to another new vendor?  A company suddenly and unexpectedly sunsets its technology and with it, the tech support. Can IT find a third party that will step in to support the old tech if the IT department had relied on the original vendor for its know-how, and doesn’t have the budget to move to another tech option?  Related:Bridging the Tech Gap: Fostering Cross-Generational Adoption There is a component shortage at the vendor, so IT is unable to upgrade routers on its network. Is there an alternative vendor?  IT has contracted with a service company to provide technical and user support for a multi-national application, but now the provider ceases operations in one of the countries where the company has a facility. What do you do now?  All are real-world examples that I’ve personally seen. They call into question the IT supply chain’s resiliency. When these incidents occurred, there was no ready route for IT to cure a supply chain conundrum, and the IT departments involved found themselves in difficult positions, having to “tough it out” with unsupported technologies, or pause certain technologies, and/or create workarounds for processes that no longer functioned.   No one likes to be in that position. So, are there tried and true supply chain methodologies that can be applied to the IT supply chain, too?  Yes, there are proven supply chain strategies and methods out there. Here are four of them:  Related:Tech Company Layoffs: The COVID Tech Bubble Bursts Assess your supply chain.   Who are your mission critical vendors? Do they present significant risks (for example, risk of a merger, or going out of business)? Where are your IT supply chain “weak links” (such as vendors whose products and services repeatedly fail). Are they impairing your ability to provide top-grade IT to the business?   What countries do you operate in? Are there technology and support issues that could emerge in those locations? Do you annually send questionnaires to vendors that query them so you can ascertain that they are strong, reliable and trustworthy suppliers? Do you request your auditors periodically review IT supply chain vendors for resiliency, compliance and security?  Those are a few questions that IT departments should ask when reviewing tech supply chains, but when I mention these to IT leaders, few tell me that they do them.  Mitigate the supply chain’s weak links.   If you have a mission-critical supplier and you find there are no alternative suppliers, you’re exposed to risk if that supplier gets acquired, goes out of business, or has a component shortfall and can’t deliver.  For any mission-critical sole source supplier, it’s incumbent on IT to locate alternate suppliers that can step in, and to be ready to use them if an emergency warrants it.  Related:The Top Habits of High-Performing IT Development Teams One key area is internet service providers (ISPs). Companies should always have more than one ISP so Internet service will remain uninterrupted.  Audit your suppliers.   Most enterprises include security and compliance checkpoints on their initial dealings with vendors, but few check back with the vendors on a regular basis after the contracts are signed.  Security and governance guidelines change from year to year. Have your IT vendors kept up? When was the last time you requested their latest security and governance audit reports from them?  Verifying that vendors stay in step with your company’s security and governance requirements should be done annually.  Include the IT supply chain in the corporate risk management plan.   Although companies include their production supply chains in their corporate risk management plans, they don’t consistently consider the IT supply chain and its risks.  Today’s digital companies won’t function if the IT isn’t working, so CIOs must push for the IT supply chain to be part of overall corporate risk management if it isn’t already.  source

Risk Management for the IT Supply Chain Read More »

How CIOs Can Navigate Their Jobs in the AI Era

As tech leaders, we know that AI is not a new concept. The tireless workhorse has been quietly operating in applications ranging from automation and data analysis to gaming and search engines for decades. So, we can be forgiven if its sudden and explosive popularity among the public surprised many of us, including chief information officers.  It’s not just the extreme pace at which AI has evolved or the ever-growing dearth of generative AI applications that has the C-suite rethinking its tech priorities. A substantial percentage of executives are scratching their collective heads about who will be tasked with tapping AI’s potential, who will keep AI expectations realistic, and, perhaps most importantly, where we will find workers with the requisite skills to keep pace with technological demands.  Searching for Skills  According to research from our company and IDC, businesses in every significant sector are adopting increasingly complex AI technologies as they seek to automate repetitive tasks, drive innovation, and increase productivity. Surprisingly, they’re not finding that they are replacing their workers with AI robots, as many anticipated and feared; instead, enterprises everywhere are struggling to find workers with the skills required to meet growing and more complicated AI needs.  Related:Bridging the Tech Gap: Fostering Cross-Generational Adoption Over one-third of the 650 companies surveyed contended a skills-based worker shortage could cause delays or worse for their 2025 AI initiatives. Workers with experience with cybersecurity, networking, data, and automation are in high demand and are also most challenging to come by, according to Enterprise Horizons 2024.  There needs to be more than recruitment to advance AI programs in 2025. To meet their goals, organizations must also implement reskilling and training programs. In the interim, external tech partners will be critical to keeping AI initiatives moving forward as companies recruit and retrain.  CIO: Agents of Change  CIOs are poised to serve as agents of change, helping usher in and implement new AI apps and services. As this happens, it’s never been more critical for the CIO to align technology with business strategy to deliver consistent communications to board members and the rest of the C-suite, and this will be pivotal to advancing the company’s AI programs in 2025. With 47% of Enterprise Horizons 2024 respondents reporting that their board has unrealistic demands regarding the impact of AI, it will be crucial for the CIO to act as a conduit between executive management and the rest of the organization. Not just to help temper board expectations about AI but also to act as a translator to effectively relay the company’s technological needs in simplified, streamlined language.   Related:Risk Management for the IT Supply Chain This is when CIOs should ask themselves, “What problems are we trying to solve?” The answer to that question will help guide the organization’s AI strategy and fill skills gaps. Amidst all the hype, it is incumbent upon the CIO to take a breath and remember that they are still solving problems — they just have access to different tools to solve them.  As the CIO role shifts toward change agent, translator, and strategist in 2025, savvy businesses will seek a leader who oversees their organization’s AI strategy and implementation. This is where the new, somewhat nebulous role of chief AI officer (CAIO) enters. Although 40% of technology leaders say a CAIO role will take over much of the CIO’s responsibilities within two years, and 38% are worried that AI could replace their or their team’s role, we need to consider the possibility that the CAIO role can complement the CIO role, not replace it.    The CIO and CAIO can accomplish more as a team than either role can achieve. With 64% of surveyed business leaders reporting that they find it challenging to meet their business’s technology demands, splitting the duties and allowing each leader to focus on a specific aspect of those demands increases efficiencies and gives tech leaders time to innovate. Together, the two roles oversee the company’s technology strategy, including growth, infrastructure, risk management, and AI innovation and implementation, among many other aspects of the business.  Related:The Top Habits of High-Performing IT Development Teams Optimism Abounds  The year ahead will undoubtedly include its fair share of unforeseen challenges and continued stress and anxiety as organizations around the globe determine how to make the most of AI. While it’s a stressful time to be a tech leader, CIOs, and other tech leaders are excited about the current pace of innovation and the seemingly endless possibilities.  As we look forward to 2025, it’s safe to say AI will continue to excite, confound, and ignite the creative fire in business leaders everywhere. Enterprises, and in particular CIOs, would do well to prepare now by starting to retrain their current workforce, launching recruitment campaigns, and preparing the C-Suite for the inevitable changes ahead.  source

How CIOs Can Navigate Their Jobs in the AI Era Read More »

How DOGE May Impact Tech

Government inefficiency has been a long-term gripe in America. Seemingly to the rescue is the Department of Government Efficiency (DOGE), headed by Elon Musk. DOGE is moving rapidly to cut costs, by eliminating some organizations, like USAID, entirely and slashing the workforce of others. Though many lawsuits have been filed to stop the slashing and burning, DOGE continues, setting its sights on “optimizing” federal agencies. These moves will likely affect tech sales and contracts, as well as the relationships between federal agencies and tech vendors. DOGE is also siphoning highly-sensitive information from various federal organizations that could compromise individuals, and potentially, the entire country.   “The speed with which Musk and his team are moving to extract information from sensitive government databases and affect longstanding government programs is unprecedented and potentially illegal. We still don’t know the policy implications,” says Rita McGrath, Columbia University Business School professor and C-suite strategist, in an email interview. “We can expect disruptions in what would otherwise be ‘normal’ procurement, contracting and payment processes. Bills will not get paid in a timely way and with dismissals and employee buyouts on the horizon, a government already hard-pressed to provide essential services is going to function even less well.”  Related:Questionable Oversight: Who Watches the Watchers on Sensitive Data? Relative to the size of the US population and economy, the federal workforce has been in decline since the Reagan administration basically declared war on it, she says.   “The doctrine that high-functioning, effective government is worth making investments in, that followed the post-World War II ‘golden age’ has largely disappeared and the workforce remaining is often overworked and frustrated at accomplishing even basic tasks in a timely way,” says McGrath. “Government procurement processes are already glacially slow and complex, which means that larger firms with staffs that can cope tend to have an advantage in government procurement.”  Whether Musk’s team will be able to override traditional protections against theft of government resources in the name of ‘efficiency’ is an open question.  “It really comes down to whether institutional rules will intervene,” says McGrath. “There is, however, likely to be a permanent shift in how services, such as consulting, are being managed.  We’ve already seen a shift in demand for management consulting and legal services to depart from a time-based model to more of a value-based model, which would have huge implications for the billing structures and organizational forms of these [companies] – from a leverage model to some kind of outcome-focused approach.  We’re watching it as a potential inflection point very closely.”  Related:Tidal Wave of Trump Policy Changes Comes for the Tech Space Ben Walker, CEO at transcription services provider Ditto, says DOGE budget cuts will probably cause federal agencies to be more cautious about tech purchases and consulting contracts, which could lead to longer procurement cycles and stricter requirements for vendors.  “To adapt, my company is looking at offering flexible contract structures and more ways to provide long-term value. Strong performance and trust will matter more than ever as agencies justify their spending,” says Walker in an email interview. “One way to think of it is that the federal government is becoming more like a private-sector company. Agencies may demand clearer return-on-investment metrics and push for performance-based contracts. Vendors that can clearly articulate the necessity and efficiency of their services will have an edge.”  Tips to Weather the Storm  Columbia University’s McGrath believes enterprises should do several things to protect themselves. For example, it’s important to have cash on hand.   “You don’t want to be caught in a cash flow crunch if suddenly your invoices aren’t getting paid.  So, work proactively with your bank or other customers to make sure you have some buffers in the bank,” says McGrath.  Also lean into existing relationships to better understand the potential impacts.  “[T]his will be a scary time for those under the microscope by DOGE and it could be an opportunity to both learn what’s going on and deepen trust in existing relationships,” says McGrath. “Connect with peers — perhaps lightning check ins — to keep on top of what is really going on. Something like a daily call where you share information about what you’re learning and what they are learning could be helpful.”  And finally, don’t assume that business as usual will return any time soon. Following the USAID purge, agencies that have existed for many decades are facing instant changes that will disrupt their operations. Processes, however efficient or inefficient, will change and tech companies need to be prepared for the financial impacts.  Bottom line, proceed with eyes wide open and caution, and embrace scenario planning. Such efforts helped organizations weather the whipsaw effects of the pandemic. Such scenario planning is even more important at a time when virtually anything can happen, especially under a cloak of opacity.  It’s also important to nurture current relationships and to monitor how procurement processes and requirements are changing. Once again, agility will be critical to navigating the changes.  source

How DOGE May Impact Tech Read More »

Medallion Architecture: A Layered Data Optimization Model

Along with the emergence of generative artificial intelligence (GenAI) has come a surging demand for data and data center capacity to host growing AI workloads. And more and more organizations find themselves in the race to build the infrastructure and data center capacity capable of supporting the current and future use of AI and machine learning (ML).  For finance functions, high-quality, well-organized, and trustworthy data is essential in the development of effective AI-driven operating models. And while speed is a big factor, trust and safety are even greater concerns in a technology environment where there are few guardrails for AI risk management. Just think of the internet with no rules around e-commerce, privacy, or business and personal safety.   So where does a management team get a handle on the critical issues around an AI approach that is both highly efficient from an operations standpoint and optimized for risk management? We believe in this case that the past can be the prologue: consider a principle known as the “medallion architecture” — a commonly used industry framework for managing large-scale data processing in cloud environments. For many of the same reasons it works so well there, we also find it applies well to data engineering. It’s particularly well suited for tax and finance operations, where data is one of the most valuable assets and for which flexible, scalable, and reliable data management is essential for regulatory compliance speed and accuracy.   Related:Will Enterprises Adopt DeepSeek? A Layered Approach  The reality is that data and AI are essentially inseparable in our new digital era. While data has existed for a long time without AI, AI does not exist without data. By extension, a solid data strategy is required for achieving meaningful returns on AI value, and medallion architecture is a highly effective data management tool that helps get the most out of an organization’s AI investment. As a data engineering model, it organizes information into three distinct tiers of bronze, silver and gold “medals.” Each layer has a specific role in the data pipeline, designed to facilitate clean, accurate and optimized dataflows for downstream processes:  Bronze: This is the raw data layer. The data is ingested from various sources, including structured, semi-structured and unstructured formats. At this stage, the data is stored in its original form without any significant transformation. This serves as a robust foundation, providing a full audit trail and allowing businesses to revisit the raw data for future needs.  Related:AI Is Improving Medical Monitoring and Follow-Up Silver: In this intermediate stage, data from the bronze layer is cleaned, filtered and structured into a more usable format. This involves applying necessary transformations, removing duplicates, filling in missing data and applying quality checks. The silver layer acts as a reliable data set that can be used for analysis, but it’s still not fully optimized.  Gold: This is the final stage of the data pipeline where the silver data is further refined, aggregated and structured for direct consumption by analytics tools, dashboards and decision-making systems. The gold layer delivers highly curated, trusted data that’s ready for use in real-time reporting and advanced analytics.  Applying the Benefits of Medallion Architecture in the Finance Sector  For financial institutions, data management needs are highly complex. Banks, trading firms and FinTech companies process enormous amounts of data daily, with requirements for accuracy, speed and regulatory compliance. Medallion architecture addresses the following needs.  1. Improved data quality and governance. Financial institutions must ensure data accuracy and completeness in alignment with strict regulatory requirements, such as Basel III, the Sarbanes-Oxley Act (SOX) and MiFID II. The multilayered features of medallion architecture support data quality checks that can be applied at each stage. By moving from the bronze to gold layer, data undergoes multiple transformations and validations, improving accuracy and reducing errors. It also facilitates better data governance and traceability, allowing for easier auditing and compliance reporting.  Related:Is a Small Language Model Better Than an LLM for You? 2. Scalability for large data volumes. The financial sector often deals with massive data sets — from transaction histories and market feeds to customer data. The layered approach makes it easier to scale these data pipelines. Since the raw data in the bronze layer is stored in its original form, it can handle the ingestion of high volumes of data without requiring immediate transformations. As data moves to the silver and gold layers, the architecture supports scalable processing frameworks that enable financial institutions to efficiently process large data sets.  3. Faster time to insights. In fast-paced financial markets, speed is essential. Trading firms, for example, need real-time data to make decisions on market movements. The medallion structure allows financial institutions to separate raw data ingestion from data analytics. Analysts can start working on silver and gold layers for immediate insights, while engineers refine and clean the data in the background. This results in quicker access to actionable insights, essential for high-frequency trading or real-time fraud detection.  4. Flexibility and agility. Medallion architecture offers flexibility in handling diverse data sources and types — an essential feature in the financial industry, where data comes from numerous channels. The bronze layer’s ability to store raw data in its native form makes it easy to adapt to new data types or sources without needing immediate transformations, while the silver and gold layers can be adjusted to reflect new business requirements, market conditions or regulatory changes.  5. Cost efficiency. Processing large volumes of financial data is expensive. Separating the raw data from the processed data helps reduce unnecessary data transformations and storage costs. Financial institutions can optimize their compute resources by running complex transformations only when needed, thus lowering operational costs.  6. Enhanced security and risk management. Raw data in the bronze layer can be heavily restricted, with only authorized personnel able to access it, while the curated gold layer can be more widely available for analysis. This segmentation of data access allows for tighter security controls and reduces the attack surface.  7. Advanced analytics and machine

Medallion Architecture: A Layered Data Optimization Model Read More »

The CEO/CIO Dynamic: Navigating GenAI Implementation

As companies seek to realize the promise of generative AI, effective collaboration between CEOs and CIOs has become an unheralded, but critical driver of technological transformation.  Artificial intelligence is revolutionizing work across almost every industry. Leaders are facing immense pressure to substantiate the value of GenAI and effectively measure outcomes that demonstrate its impact to their people, boards and shareholders. To do so, they are aggressively moving from pilots to transformational programs to unlock new revenue streams, maximize ROI and cement their competitive advantage.   KPMG’s latest AI & Digital Innovation Quarterly pulse survey found that 79% of business leaders are prioritizing productivity gains, and that more than half are exploring the use of AI agents. Those tools can work independently to perform tasks and adapt in real time. To successfully implement GenAI or AI agents within an organization, it is crucial for the CEO and the CIO to work together to establish a shared vision and strategy to meet business objectives and maximize the return on their investment.   The CEO should have a clear understanding of the potential benefits that GenAI can bring and how it aligns with the long-term goals of the organization. The CEO should set the agenda and drive a culture of collaboration, cross-functional strategy and integration because siloed efforts are unlikely to yield the game changing transformation needed for a sustainable, competitive advantage.  Related:Risk Management for the IT Supply Chain Responsibilities for GenAI strategies are increasingly being shared across the C-suite as organizations adopt longer-term strategies. CIOs are also deeply involved in developing the strategy, and they have great influence over the technology investments. They also bring technical expertise to the table and provide insights into the feasibility of implementing the technology and the potential risks associated with it. The CIO should have an ecosystem strategy for their AI program that evaluates the compatibility of GenAI solutions with existing systems and considers the potential impact on the organization’s IT infrastructure.  CIOs must also spearhead efforts to break down functional siloes that can hinder enterprise-wide transformation. It is common for companies to prioritize digital transformation in certain functions over others, resulting in a capability gap that can be noticeable to customers, stakeholders and employees.  Effective communication between the CEO and the CIO is crucial for a successful GenAI implementation at scale. Regular touchpoints and open dialogue allow both parties to exchange ideas, address concerns, while aligning their expectations and initiatives with the organization’s goals. Related:The Top Habits of High-Performing IT Development Teams When I speak with executives, many of their experiences are similar — they are looking to close the gap between their aspirations and the everyday habits of their workforce. They want to understand how to put AI into production beyond writing drafts so they can see transformational change. They’ve invested their own time in experimenting with GenAI and believe in its transformative potential for their organization. Looking at clients across industries, what is most effective is when they prioritize modernizing their data strategies and systems to ensure the quality and integrity of their AI tools.   Successful organizations also are reimagining their workforces and considering how to future-proof their talent strategy. They are excited about the early successes they’ve seen with GenAI across different corporate functions like finance, sales and IT, and they want to translate those lessons into bigger opportunities.   It’s also no secret that GenAI implementation comes with inherent risks, such as data privacy, cyber and ethical concerns. Collaboration between the CEO and CIO helps mitigate these risks by having a robust risk management framework in place, which includes data encryption, access controls and compliance with the evolving regulatory landscape. Other areas that are ripe for collaboration include the establishment of ethical guidelines and changes to workflows and job roles.   Related:Hot Tech Jobs in 2025 Many also may not have fully considered the long-term costs, which is crucial for informed decision making. Understanding and managing the costs of AI adoption is vital, and organizations must establish comprehensive total cost of ownership models, with cross-functional governance, and standard procedures to track and manage the success.   Working together, the CEO and CIO can push forward these changes faster and more effectively. Findings from the 2024 KPMG Global Technology survey showed that 80% of C-suite technology leaders say senior leadership’s risk aversion makes  them slower than competitors to embrace new technology. This makes it crucial to have a strong change-management strategy in place so organizations not only can have a smooth adoption but also a faster and more efficient one. The CEO should communicate the benefits of GenAI to employees, emphasizing how it can enhance productivity and create new opportunities while the CIO can work with other leaders to embed the technology into existing workflows, and reduce barriers to adoption.   The CEO/CIO dynamic plays a pivotal role in the successful implementation of GenAI in organizations. Collaboration, shared vision, effective decision-making, risk and change management, and measuring success need to be top of mind. By working together, they can fully leverage the benefits of GenAI to drive innovation, improve efficiency, and stay competitive in the market.  source

The CEO/CIO Dynamic: Navigating GenAI Implementation Read More »

IT Hiring in 2025: Cloudy With a Chance of High Salaries

A wave of IT job losses doesn’t make bleak reading for 2025 — quite the contrary, as the reality of AI kicks in and accelerates demand for IT pros.  In 2024, the number of unemployed IT workers reached its highest point since the dot-com collapse of the early 2000s.  Layoffs coincided with peak pessimism that generative AI would take developers’ jobs as enthusiasm for LLMs surged and the C-suite bought into automation.   But don’t misread the signs: Many tech layoffs hit business staff rather than frontline tech staff at companies repositioned for AI, cloud, and cybersecurity.  While 2025 promises uncertainty, and businesses should expect the unexpected, two things will remain constant: demand for digital and the tech skills shortage.  As a result, professionals with the right skills will command large and growing salaries. The only question is, which skills?  Foundational Approach  Recruitment specialist Harvey Nash capped a gloomy 2024 with some chilling data, which expects the increase in recruitment will be at its lowest level since 2011. But there’s a silver lining to these findings: recruitment is still happening. Analyst firm IDC predicts hiring will vary by sector, and recruitment in the UK will bounce back, with nearly half of IT and tech hiring managers planning to increase headcounts.  Related:How DOGE May Impact Tech Digitalization is real, and CIOs and CTOs need skills and experience in AI, cloud and cybersecurity to deliver.  After experimenting with AI, the focus for 2025 will be delivery: large-scale, day-to-day production to win and retain customers. And IT pros can expect to experience sharp growing pains as AI has proved difficult to deliver outside pilots or limited deployments. For all the recent AI successes, just as many systems failed to deliver as expected, produced inaccurate and unreliable returns, or introduced risk.  This opens up opportunities for those with the skills and experience to design, build, and train models and for those capable of taking systems from pilot to production.  But AI has broad applications, so what skills should IT pros be homing in on? According to one industry-backed report, “foundational” skills in AI literacy, especially in data analysis and prompt engineering, will be key.    The hiring data and trends from Andela’s talent marketplace align with this report, saying “generalists” are in huge demand. As such, it is important to build a solid grounding before plunging into the AI jobs market. A good example is Python.  Getting ahead in a foundational technology such as Python means technologists can apply the mechanics of the possible to solve problems at a practical level. Python pre-dates AI so it has something akin to universal applicability in the world of programming. But AI and data mining have taken it to a whole new level, with libraries, like PyTorch, TensorFlow and Langchain, building the foundation in AI. Its ease of use and the growing set of libraries have seen the language rated most popular during the past year.  Related:The CEO/CIO Dynamic: Navigating GenAI Implementation Six of the Best  But tech skill demand isn’t limited to AI. Hidden in Andela’s marketplace data were revealing insights on roles being sought by hiring managers — and the salaries on offer. Our survey of more than 150,000 individuals identified the six highest-paid technology posts:  Principal software engineer  Lead software engineer   Senior back-end engineer (Java)   We have seen clients pay $144,000 for a technical architect, making this the highest-paid position going into 2025. Interest in digital will mean salaries for those with foundational skills will remain steady or even increase.  Behind these skills, however, lies a set of deeper capabilities sought by teams hiring global talent.  Related:Apple’s $500 Billion AI Investment to Create 20,000 Tech Jobs Take a principal software engineer. Recruiters are looking for experience with Java, Ruby on Rails, Python or Golang, knowledge of the three main cloud providers’ platforms, a firm grasp of containerization, and expertise in microservices and CI/CD.  Senior back-end engineers should have these skills plus expertise in design patterns, data structures and algorithms, and unit testing.  And technical architect — one of the best-paid jobs this year? Familiarity of cloud computing technologies and providers’ platforms, an understanding of how CRM systems operate in the cloud, and a solid understanding of cybersecurity principles are prerequisites.  AI and business uncertainty are influencing hiring — just not for the worst.  For anyone changing jobs in 2025, the advice is simple: Stay up to speed on new technologies while remaining well-grounded on foundational skills so employers can build the talent needed to get ahead on digital — and you can land the salary you want.  source

IT Hiring in 2025: Cloudy With a Chance of High Salaries Read More »

Apple’s $500 Billion AI Investment to Create 20,000 Tech Jobs

Apple on Monday announced a plan to spend $500 billion to bolster its artificial intelligence ambitions that will add 20,000 research and development jobs in the US over the next four years. The plan will include the expansion of data center facilities in Michigan, Texas, California, Arizona, Nevada, Iowa, Oregon, North Carolina, and Washington. The company, with the help of Taiwan’s Foxconn, will build a 250,000-square-foot facility in Houston, Texas to manufacture AI servers to support Apple Intelligence. US President Donald Trump sought to claim the announcement as a boost to his administration, which in recent days saw falling approval ratings after a whirlwind start to his second term that included thousands of federal government firings. Trump met with Apple CEO Tim Cook last week and in social media posts touted Monday’s announcement as a vote of confidence in his administration. During an event with state governors in Washington, D.C., last week, Trump said Apple’s investment was proof that his tariff efforts are paying off. Apple manufactures many of its products in China and faces new 10% tariffs on those goods. “[Apple] stopped two plants in Mexico that were starting construction,” Trump said. “They just stopped them — they’re going to build them here instead, because they don’t want to pay the tariffs. Tariffs are amazing.” Related:IT Hiring in 2025: Cloudy With a Chance of High Salaries Despite Trump’s assertions, Apple did not state if the proposed tariffs factored into its plans. It’s also unclear what “plants” Trump was referring to, as Apple has not announced specific plans to build in Mexico. Reports say Foxconn, which produces iPhones for Apple in China and India, is planning to build a factory in Mexico in partnership with Nvidia. In 2021, during the Biden Administration, Apple made a $430 billion commitment to creating 20,000 new jobs across the country over five years. But its plan to build a new campus in Research Triangle Park in North Carolina was paused in 2024. And during the first Trump administration, Apple announced a $350 billion, five-year spending plan. Apple has not publicly disclosed how much of those previous commitments were fulfilled. Cook said Apple is committed to boosting domestic manufacturing. “We are bullish on the future of American innovation and we’re proud to build on our long-standing US investments with this $500 billion commitment to our country’s future,” Cook said in a statement. He said the company would double its Advanced Manufacturing Fund, which invests in training for high-skilled manufacturing. Related:What Tech Workers Should Know About Federal Job Cuts and Legal Pushback Apple said the 20,000 jobs will add to the 2.9 million jobs the company already supports throughout the country through direct employment, work with US-based suppliers and manufacturers, and developer jobs. The new positions will focus on research and development, software development, silicon engineering, and AI and machine learning advancements. “This is a welcome sign as Apple steps up to design its manufacturing infrastructure for the intelligent age,” Boston University Questrom School of Business professor emeritus Venkat Venkatraman wrote in a post on LinkedIn. “Could this help Apple get into a broader set of digital products? Possibly. It also signals a major geographical realignment of its global footprint (and political realities)!” Plan Details The announced Texas manufacturing facility is slated to open in 2026 and will produce AI servers previously manufactured outside of the US. The company’s US Advanced Manufacturing Fund, which was created in 2017 to spur high-skilled manufacturing training and support innovation, will increase from $5 billion to $10 billion. The expanded effort includes a multibillion-dollar commitment from Apple to produce advanced silicon in TSMC’s Arizona plant. In Detroit, the company will launch the Apple Manufacturing Academy to offer free in-person and online courses to teach project management and manufacturing process optimization, and other smart manufacturing techniques. Related:Tech Company Layoffs: The COVID Tech Bubble Bursts source

Apple’s $500 Billion AI Investment to Create 20,000 Tech Jobs Read More »

Is a Small Language Model Better Than an LLM for You?

While it’s tempting to brush aside seemingly minimal AI model token costs, that’s only one line item in the total cost of ownership (TCO) calculation. Still, managing model costs is the right place to start in getting control over the end sum. Choosing the right sized model for a given task is imperative as the first step. But it’s also important to remember that when it comes to AI models, bigger is not always better and smaller is not always smarter.   “Small language models (SLMs) and large language models (LLMs) are both AI-based models, but they serve different purposes,” says Atalia Horenshtien, head of the data and AI practice in North America at Customertimes, a digital consultancy firm.  “SLMs are compact models, efficient, and tailored for specific tasks and domains. LLMs, are massive models, require significant resources, shine in more complex scenarios and fit general and versatile cases,” Horenshtien adds.   While it makes sense in terms of performance to choose the right size model for the job, there are some who would argue model size isn’t much of a cost argument even though large models cost more than smaller ones.   “Focusing on the price of using an LLM seems a bit misguided. If it is for internal use within a company, the cost usually is lass than 1% of what you pay your employees. OpenAI, for example, charges $60 per month for an Enterprise GPT license for an employee if you sign up for a few hundred. Most white-collar employees are paid more than 100x that, and even more as fully loaded costs,” says Kaj van de Loo, CPTO, CTO, and chief innovation officer at UserTesting.  Related:AI Is Improving Medical Monitoring and Follow-Up Instead, this argument goes, the cost should be viewed in a different light.  “Do you think using an LLM will make the employee more than 1% more productive? I do, in every case I have come across. It [focusing on the price] is like trying to make a business case for using email or video conferencing. It is not worth the time,” van de Loo adds.  Size Matters but Maybe Not as You Expect  On the surface, arguing about model sizes seems a bit like splitting hairs. After all, a small language model is still typically large. A SLM is generally defined as having fewer than 10 billion parameters. But that leaves a lot of leeway too, so sometimes an SLM can have only a few thousand parameters although most people will define an SLM as having between 1 billion to 10 billion parameters.  As a matter of reference, medium language models (MLM) are generally defined as having between 10B and 100B parameters while large language models have more than 100 billion parameters. Sometimes MLMs are lumped into the LLM category too, because what’s a few extra billion parameters, really? Suffice it to say, they’re all big with some being bigger than others.  Related:Medallion Architecture: A Layered Data Optimization Model In case you’re wondering, parameters are internal variables or learning control settings. They enable models to learn, but adding more of them adds more complexity too.   “Borrowing from hardware terminology, an LLM is like a system’s general-purpose CPU, while SLMs often resemble ASICs — application-specific chips optimized for specific tasks,” says Professor Eran Yahav, an associate professor at the computer science department at the Technion – Israel Institute of Technology and a distinguished expert in AI and software development. Yahav has a research background in static program analysis, program synthesis, and program verification from his roles at IBM Research and Technion. Currently, he is CTO and co-founder of Tabnine, an AI-coding assistant for software developers.   To reduce issues and level-up the advantages in both large and small models, many companies do not choose one size over the other.  “In practice, systems leverage both: SLMs excel in cost, latency, and accuracy for specific tasks, while LLMs ensure versatility and adaptability,” adds Yahav.  Related:The Cost of AI: How Can We Adopt and Deliver AI Efficiently? As a general rule, the main differences in model sizes pertain to performance, use cases, and resource consumption levels. But creative use of any sized model can easily smudge the line between them.  “SLMs are faster and cheaper, making them appealing for specific, well-defined use cases. They can, however, be fine-tuned to outperform LLMs and used to build an agentic workflow, which brings together several different ‘agents’ — each of which is a model — to accomplish a task. Each model has a narrow task, but collectively they can outperform an LLM,” explains, Mark Lawyer, RWS‘ president of regulated industries and linguistic AI.  There’s a caveat in defining SLMs versus LLMs in terms of task-specific performance, too.  “The distinction between large and small models isn’t clearly defined yet,” says Roman Eloshvili, founder and CEO of XData Group, a B2B software development company that exclusively serves banks. “You could say that many SLMs from major players are essentially simplified versions of LLMs, just less powerful due to having fewer parameters. And they are not always designed exclusively for narrow tasks, either.”   The ongoing evolution of generative AI is also muddying the issue.  “Advancements in generative AI have been so rapid that models classified as SLMs today were considered LLMs just a year ago. Interestingly, many modern LLMs leverage a mixture of experts architecture, where smaller specialized language models handle specific tasks or domains. This means that behind the scenes SLMs often play a critical role in powering the functionality of LLMs,” says Rogers Jeffrey Leo John, co-founder and CTO of DataChat, a no-code, generative AI platform for instant analytics.  In for a Penny, in for a Pound  SLMs are the clear favorite when the bottom line is the top consideration. They are also the only choice when a small form factor comes into play.  “Since the SLMs are smaller, their inference cycle is faster. They also require less compute, and they’re likely your only option if you need to run the model on an

Is a Small Language Model Better Than an LLM for You? Read More »