|
DataPelago has created a new engine called Nucleus that dramatically speeds up data processing for AI and analytics. It outperforms Nvidias cuDF library by large margins while working across different types of hardware. Todays GPUs are powerful, but older software often wastes their potential, making faster tools like Nucleus especially valuable. This shift could have dramatic implications for Nvidia. For years, enterprises have leaned on GPUs (Graphics Processing Units) to handle ever-growing mountains of data, leveraging their ability to run thousands of calculations in parallel for AI and analytics workloads. Every generative AI model, recommendation engine, and analytics dashboard depends on data libraries to prepare, join, and transform massive datasets. Yet the industry faces a quiet challenge: Despite advances in hardware, performance often stalls at scaling limits because the software stack struggles to fully exploit the hardwares capabilities. Many legacy data libraries were optimized for CPUs, not GPUs. As a result, memory bandwidth and compute throughput often go underutilized, and every time data moves between CPU and GPU, much of the performance advantage evaporates. To address this, Nvidia launched cuDF in 2018 as part of its open-source RAPIDS suitea GPU-accelerated DataFrame library that quickly became the gold standard for data operations. It delivered speedups over CPU-based libraries and better utilization of GPU hardware. But cuDF also has limits. It requires an Nvidia GPU with ample memory and CUDA support, ruling out environments without compatible hardware. In many ways, cuDF became the industrys ceiling: powerful enough to accelerate AI and analytics pipelines, yet constrained by the quirks of GPU architecture itself. Now, California-based data startup DataPelago says it has surpassed those limits with its universal data processing engine, Nucleus. Built atop Nvidia hardware, Nucleus reportedly delivers performance gains so steep they could reset the economics of GPU acceleration. In a benchmark test, Nucleus outpaced cuDF by 38.6 times on hash joins, eight times on sorts, and 10 times on filters and projections. To fully realize the benefits of GPUs, data processing engines need to fully leverage the hardwares strengths while compensating for its limitations, says DataPelago CEO Rajan Goyalsomething he argues demands fresh algorithms built for data workloads. The implications go far beyond engineering bragging rights. Cloud GPUs are expensive, and enterprises face pressure to maximize every compute cycle. Faster data processing means lower cloud bills and quicker time-to-insight. Goyal says Nucleus is designed to run on any hardware and handle any type of data, while integrating with existing frameworks without requiring changes to customer applications. We slot into the existing environments that developers are already working in, he adds. Empowering Enterprise AI with a Hardware-Neutral Approach DataPelagos benchmark test ran on standard public-cloud servers with both entry-level Tesla T4 and high-end H100 GPUs. The test mimicked real-world tasks: moving data from CPU to GPU, processing it, and returning results to the host. Using the same dataset and harness, Nucleus was compared head-to-head with cuDF on core AI and analytics operations. We wanted to improve the performance ceiling for GPUs, and the only way to do that credibly was to compare ourselves directly to cuDF, says Goyal. He notes that accelerating data prep by an order of magnitude gives businesses the capacity to process exponentially more information for AI training and retrieval tasks, keeping systems up-to-date. The engine achieved these results by redesigning its execution layer to handle complex workloads, including kernel fusion, native multi-column support, and optimized handling of variable-length data such as strings. Interestingly, while Nucleus runs on Nvidias CUDA framework and GPUs, it delivers higher performanceessentially out-engineering Nvidia on its own tech stack. DataPelago president JG Chirapurath says Nucleus delivers far greater performance from existing hardware investments and stresses that enterprises prefer solutions that build on what they already have rather than forcing a rip-and-replace. Goyal argues that cuDF is tightly coupled to Nvidias GPU ecosystem, creating vendor lock-in and limiting hardware flexibility. This dependence restricts open innovation and ties enterprises to Nvidias roadmap. Nucleus is designed to work across any hardware (not only GPUs), while also handling any type of data and supporting any query engine. It is designed to lift the performance ceiling of any hardware, claims Goyal. The engine also includes built-in intelligence that automatically maps data operations to the most suitable hardware and dynamically reconfigures tasks to maximize performance. Software Over Silicon: The Emerging Battle for Enterprise AI Efficiency If DataPelagos approach takes hold, enterprises may begin prioritizing universality and efficiency over single-vendor ecosystems when building AI infrastructure. Still, analysts caution that benchmark results often look stronger in controlled tests than in real-world production, and risks remain if the hardware landscape evolves quickly. The offering will appeal to those looking to avoid vendor lock-in, says Alvin Nguyen, senior analyst at Forrester. But with tools like AMDs CUDA translation for its data center GPUs, the real advantage is if youre also targeting CPUs and Field Programmable Gate Arrays (FPGAs). There is a large population of developers experienced with NVIDIAs ecosystem, so moving away from NVIDIA now means a bigger short-term investment in other options. Nguyen also notes that progress on transformer-based workloads, such as training large foundational models, is slowing compared with prior years. As inferencing begins to outpace training, raw GPU horsepower is no longer the main driver. A more balanced view, including the software layer, is a smart way to look at things. Still, investors are buying in. DataPelago has raised $47 million in seed and Series A funding from Eclipse, Qualcomm Ventures, and Taiwania Capital, and recently hired industry veteran JG Chirapurath as president. CEO Goyal himself worked at Cisco and Oracle before founding the company. For years, the AI industry has fixated on chip shortages and the race for ever-more powerful GPUs. Nucleus points instead to a different kind of competition. If the biggest performance gains now come from software rather than hardware, the battleground could shift from chip foundries to algorithmic innovation. The future of AI infrastructure may depend less on building bigger chips and more on rethinking how we harness the ones we already have. Hardware neutrality is strategic differentiation, not just technical capability. Enterprises want infrastructure investments that remain valuable as technology evolves, says Chirapurath. My long-term vision is positioning DataPelago as the universal data processing foundation that accelerates the next decade of AI and analytics innovation. We’re making previously imposible applications economically feasible.
Category:
E-Commerce
Lets be honest: fear is everywhere inside organizations right now. You can feel it in how people talk about headcount or not. You see it in 10-slide decks justifying one decision. You sense it when smart teams stop raising bold ideas and start hedging every word. Ive been in innovation and design for nearly 20 years, working with Fortune 100s through recessions, crises, and COVID-19. Ive never seen fear take up as much space as it does now. Its no wonder why we are here, though. Headlines are dominated by mass layoffs, inflation, AI threats, climate disasters, and geopolitical instability. The pressure to do more with less only seems to grow inside organizations. Where we once saw enthusiastic collaboration and time for ideation, we now hear repeated reminders that this project is critical and the product must launch ahead of schedule. Remember camaraderie, laughter, and the room to innovate? They have all been squeezed out in the name of efficiency. Its not that managers are clueless about the stressors facing their direct reports. Yet pressure from the board, the market, and the team itself creates a vicious cycle of fear that ultimately leads to a scarcity mindset. Time, money, and opportunity suddenly feel limited. As a result, people stop thinking strategically, long-term visions are discarded, and innovation grinds to a halt. And yet, I know this from the dozens of companies Ive worked with in innovation: Teams can still move forward, even amidst great fear. To succeed, they need leaders who can stay grounded, know when to shift gears, and create the conditions for progress despite immense uncertainty. Here are five ways to guide your team out of fear and toward innovation and long-term business growth. 1. Use empathy to shift your culture Leaders set the tone of their teams and how they show empathy for their teams is a huge determinant of teams success. Prioritizing psychological safety and addressing issues transparently can cultivate a thriving and resilient team. I recently led a team through a project with fear at the centera looming launch date, mandated cross-team collaboration (that wasnt always so collaborative), and many reinforced critical reminders. Our team used simple tactics to focus on voices being heard to bring hope. We set up onboarding calls with every team and created a survey to understand what was on their mind and how they were approaching their work. Providing a space for their voices to be heard shifted their thinking from fear to opportunity. Another way you can help teams shift their culture is by creating metrics around empathy. Pulse-check your teams regularly. Build in skip-level 1:1s. Prioritize open communication over perfect communication. In times of turmoil, you cannot over-communicate. Anchoring decision-making in long-term aspirations, not just short-term survival, also helps teams remember that every situation is temporary. Painting a picture of the vision can reshape the entire trajectory of a project. It anchors a team in what feels challenging now but paints a picture of the future that showcases a vision teams can build toward. 2. Build an environment where truth-telling is rewarded Fear loves to shut down expansive thinking. Studies as far back as the 1950s, including Solomon Asch’s conformity experiments and Elisabeth Noelle-Neumann’s spiral of silence theory, illustrate how fear can tamper with expansive thinking and encourage conformity. Yet the last thing we need right now is conformity. Executives need truth-tellers, people who will stand up and say no and give the real talk thats needed. One executive always asked me to visit when I was in town. It wasnt because Id say everything was perfect. Instead, I would tell the truth, ask hard questions, and challenge their opinion if it wasnt rooted in sound insight. This kind of relationship goes beyond fear and centers on mutual respect and trust. As a leader, you dont necessarily need to empower the loudest naysayer in the room, but its time to find and cultivate truth-telling. While debate may seem risky in fearful times, leaders need ideas shared widely, not hoarded. For example, you might require teams to bring three hard questions for every share. If truth-telling feels impossible with current team members, consider expanding the office of the CEO into roles such as a chief of staff, which has seen a nearly 1,934% increase from 2019 to 2022 alone. This role exists to help the CEO uncover whats really happening. There are experts who can evaluate and support the creation of these roles such as Nova Chief of Staff or Ask a COS. 3. Encourage deep failure In tech innovation, youve heard Mark Zuckerbergs Move fast and break things or read The Lean Startup, which coined MVP. Those approaches are right, in theory. However, in corporate America, I rarely see them play out as intended because failure sill feels off-limits. Yet deep failure, done right, can be a golden ticket. Start by running small, agile experiments to gather insights, keeping the end user at the center every step of the way. When failure is reframed as data, not defeat, teams become more creative, and solutions get better and improve faster. The Spotify Model treats failure as fuel by being grounded in autonomy, innovation, and continuous learning. This mindset shift also applies to AI. Too many teams are still blocking generative tools out of fear. Instead, start mandating the use of AI to power rapid experimentation. While enterprise tools like Microsoft Copilot offer a safe starting point, the real value comes from going further. Right-sized, rapid AI tool creation within teams, allows them to customize and solve for their own challenges. 4. Stop glamorizing a good story The best final presentations focus less on telling a good story and more on strategy, substance, and outcomes. Teams shouldnt spend hours editing out words the CEO doesnt like or redesigning an infographic 10 times because it just doesnt feel right. Thats just fear in disguise, and it pulls focus away from what matters: whether the idea is right, not whether its perfectly packaged. One simple fix? Standardize the format. Require every team to deliver their final presentation as a one-page memo. Everyone knows Amazons approach; theyve been onto something for years. When everyone operates from the same format, it levels the playing field and puts the thinking, not the theatrics, at the center. We also must let go of outdated ideas about what a good leader looks or sounds like. Ive been told throughout my career that I have phenomenal executive presence. For a while, I wore that as a badge of honor. I now realize we should call it what it is: a coded way of valuing performance over substance. Its time to stop prioritizing polish and instead focus on outcomes. What matters isnt how someone says it but what theyre actually saying and whether it moves the business forward. 5. Consider your leadership legacy The phrase often attributed to Maya AngelouPeople wont remember what you accomplished, but how you made them feelis usually reserved for personal relationships. But why should it be any different for an executive? A retired Fortune 50 CEO once shared my article on LinkedIn. I reached out to thank her and asked to meet. Ahead of our conversation, I watched her interviews and saw someone who was consistently real, honest, and unafraid to talk about the personal and professional challenges of leadership. When we met, she was even more impressive in person, grounded in her values, and focused on creating an impact far beyond the boardroom. Her approach inspired me to think more intentionally about the legacy I want to leave behind as a leader. Legacy starts with shifting the focus from profit alone to impact. As a leader, you can create large-scale, positive change by championing equity, advancing sustainability or investing in social causes. But figuring out what kind of legacy you want to leave requires reflection. Leadership legacies dont build themselves. As Peter Drucker famously said, What gets measured gets managed. The same applies here. Set goals around the kind of legacy you want to leave, and check in with yourself regularly. Ask: What do I want to be remembered for? and What will people truly take away from working with me? While this metric might feel far away from those set by the board during times of intense pressure, they are just as worthy of time and focus. Perhaps these leadership tips feel basic, but when fear is driving the agenda, the basics are exactly what we need. Leaders need to remember that people power companies and people need their leaders. They need them to model how to lead through turmoil. Fear divides and creates short-term thinking. Yet empathy, truth-telling, encouraging failure, clear strategy, and legacy thinking can unite teams and drive innovation.
Category:
E-Commerce
Want more housing market stories from Lance Lamberts ResiClub in your inbox? Subscribe to the ResiClub newsletter. National home prices rose 0.2% year over year from July 2024 to July 2025, according to the Zillow Home Value Index reading published August 18a decelerated rate from the +2.8% year-over-year rate from July 2023 to July 2024. And more metro-area housing markets are seeing declines: > 31 of the nations 300 largest housing markets (10% of markets) had a falling year-over-year reading from January 2024 to January 2025. > 42 of the nations 300 largest housing markets (14%) had a falling year-over-year reading from February 2024 to February 2025. > 60 of the nations 300 largest housing markets (20%) had a falling year-over-year reading from March 2024 to March 2025. > 80 of the nations 300 largest housing markets (27%) had a falling year-over-year reading from April 2024 to April 2025. > 96 of the nations 300 largest housing markets (32%) had a falling year-over-year reading from May 2024 to May 2025. > 110 of the nations 300 largest housing markets (36%) had a falling year-over-year reading from June 2024 to June 2025. > 105 of the nations 300 largest housing markets (35%) had a falling year-over-year reading from July 2024 to July 2025. All year, more housing markets have slipped into year-over-year price declines as the supply-demand balance gradually tilts toward buyers in todays affordability-constrained, post-boom environment. But this month, that list of declining markets actually got a little shorter. Home prices are still climbing in many regions where active inventory remains well below pre-pandemic 2019 levels, such as pockets of the Northeast and Midwest. In contrast, some pockets in states like Arizona, Texas, Florida, Colorado, and Louisianawhere active inventory exceeds pre-pandemic 2019 levelsare seeing modest home price corrections. Year-over-year home value declines, using the Zillow Home Value Index, are evident in major metros such as Tampa (-6.2%); Austin (-6.0%); Miami (-4.6%); Orlando (-4.3%); Dallas (-3.9%); San Francisco (-3.8%); Phoenix (-3.5%); Jacksonville, Florida (-3.4%); San Antonio (-3.1%); Atlanta (-3.1%); Denver (-2.9%); San Diego (-2.7%); Raleigh, North Carolina (-2.3%); Sacramento (-2.2%); Riverside, California (-2.1%); Houston (-1.9%); San Jose (-1.6%); New Orleans (-1.0%); Charlotte, North Carolina (-0.9%); Los Angeles (-0.8%); Portland, Oregon (-0.8%); Seattle (-0.8%); Memphis (-0.8%); Nashville (-0.2%); and Las Vegas (-0.0%). Click here for an interactive version of the chart below. Many of the housing markets seeing the most softness, where homebuyers have gained the most leverage, are primarily located in Sunbelt regions, particularly the Gulf Coast and Mountain West. Many of these areas saw major price surges during the Pandemic Housing Boom, with home price growth outpacing local income levels. As pandemic-driven domestic migration slowed and mortgage rates rose, markets like Tampa and Austin faced challenges, relying on local income levels to support frothy home prices. This softening trend is further compounded by an abundance of new home supply in the Sunbelt. Builders are often willing to lower prices or offer affordability incentives to maintain sales, which also has a cooling effect on the resale market. Some buyers, who previously would have considered existing homes, are now opting for new homes with more favorable homebuilder deals. !function(){"use strict";window.addEventListener("message",function(a){if(void 0!==a.data["datawrapper-height"]){var e=document.querySelectorAll("iframe");for(var t in a.data["datawrapper-height"])for(var r,i=0;r=e[i];i++)if(r.contentWindow===a.source){var d=a.data["datawrapper-height"][t]+"px";r.style.height=d}}})}(); Given the shift in active housing inventory and months of supply, this softening and regional variation should not surprise ResiClub PRO membersweve been closely documenting it. (ResiClub PRO members can view our latest analysis of home prices across 800-plus metros and more than 3,000 counties here.) Of course, while 105 of the nations 300 largest metro-area housing markets are seeing home price declines, another 195 are still seeing year-over-year home price increases. Where are home prices still up on a year-over-year basis? See the map below: !function(){"use strict";window.addEventListenr("message",function(a){if(void 0!==a.data["datawrapper-height"]){var e=document.querySelectorAll("iframe");for(var t in a.data["datawrapper-height"])for(var r,i=0;r=e[i];i++)if(r.contentWindow===a.source){var d=a.data["datawrapper-height"][t]+"px";r.style.height=d}}})}();
Category:
E-Commerce
All news |
||||||||||||||||||
|