Speak With An Expert

Mail icon to contact Liqid about our composable infrastructure technology

Dell Technologies World 2022 Recap: Liqid Demonstrated Composable Memory!

Posted on
May 17, 2022
Written By

More than 9,000 tech hungry individuals, including Team Liqid, braved the Vegas heat to attend Dell Technologies World 2022. In what felt like the first “back-to-normal” event since 2019. Attendees were ready to engage, and Liqid had a "ThinkTank" filled with news to share!

At the event, Liqid announced that our company is the first CDI provider to demonstrate composable memory via the emerging Compute Express Link™ (CXL™) 2.0 protocol in collaboration with Samsung and Tanzanite Silicon Solutions

The final element left to compose is DRAM. Leveraging the PCIe Gen 5.0 physical layer, CXL decouples DRAM from the CPU, delivering high-speed CPU-to-memory connections for the first time. With native support for CXL, Liqid Matrix™ composable software can compose memory along with storage and accelerators like GPU. It’s a very big deal for organizations who want to address memory bottlenecks and improve the utilization of these expensive resources.

Liqid CEO Sumit Puri packs the house for his Dell World Chat session.

Tanzanite Chief Product Officer, Chandra Joshi, was on hand at our booth to demonstrate and educate attendees on the benefits of composable memory. Liqid’s own product leader, Ben Bolles, articulated the benefits of composable memory via CDI. “With the breakthrough performance provided by CXL, the industry will be better positioned to support and make sense of the massive wave of AI innovation predicted over just the next few years."

We’re excited to collaborate with Samsung and Tanzanite to illustrate the power of this new protocol,” Ben said. “As our demonstration illustrates, by decoupling DRAM from the CPU, CXL enables us to achieve these milestone results in performance, infrastructure flexibility, and more sustainable resource efficiency, preparing organizations to rise to the architectural challenges that industries face as AI evolves at the speed of data.”

So, just how important are these breakthroughs for the ever-growing use of AI? Well, Gartner has provided some interesting insight.

Over the past two years AI use has practically tripled, and by 2025 AI will be the top driver of infrastructure decision making as the market matures. This will create an even larger strain on data centers, resulting in a 10x growth in compute requirements over that same period.1

Considering this increased strain, the adoption of CXL technology will be a huge help, providing ample assistance with the exploding demand for expansive compute performance, efficiency, and sustainability not possible with traditional architectures. This allows for previously static DRAM resources to be shared for exponentially higher performance, reduced software stack complexity, and lower overall system cost.

Also, it's worth noting that Liqid is a contributing member of the Compute Express Link™ Consortium.

In addition to the well-received demo, Liqid also exclusively showcased Liqid ThinkTank, the first fully integrated CDI system for the world’s most challenging GPU-centric workloads. For those needing to simplify their organizations’ adoption of AI/ML, HPC, edge computing, and other applications that require mega-performance, the GPU-centric solution offers turnkey simplicity, industry-leading performance, impressive TCO savings, and meaningful reductions in carbon emissions.

Luke Quick, Vice President of OEM Sales at Liqid, discussed with attendees why Liqid ThinkTank is the right solution for these new workloads. “As organizations seek to incorporate AI+ML as part of their overall digital strategy and operationalize it day-to-day, IT teams have been quick to realize that traditional infrastructures often lack the flexibility to consistently support proof-of-concept deployments, much less scale to align with an organization’s vision for AI in the enterprise,” said Luke.

“Fortunately, organizations that wish to accelerate time-to-value for their AI+ML programs can now hit the ground running with Liqid ThinkTank. Our newest, adaptive CDI solution is built for GPU-intensive computing to significantly accelerate time-to-value for AI-driven business solutions, while serving as a powerful backbone in a greater AI ecosystem that is well positioned to organically adapt to tomorrow’s challenges.”

In addition, with 2-3x improvements in data center utilization rates compared to traditional static architectures Liqid ThinkTank and other solutions based on Liqid Matrix CDI software can be incorporated as a fundamental element of a more sustainable data center ecosystem with the potential to significantly reduce energy consumption and save millions of gallons of water and tons of global CO2 emissions per year.

Thank you once again to everyone who was able to attend Dell Technologies World 2022! We already cannot wait to attend next year’s conference!

[1] Source: Gartner, Predicts 2021: Operational AI and Enabling AIOrchestration Platforms, December 2020

Written by
Posted on
May 17, 2022

Would you like to learn more?

Speak with one of our sales experts to learn more about how we aim to deliver complete composability. For other inquiries, you can drop us a line. We'll get back to you as soon as possible.