- Innovation: New full suite of leading-edge AI solutions based on the NVIDIA GB200 Grace Blackwell Superchip, through a strengthened partnership with Wistron and ecosystem partners
- Sustainability and energy efficiency: Embracing dielectric direct-to-chip liquid cooling with up to 2.5 kW TDP
SAN JOSE, Calif. and TAIPEI, Oct. 14, 2024 /PRNewswire/ — Wiwynn (TWSE:6669), a leading cloud IT infrastructure provider for hyperscale data centers, is unveiling a full suite of AI data center solutions and state-of-the-art liquid cooling technologies, on display at the upcoming Open Compute Project (OCP) Global Summit 2024 on October 15-17 at San Jose Convention Center (Booth #B11). Visit Wiwynn OCP 2024 Page
“The high power demands of AI are pushing the limits of data centers, and as the technology grows more complex, it’s essential to enhance both performance and sustainability,” noted William Lin, President of Wiwynn. “We’re excited to showcase Wiwynn’s complete AI and advanced cooling solutions at OCP Global Summit 2024. This year, we’re demonstrating how our technology creates unprecedented performance and efficiency to empower data centers worldwide and unlock new possibilities in the AI era.”
AI solutions harnessing NVIDIA accelerated computing
To drive innovation in data centers, Wiwynn has strengthened its cooperation with Wistron Corporation, delivering complete AI acceleration platforms leveraging state-of-the-art chips — including the NVIDIA GB200 Grace Blackwell Superchip.
NVIDIA GB200 NVL72 Rack Solution:
- One of the first NVIDIA GB200 NVL72 platform-based solutions available on the market.
- Supercharges training and inference for trillion-parameter-scale AI models while providing 25x lower total cost of ownership (TCO) compared with the previous generation of GPUs.
- The liquid-cooled rack connects 72 NVIDIA Blackwell GPUs and 36 NVIDIA Grace CPUs through fifth-generation NVIDIA NVLink™ and NVLink Switch technologies, enabling powerful AI acceleration in a single rack.
Wiwynn is also offering updated AI solutions on the NVIDIA HGX™ platform, including:
- GS1400A: The NVIDIA MGX™ 4U server leverages eight NVIDIA H200 Tensor Core GPUs interconnected with NVLink and NVSwitch to bring accelerated computing into any data center with modular server designs.
- The compact GS1300N: Can be equipped with eight NVIDIA Hopper or Blackwell architecture GPUs within 3U-height and achieves over 90% heat dissipation with Wiwynn’s DLC technology.
State-of-the-art liquid cooling
At the Summit, Wiwynn will introduce:
- Leading-edge two-phase liquid cooling techs that push chip thermal limits:
- Through a partnership with ZutaCore®, a leading provider of DLC and waterless liquid cooling solutions, this pushes the single-chip power limit by delivering up to 2.5 kW thermal design power (TDP). Deep dive at OCP Future Technologies Symposium presentation.
- Demonstrating a validated two-phase immersion cooling solution that achieves 2.8kW TDP, meeting future demands.
- Open IP SuperFluid Cooling Technology: Partnering with Intel, Wiwynn is solving a major roadblock for DLC by replacing water with a novel dielectric fluid, protecting electric circuits from damage and loss from water leakage, reducing the risk of data center outage, while achieving over 1.5kW TPD cooling capacity with a single-phase solution.
For rapidly diversifying data centers, Wiwynn is focusing on developing the cooling technology for the future and offers optimized solutions for data centers.
Get more information on Wiwynn’s Facebook, LinkedIn, and website www.wiwynn.com.
About Wiwynn
Wiwynn is an innovative cloud IT infrastructure provider of high-quality computing and storage products, plus rack solutions for leading data centers. We are committed to the vision of “unleash the power of digitalization; ignite the innovation of sustainability”. The Company aggressively invest in next-generation technologies to provide the best TCO (Total Cost of Ownership), workload and energy-optimized IT solutions from cloud to edge.
SOURCE Wiwynn