KIOXIA Announces Next Gen XD8 E1.S Data Center SSDs with up to 15.5GB/s Speeds

Share post:

KIOXIA this morning announced their newest XD8 Series Data Center PCIe 5.0 (32GT/s x4) SSDs which will provide up to 73% higher performance than the previous generation, greater scalability and enhanced efficiency.  The  XD8 is available in 1.92, 3.84 and 7.62TB and the EDSFF E1.S form factor provides for 9.5, 15 and 25mm heat sink sizes.

KIOXIA Announces Next Gen XD8 E1.S Data Center SSDs with up to 15.5GB/s Speeds

The XD8 provides performances specifications of up to 12.5GB/s read, 5.8GB/s write with up to 2300K IOPS.  It is KIOXIA’s third generation, is intended for the cloud/hyperscale data center environment, is NVMe 2.0 compliant as well as supporting Open Compute Data Center SSD 2.5 specifications.

blank

“The KIOXIA XD8 Series is engineered to deliver superior PCIe ® 5.0 performance over previous generation SSDs and optimize thermal management, addressing the needs of
OCP hyperscale environments. As an active member of the OCP community, KIOXIA is committed to collaborating with leading server and storage system developers to harness the full potential of flash memory, NVMe™, and PCIe technologies.” states Neville Ichhaporia Sr VP & GM of SSD BU KIOXIA America, Inc.

blank

The KIOXIA XD8 is built with KIOXIA designed SSD controller, BiCS FLASH 3D flash memory and firmware, has full end-to-end encryption and power-loss protection, along with non-SED and TCG Opal SSC SED options.  Available in capacities of 1.92, 3.84 and 7.68 terabytes (TB), KIOXIA XD8 Series evaluation drives are now sampling to select customers. For more information, please visit www.kioxia.com.

blank

Related articles

Simplify and Scale AI-Powered MetaHuman Deployment with NVIDIA ACE and Unreal Engine 5

At Unreal Fest 2024, NVIDIA released new Unreal Engine 5 on-device plugins for NVIDIA ACE, making it easier...

NVIDIA Unveils NIM Blueprint for Cybersecurity

Artificial intelligence is transforming cybersecurity with new generative AI tools and capabilities that were once the...

What’s the ROI? Getting the Most Out of LLM Inference

Large language models and the applications they power enable unprecedented opportunities for organizations to get deeper...