Cambridge-based infrastructure provider DC intelligence (DCI) plans to use Nebulonsmart Edge storage for a variety of liquid-cooled rack-mount servers that it plans to offer for high-performance 5G-enabled edge deployments, including manufacturing and rural use cases.
Nebulon storage enables DCI to deliver something effective Hyper-converged infrastructure (HCI), but with Supermicro hardware equipped with a liquid-cooled rack-mount 19-inch blade. The liquid cooling aspect allows the server to be deployed where the data center does not have the usual equipment expected in terms of power and cooling, making it suitable for edge requirements, especially in remote and hot areas.
nebula Provides a Nebulon ON storage control plane for performing analytics on Amazon Web Services or the Google Compute Platform public cloud. Onsite media, on the other hand, is in the form of storage processing units (SPUs) that reside in PCIe slots.These offload storage Input / output (I / O) processing and management from the server to the Nebulon hardware.
This feature allows DCI to provide a DataQube. It is effectively a modular data center computing and storage node for hyper-converged infrastructure. Its target market is customers who want to be able to expand to edge locations.The company has 5G-enabled modular data center hardware Agricultural technology And manufacturing environment.
Nebulon supports specialized applications that rely on speed and performance. Nebulon also claims that it is aimed at customers who need ease of use, and that technology allows application owners to provision storage without the intervention of a storage administrator.
SPU will be replaced RAID Cards and storage Host bus adapter Found on the server. Each SPU has two 25 Gigabit Ethernet (GbE) ports that form the data plane for each application cluster and one GbE cloud connection.
The SPU connects the flash storage on each node and emulates the functionality of the local storage controller to the host.This local PCIe The device manages only host I / O – application, server, and storage metrics are offloaded to the Nebulon ON cloud for prescription analysis. Enterprises can build Nebulon clusters called nPods that can scale to 32 servers. Data services such as mirrors, snapshots and volumes are configured in the Nebulon ON cloud portal.
DCI used separate compute and storage nodes, but was considering hyper-converged infrastructure as a way to deliver DataQube, said Chris Ward-Jones, chief technology officer of the company. I will.
“The problem with HCI is giving up chunks of CPU / RAM for storage workloads,” says Ward-Jones. “Or would you like to build separate compute and storage nodes? Then I discovered Nebulon and found that I could enjoy the benefits of hyper-converged, but offload storage workloads.”
Nebulon provides a storage control plane from the public cloud while the media resides on the server. “I think it’s like a RAID card,” says Ward-Jones. “You can build a SAN between nodes.
“The obvious advantage is that you can get more compute and memory within the physical footprint of the rack. It’s cheaper than software defined storage and is a suite between cost and compute / memory density. It will be a spot. ”
DCI is a Pure Storage customer for a wider range of storage needs and recently purchased a capacity of approximately 2PB from its supplier. Unfortunately, “It doesn’t work well when immersed. [ie, liquid-cooled] The solution, “says Ward-Jones.
DCI’s Chief Technology Officer summarized the benefits of Nebron as “a lot of value to make a lot of money.” Because it runs storage on the server, it looks like hyper-converged, but you can use compute that is independent of the storage stack. ”
Nebulon helps DCI provide high performance edge hyper-converged
Source link Nebulon helps DCI provide high performance edge hyper-converged