Grizzlyware — Elfcam OS2 fibre optic datacenter infrastructure
Case study · grizzlyware.com

Connecting datacenter servers with a long-lasting fibre solution

Datacenter Microsoft S2D Armoured OS2 fibre Published in 2023
120
Virtual machines
600
Connected employees
50 TB
Total S2D traffic
16
10G fibre links

Grizzlyware — Digital transformation and critical infrastructure

Grizzlyware is a company specialising in digital transformation services, custom software and tailored product features. It builds unique bridges between technology platforms to modernise digital systems, allowing its customers to streamline their workflows and improve the user experience.

As part of a Microsoft Storage Spaces Direct (S2D) converged network project, Grizzlyware operates a central data network comprising 120 virtual machines — mainly domain controllers, file servers and terminal servers — used by 600 employees. Traffic migrations take place live between Hyper-V hosts. The infrastructure requires robust fibre optic cables suited to the demanding environment of a datacenter.

Four critical constraints to solve

1

Storage Spaces Direct (S2D) traffic reaches 50 TB in total across four virtual machines. Each VM requires 4 links to the switch — the switch must deliver throughput, switching capacity and latency suited to this load.

2

Traffic originates from 4 virtual hosts with a mix of S2D flows, management traffic, backups and inter-VM networking. High availability is essential: each host must be connected to each of the two redundant switches.

3

Nightly backups generate additional traffic peaks. The inter-switch connection must be planned carefully to absorb these loads without affecting the availability of production services.

4

The datacenter environment is particularly demanding for cables: mechanical stress, temperatures, installation density. Long-term durability is a non-negotiable criterion.

Armoured OS2 fibre architecture with MLAG for high availability

The solution is built on a switch with 1.44 Tbps capacity, a throughput of 1,071 Mpps and a latency of 612 ns — enough headroom to absorb all of the 50 TB of S2D traffic.

Sixteen 10G links in armoured LC/UPC to LC/UPC OS2 fibre optic cable (four per server) are set up between the switches and the servers fitted with 10G network cards. Since each host is connected to both redundant switches, MLAG (Multi-chassis Link Aggregation) is deployed — the two independent physical switches operate as a single logical switch, ensuring service continuity in the event of a switch failure.

Microsoft S2D network architecture — 10G fibre links between Hyper-V hosts and MLAG switches
S2D architecture — 16 10G fibre links between 4 Hyper-V hosts and 2 MLAG switches
Grizzlyware datacenter cabling diagram — redundant switches and Elfcam armoured fibre cables
Cabling diagram — redundant switches with armoured OS2 fibre connectivity

A reliable and long-lasting datacenter infrastructure

1.44 Tbps
Switching capacity
612 ns
Switching latency
Zero
Service interruptions recorded

The deployment of Elfcam armoured fibre optic cables allowed Grizzlyware to meet every technical constraint — throughput, redundancy, mechanical durability — while ensuring lossless transmission across the 16 10G links in the dense datacenter environment. The MLAG setup delivers the high availability required for the 600 connected users.

Armoured fibre optic cables used in this project

The selected cables combine a steel armour for mechanical strength in the datacenter, an LSZH jacket for fire safety compliance, and APC-grade polished LC/UPC connectors to minimise reflection losses.

E

Elfcam technical team

Experts in fibre optic infrastructure and networks since 2018. More than 40,000 installations supported in France and Europe, from home networks to multi-site datacenters.

EN United Kingdom
Choose your country
Livrer en
United Kingdom
Destination de livraison
Panier
Mon Panier
Chargement...