Friday 17 October 2008

Article on Cluster Interconnects from HPCCommunity Kusu toolkit


HPCCommunity.org is a technical community for High Performance Computing (HPC) sponsored by Platform Computing Corporation and dedicated to supporting both Platform Computing and non-Platform technologies. Within HPCCommunity you'll find Kusu, an open source cluster toolkit. It is the foundation of Platform Computing's Open Cluster Stack (OCS5) and Red Hat HPC Solution and has an interesting section on Technical Articles, Tips and Tricks. The following extracts come from "KUSU 103: Beowulf Cluster Best Practices", which appeared yesterday:

3.3 Cluster Interconnects.

Most small clusters have only one network and this is typically Gigabit Ethernet.

For larger clusters it is common to see i) Administration network typically GE, ii) HPC Network usually one of Infiniband, Quadrics or Myrinet, iii) Out-of-Band network.

The focus of this section is on the HPC Network.

Firstly determine if you your code will benefit from a HPC network. For bandwidth sensitive codes or IO intensive codes, then using Infiniband, Quadrics or Myrinet would help tremendously.

Which HPC network is the best? For that answer - you would have to know the characteristics of your code. Best to run your applications on the network you intend to buy - beg and borrow resources from your friends and colleagues in the community or the vendor to test.

In real-world MPI codes, Quadrics is typically the fastest due to its SHMEM capabilities. Quadrics also has excellent MPI management software in the form of RMS.

For people in the know, they are willing to pay the premium for Quadrics.

Read in full on this link.

No comments: