Site icon MacTech.com

InfiniBand Debuts Mac-based #3 Supercomputer on the Planet

InfiniBand Architecture Debuts #3 Supercomputer on the Planet

Virginia Tech Stuns Computing Industry With 10 TeraFlop Computer Built in
Record Time for Only $5.2 Million Dollars

SANTA CLARA, CA and YOKNEAM, ISRAEL — (MARKET WIRE) — 11/16/2003 —
Mellanox=AE Technologies Ltd., the leader in InfiniBand=81 solutions has
announced, in conjunction with Virginia Tech, the deployment, performance
results and ranking for Virginia Tech’s 1105 node InfiniBand computing
cluster. The performance result achieves a #3 position on the new Top500
Supercomputer list ( www.top500.org ) that was released today. Leveraging
the industry standard InfiniBand interconnect and industry standard Apple
Power Mac G5 computers, Virginia Tech was able to build the system in less
than four months for only $5.2 million — less than one tenth the average
cost of comparable systems.

This InfiniBand cluster is reporting 10.28 TeraFlops, which ranks #3 on the
latest Top500 list making it the most powerful computer at any educational
institution in the world. Previous systems in the top 5 have typically cost
well over $50M, and the #1 and #2 systems in this list each cost well over
$100M. Mellanox and Virginia Tech have clearly demonstrated that industry
standard clusters provide huge compute capabilities at a fraction of the
cost.

“The decision to use industry standard 10 Gb/sec InfiniBand technology for
the high performance interconnect was absolutely the right choice for this
record breaking cluster,” says Dr. Srinidhi Varadarajan, of Virginia Tech’s
College of Engineering. “The InfiniBand interconnect has performed
flawlessly for Virginia Tech. InfiniBand has scaled beyond our expectations
enabling us to deploy this cluster in record time, at a record low cost
while setting a new performance standard for the world to follow.”

The cluster features the G5 using PowerPC 64-bit processors running at 2GHz
each, complemented by an incredible 4.4 TeraBytes of total system memory.
All 1105 cluster nodes are interconnected with 10 Gb/sec ultra low latency
InfiniBand links through twenty-four 96-Port InfiniBand switches, utilizing
copper cables from Amphenol.

The performance results of 10.28 TeraFlops was submitted to the Top500.org
and has received an official ranking on the new list published at the
SC2003 (SuperComputing) Conference in Phoenix. The cluster runs on Mac OS
X, Virginia Tech utilities, and MVAPICH (MPI for InfiniBand) developed by
the Department of Computer and Information Science at Ohio State University.

“The scalability and performance of InfiniBand has enabled Virginia Tech to
quickly build one of the most powerful computing facilities in the world
from inexpensive industry standard servers. This cluster is truly a
breakthrough event in computing,” said Eyal Waldman, CEO for Mellanox.
“Virginia Tech and Mellanox have achieved a feat that proves it is possible
to achieve massive computational power quickly and at a fraction of the
cost.”

About InfiniBand

InfiniBand is the only 10 Gb/sec ultra low latency clustering,
communication, storage and embedded interconnect in the market today.
InfiniBand, based on an industry standard, provides the most robust data
center interconnect solution available with reliability, availability,
serviceability and manageability features designed from the ground up.
These parameters greatly reduce total cost of ownership for the data
center. Low cost InfiniBand silicon that supports 10 Gb/sec RDMA transfers
is shipping today providing eight times the bandwidth of Ethernet and three
times the bandwidth of proprietary clustering interconnects. With an
approved specification for 30 Gb/sec, InfiniBand is at least a generation
ahead of competing fabric technologies today and in the foreseeable future.

About Mellanox

Mellanox is the leading supplier of InfiniBand semiconductors, providing
complete solutions including switches, host channel adapters, and target
channel adapters to the server, communications, data storage, and embedded
markets. Mellanox Technologies has delivered more than 100,000 InfiniBand
ports over two generations of 10 Gb/sec InfiniBand devices including the
InfiniBridge, InfiniScale, InfiniHost and InfiniScale III devices. Mellanox
InfiniBand interconnect solutions today provide over eight times the
performance of Ethernet, and over three times the performance of
proprietary interconnects. The company has strong backing from corporate
investors including Dell, IBM, Intel Capital, Quanta Computers, Sun
Microsystems, and Vitesse as well as, strong venture backing from Bessemer
Venture Partners, Raza Venture Management, Sequoia Capital, US Venture
Partners, and others. The company has major offices located in Santa Clara,
CA, Yokneam and Tel Aviv Israel. For more information visit
www.mellanox.com .

Exit mobile version