Mellanox Support for v2.1
-
The main idea behind this is very simple. Mellanox cards are cheap, in comparison with the 10G network cards. Infiniband switches are getting very affordable as well.
Originally they were meant for HPC ( High Performance Clustering ) but they do support IPoIB ( IP over InfiniBand ) which allows 10GB netowork aggregation between servers and some enterprise level switches.
Capabilities are tested in this thread…
http://forums.servethehome.com/showthread.php?680-Mellanox-Infiniband-%2820-Gbps%29-HBA-for-90-bucksA quick eBAY search reveals the following
http://www.ebay.com/sch/i.html?_from=R40&_trksid=p3984.m570.l1313&_nkw=MHET2x-1TC&_sacat=See-All-Categories
Mellanox cards start at $40 and up, which is a steal in comparison with the $300+ 10G network cardsDecent Switches can be had around $200-350 that support this…
A good example is the TopSpin 120 / Cisco SFS7000p / HP ( I do believe they are the same as they take the same software update from Cisco )IB drivers for BSD have been around for awhile...
its in V9.0+ releases, but even though someone is already working on it on the FreeBSD forums, Sadly no ETA
http://forums.freebsd.org/showthread.php?t=17774Firmware - http://www.mellanox.com/content/pages.php?pg=custom_firmware_table
Drivers - http://www.mellanox.com/content/pages.php?pg=software_overview_ib&menu_section=34
More Info - http://lists.freebsd.org/pipermail/freebsd-current/2011-March/023554.htmlI don't know how to code or compile in BSD, nor have the time to learn right now as I'm working 12-14hr night shifts again at the airport ( 5 on - 1 off and sometimes i don't even get my off due to being short staffed at the hangar )
I would put at least $100 towards this.
As interest is growing ( seeing a large amount of treads with questions regarding Mellanox Support ), I hope there will be more people willing to pitch in instead of waiting for future releases that will support it