enow.com Web Search

Search results

  1. Results from the WOW.Com Content Network
  2. Dell M1000e - Wikipedia

    en.wikipedia.org/wiki/Dell_M1000e

    Other options are the Mellanox SwitchX M4001F and M4001Q [53] and the Mellanox M2401G 20 Gb Infiniband switch for the M1000e enclosure [54] The M4001 switches offer either 40 GBit/s (M4001Q) or the 56 Gbit/s (M4001F) connectivity and has 16 external interfaces using QSFP ports and 16 internal connections to the Infiniband Mezzanine card on the ...

  3. InfiniBand - Wikipedia

    en.wikipedia.org/wiki/InfiniBand

    Mellanox (acquired by Nvidia) manufactures InfiniBand host bus adapters and network switches, which are used by large computer system and database vendors in their product lines. [2] As a computer cluster interconnect, IB competes with Ethernet, Fibre Channel, and Intel Omni-Path. The technology is promoted by the InfiniBand Trade Association.

  4. Mellanox Technologies - Wikipedia

    en.wikipedia.org/wiki/Mellanox_Technologies

    Mellanox network adapter and switches supported remote direct memory access (RDMA) and RDMA over Converged Ethernet. Product names included: The ConnectX product family of multi-protocol ASICs and adapters supports virtual protocol interconnect (VPI), enabling support for both Ethernet and InfiniBand traffic at speeds up to 200 Gbit/s.

  5. Mellanox Expands Line of FDR 56Gb/s InfiniBand Switch ... - AOL

    www.aol.com/news/2012-11-12-mellanox-expands...

    Mellanox Expands Line of FDR 56Gb/s InfiniBand Switch Solutions with 12-port Switch System for Small Scale, Storage and Embedded Applications New switch enables cost-effective, low-scale ...

  6. Mellanox Introduces MetroX - Long Haul InfiniBand Switch ...

    www.aol.com/news/2012-11-12-mellanox-introduces...

    Mellanox Introduces MetroX - Long Haul InfiniBand Switch Solutions Mellanox MetroX™ TX6000 series of non-blocking long haul switches extends InfiniBand and RDMA connectivity reach to campus wide ...

  7. Mellanox FDR InfiniBand Demonstrates 2X Growth over a 6 Month ...

    www.aol.com/news/2013-06-18-mellanox-fdr...

    Mellanox FDR InfiniBand Demonstrates 2X Growth over a 6 Month Period for Petascale-Capable Systems on the TOP500 Overall FDR InfiniBand-based systems on the TOP500 grew 3.3X year-over-year, ...

  8. Mellanox's FDR InfiniBand Solution with NVIDIA GPUDirect RDMA ...

    www.aol.com/2013/06/17/mellanoxs-fdr-infiniband...

    Mellanox's FDR InfiniBand Solution with NVIDIA GPUDirect RDMA Technology Provides Superior GPU-based Cluster Performance Triples small message throughput and reduces MPI latency by 69 percent ...

  9. RDMA over Converged Ethernet - Wikipedia

    en.wikipedia.org/wiki/RDMA_over_Converged_Ethernet

    RDMA over Converged Ethernet (RoCE) [1] is a network protocol which allows remote direct memory access (RDMA) over an Ethernet network. There are multiple RoCE versions. RoCE v1 is an Ethernet link layer protocol and hence allows communication between any two hosts in the same Ethernet broadcast domain.