BioTeam’s Ari Berman was recently interviewed by HPC Wire’s John Russell about how customers in the Life Sciences are utilizing High Performance Computing (HPC) and how that usage might evolve over the course of 2017. There’s a lot of insight and background in the full article on HPC Wire, we’ve pulled out some of the high points below.
HPC Usage Growing
HPC usage continues to grow, driven by the data-intensive demands of established technologies such as Next Generation Sequencing but increasingly by newer technologies such as those emerging from the imaging community (e.g., Light Sheet Fluorescence Microscopy, Cryo-Electron Microscopy). Tools like Science Gateways and similar web portals are also making HPC more accessible for those less comfortable with the Command Line Interface and HPC environments; this should help expand the reach of HPC to a wider audience.
‘Denser’ Compute, Storage remains a huge issue
At the infrastructure level, compute is seeing a trend towards increased system density (more processing power packed into a smaller physical footprint), GPUs are emerging as a key player in imaging and simulation, and Intel Xeon continues as the dominant processor architecture in conventional HPC/Cluster environments. Storage remains a huge challenge as organizations balance their desire to never throw any data away against the costs of storage and the dizzying array of storage platforms available. RENCI’s iRODS platform is emerging as a viable tool to provide a consolidated interface to diverse storage systems and to enable the use of metadata to better manage the files they contain.
New players in high performance networking
Underpinning the whole system, networking is an area seeing some significant changes. In the cluster market, viable alternatives to using Mellanox’s Infiniband are emerging from vendors such as Arista and Juniper, which is providing cost effective 100Gb ethernet for cluster networks (though, Infiniband still has plenty of advantages, depending on the use cases). Similarly, Cisco is seeing some challenges from Arista, Ciena and Juniper in Enterprise environments as researchers look to design truly high performance networks (like Science DMZs) for scientific data exchange. The scope of the networking challenge is also growing with organizations wrestling with how to move petabyte quantities of data nationally and internationally.
Overall, the future for HPC in the life sciences space looks exciting and innovative, though conservative with new technologies – to find out more details read the full article on HPC Wire.