Why Cloud?

Advantages for NIH & Biological Data sets

There are a number of key benefits that the Cloud provides in support of biomedical research:

  • it enables collaboration (within labs, across labs, within consortia and even between scientific disciplines) because it enables equal access to the data and compute resources, regardless of the capabilities of the researcher’s institutional environment.
  • it can be used to reduce wasteful (and expensive and error prone) duplication of data
  • it facilitates reproducible science by allowing the data and associated analysis pipelines to be shared with others
  • it can be configfured as a secure environment where all data access is tracked, authenticated, and authorized.

There are also some very practical realities of modern biomedical informatics that the cloud is uniquely suited to address:

  • Data sets are growing too large for local infrastructure, inhibiting researcher’s abilty to maintain this data
  • Compute requirements to process these large datasets are exceeding local capacity, inhibiting researcher’s abilty to analyze this data
  • As a result, it often makes more sense to move the compute to the data than vice versa (i.e. the old model of downloading data to a local computer for analysis)
  • Data is currently scattered, hard to find, not ‘FAIR’ [Finable, Accessable, Interoperable, Reusable]. The cloud has the capacity to provide a single environment where these datasets can be stored for wider use.

AWS Resources

AWS has a page that describes a variety of existing uses of cloud technology for research and technical computing that outlines various AWS solutions in this space. AWS also has an eBook entitled Personalized Medicine and Genomic Research: Profiles in Cloud-Enabled Scientific Discovery covering four real-world genomic use cases (registration required to access ebook).


Google Resources

Google has a Healthcare and Life Sciences Solutions section that outlines a variety of uses of GCP in this area, including their Google Genomics offering.

Potential Downsides for NIH & Biological Data sets

The Cloud does have many benefits, however, there are also some caveats, particularly given that the cloud can be perceived as a potential panacea for a wide variety of issues the biomedical community is now facing.

  • Data uploaded into the cloud are not automatically FAIR, and therefore are simply not useful unless an effort goes into making them FAIR (with all that entails).
  • Many scientific tools are currently not configured to work effectively in the cloud, and must be reconfigured.
  • Resources (data, tools, etc) in the cloud are not guaranteed to persist, therefore this should not be seen as a replacement for uploading data into a proper data repository.
  • Data download costs (‘egress fees’) from the cloud can get very expensive for large data volumes, this is a vital consideration if you are putting data into the cloud with a view to allowing unfettered access to interested parties.

Situations where the Cloud may not be the best option

There are also certain situations where the cloud may not be an appropriate solution to the computing problem at hand.

Very high performance HPC

Due to the physical design and architecture of cloud hardware, it may not be appropriate for certain supercomputing applications such as computational chemistry (e.g. quantum mechanics/molecular mechanics simulations) or physics (e.g. computational fluid dynamics) where the node to node latency needs to be low. This is because the Cloud can sometimes have networks that are high latency or oversubscribed resulting in lower performance compared to a real “supercomputer”.

This situation may be improving with recent announcements from AWS, however, purists would still argue the Cloud is slow compared to the latest and greatest big iron.

When its cheaper to use a non-cloud approach

There are also situations where the cloud would work fine, but it may not be your best option (especially in terms of total cost of ownership, TCO). For example, if you have a local system that does some regular analysis with local data and the resource is utilized 100% of the time you basically have a steady-state workload and and so your TCO can often be lower than using Cloud since you effectively amortize hardware cost over time.

Another example is when cloud cost metering is not to your benefit or well understood. Imagine you set up a “data sharing” website that suddenly becomes very popular, you could end up paying significant data egress charges that you could have negotiated ahead of time or hosted in a different way.

All these examples speak to the importance of understanding the IT challenges you are trying to solve and the asssociated infrastructure requirements needed for a system to address these challenges. It also highlights the need to understand how CSPs charge for their services and do some upfront planning and cost estimation, plus ongoing monitoring to ensure that there are no unpleasant surprises when the monthly bill appears.

General advantages of the Cloud

  • Pricing
    • You benefit from the economies of scale available to the CSPs and from the competition with other cloud vendors
    • Spot pricing - reduced rates available to encourage the use of the CSPs idle compute resources
    • Reserved compute instance pricing - compute costs can be less expensive if you reserve your compute resources for longer periods of time
  • Pay for what you use - most CSP pricing is on a ‘per hour’ basis.
  • Constantly evolving toolset and features
    • Cloud providers are constantly evolving their tools and features, particularly around compute capacity and storage, more recently in the areas of AI, Machine Learning, and networking.
  • No need to maintain local infrastructure
    • Little or no capital expenditure to establish a compute infrastructure (though potentially a large operating expense instead).
  • Expandable capacity, no long term commitments
    • Easy to expand your resources on a temporary basis, e.g. to do a specific analysis with many more compute nodes, or increased data storage for a new dataset. The resources can be brought on line very quickly and then turned off once they are no longer needed.
  • Very amenable to DevOps approaches to allow ‘scriptable infrastructure’ or ‘Infrastructure as Code’
    • tools like Ansible, Chef, Puppet, etc. can be used to describe the cloud infrastructure and bring it into being with little to no manual intervention.
    • Allows hardware to quickly be created, torn down and recreated as needed which is handy when you are paying by the hour
    • Also creates documentation for the system configuration as a convenient side-effect

Other things to consider when using the Cloud

Just as there are many good reasons for using a Cloud solution, there are also quite a few things to be aware as you start down this road.

System Design

  • There is often a need to reevaluate how solutions are architected for the cloud, this is to allow the system to take advantage of the options available via the cloud, particularly its elastic capabilities, plus to manage the very different service cost model.
  • Avoid over provisioning compute and storage (use what you need when you need, but then remove it when you are done).
    • Using the elastic and dynamic capabilities of cloud services is highly recommended.
    • Implementing system automation through the use of DevOps tools supports a more dynamic system provisioning approach.
  • Egress charges for data to the internet, or to different zones/regions of the CSP, may lead to very significant service bills (Users could potentially download large volumes of data unless you configure your cloud suitably)
  • You should plan on implementing a comprehensive monitoring solution for the utilized services and your application’s performance in order to periodically re-optimize the cloud solution. (Cloud providers typically monitor the services you consume but not necessarily the performance of your applications, which you will need to do)
  • Sharing data between clouds and/or on prem data centers can present challenges.
    • Viable options: : create an appropriate set of user roles, bucket policies, or signed URLS)
  • Integrating your institution’s access control and authentication with those of the CSP can be a major effort. This will involve your IT Infosec and the CSP working together.
  • Limited technical portability between CSPs (You may find yourself locked in to a particular CSP if you use a solution which relies on their proprietary technology)

Security

  • Increase in security vulnerabilities due to multi tenancy (infrastructure such as network storage, hosting servers, networks etc. is shared on public clouds)
  • Reduced control on operational governance over CSP (you will have little control other than those spelled out in the terms and conditions you signed with the CSP)
  • Multiregional compliance and legal issues (for example cloud providers may not store your data in the Continental United States (CONUS) unless special agreements are in put in place)
  • Legal and HIPAA requirements for handling certain types of PI or PHI data may make it necessary to have a Business Agreement (BA) with the CSP. Working through a CSP reseller, may need additional BAs to be put in place. Delays getting these agreements in place can be longer than you may anticipate!

Staffing & Support

  • Needs cloud technical support (you may not have personnel who are familiar with your CSP services. Support from CSP can be expensive!)