Research Computing Resources
Data Usage Agreements
When acquiring data sets from external agencies, in addition to the contractual process, frequently a data security plan must be signed off by an IT department contact. H&S IT acts as that contact for researchers in the school, with CRC verifying that systems have been set up according to Stanford's and the external agency's standards. Please contact Sean Brandt, H&S CIO, to start that process.
Sherlock Resources for H&S Researchers
Sherlock is Stanford's main shared high performance computing cluster with over 1,600 servers. Sherlock is a great solution for computing needs above and beyond what your laptop/desktop can handle, in terms of CPUs, RAM, storage and I/O The School of Humanities and Sciences has resources in Sherlock and has made them available for any researcher (Faculty, Postdoc, Research Staff, or Student) who is on the research team of an H&S faculty member. Sherlock can be used for departmental or sponsored research. It is not intended for classwork.
The school's 'hns' partition has 89 compute nodes, including one GPU node and 6 large memory nodes. H&S researchers have access to those, along with the centrally provided 'normal' and ‘gpu’ partitions. If you need more computing resources or faster access to those resources, consider purchasing systems on the cluster and becoming a Sherlock "owner".
Getting Started with Sherlock
First, set up an account. Faculty members can do this by writing email@example.com and specifying the SUNet IDs of the members of their research team to be added to their account.
Sherlock is usually accessed via ssh. Most Unix-like operating systems provide an SSH client by default that can be accessed by typing the ssh command in a terminal window. Connect using:
You can also connect with your web browser using Sherlock OnDemand. To connect to Sherlock OnDemand, point your browser to https://login.sherlock.stanford.edu. From here, users can connect to several GUI based applications including Jupyter notebooks, RStudio and TensorFlow. For more information, see the OnDemand documentation page.
More information on how to connect to Sherlock is available on the getting connected page.
Using H&S Resources on Sherlock
Once you've connected, as an H&S researcher you can use the compute resources provided by the school to run your jobs. To access these resources, include:
in the sbatch script when submitting your job.
They also have office hours to go more in-depth about your particular needs. Please note that due to the current COVID-19 situation, all office hours are being done remotely via Zoom. They are first come first serve or users can reserve a time and sign up for an office hours appointment.
The H&S IT team collaborates with the Stanford Research Computing Center to provide training and onboarding to departments, programs, and labs. We can come to your faculty meeting, graduate student event, or other gathering to give an overview of Sherlock and how to access the H&S resources. For more information or to schedule an event, please fill out and submit the Research Computing Training Interest Form.
Other Research Computing Resources
If the Sherlock resources that H&S provides are not sufficient for your research needs, faculty members can purchase additional dedicated resources on Sherlock. More information on pricing and capability is available on the Sherlock Server Order Page.
Sherlock is approved for low and medium risk data as defined by the University. If you use High Risk Data for research, H&S IT recommends using Nero, a computing cluster specifically designed to compute on and store High Risk Data. To get started with Nero, please contact firstname.lastname@example.org and specify that you are interested in using Nero. You will also need to complete a Data Risk Assessment with the University's Information Security and Privacy offices for each Nero use case.
If you have large data storage requirements, Oak storage is available for purchase. Oak is high I/O storage that can be easily mounted on the Sherlock compute cluster and can be purchased in 10TB allocations at relatively low cost (currently $50/TB/year). Oak storage is not backed up by default.
If you have use cases that are not well served by a shared computing cluster, please schedule an IT Consultation with us.