Dear Grex / HPC Users,
Thank you all for using our HPC resources! Another year passed since our last call. Our local HPC system, Grex, just got a major hardware and datacentre renewal in 2024.
It is time to renew the local resource allocations for 2025
Please find the Local Resource Allocation Call 2025 forms and conditions attached
Your HPC Team.
====
Local HPC (Grex) resource allocation call description
Overview of the HPC resources
Grex is the High-Performance Computing (HPC) system at the University of Manitoba. In addition to its compute capacity, which includes Intel Skylake compute nodes purchased in 2020 and 2021, over 5,000 new AMD Genoa compute cores were added as part of the HPCC SISF 2023 project. There is also a large, partially community-funded project storage on Grex.
This call is for renewing and updating the allocation of resources on the Grex HPC system, specifically CPU time (in Core Years, CY) and storage (in TBs). The generally available hardware is as follows:
*
30 of 192-core AMD Genoa Zen4 9654 nodes, each with 192 cores and 4 GB or 8 GB of memory per core.
*
12 nodes with 40-core Intel Cascade Lake CPUs and 42 nodes with 52-core Intel Cascade Lake CPUs, each with 3GB or 8 GB of memory per core.
*
Two nodes with 4x NVIDIA V100 GPUs, Intel 5218 Cascade Lake CPUs, and 6 GB memory per core.
The total number of allocatable CPU cores is 8,580.
*
General GPU nodes are available on a first-come, first-served basis. There are no dedicated GPU allocations, as the number of GPUs (eight V100s) is limited. Please indicate if you anticipate needing GPUs in your proposal.
*
Contributed resources: A significant number of contributed GPUs (34 total, comprising A30 and V100 models) and CPU cores (1,800 AMD Genoa) are available. These resources are owned by contributors and are not allocatable but can be used opportunistically when not in use by their owners.
The Project storage, allocatable per research group, has a total capacity of 2 PB, with 1.7 PB designated for the contributing Faculties of Agriculture, Engineering, and Science. The /project space is allocated by default between 5 and 40 TB per group. Larger storage allocations are possible through this call, especially for research groups from contributing Faculties. Research groups from these Faculties will be given priority for /project storage according to their contributions.
There is no charge for using Grex for University of Manitoba academic PIs.
The request for ARC resources must come from a UManitoba PI, and the unit/owner of a resource allocation is the PI’s research group.
Categories of Resource Requests
There are two categories of resource requests:
1.
Default allocations or “Rapid Access Service” (RAS) is limited to under 60 core-years, 4 GB memory per core, and 5 to 40 TB of Project storage per research group. RAS requests do not require a full proposal. For information and access, please email arc(a)umanitoba.ca.
2.
Resource Allocation Call (RAC): Users who need more core-years or storage than the RAS limits above for 2025 must request resources via an RAC proposal as part of this call. Proposals must be submitted in the format described below.
________________________________
Proposal Format and Submission Timelines
Please use the attached 2024 RAC Request Template for your proposal. You may remove all explanations (text in italics) if desired.
This template is intended as a guide. Expected lengths for each section are included in the template, but your proposal may use more, or less space, depending on the size and complexity of the request.
Proposals for the use of Grex resources will be reviewed by the Advanced Research Computing (ARC) Committee and Grex technical staff to ensure resources are used efficiently. Based on available Grex resources and the total request volume, requested allocations may be scaled to fit within the system’s capacity.
Proposals must be sent by email to ARC(a)umanitoba.ca<mailto:ARC@umanitoba.ca> by 4:30 pm on November 28, 2024.