forked from OSC/bc_osc_matlab
-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathform.yml
More file actions
92 lines (92 loc) · 2.53 KB
/
form.yml
File metadata and controls
92 lines (92 loc) · 2.53 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
cluster: "arcc2"
form:
- auto_accounts
- custom_queue
- bc_num_hours
- bc_num_slots
- num_cpus
- num_gpus
- node_type
- custom_element
attributes:
custom_queue:
widget: "select"
label: "Partition"
options:
- ["Public", "public"]
- ["Short", "short"]
- ["Large Memory", "large-mem"]
- ["GPU V100", "gpu-v100"]
- ["GPU A100", "gpu-a100"]
- ["GPU H100", "gpu-h100"]
bc_num_hours:
label: Walltime (Hours)
help: |
- Partition durations:
- Short: 4 hours maximum
- All others: 480 hours maximum
- Exceeding walltime time will automatically stop this job.
cacheable: false
widget: number_field
max: 480
min: 1
step: 1
value: 1
bc_num_slots: 1
num_cpus:
label: CPUs (Cores)
help: |
- CPU Paritions: allocate up to 64 cores
- GPU Partitions:
- V100 and A100: allocate up to 32 CPU cores per GPU
- H100: allocate up to 12 CPU cores per GPU
cacheable: false
widget: number_field
max: 96
min: 1
step: 1
value: 1
num_gpus:
label: "GPUs"
help: |
- GPUs Available:
- V100 and A100: 2 GPUs per node
- H100: 8 GPUs per node
- If requesting GPUs, you must select a GPU-enabled partition above
cacheable: false
widget: number_field
max: 8
min: 0
step: 1
value: 0
node_type: null
custom_element:
class: "d-none"
skip_label: true
help: |
<div class="card mb-3">
<div class="card-header">
Memory per CPU
</div>
<div class="card-body">
<p class="mb-2">
Total memory available ≈ <strong>CPUs x Mem/CPU</strong>.
</p>
<div class="table-responsive">
<table class="table table-sm mb-0">
<thead>
<tr><td>Partition</td><td>Mem / CPU</td></tr>
</thead>
<tbody>
<tr><td>Public</td><td>3800 MB (≈ 3.71 GiB)</td></tr>
<tr><td>Short</td><td>3800 MB (≈ 3.71 GiB)</td></tr>
<tr><td>Large Memory</td><td>30400 MB (≈ 29.7 GiB)</td></tr>
<tr><td>GPU-V100</td><td>3800 MB (≈ 3.71 GiB)</td></tr>
<tr><td>GPU-A100</td><td>15200 MB (≈ 14.8 GiB)</td></tr>
<tr><td>GPU-H100</td><td>10125 MB (≈ 9.89 GiB)</td></tr>
</tbody>
</table>
</div>
</div>
</div>