Resource Shapes
When you launch a workload like a pod or a job, you can specify the resource shape to use. This can be done by specifying it in the Resource section in the create workload page.

Available Resource Shapes
The available resource shapes can be found at pricing page. For enterprise plan users, we are able to offer dedicated / tailored resource shapes, contact us to discuss more options.

GPU availability
Sometimes you may experience a longer wait time when requesting a GPU in the on-demand node group due to high demand. If you are in need of a consistent number of GPUs for your workload, please contact us to discuss options.
FAQ
Is my code running in a GPU environment?
Inside the photon deployment, you may want to find out if the deployment comes with GPUs. Lepton provides the environmental variable LEPTON_RESOURCE_ACCELERATOR_TYPE
to help you with that. You can use it as follows:
import os
if os.environ.get("LEPTON_RESOURCE_ACCELERATOR_TYPE") is not None:
print("I have a GPU!")
else:
print("I don't have a GPU!")
Or, if you are using pytorch, an easy way is to use torch's built in function:
import torch
if torch.cuda.is_available():
print("I have a GPU!")
else:
print("I don't have a GPU!")