Resource Shapes
When you launch photons, in default, we will use the cpu.small
resource type consisting of 1 cpu core and 4 GBs of memory. However, you may choose to use a different resource type depending on your use case. This can be done by specifying the --resource-shape
flag when launching photons, such as:
lep photon run -n myphoton --resource-shape gpu.t4
Available Resource Shapes
The available resource shapes can be found at pricing page. For enterprise plans, we are able to offer dedicated / tailored resource shapes. Contact us to discuss options.
Is my code running in a GPU environment?
Inside the photon deployment, you may want to find out if the deployment comes with GPUs. Lepton provides the environmental variable LEPTON_RESOURCE_ACCELERATOR_TYPE
to help you with that. You can use it as follows:
import os
if os.environ.get("LEPTON_RESOURCE_ACCELERATOR_TYPE") is not None:
print("I have a GPU!")
else:
print("I don't have a GPU!")
Or, if you are using pytorch, an easy way is to use torch's built in function:
import torch
if torch.cuda.is_available():
print("I have a GPU!")
else:
print("I don't have a GPU!")
GPU availability
Keep in mind that GPUs are in high demand recently, and may not be as available as CPUs. You may experience a longer wait time when requesting a GPU. If you are in need of a consistent number of GPUs for your workload, please contact us to discuss options.