KubePACS is a Kubernetes-native spot instance provisioning system for building node pools that balance performance, availability, and cost. It uses cloud market signals such as spot price, benchmark performance, and availability scores to choose instance types, then integrates the decision into Karpenter so Kubernetes can provision those nodes through the normal autoscaling workflow.
The project is described in the paper KubePACS: Kubernetes Cluster Using Performant, Highly Available, and Cost Efficient Spot Instances. A hosted project page is available at kubepacs.ddps.cloud.
KubePACS_with_Karpenter/: Karpenter fork with the KubePACS scheduler path and Helm chart.charts/: static Helm repository output, includingindex.yamland packaged chart archives.IaC/IaC_karpenter_kubepacs/: Terraform for deploying the EKS/Karpenter/KubePACS environment.figures/: scripts and data used to regenerate paper figures.api/: API wrapper for the KubePACS optimizer.frontend/: project website frontend.
This repository is intended to be publishable as a Helm repository. Once GitHub Pages is enabled for this repo, users can install the chart with:
helm repo add kubepacs https://ddps-lab.github.io/KubePACS/charts
helm repo update
helm upgrade --install karpenter kubepacs/karpenter \
--namespace karpenter \
--create-namespace \
-f KubePACS_with_Karpenter/karpenter-provider-aws/charts/karpenter/examples/kubepacs-values.yamlThe chart installs Kubernetes resources into an existing EKS cluster. AWS prerequisites still need to exist: an EKS cluster, controller IAM role, node IAM role, tagged subnets/security groups, and an interruption queue if configured.
The default controller image is:
ghcr.io/ddps-lab/kubepacs-karpenter-controller:1.8.1-kubepacs
That image is built from KubePACS_with_Karpenter/karpenter-provider-aws/Dockerfile and includes:
- the KubePACS-enabled Karpenter controller binary
kubepacs_cli.pyaws_coremark_singlecore.csv- Python runtime dependencies used by the optimizer
If settings.kubepacs.enabled=true, the chart refuses to render with the upstream Karpenter image because that image does not contain KubePACS.
Publishing is handled by .github/workflows/publish-kubepacs-helm.yaml.
For public installs, configure GitHub like this:
- Enable GitHub Pages with deployment from GitHub Actions.
- Run the
Publish KubePACS Helm Chartworkflow. - Make the GHCR package
kubepacs-karpenter-controllerpublic.
To update the packaged Helm repository locally:
helm package KubePACS_with_Karpenter/karpenter-provider-aws/charts/karpenter --destination charts
helm repo index charts --url https://ddps-lab.github.io/KubePACS/chartsThe Terraform environment under IaC/IaC_karpenter_kubepacs/ creates the supporting AWS/EKS pieces and installs the local Helm chart.
cd IaC/IaC_karpenter_kubepacs
terraform init
terraform applyThe IaC defaults to the public KubePACS controller image, but the image can be overridden with:
controller_image_repository = "ghcr.io/ddps-lab/kubepacs-karpenter-controller"
controller_image_tag = "1.8.1-kubepacs"
controller_image_digest = ""Figure scripts live in figures/. They use Python 3.11+ and uv.
Install dependencies:
cd figures
uv syncRegenerate every figure script:
cd figures
find . -name 'figure_*.py' -print0 | sort -z | while IFS= read -r -d '' script; do
dir=$(dirname "$script")
file=$(basename "$script")
(cd "$dir" && uv run python "$file")
doneRegenerate one figure:
cd figures/figure10_exp_k8s_karpenter
uv run python figure_10.pyThe scripts overwrite the generated PDFs in each figure directory. Most figures use checked-in result data.
See figures/README.md for the full figure command list.
- Project page: https://kubepacs.ddps.cloud/
- Paper: https://arxiv.org/abs/2604.24027
- DOI: https://doi.org/10.1145/3801927.3810468
Please cite KubePACS as:
Taeyoon Kim, Kyumin Kim, Enrique Molina-Giménez, Pedro García-López,
and Kyungyong Lee. 2026. KubePACS: Kubernetes Cluster Using Performant,
Highly Available, and Cost Efficient Spot Instances. In 27th International
Middleware Conference (Middleware ’26), December 14–18, 2026, Tarragona,
Spain. ACM, New York, NY, USA, 14 pages.
https://doi.org/10.1145/3801927.3810468