AI(2)
-
LLM 구성: vLLM + TheBloke/SOLAR-10.7B-Instruct-v1.0-AWQ
llm.ymlapiVersion: v1kind: PersistentVolumeClaimmetadata: name: models-pvc namespace: llmspec: accessModes: [ "ReadWriteOnce" ] resources: requests: storage: 100Gi storageClassName: ssd-local volumeName: models-pv-3080 # 같은 로컬 디스크 공유(원치 않으면 별도 PV 생성)---apiVersion: apps/v1kind: Deploymentmetadata: name: vllm-solar namespace: llmspec: replicas: 1 selector: matchLabels: { app..
2025.08.28 -
LLM구성: lama.cpp(server-cuda) + solar-10.7b-instruct-v1.0.Q4_K_M.gguf
llm.ymlapiVersion: apps/v1kind: Deploymentmetadata: name: llama-solar-gguf namespace: llmspec: replicas: 1 selector: matchLabels: { app: llama-solar-gguf } template: metadata: labels: { app: llama-solar-gguf } spec: nodeSelector: kubernetes.io/hostname: "3080" runtimeClassName: nvidia containers: - name: server image: ghcr.io/ggerganov/llama.c..
2025.08.27