ComfyUI docker images for use in GPU cloud and local environments. Includes AI-Dock base for authentication and improved user experience.
-
Updated
Nov 4, 2024 - Shell
ComfyUI docker images for use in GPU cloud and local environments. Includes AI-Dock base for authentication and improved user experience.
RunPod serverless worker for Fooocus-API. Standalone or with network volume
The Big List of Protests - An AI-assisted Protest Flyer parser and event aggregator
Runpod-LLM provides ready-to-use container scripts for running large language models (LLMs) easily on RunPod.
RunPod Serverless Worker for the Stable Diffusion WebUI Forge API
Headless threejs using Puppeteer
RunPod serverless worker for the vLLM AI text-gen inference. Simple, optimized and customisable.
A Chrome extension that helps improve reading comprehension by generating an interactive, multiple choice quiz for any website
MLOps library for LLM deployment w/ the vLLM engine on RunPod's infra.
Python client script for sending and save prompt to A1111 serverless workers endpoints
This repository contains the runpod serverless component of the SDGP project "quizzifyme"
This project hosts the LLaMA 3.1 CPP model on RunPod's serverless platform using Docker. It features a Python 3.11 environment with CUDA 12.2, enabling scalable AI request processing through configurable payload options and GPU support.
RunPod serverless function for voice conversion using RVC-v2 (Retrieval-based Voice Conversion)
Add a description, image, and links to the runpod-serverless topic page so that developers can more easily learn about it.
To associate your repository with the runpod-serverless topic, visit your repo's landing page and select "manage topics."