RAZER AIKIT

START LOCAL, SCALE TO CLUSTERS BUILD ADVANCED AI END-TO-END

Experience the full power of Razer’s GPU-powered devices with Razer AIKit, the open-source AI development toolkit built for engineers and researchers. Designed for easy, out-of-the-box setup, Razer AIKit delivers cloud-grade GPU acceleration and scalability for AI developers to run LLMs locally over a single or cluster of GPUs.

Features

Run and fine-tune over 280,000 LLMs locally with familiar frameworks, achieving full performance and AI privacy without cloud costs.

When local power isn’t enough, connect your GPUs into a single cluster for seamless scaling and optimal performance with minimal setup.

Fully open-source, Razer AIKit invites developers to customize and contribute, advancing high-performance local AI development together.

FAQ

What GPU do I need for model inferencing and fine-tuning?

Razer AIKit supports all NVIDIA GPUs (Workstation/Consumer, Data Center, and Jetson) with compute capability 7.0 or higher. See the full list on NVIDIA’s GPU page.

Razer AIKit is built on vLLM, LlamaFactory, and Ray, combining fast inference, efficient fine-tuning, and multi-GPU scaling.

Any vLLM-compatible model from Hugging Face Hub – over 280,000 LLMs available and ready to run locally.

Razer AIKit is optimized and tested on Razer devices for peak stability and performance. However, it’s not limited to Razer hardware – Razer AIKit runs on any system equipped with a compatible GPU.

Razer AIKit comes with OpenAI API integration, enabling connections to tools like Open WebUI, Continue Coding Assistant, AnythingLLM, and AI Dev Gallery.

Use them for tasks such as code generation, QA companions, research assistance, and local AI experiments.