Best learning resources for AI
vLLM-project’s control-plane that orchestrates cost-efficient, plug-and-play LLM inference infrastructure.
Tabby is an open-source, self-hosted code-completion engine that runs on your GPU or CPU.
The best introduction on how to use LLMs like ChatGPT. It covers the basics of how LLMs work, including concepts like "tokens" and "context windows". The video then demonstrates practical applications, such as using LLMs for knowledge-based queries, and more advanced features like "thinking models" for complex reasoning. It also explores how LLMs can use external tools for internet searches and deep research. Finally, the video delves into the multimodal capabilities of LLMs, including their use of voice, images, and video.
Universal database gateway MCP server that lets agents explore MySQL, Postgres, SQL Server, MariaDB and more.
Model Context Protocol (MCP) bridge that lets Claude AI inspect, create and manipulate Blender scenes programmatically.
NVIDIA Dynamo is an open-source, high-throughput, low-latency inference framework that scales generative-AI and reasoning models across large, multi-node GPU clusters.
Continue is an open-source IDE extension and hub for creating custom AI coding assistants.
Cross-platform desktop automation MCP server that lets AI run terminal commands, manage processes and edit local files.
An MCP server for large-scale mobile automation and scraping on iOS & Android emulators, simulators and real devices.
Connects Supabase projects to Cursor, Claude and other MCP-aware assistants for schema introspection and data queries.
Open-source AI query engine that lets you ask questions and build AI agents over data spread across databases, warehouses, and SaaS apps.
A remote Model Context Protocol (MCP) server that turns any GitHub project into an always-up-to-date documentation hub for AI assistants, preventing code hallucinations.