LogoAIAny
  • Search
  • Collection
  • Category
  • Tag
  • Blog
LogoAIAny

Tag

Explore by tags

LogoAIAny

Learn Anything about AI in one site.

support@aiany.app
Product
  • Search
  • Collection
  • Category
  • Tag
Resources
  • Blog
Company
  • Privacy Policy
  • Terms of Service
  • Sitemap
Copyright © 2026 All Rights Reserved.
  • All

  • 30u30

  • ASR

  • ChatGPT

  • GNN

  • IDE

  • RAG

  • ai-agent

  • ai-api

  • ai-api-management

  • ai-client

  • ai-coding

  • ai-demos

  • ai-development

  • ai-framework

  • ai-image

  • ai-image-demos

  • ai-inference

  • ai-leaderboard

  • ai-library

  • ai-rank

  • ai-serving

  • ai-tools

  • ai-train

  • ai-video

  • ai-workflow

  • AIGC

  • alibaba

  • amazon

  • anthropic

  • audio

  • blog

  • book

  • bytedance

  • chatbot

  • chemistry

  • claude

  • course

  • deepmind

  • deepseek

  • engineering

  • foundation

  • foundation-model

  • gemini

  • github

  • google

  • gradient-booting

  • grok

  • huggingface

  • LLM

  • llm

  • math

  • mcp

  • mcp-client

  • mcp-server

  • meta-ai

  • microsoft

  • mlops

  • NLP

  • nvidia

  • ocr

  • ollama

  • openai

  • paper

  • physics

  • plugin

  • pytorch

  • RL

  • science

  • sora

  • translation

  • tutorial

  • vibe-coding

  • video

  • vision

  • xAI

  • xai

Icon for item

LLMs-from-scratch

2023
Sebastian Raschka

An open-source GitHub repository by Sebastian Raschka that contains the official code for the book "Build A Large Language Model (From Scratch)". It provides step-by-step PyTorch implementations to build, pretrain, and finetune a GPT-like LLM for educational purposes, along with exercises, bonus material, and companion video content.

pytorchbookLLMllmai-coding+1
Icon for item

Machine Learning Systems (MLSysBook)

2023
Harvard EDGE / MLSysBook community, Vijay Janapa Reddi (lead/primary author listed)

MLSysBook (Machine Learning Systems) is an open, community-driven textbook and learning stack for AI systems engineering led by the Harvard EDGE / MLSysBook community. The repository houses the textbook source, TinyTorch (a small educational DL framework), hardware lab kits, and supporting materials to teach how to design, build, benchmark, and deploy real-world machine learning systems.

githubbookai-developmentai-frameworkmlops+4
Icon for item

Hands-On Large Language Models

2024
Jay Alammar, Maarten Grootendorst +1

Official code repository for the O'Reilly book "Hands-On Large Language Models" by Jay Alammar and Maarten Grootendorst. It provides runnable notebooks, visual explanations, and practical examples across chapters covering tokens and embeddings, transformer internals, text classification, semantic search, fine-tuning, multimodal models, and more. Recommended to run in Google Colab for easy setup.

bookllmLLMgithubtutorial+5
Icon for item

Foundations of LLMs (大模型基础)

2024
ZJU-LLMs

Foundations of LLMs is an open-source book by the ZJU-LLMs team that teaches fundamentals and advanced topics of large language models. It covers language model basics, LLM architecture evolution, prompt engineering, parameter-efficient fine-tuning, model editing, and retrieval-augmented generation. The repo provides chapter PDFs, paper lists, and is updated monthly.

bookfoundationLLMllmNLP+3

Pattern Recognition and Machine Learning

2006
Christopher M. Bishop

The book coveris probabilistic approaches to machine learning, including Bayesian networks, graphical models, kernel methods, and EM algorithms. It emphasizes a statistical perspective over purely algorithmic approaches, helping formalize machine learning as a probabilistic inference problem. Its clear mathematical treatment and broad coverage have made it a standard reference for researchers and graduate students. The book’s impact lies in shaping the modern probabilistic framework widely used in fields like computer vision, speech recognition, and bioinformatics, deeply influencing the development of Bayesian machine learning methods.

foundationbook

The Elements of Statistical Learning

2009
Trevor Hastie, Robert Tibshirani +1

The book unifies key machine learning and statistical methods — from linear models and decision trees to boosting, support vector machines, and unsupervised learning. Its clear explanations, mathematical rigor, and practical examples have made it a cornerstone for researchers and practitioners alike. The book has deeply influenced both statistics and computer science, shaping how modern data science integrates theory with application, and remains a must-read reference for anyone serious about statistical learning and machine learning.

foundationbook

Machine Super Intelligence by Shane Legg

2011
Shane Legg

This book develops a formal theory of intelligence, defining it as an agent’s capacity to achieve goals across computable environments and grounding the concept in Kolmogorov complexity, Solomonoff induction and Hutter’s AIXI framework.It shows how these idealised constructs unify prediction, compression and reinforcement learning, yielding a universal intelligence measure while exposing the impracticality of truly optimal agents due to incomputable demands. Finally, it explores how approximate implementations could trigger an intelligence explosion and stresses the profound ethical and existential stakes posed by machines that surpass human capability.

foundation30u30book

Machine Learning: A Probabilistic Perspective

2012
Kevin P. Murphy

Th book offers a comprehensive, mathematically rigorous introduction to machine learning through the lens of probability and statistics. Covering topics from Bayesian networks to graphical models and deep learning, it emphasizes probabilistic reasoning and model uncertainty. The book has become a cornerstone text in academia and industry, influencing how researchers and practitioners think about probabilistic modeling. It’s widely used in graduate courses and cited in numerous research papers, shaping a generation of machine learning experts with a solid foundation in probabilistic approaches.

foundationbook

Deep Learning

2016
Ian Goodfellow, Yoshua Bengio +1

The book provides a comprehensive introduction to deep learning, covering foundational concepts like neural networks, optimization, convolutional and recurrent architectures, and probabilistic approaches. It bridges theory and practice, making it essential for both researchers and practitioners. Its impact has been profound, shaping modern AI research and education, inspiring breakthroughs in computer vision, natural language processing, and reinforcement learning, and serving as the go-to reference for anyone entering the deep learning field.

foundationbook

Probabilistic Machine Learning: An Introduction

2022
Kevin Patrick Murphy

The book provides a comprehensive yet accessible introduction to probabilistic modeling and inference, covering topics like graphical models, Bayesian methods, and approximate inference. It balances theory with practical examples, making complex probabilistic concepts understandable for newcomers and useful for practitioners. Its impact lies in shaping how students and researchers approach uncertainty in machine learning, offering a unifying probabilistic perspective that has influenced research, teaching, and real-world applications across fields such as AI, robotics, and data science.

foundationbook

Kolmogorov Complexity and Algorithmic Randomness

2022
A. Shen, V. A. Uspensky +1

This book offers a comprehensive introduction to algorithmic information theory: it defines plain and prefix Kolmogorov complexity, explains the incompressibility method, relates complexity to Shannon information, and develops tests of randomness culminating in Martin-Löf randomness and Chaitin’s Ω. It surveys links to computability theory, mutual information, algorithmic statistics, Hausdorff dimension, ergodic theory, and data compression, providing numerous exercises and historical notes. By unifying complexity and randomness, it supplies rigorous tools for measuring information content, proving combinatorial lower bounds, and formalizing the notion of random infinite sequences, thus shaping modern theoretical computer science.

foundation30u30bookmath

Deep Learning: Foundations and Concepts

2023
Chris Bishop, Hugh Bishop

The book introduces core principles and theoretical foundations behind deep learning, bridging the gap between classical machine learning and modern neural networks. It explains key architectures, optimization techniques, and mathematical frameworks that underpin today’s AI systems. By combining rigorous treatment with accessible explanations, it empowers researchers and practitioners to understand not just how deep models work, but why. Its impact lies in deepening the academic rigor of the field, shaping curricula, and guiding both industry innovation and the next generation of AI breakthroughs.

foundationbook
  • Previous
  • 1
  • 2
  • Next