-
Poolside
- Switzerland
Highlights
Starred repositories
Perplexity open source garden for inference technology
DeepEP: an efficient expert-parallel communication library
[TMLR] A curated list of language modeling researches for code (and other software engineering activities), plus related datasets.
Static analyzer for C/C++ based on the theory of Abstract Interpretation.
Official Implementation of "ADOPT: Modified Adam Can Converge with Any β2 with the Optimal Rate"
Dear ImGui: Bloat-free Graphical User interface for C++ with minimal dependencies
Open source implementation of AlphaFold3
A minimal Python framework for building custom AI inference servers with full control over logic, batching, and scaling.
Micro Llama is a small Llama based model with 300M parameters trained from scratch with $500 budget
Schedule-Free Optimization in PyTorch
PyTorch compiler that accelerates training and inference. Get built-in optimizations for performance, memory, parallelism, and easily write your own.
Website for hosting the Open Foundation Models Cheat Sheet.
The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
A Data Streaming Library for Efficient Neural Network Training
Graph Neural Network Library for PyTorch
A PyTorch Lightning extension that accelerates and enhances foundation model experimentation with flexible fine-tuning schedules.
20+ high-performance LLMs with recipes to pretrain, finetune and deploy at scale.
The OpenTF Manifesto expresses concern over HashiCorp's switch of the Terraform license from open-source to the Business Source License (BSL) and calls for the tool's return to a truly open-source …
Machine Learning Engineering Open Book
Refine high-quality datasets and visual AI models
A minimalistic boiler plate code for training pytorch models
Databricks’ Dolly, a large language model trained on the Databricks Machine Learning Platform
Fine-tune Segment-Anything Model with Lightning Fabric.
Implementation of the LLaMA language model based on nanoGPT. Supports flash attention, Int8 and GPTQ 4bit quantization, LoRA and LLaMA-Adapter fine-tuning, pre-training. Apache 2.0-licensed.
Analysis, Comparison, Trends, Rankings of Open Source Software, you can also get insight from more than 7 billion with natural language (powered by OpenAI). Follow us on Twitter: https://twitter.co…
Visualize Your Ideas With Code
This dataset contains over 110 hours of motion, eye-tracking and physiological data from 71 players of the virtual reality game “Half-Life: Alyx”. Each player played the game on two separate days f…
A library that contains a rich collection of performant PyTorch model metrics, a simple interface to create new metrics, a toolkit to facilitate metric computation in distributed training and tools…







