Stars
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
Langchain-Chatchat(原Langchain-ChatGLM)基于 Langchain 与 ChatGLM, Qwen 与 Llama 等语言模型的 RAG 与 Agent 应用 | Langchain-Chatchat (formerly langchain-ChatGLM), local knowledge based LLM (like ChatGLM, Qwen and…
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
Best Practices on Recommendation Systems
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility
NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego
Source code and dataset for ACL 2019 paper "ERNIE: Enhanced Language Representation with Informative Entities"
CNNs for Sentence Classification in PyTorch
The framework to deal with ctr problem。The project contains FNN,PNN,DEEPFM, NFM etc
SIGKDD'2019: DeepGBM: A Deep Learning Framework Distilled by GBDT for Online Prediction Tasks
A Benchmark of Text Classification in PyTorch
Best CIFAR-10, CIFAR-100 results with wide-residual networks using PyTorch
a python code of applying GBDT+LR for CTR prediction
Implements quantized distillation. Code for our paper "Model compression via distillation and quantization"
Source code for "Efficient Training of BERT by Progressively Stacking"
SIGKDD'2022: Mixture of Virtual-Kernel Experts for Multi-Objective User Profile Modeling
Continuous relaxation of Random Regression Forests
The Neural Ensemble Trees (NET) is a neural model based on RF or GBDT.
Research Benchmarks for Online Prediction Tasks.
AAAI'2020: Light Multi-segment Activation for model compression
