Skip to content
#

distillation-model

Here are 26 public repositories matching this topic...

A minimalist SOTA LaTeX OCR model with only 20M parameters, running in browser. Full training pipeline available for self-reproduction. | 超轻量SOTA LaTeX公式识别模型,仅20M参数量,可在浏览器中运行。训练全流程代码开源,以便自学复现。

  • Updated Nov 8, 2025
  • Python

A transformer-based masked language model for learning amino acid sequence representations. The model uses self-attention mechanisms with custom gating and incorporates protein features for enhanced sequence understanding. Trained using BERT-style masking on peptide sequences to learn contextual amino acid embeddings.

  • Updated Mar 7, 2025
  • Python

Improve this page

Add a description, image, and links to the distillation-model topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the distillation-model topic, visit your repo's landing page and select "manage topics."

Learn more