Attention Mechanisms in Transformers: Comparing MHA, MQA, and GQA

Background The Transformer (Vaswani et al., 2017) is a model based on the encoder-decoder architecture. This model has demonstrated outstanding performance in the field of natural language processing (NLP), leading to a series of optimized models based on it, such as BERT (Devlin et al., 2018) which uses only the encoder, GPT (Radford et al., 2018) series which uses only the decoder, and subsequent large language models (LLMs) like LLaMA (Touvron et al., 2023) and GPT-4 (OpenAI et al., 2024), most of which adopt a decoder-only architecture. ...

2025-01-16 · 29 min · 6139 words · Yue Shui

Building Domain-Specific LLMs

Background With the widespread application of Large Language Models (LLMs) across various industries, enterprises and research teams face an urgent need to adapt general-purpose models to specific domains. Foundational LLMs often fail to meet deep domain-specific requirements when handling specialized tasks. For example, in the application of closed-source programming languages, existing open-source models lack sufficient understanding of their syntax and semantics, leading to poor performance in tasks such as code generation and error correction. Therefore, injecting domain knowledge and training dedicated LLMs has become a key step in enhancing development efficiency and code quality. ...

2025-01-05 · 21 min · 4340 words · Yue Shui

Building a Home Deep Learning Rig with Dual RTX 4090 GPUs

Rent a GPU or Buy Your Own? Before setting up a deep learning environment, consider usage duration, budget, data privacy, and maintenance overhead. If you have long-term needs (e.g., over a year) and require strict data security, building your own GPU server often provides lower overall costs and a more controllable environment. On the other hand, for short-term projects or when data privacy is not critical, renting cloud GPUs (e.g., Azure, AWS, GCP) or using free platforms (Colab, Kaggle) offers greater flexibility. ...

2024-12-21 · 10 min · 1988 words · Yue Shui