Skip to content

Latest commit

 

History

History
115 lines (84 loc) · 4.42 KB

File metadata and controls

115 lines (84 loc) · 4.42 KB

Mastering LLMs: Training, Fine-Tuning, and Best Practices

A comprehensive full-day workshop covering the fundamentals and advanced concepts of Large Language Models (LLMs), from theoretical foundations to practical implementation and deployment strategies.

Note

Visit workshop website to navigate with ease

Workshop Details

📅 Date: August 23rd 2025
🎟️ Registration: Workshop Link

Modules

  • Fundamentals of text representation
  • Contextual embeddings using transformers.
  • Internals of the transformer architecture: attention mechanism, embeddings, and core components that make up large language models
  • HuggingFace pipelines for different tasks a language model can handle: classification, text generation, etc.
  • Fine-tune a pretrained GPT2 for code-generation.
  • LLM Optimizations:
    • PEFT
    • Quantization/LoRA
  • Instruction Tuning
  • LLM alignment or Performance Tuning using RLHF/PPO
  • RAG
  • LangChain
  • DSpy
  • Tool/Function Calling
  • MCP
## Setup Instructions

Prerequisites

Before attending the workshop, please ensure you have the following:

Access/Services

# Clone the repository
git clone https://github.com/raghavbali/mastering_llms_workshop_dhs2025.git
cd mastering_llms_workshop_dhs2025

Environment Setup

  • Notebooks are self-contained for quick setup
  • Modules aimed at low-resource setups/colab compatible
## Prerequisites
  • Familiarity with python, pytorch and python ecosystem
  • Understanding of neural networks and deep learning concepts

Previous Workshops

Thanks

A huge round of thanks to amazing teams at:

Pytorch Hugging Face Ollama Unsloth Ollama Chroma Model Context Protocol

Citation

If you use materials from this workshop in your research or projects, please cite:

@misc{mastering_llms_workshop_2025,
  title={Mastering LLMs: Training, Fine-Tuning, and Best Practices},
  author={Raghav Bali},
  year={2025},
  url={https://github.com/raghavbali/mastering_llms_workshop_dhs2025},
  note={Workshop materials for DHS 2025}
}

Contact

For questions about the workshop content or materials, please open an issue in this repository.

Author: