Quick Start Guide to Large Language Models
$49.99
Quantity | Discount |
---|---|
5 + | $37.49 |
- Description
- Additional information
Description
Foreword to the First Edition
Preface
Acknowledgments
About the Author
Part I: Introduction to Large Language Models
Chapter 1: Overview of Large Language Models
Chapter 2: Semantic Search with LLMs
Chapter 3: First Steps with Prompt Engineering
Chapter 4: The AI Ecosystem–Putting the Pieces Together
Part II: Getting the Most Out of LLMs
Chapter 5: Optimizing LLMs with Customized Fine-Tuning
Chapter 6: Advanced Prompt Engineering
Chapter 7: Customizing Embeddings and Model Architectures
Chapter 8: AI Alignment: First Principles
Part III: Advanced LLM Usage
Chapter 9: Moving Beyond Foundation Models
Chapter 10: Advanced Open-Source LLM Fine-Tuning
Chapter 11: Moving LLMs into Production
Chapter 12: Evaluating LLMs
Part IV: Appendixes
Appendix A: LLM FAQs
Appendix B: LLM Glossary
Appendix C: LLM Application Archetypes
Index
Sinan Ozdemir is currently the founder and CTO of LoopGenius and an advisor to several AI companies. Sinan is a former lecturer of Data Science at Johns Hopkins University and the author of multiple textbooks on data science and machine learning. Additionally, he is the founder of the recently acquired Kylie.ai, an enterprise-grade conversational AI platform with RPA capabilities. He holds a master’s degree in Pure Mathematics from Johns Hopkins University and is based in San Francisco, CA.
The Practical, Step-by-Step Guide to Using LLMs at Scale in Projects and Products
Large Language Models (LLMs) like Llama 3, Claude 3, and the GPT family are demonstrating breathtaking capabilities, but their size and complexity have deterred many practitioners from applying them. In Quick Start Guide to Large Language Models, Second Edition, pioneering data scientist and AI entrepreneur Sinan Ozdemir clears away those obstacles and provides a guide to working with, integrating, and deploying LLMs to solve practical problems.
Ozdemir brings together all you need to get started, even if you have no direct experience with LLMs: step-by-step instructions, best practices, real-world case studies, hands-on exercises, and more. Along the way, he shares insights into LLMs’ inner workings to help you optimize model choice, data formats, prompting, fine-tuning, performance, and much more. The resources on the companion website include sample datasets and up to date code for working with open and closed source LLMs such as those from OpenAI (GPT-4 and GPT-3.5), Google (BERT, T5, and Gemini), X (Grok), Anthropic (the Claude family), Cohere (the Command family), Meta (BART and the LLaMA family), and more.
- Learn key concepts: pre-training, transfer learning, fine-tuning, attention, embeddings, tokenization, and more
- Use APIs and Python to fine-tune and customize LLMs for your requirements
- Build a complete neural/semantic information retrieval system and attach to conversational LLMs for building retrieval-augmented generation (RAG) chatbots and AI Agents
- Master advanced prompt engineering techniques like output structuring, chain-of-thought prompting, and semantic few-shot prompting
- Customize LLM embeddings to build a complete recommendation engine from scratch with user data that outperforms out of the box embeddings from OpenAI
- Construct and fine-tune multimodal Transformer architectures from scratch using open source LLMs and large visual datasets
- Align LLMs using Reinforcement Learning from Human and AI Feedback (RLHF/RLAIF) to build conversational agents from open models like Llama 3 and FLAN-T5
- Deploy prompts and custom fine-tuned LLMs to the cloud with scalability and evaluation pipelines in mind
- Diagnose and optimize LLMs for speed, memory, and performance with quantization, probing, benchmarking, and evaluation frameworks
“A refreshing and inspiring resource. Jam-packed with practical guidance and clear explanations that leave you smarter about this incredible new field.”
—Pete Huang, author of The Neuron
Additional information
Series | |
---|---|
Imprint | |
Format | |
ISBN-13 | |
ISBN-10 | |
Author | |
BISAC | |
Subjects | professional, higher education, COM025000, COM051360, COM042000, Employability, IT Professional, Y-AM DATABASES, COM016000 |