Chiudi

Aggiungi l'articolo in

Chiudi
Aggiunto

L’articolo è stato aggiunto alla lista dei desideri

Chiudi

Crea nuova lista

Dati e Statistiche
Wishlist Salvato in 0 liste dei desideri
LLMs in Production
Scaricabile subito
49,61 €
49,61 €
Scaricabile subito
Chiudi

Altre offerte vendute e spedite dai nostri venditori

Altri venditori
Prezzo e spese di spedizione
ibs
Spedizione Gratis
49,61 €
Vai alla scheda completa
Altri venditori
Prezzo e spese di spedizione
ibs
Spedizione Gratis
49,61 €
Vai alla scheda completa
Altri venditori
Prezzo e spese di spedizione
Chiudi
ibs
Chiudi

Tutti i formati ed edizioni

Chiudi
LLMs in Production
Chiudi

Promo attive (0)

Chiudi
LLMs in Production
Chiudi

Informazioni del regalo

Descrizione


Goes beyond academic discussions deeply into the applications layer of Foundation Models. This practical book offers clear, example-rich explanations of how LLMs work, how you can interact with them, and how to integrate LLMs into your own applications. Find out what makes LLMs so different from traditional software and ML, discover best practices for working with them out of the lab, and dodge common pitfalls with experienced advice. In LLMs in Production you will: • Grasp the fundamentals of LLMs and the technology behind them • Evaluate when to use a premade LLM and when to build your own • Efficiently scale up an ML platform to handle the needs of LLMs • Train LLM foundation models and finetune an existing LLM • Deploy LLMs to the cloud and edge devices using complex architectures like PEFT and LoRA • Build applications leveraging the strengths of LLMs while mitigating their weaknesses LLMs in Production delivers vital insights into delivering MLOps so you can easily and seamlessly guide one to production usage. Inside, you’ll find practical insights into everything from acquiring an LLM-suitable training dataset, building a platform, and compensating for their immense size. Plus, tips and tricks for prompt engineering, retraining and load testing, handling costs, and ensuring security. Foreword by Joe Reis. About the technology Most business software is developed and improved iteratively, and can change significantly even after deployment. By contrast, because LLMs are expensive to create and difficult to modify, they require meticulous upfront planning, exacting data standards, and carefully-executed technical implementation. Integrating LLMs into production products impacts every aspect of your operations plan, including the application lifecycle, data pipeline, compute cost, security, and more. Get it wrong, and you may have a costly failure on your hands. About the book LLMs in Production teaches you how to develop an LLMOps plan that can take an AI app smoothly from design to delivery. You’ll learn techniques for preparing an LLM dataset, cost-efficient training hacks like LORA and RLHF, and industry benchmarks for model evaluation. Along the way, you’ll put your new skills to use in three exciting example projects: creating and training a custom LLM, building a VSCode AI coding extension, and deploying a small model to a Raspberry Pi. What's inside • Balancing cost and performance • Retraining and load testing • Optimizing models for commodity hardware • Deploying on a Kubernetes cluster About the reader For data scientists and ML engineers who know Python and the basics of cloud deployment. About the author Christopher Brousseau and Matt Sharp are experienced engineers who have led numerous successful large scale LLM deployments. Table of Contents 1 Generative AI: Why large language models have captured attention 2 Large language models: A deep dive into language modeling 3 Large language model operations: Building a platform for LLMs 4 Data engineering for large language models: Setting up for success 5 Training large language models: How to generate the generator 6 Large language model services: A practical guide 7 Prompt engineering: Becoming an LLM whisperer 8 Applications and Agents: Building an interactive experience 9 Creating an LLM project: Reimplementing Llama 3 10 Creating a coding copilot project: This would have helped you earlier 11 Deploying an LLM on a Raspberry Pi: How low can you go? 12 Creating a coding copilot project: Integrating an LLM service into VS Code with RAG started A History of linguistics B Reinforcement learning with human feedback C Multimodal latent spaces
Leggi di più Leggi di meno

Dettagli

2025
Testo in en
Tutti i dispositivi (eccetto Kindle) Scopri di più
Reflowable
9781638357254
Chiudi
Aggiunto

L'articolo è stato aggiunto al carrello

Compatibilità

Formato:

Gli eBook venduti da IBS.it sono in formato ePub e possono essere protetti da Adobe DRM. In caso di download di un file protetto da DRM si otterrà un file in formato .acs, (Adobe Content Server Message), che dovrà essere aperto tramite Adobe Digital Editions e autorizzato tramite un account Adobe, prima di poter essere letto su pc o trasferito su dispositivi compatibili.

Compatibilità:

Gli eBook venduti da IBS.it possono essere letti utilizzando uno qualsiasi dei seguenti dispositivi: PC, eReader, Smartphone, Tablet o con una app Kobo iOS o Android.

Cloud:

Gli eBook venduti da IBS.it sono sincronizzati automaticamente su tutti i client di lettura Kobo successivamente all’acquisto. Grazie al Cloud Kobo i progressi di lettura, le note, le evidenziazioni vengono salvati e sincronizzati automaticamente su tutti i dispositivi e le APP di lettura Kobo utilizzati per la lettura.

Clicca qui per sapere come scaricare gli ebook utilizzando un pc con sistema operativo Windows

Chiudi

Aggiungi l'articolo in

Chiudi
Aggiunto

L’articolo è stato aggiunto alla lista dei desideri

Chiudi

Crea nuova lista

Chiudi

Chiudi

Siamo spiacenti si è verificato un errore imprevisto, la preghiamo di riprovare.

Chiudi

Verrai avvisato via email sulle novità di Nome Autore