WA3417

Boosting Developer Productivity with AI Training

This Artificial Intelligence (AI) training gives developers a technical introduction to large language models (LLMs) and teaches them how to increase their coding productivity with various AI tools, including ChatGPT and GitHub Copilot.

Course Details

Duration

2 days

Prerequisites

  • IT background or be interested in generative AI-driven programming

Target Audience

  • Software developers
  • IT architects
  • Technical managers

Skills Gained

  • Understand LLMs' fundamental concepts and principles
  • Gain insights into the diverse applications of LLMs across various domains, including natural language processing, creative text generation, and code development
  • Enhance productivity and problem-solving with AI
  • Develop proficiency using popular LLM platforms and tools like OpenAI's ChatGPT and GitHub Copilot
  • Explore ethical considerations and potential risks associated with LLM usage
  • Apply LLM-powered techniques to practical scenarios
Course Outline
  • Introduction to Large Language Models
    • What is Generative AI?
    • A Bit of History ...
    • ... and Then ...
    • RNNs
    • Problems with RNNs
    • Transformers
    • Encoders and Decoders
    • Generative AI and LLMs
    • Training the Model to Predict the Next Word Visually
    • The LLMs Landscape
    • The Evolutionary Tree of LLMs
    • The Microsoft 365 Copilot Ecosystem
    • The LLM Capabilities vs LLM Size (in Parameters)
    • Does the Model Size Matter?
    • Inference Accuracy vs LLM Size
    • Open AI GPT Models
    • Llama
    • The LLaMA Family of LLMs
    • LLaMA 2
    • The AI-Powered Chatbots
    • How Can I Access LLMs?
    • Options for Accessing LLMs
    • Cloud Hosting
    • Opinions about LLMs
    • Multimodality of LLMs
    • Infographic of Multimodality Tasks
    • Example of an LLM Explaining a Joke
    • Example of Cause & Effect Reasoning
    • Inferring Movie from Emoji
    • Prompt Engineering
    • The Right People, with the Right Skills, for the Right Time ...
    • Context Window and Prompts
    • Zero- and Few-Shot Prompting
    • The Training Datasets
    • The RedPajama Project (OSS LLaMA Dataset)
    • AI Alignment
    • Reinforcement Learning with Human Feedback (RLHF)
    • Problems with RLHF
    • Ethical AI
  • LLMs, a Technologist's Perspective
    • LLM Operational Aspects
    • Understanding Model Sizes
    • Physical Model Sizes
    • The Training and Inference Costs
    • The Model Training Phase's Carbon Footprint
    • Quantization
    • Model Formats
    • LLM Accuracy Benchmarks
    • Open and Closed Book Benchmarks
    • The Perplexity Performance Metric
    • Embeddings
    • Where are Embeddings Used?
    • The Vector Databases
    • LLM Concerns
    • Ways to Interface with Local LLMs
    • Using a Supported Programming API (Binding)
    • UI Options
    • Customization Options for LLMs
    • Customization Options: Top-p and Top-k
    • Customization Options: Temperature and Repetition Penalty
    • Customization Option: The Turn Template
    • Configuration Presets
  • Introduction to ChatGPT
    • A Stylized OpenAI ChatGPT Logo
    • OpenAI GPT Models
    • OpenAI Models
    • ChatGPT 4.0
    • ChatGPT Prompts
    • ChatGPT Prompts Strategies, Tactics, and Best Practices
    • Prompt Engineering: Dealing with ChatGPT's Hallucination Syndrome
    • Prompt Engineering: Break Down the Complex Tasks into Smaller Ones
    • Prompt Engineering: Examples of Prompts
    • OpenAI API
    • GPT Embeddings
    • Embedding Models' Risks and Limitations
    • OK. How Can I Get My OpenAI Embedding
    • Tokens, Take 1
    • Tokens, Take 2
    • The Tokenizer UI
    • Prompts, Embeddings, and Tokens
  • AI-Powered Developer Productivity
    • Generative AI and LLMs for Developers
    • How to Become a Technologies and Philosopher All in One
    • Gartner on AI-augmented Development Tools
    • Developer-AI Pair Programming Paradigm
    • The Tooling
    • Some Facts ...
    • Code Generation: SQL Example
    • Code Generation: Using ThreadLocal Storage in Java
    • Code Generation: Thread-safe Singleton Design Pattern in C#
    • Code Generation: Bash Scripting
    • Code-to-Code Translation
    • Code Llama
    • Fine-Tuning Llama 2 Workflows
    • GitHub Copilot
    • Can I Trust AI-Generated Code?
    • The Safeguards
    • The General Recommendations ...
  • Introduction to GitHub Copilot
    • What is GitHub Copilot?
    • Copilot Chat
    • IDE and REPL Integrations
    • Will Copilot Replace Developers?
    • Can I Trust Code Generated by GitHub Copilot Code?
    • GitHub Copilot's Modus Operandi
    • The Life of a Code Completion: The Big Picture
    • Code Suggestions are Not Copy & Paste from Other Peoples' Code
    • The Shebang Prologue Hint
    • Getting Started with GitHub Copilot
    • GitHub Copilot Plans
    • Copilot for Individuals
    • Copilot for Businesses
    • GitHub Copilot Security
    • Responsible Copilot
  • Lab Exercises
    • Lab 1. Learning the Colab Jupyter Notebook Environment
    • Lab 2. Hello, AI!
    • Lab 3. OpenAI Platform Overview
    • Lab 4. Using OpenAI API
    • Lab 5. Understanding Embeddings
    • Lab 6. OpenAI API Project
    • Lab 7. Copilot Environment Setup
    • Lab 8. Hello, Copilot!
Upcoming Course Dates
USD $1,525
Online Virtual Class
Scheduled
Date: Jan 6 - 7, 2025
Time: 10 AM - 6 PM ET
USD $1,525
Online Virtual Class
Scheduled
Date: Feb 18 - 19, 2025
Time: 10 AM - 6 PM ET
USD $1,525
Online Virtual Class
Scheduled
Date: Mar 31 - Apr 1, 2025
Time: 10 AM - 6 PM ET
USD $1,525
Online Virtual Class
Scheduled
Date: May 12 - 13, 2025
Time: 10 AM - 6 PM ET
USD $1,525
Online Virtual Class
Scheduled
Date: Jun 30 - Jul 1, 2025
Time: 10 AM - 6 PM ET