What We Learned at the .NET AI Bootcamp with Jeff Fritz
See Jeff Fritz .NET AI Bootcamp - https://www.youtube.com/live/nJYB9Fb0hr4
Iron Software proudly sponsored Jeff Fritz's 8-hour long .NET AI Bootcamp, delivering exactly what the .NET community needed: a comprehensive, hands-on workshop that transforms AI from buzzword to practical development tool. This wasn't another theoretical overview, Jeff Fritz built working applications from scratch, demonstrating real-world implementation patterns that developers can immediately apply. A second workshop focused on .NET Aspire is happening the following week, you can find more information here.
As part of our continued commitment to supporting .NET developer education and community growth, Iron Software and Jeff Fritz made this free, full-day virtual event possible for thousands of developers worldwide. The workshop exemplifies our dedication to fostering innovation and collaboration within the .NET ecosystem.
Workshop Environment and Setup
The bootcamp emphasized treating the session as a focused workshop rather than passive viewing. Jeff Fritz recommended a clean development environment with .NET 9 Preview, Docker, and proper GitHub token configuration for accessing models.
The GitHub token setup proved remarkably straightforward: navigate to Developer Settings, create a fine-grained token with read access to Models, and set appropriate expiration. GitHub's open models provide free access to GPT-4 Mini without requiring OpenAI or Azure subscriptions.
Jeff Fritz demonstrated three deployment options: GitHub Models for free access, Azure OpenAI Service for enterprise features, and Ollama for complete local privacy. The key insight was provider flexibility applications can switch between services without code rewrites.
Building Real Applications: The Quiz App
Rather than demonstrating another chatbot, Fritz built a practical quiz application using Blazor Server. The application generates dynamic trivia questions on any topic, showcasing real-world AI integration patterns.
The implementation highlighted clean integration through Microsoft.Extensions.AI dependency injection, making AI services as accessible as logging or HTTP clients. The application demonstrated prompt chaining in action, showing how multiple AI calls can work together to create sophisticated user experiences.
Microsoft.Extensions.AI: Unified Provider Abstraction
Microsoft.Extensions.AI emerged as the workshop's most significant technical revelation. This package provides unified abstraction across AI providers, allowing applications to work with OpenAI, Ollama, or GitHub Models through consistent interfaces.
The abstraction registers AI clients in Program.cs using familiar .NET dependency injection patterns. Applications write against consistent interfaces while maintaining complete flexibility to change providers based on requirements, cost, or deployment constraints.
Security Best Practices from Day One
Jeff Fritz emphasized proper secret management throughout development. The workshop covered dotnet user-secrets for development, avoiding API keys in configuration files, and preventing credential commits to source control.
The focus on security-first development rather than retrofitting protection later addresses a critical gap in many AI implementations. Given that AI applications often require multiple API keys and service credentials, establishing secure patterns early prevents significant security vulnerabilities.
Retrieval-Augmented Generation (RAG): The Essential Pattern
The workshop's most valuable segment covered Retrieval-Augmented Generation implementation. Fritz built a complete system that processes documents, creates chunks, generates vector embeddings, stores them in memory, and matches user queries to relevant content before generating responses.
Fritz described RAG as "the most powerful pattern for real-world applications legal, finance, knowledge bases, and beyond." This pattern transforms AI from generic question-answering to applications that understand and reason over specific organizational data, unlocking significant business value.
The demonstration showed progression from simple Q&A to applications that comprehend company documents, policies, and knowledge bases where practical AI implementation delivers measurable business impact.
Local Development with Ollama
For developers requiring complete control or avoiding external API dependencies, Jeff Fritz demonstrated local AI development using Ollama in Docker. The setup involves pulling Docker images, configuring GPU support when available, and downloading appropriate models.
Local deployment offers complete privacy, eliminates external dependencies, and requires surprisingly modest hardware resources. Fritz repeatedly emphasized that meaningful AI development doesn't require expensive GPU hardware standard development machines to handle most workflows effectively.
Practical Prompt Engineering
The workshop included actionable prompt engineering techniques, avoiding both oversimplification and unnecessary complexity. Fritz demonstrated structured prompts and conversational roles, showing how context like "you're a .NET expert helping a junior developer" significantly improves response quality.
The quiz application illustrated maintaining conversation context and guiding AI responses, critical capabilities for production applications that go beyond single-query interactions.
Current Limitations and Realistic Expectations
Jeff Fritz provided honest assessments of current AI limitations. Most large language models perform best with English, with other languages producing less reliable results, an area requiring continued ecosystem improvement.
Cost analysis revealed GitHub Models as genuinely free for learning and small projects, while Azure OpenAI provides pay-per-token pricing that remains affordable at reasonable scale. The key advantage is starting with free tiers and scaling without code changes.
Hardware requirements remain accessible; standard development laptops handle AI development workflows, and local models run effectively on modest hardware configurations.
Getting Started: Resources and Next Steps
The bootcamp repository at github.com/csharpfritz/ai-bootcamp contains complete examples, Docker configurations, Blazor templates, and model setup instructions. YouTube replays enable code-along learning for those who missed the live session.
Recommended progression path:
- Begin with GitHub Models for cost-free experimentation
- Clone the bootcamp repository and implement the quiz application
- Explore Microsoft.Extensions.AI abstractions and provider flexibility
- Build RAG applications using demonstrated patterns
- Scale to Azure or local models when project requirements justify the complexity
The Broader Impact
This bootcamp demonstrated that AI integration in .NET has moved beyond experimental status into standard development practice. The combination of Microsoft.Extensions.AI for provider abstraction, GitHub Models for accessible LLM access, and proven patterns like RAG creates concrete opportunities for .NET developers.
The development path is clearer than many expected. Developers don't need AI expertise to build intelligent applications, the .NET ecosystem now provides abstractions that allow focus on application logic rather than AI integration complexity.
Jeff Fritz's workshop proved that developers can progress from zero AI knowledge to functional applications within a single day. For .NET developers curious about AI implementation but uncertain about entry points, this bootcamp demonstrates just how accessible the technology has become.
Why This Matters Now
As Iron Software continues supporting .NET community innovation, events like Fritz's bootcamp represent exactly the type of practical, hands-on learning that drives the ecosystem forward. This isn't theoretical AI discussion, it's actionable knowledge that developers can implement immediately in production applications.
The workshop validates what the .NET community suspected: AI integration is becoming a standard part of the developer toolkit, not a specialized niche. With proper abstractions, accessible models, and proven patterns, the barrier to entry has dropped significantly.
For organizations evaluating AI integration, the message is clear: the tools exist, the patterns are proven, and the .NET ecosystem provides the foundation for reliable, scalable AI-powered applications. The question isn't whether to integrate AI, it's how quickly teams can adapt these patterns to deliver business value.
Focus on What Makes Your Application Unique
While building AI capabilities into your applications, remember that certain foundational components are better left to proven, enterprise-grade solutions. Rather than spending development time recreating PDF generation, OCR processing, or barcode reading functionality, developers can focus on their application's unique value proposition.
Iron Software's suite of .NET libraries handles these infrastructure concerns, allowing development teams to concentrate on the AI features and business logic that differentiate their applications. From IronPDF for document processing to IronOCR for intelligent text extraction, these battle-tested libraries integrate seamlessly with modern AI workflows.
Start Building Intelligent Applications Today
Ready to implement the patterns demonstrated in Fritz's workshop? Iron Software offers a free trial of our complete .NET library suite, giving you access to the document processing and data extraction tools that complement AI integration perfectly.
Our libraries work alongside the Microsoft.Extensions.AI patterns Fritz demonstrated, enabling rapid development of sophisticated applications that combine AI intelligence with robust document processing capabilities. Whether you're building RAG systems that process PDFs, applications that extract data from scanned documents, or workflows that generate intelligent reports, Iron Software provides the foundational tools that let you focus on innovation rather than implementation complexity.