services.AddSingleton<IJarvasOrchestrator, JarvasOrchestrator>();
services.AddScoped<IMeetingIntelligenceService>();
services.AddTransient<IDbContextFactory<ToolkitDbContext>>();
app.MapHub<JarvasHub>("/jarvas-hub");
builder.Services.AddAzureOpenAI(config);
Technology & Architecture

Enterprise architecture. Your source code.

.NET 8, Blazor Server, Azure SQL, Azure OpenAI — a production-ready platform built for insurance distribution. Full source code ownership. Deploy on your infrastructure. Extend, customize, and scale without limits.

120+ injectable services · 180+ Razor components · 60+ CSS files · Two separate databases

The Stack

No experiments. Proven technology.

Every layer of the stack is production-hardened on the Microsoft ecosystem. No beta dependencies. No framework risks. Enterprise-ready from day one.

Framework

.NET 8

Backend framework with top-tier performance

UI

Blazor Server

Real-time server-rendered UI via SignalR

Database

Azure SQL

EF Core with Code-First migrations

ORM

Entity Framework Core

IDbContextFactory for safe concurrency

Cloud

Microsoft Azure

App Service, Key Vault, Managed Identity

AI / LLM

Azure OpenAI (GPT-5.2)

Coaching, extraction, insight generation

Speech

Azure Speech Services

STT, streaming TTS, speaker diarization

Real-time

SignalR

WebSocket-based audio & command streaming

Components

Custom Blazor + Radzen

180+ Razor components, 60+ CSS files

Client Audio

Web Audio API

PCM streaming, voice activity detection

Application Architecture

Built for real-time. Designed for scale.

A Blazor Server application where every component has direct access to server resources, AI services, and real-time communication — no API translation layer, no client-side state management headaches.

UI update latency

Blazor Server Model

UI renders on the server — DOM diffs sent to the browser via SignalR circuit. No WebAssembly, no client-side framework overhead. True server-side interactivity.

injectable services

Service-Oriented Design

120+ injectable services organized by domain — AI, data access, business logic, real-time. Clean separation of concerns with full dependency injection.

event propagation

Event-Driven Communication

Cross-component communication via C# events and SignalR hub methods. JARVAS commands, audio streams, and page control all flow through the real-time event bus.

central orchestrator

Singleton Orchestrator

The JARVAS orchestrator manages all active AI sessions as a singleton service. Session state, voice pipeline coordination, and intervention level control — all centrally managed.

server-rendered

Interactive Server Render

All pages use server-side interactivity. No split between static and interactive — every page has full access to server resources, database, and AI services.

Database Architecture

Two databases. Complete isolation.

The Toolkit and RM Portal use separate databases with separate DbContexts. Data isolation by design — an RM can never access advisor data, and vice versa. Entity Framework Core manages both with production-proven patterns.

Toolkit Database

Advisor CRM, JARVAS sessions, gamification, training, communications

Lead Management Database

RM Portal leads, cross-sell feedback, lead lifecycle tracking

Complete data isolation — separate authentication, separate DbContexts, no cross-database queries

IDbContextFactory

Safe concurrent access — no DbContext sharing across threads

Transient Data Services

Each request gets a fresh DbContext — prevents concurrency deadlocks

Scoped UI Services

Stateful services share state across components within a circuit

Code-First Migrations

Schema changes tracked in code — auto-applied on startup

Connection Pooling

Azure SQL handles pooling automatically — zero config needed

Voice AI Pipeline

From speech to response in under two seconds.

A streaming architecture that processes speech, generates AI responses, and synthesizes voice output in real time. Barge-in support means JARVAS stops speaking within 100ms if the customer starts talking.

Customer Speaks

2s silence buffer

Speech-to-Text

Azure STT with diarization

GPT Streaming

Context-aware response

Sentence Buffer

Natural speech cadence

Streaming TTS

Neural voice synthesis

~1-2s

End-to-end latency

100ms

Barge-in detection

2s

Silence buffer

3

Speaker diarization

Azure Infrastructure

Your Azure subscription. Your infrastructure.

Every deployment is a fully isolated instance on the buyer's own Azure subscription. No shared tenancy. No data commingling. You control the region, the tier, and the scaling configuration.

Azure App Service

Hosts the Blazor Server application

Linux, .NET 8, B1 to P3v3 tiers

Azure SQL Database

Primary data storage

Two databases: Toolkit + Lead Management

Azure Key Vault

Secrets management

API keys, connection strings, certs

Azure OpenAI Service

GPT-5.2 for all AI features

Coaching, summaries, extraction, chat

Azure Speech Services

STT, TTS, and diarization

Neural voices, streaming output

Azure Managed Identity

Service-to-service auth

No stored credentials — identity-based

Deployment Process

1

Source code delivered via private Git repository

2

Provision Azure resources (App Service, SQL, Key Vault, OpenAI, Speech)

3

Configure secrets in Key Vault

4

Run dotnet publish for release build

5

Zip deploy to App Service via Azure CLI

6

Migrations auto-apply on first startup

Typical production costs: $800–$1,500/month on Azure for a mid-size distributor (200–500 advisors). Azure OpenAI usage is per-1K-tokens on top.

Security & Compliance

Enterprise security. Built in, not bolted on.

The Microsoft Azure security stack protects every layer — from identity and access to data encryption, AI safety guardrails, and network hardening. Each deployment is fully isolated with no shared infrastructure.

Identity & Access

  • ASP.NET Core Identity with BCrypt hashing
  • Role-based access control (RBAC)
  • Separate auth for Toolkit vs RM Portal
  • Account lockout policies
  • Azure AD / Entra ID integration possible

Data Protection

  • HTTPS/TLS 1.2+ in transit
  • Azure SQL TDE at rest
  • Secrets in Azure Key Vault
  • No client-side sensitive data
  • Anti-forgery tokens on forms

AI Safety

  • Anti-hallucination guardrails
  • Temperature controls per use case
  • Transcript-grounded outputs
  • Early return for sparse data
  • Prompt injection defenses

Compliance

  • GDPR-ready consent framework
  • No audio recordings stored
  • Immutable audit trail
  • Buyer controls data residency
  • Azure SOC 2 / ISO 27001 inherited

Network Security

  • Built-in DDoS protection
  • IP filtering & TLS termination
  • Azure Front Door / WAF support
  • Private Endpoints available
  • Managed Identity — no stored creds

Data Isolation

  • Dedicated instance per buyer
  • No shared infrastructure
  • No multi-tenant data mixing
  • Separate databases per product
  • Buyer-controlled encryption keys

Scalability & Performance

From 10 advisors to 10,000.

SignalR circuits, connection pooling, streaming AI, and auto-scaling tiers — every layer is designed for concurrent real-time usage at enterprise scale.

~500KBPer-circuit memory

Blazor Server

Each user maintains a SignalR circuit (WebSocket). Stateful server components with real-time DOM updates. Service lifetime management — Transient for data access, Scoped for shared UI state, Singleton for orchestration.

0Deadlock risk

Database

Entity Framework Core with IDbContextFactory for safe concurrent access. Transient DbContext instances prevent concurrency deadlocks. Azure SQL connection pooling handles the rest.

<2sVoice latency

AI / Voice

Streaming architecture for sub-second voice latency. GPT token streaming to sentence buffering to streaming TTS to real-time audio. Barge-in within 100ms. VAD for efficient processing.

100msReconnect speed

Real-time

SignalR for all real-time features — JARVAS commands, audio streaming, page control. Auto-reconnection with client-side UI. Circuit error recovery with graceful degradation.

Scaling Reference

Concurrent Users

Recommended Tier

Est. App Service Cost

50

B1 / S1

~$50/mo

200

P1v3

~$200/mo

500

P2v3

~$500/mo

1,000+

Multi-instance + Front Door

Custom

Source Code Ownership

Your platform. Your competitive advantage.

This isn't SaaS you rent. It's a complete codebase you own. One-time purchase. Full source code. Deploy, customize, extend, and white-label without limits or dependencies.

Every Line of Code

Full source code delivered via private Git repository. No obfuscation, no compiled-only modules, no black boxes.

Fork, Extend, Evolve

Modify anything — UI components, business logic, AI prompts, database models. Add new modules or entirely new product verticals.

White-Label It

Replace all branding with yours. Custom domain, logos, colors, typography. Your customers never see "JARVAS."

Localize It

Full localization support — UI strings, AI prompts, calculator labels, communication templates. Deploy for any market.

Day-One Ready

Comprehensive deployment documentation, architecture guide, and fully commented source code. Your team is productive from day one.

No Vendor Lock-In

No recurring fees. No subscription. No per-user pricing. Your only ongoing costs are Azure infrastructure and API usage.

JarvasOrchestrator.cs
1// Your code. Your platform. Forever.
2public class JarvasOrchestrator : IJarvasOrchestrator
3{
4 private readonly IAzureOpenAIService _ai;
5 private readonly ISpeechService _speech;
6 private readonly ISessionManager _sessions;
7 
8 public async Task<JarvasResponse> ProcessAsync(
9 MeetingContext context,
10 InterventionLevel level)
11 {
12 // Your business logic. Your rules.
13 // Modify anything. Extend everything.
14 }
15}
Technical Deep-Dive

Ready to evaluate the architecture?

Request a technical review and get a hands-on walkthrough of the codebase, deployment model, and integration patterns. Talk to architects, not salespeople.

.NET 8 · Blazor Server · Azure SQL · Azure OpenAI · Full source code ownership