This document explains how local development authentication works for this Azure Functions project.
The fastest way to get started is using Docker Compose:
# Start both Azurite and DTS emulator
docker compose up -d
# Generate local settings from your Azure environment
./scripts/generate-settings.sh
# Run the Functions app
cd src
func startDashboard URLs:
- Local DTS Dashboard: http://localhost:8082/ - Monitor orchestrations running locally
- Azurite Blob: http://localhost:10000/
Monitoring Orchestrations:
-
Local: Open http://localhost:8082/ to view orchestrations running against the local DTS emulator
-
Cloud: Generate the Azure DTS dashboard URL with the provided scripts:
./scripts/get-dts-dashboard-url.sh
or
.\scripts\get-dts-dashboard-url.ps1
Then open the generated URL in your browser to view orchestrations running in Azure.
To stop the emulators:
docker compose downIf Docker is not available, you can use native Azurite and Azure Storage for Durable Functions:
# 1. Install Azurite globally via npm
npm install -g azurite
# 2. Start Azurite in a separate terminal
azurite
# 3. Temporarily switch to Azure Storage backend (see configuration below)
# 4. Generate local settings
./scripts/generate-settings.sh
# 5. Run the Functions app
cd src
func startConfiguration changes for Azure Storage backend:
Temporarily update src/host.json durableTask section:
"durableTask": {
"hubName": "%TASKHUB_NAME%",
"storageProvider": {
"type": "azureStorage",
"connectionStringName": "AzureWebJobsStorage"
},
"tracing": {
"traceInputsAndOutputs": true
}
}Your local.settings.json should have:
{
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"TASKHUB_NAME": "default"
}
}Note: This approach uses Azure Storage (via Azurite) instead of DTS for orchestration state. You won't have the DTS dashboard, but all Durable Functions features will work.
Use the provided helper script to easily switch between DTS and Azure Storage:
# Switch to Azure Storage (for environments without Docker)
./scripts/switch-storage-backend.sh azureStorage
# Switch back to DTS (when Docker is available)
./scripts/switch-storage-backend.sh dtsPowerShell:
# Switch to Azure Storage
.\scripts\switch-storage-backend.ps1 azureStorage
# Switch to DTS
.\scripts\switch-storage-backend.ps1 dtsThe script automatically updates your host.json configuration and provides instructions for the required environment setup.
Before you begin, ensure you have:
- Docker installed for running the Durable Task Scheduler emulator
- Azurite installed for local Azure Storage emulation
- Azure CLI installed and configured
This project uses the Durable Task Scheduler (DTS) for orchestration instead of Azure Storage. For local development, you need to run the DTS emulator.
Pull and run the DTS emulator Docker container:
# Pull the emulator image
docker pull mcr.microsoft.com/dts/dts-emulator:latest
# Run the emulator (exposes ports 8080 for gRPC and 8082 for dashboard)
docker run -d -p 8080:8080 -p 8082:8082 mcr.microsoft.com/dts/dts-emulator:latestThe emulator exposes two ports:
- 8080: gRPC endpoint for the Functions app to connect
- 8082: Dashboard UI at http://localhost:8082/
The Azure Functions runtime still requires Azurite for some components:
azurite startCheck that your local.settings.json includes these DTS-specific settings:
{
"Values": {
"DTS_CONNECTION_STRING": "Endpoint=http://localhost:8080;Authentication=None",
"TASKHUB_NAME": "default"
}
}This project uses Azure Managed Identity for authentication - both in production AND for local development. No API keys are required!
- In Azure (Production): The Function App uses its User-Assigned Managed Identity with the
Cognitive Services OpenAI Userrole - Locally: Developers use their own Azure credentials via
az loginwith the same role
When you provision the infrastructure, the Bicep deployment automatically:
- Creates the Azure OpenAI resource
- Assigns the
Cognitive Services OpenAI Userrole to the Function App's managed identity - Assigns the same role to YOU (the person running
azd provision)
This is done via the principalId parameter in main.bicep:
// This assigns the role to the person running azd provision
module openaiRoleAssignmentDeveloper 'app/rbac/openai-access.bicep' = if (!empty(principalId)) {
name: 'openaiRoleAssignmentDeveloper'
scope: rg
params: {
openAIAccountName: openai.outputs.aiServicesName
roleDefinitionId: CognitiveServicesOpenAIUser
principalId: principalId // Your user ID
}
}# Login to Azure
az login
# Verify you're using the correct subscription
az account show# Run the postprovision script
./scripts/generate-settings.sh
# Verify it was created (notice: no API key!)
cat src/local.settings.jsoncd src
func startThe Azure Functions runtime will automatically use your Azure credentials via DefaultAzureCredential.
You can monitor orchestration instances in real-time using the DTS dashboard:
- Open http://localhost:8082/ in your browser
- Click on your task hub (default: default)
- View orchestration status, history, and execution details
Note: The DTS emulator stores data in memory, so all orchestration data is lost when the container stops.
✅ No secrets to manage - No API keys in config files or environment variables
✅ More secure - Credentials never leave Azure's identity system
✅ Production parity - Local dev works exactly like production
✅ Automatic role assignment - Developers get the right permissions when they provision
✅ Audit trail - All API calls are tied to specific user identities
-
Check your Azure login:
az login az account show
-
Verify your role assignment:
# Get your user principal ID az ad signed-in-user show --query id -o tsv # List role assignments (replace <resource-group> and <openai-name>) az role assignment list --assignee <your-principal-id> \ --scope /subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.CognitiveServices/accounts/<openai-name>
-
If role is missing, run
azd provisionagain or manually assign:az role assignment create \ --role "Cognitive Services OpenAI User" \ --assignee <your-email-or-principal-id> \ --scope /subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.CognitiveServices/accounts/<openai-name>
If other developers join the project:
Option 1: They run azd provision (if they have subscription permissions)
- This will automatically assign them the role
Option 2: Manually assign the role (if you're the admin)
az role assignment create \
--role "Cognitive Services OpenAI User" \
--assignee <developer-email> \
--scope /subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.CognitiveServices/accounts/<openai-name>Q: Do I need an API key for local development?
A: No! Just make sure you're logged in with az login.
Q: What if I want to use an API key anyway?
A: You can manually add AZURE_OPENAI_KEY to your local.settings.json, but it's not recommended. The keyless approach is more secure.
Q: Does this work in CI/CD pipelines?
A: Yes! Use a service principal or managed identity for your CI/CD pipeline with the same role assignment.
Q: What about Cosmos DB access?
A: Same approach! Your user is assigned DocumentDB Account Contributor role during provisioning (see the cosmosDb module in main.bicep).