When you build an Azure WebJob that processes messages from a Service Bus queue or topic, controlling how many messages are processed in parallel can be crucial. Too much parallelism can overwhelm downstream systems or databases; too little can leave throughput on the table.
In this post, we’ll explore how to fine-tune concurrency using the WebJobs SDK’s Service Bus options — specifically MaxConcurrentCalls, MaxMessageBatchSize, and a few global settings.
🧩 The Basics: How Concurrency Works in Service Bus Triggers
Each Azure WebJob host runs a message pump that:
- Receives messages from Service Bus.
- Dispatches them to your function handler.
- Optionally acknowledges (completes) or abandons them.
Two primary settings control this behavior:
MaxConcurrentCalls– how many message handlers can run at once per host.MaxMessageBatchSize– how many messages can be batched together in a single invocation.
⚙️ MaxConcurrentCalls: The Core Concurrency Limit
b.AddServiceBus(options =>
{
options.MessageHandlerOptions.MaxConcurrentCalls = 3;
});
What it does: At most three function executions will run at any given time (per WebJob host instance). If more messages are available, they’ll wait in the Service Bus queue until one of the three slots is free.
Why it matters:
- Keeps CPU, memory, or I/O under control.
- Prevents flooding downstream APIs or databases.
- Useful for rate-limiting.
| Setting | Meaning |
|---|---|
MaxConcurrentCalls = 1 | Processes messages one at a time. |
MaxConcurrentCalls = 3 | Up to three parallel executions. |
MaxConcurrentCalls = 20 | High throughput; only if your app and infrastructure can handle it. |
📦 MaxMessageBatchSize: Controlling Batch Size
b.AddServiceBus(options =>
{
options.BatchOptions.MaxMessageBatchSize = 5;
});
What it does: The trigger can pull up to five messages at once and deliver them in a single function invocation.
When to use batching:
- When the work per message is small and the overhead of function invocation is significant.
- When you want to perform operations in bulk (e.g., database inserts).
MaxMessageBatchSize doesn’t multiply concurrency — it changes the shape of each invocation. You could have three parallel function calls (due to MaxConcurrentCalls = 3), each processing up to five messages — for a total of 15 messages being processed at once.
🧮 Putting It Together
Let’s say you apply the following configuration:
options.MessageHandlerOptions.MaxConcurrentCalls = 3; options.BatchOptions.MaxMessageBatchSize = 5;
With these settings:
- Maximum parallel invocations per host = 3
- Maximum messages per invocation = 5
- Therefore, max messages processed in parallel per host = 3 × 5 = 15
🌍 Global Concurrency Cap: MaxConcurrentFunctionExecutions
services.Configure<JobHostOptions>(o =>
{
o.MaxConcurrentFunctionExecutions = 3;
});
This ensures that across your entire WebJob (and all triggers), no more than three functions run concurrently — even if multiple queues or timers are active.
☁️ Scaling Considerations
| Setting | Instances | Effective Max Parallelism |
|---|---|---|
MaxConcurrentCalls = 3 | 1 | 3 |
MaxConcurrentCalls = 3 | 2 | 6 |
MaxConcurrentCalls = 3 | 4 | 12 |
If you truly need to cap total concurrency globally, either:
- Pin your WebJob to a single instance.
- Or use global caps (
MaxConcurrentFunctionExecutions) and plan around scale-out behavior.
🧠 Recommended Patterns
| Scenario | Recommended Setting |
|---|---|
| Low-throughput, critical task | MaxConcurrentCalls = 1 |
| Moderate concurrency, safe processing | MaxConcurrentCalls = 3 |
| Bulk processing with small messages | MaxConcurrentCalls = 3, MaxMessageBatchSize = 5 |
| Global ceiling across multiple functions | MaxConcurrentFunctionExecutions = N |
| Session-enabled queues | MaxConcurrentSessions = N |
✅ Full Configuration Example
Here’s a complete example for a .NET 6 WebJob that processes a Service Bus queue with controlled concurrency:
using Microsoft.Extensions.Hosting;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.ServiceBus;
var host = new HostBuilder()
.ConfigureWebJobs(b =>
{
b.AddAzureStorageCoreServices();
b.AddServiceBus(o =>
{
o.MessageHandlerOptions.MaxConcurrentCalls = 3;
o.BatchOptions.MaxMessageBatchSize = 5;
});
})
.ConfigureServices(s =>
{
s.Configure<JobHostOptions>(o =>
{
o.MaxConcurrentFunctionExecutions = 3; // optional global cap
});
})
.Build();
host.Run();
🧭 Summary
| Setting | Scope | Controls | Default | Typical Use |
|---|---|---|---|---|
MaxConcurrentCalls | Per function | Parallel message handlers | 16 | Primary concurrency limit |
MaxMessageBatchSize | Per function | Messages per batch | 1 | Batching efficiency |
MaxConcurrentFunctionExecutions | Global | Total concurrent functions | ∞ | Hard ceiling across WebJob |
🔍 Final Thoughts
Tuning concurrency isn’t one-size-fits-all — it depends on your workload, message size, and downstream systems. Start conservative, measure throughput and latency, and scale up gradually.
By combining MaxConcurrentCalls, MaxMessageBatchSize, and MaxConcurrentFunctionExecutions, you gain precise control over performance, stability, and cost in your Azure WebJobs.
