Optimizing AX Batch Performance - Batch Group Configuration

Imparted from Here

Are you pushing lots of transactions through the AX batch framework?  Are you wondering if you've optimized the system for your workload? We recently did some testing in our lab to illustrate some of the performance and concurrency characteristics of the AX 2009 batch framework. Our goal was to show the impact of two commonly overlooked configuration options (batch groups & batch threads) that can greatly impact the throughput of your system. In this post we'll address the topic of batch groups. We'll cover the topic of batch thread configuration in a subsequent post. Now to the details...

Test Environment

In our AX 2009 test environment we have 1 SQL Server hosting the AX database and 2 AOS servers configured for batch processing. 2 separate companies were created within AX and each has 10,000 orders that need to be invoiced. A separate invoicing batch job was setup for each company to process the orders. To illustrate how batch groups affect batch job concurrency we ran 2 different tests.

In scenario 1, a single batch group exists and all batch activity is configured to use it. Both AOS instances are assigned to this batch group and both are using the default of 8 batch threads.

In scenario 2, 2 batch groups exist. Company A is configured to use batch group A and Company B is configured to use batch group B. 1 AOS instance is assigned only to batch group A and the other AOS instance is assigned only to batch group B. Both AOS instances are using the default of 8 batch threads.

Test Results

Scenario 1: The red line represents the number of invoicing batch tasks completed each minute for company A. The blue line represents the number of invoicing batch tasks completed each minute for company B. Both jobs were started at the same time. 

Scenario 2: The red line represents the number of invoicing batch tasks completed each minute for company A. The blue line represents the number of invoicing batch tasks completed each minute for company B. Both jobs were started at the same time.

  • When a single batch group was used (scenario 1), one job consumed all available threads across both AOS instances until it completed. Then the second job started and did the same thing. Each job only took 6.5 minutes, but the jobs executed serially.
  • When separate batch groups were used (scenario 2), each job had access to a set of dedicated batch threads. The jobs processed invoices concurrently, but they both took 13 minutes to complete since each job only had half the threads available to it.  
  • In the end, the total amount of time required to process all 20,000 orders was almost exactly the same with both scenarios completing in 13 minutes. It's just the order in which they were processed that was different.

Conclusions

Based on what we saw in the test results, I think you'll agree it's important to consider the different workloads you have and how they interact. If all of your batch jobs have the same priority and it's acceptable for any of those jobs to queue up behind another one for a while, then a single pool of batch resources (single batch group) will likely meet your needs. But, if you have a time sensitive batch job mixed in with everything else, you can't guarantee that job will execute when you have it scheduled because it doesn't have dedicated resources.

We have observed the job queuing behavior exhibited in scenario 1 a number of times with different customers - a large multi-threaded batch job consumes all available batch threads across all AOS instances for many hours while a small but time sensitive job sits there waiting to run until after the other one is complete. This can be a frustrating situation, but the good news is there's a simple solution - move the time sensitive job to a separate batch group to ensure resources are available when it needs to execute. Then you can rest easy knowing that it will execute on schedule regardless of the other activity on the system.

In the next post on batch performance we'll discuss the topic of batch threads. Stay tuned...

No comments:

Post a Comment