TransWikia.com

Maximum number of records a trigger processes in a single batch?

Salesforce Asked on November 23, 2021

What is the maximum number of records a trigger needs to be able to process in a single transaction/batch?

Is it limited to 200? or do triggers need to be able to handle more?

4 Answers

This also applies to batch jobs that SF does not reset limit in auto chunking. That is, if your batch job updates 1000 records (the same object) in one batch, the trigger will be fired 5 times, SF does not reset SF limit between each trigger chunk.

Answered by sherry peng on November 23, 2021

In addition to the other answers, Platform Event triggers can be presented up to 2000 recs and are not chunked into 200.

There are ways to work around this with Summer 19, see Platform Events Guide Smaller Batches

Answered by cropredy on November 23, 2021

As per the documentation

Implementation Considerations:

the maximum chunk size is 200 (for newer API versions). This mentions bulk API but my understanding it is across any "bulk operation" (including a DML operation in Apex).

Also covered in @cropredy's answer, platform event subscribers receive chunks of up to 2000 records, though this is now configurable.

Answered by Phil W on November 23, 2021

When more than 200 records need to be triggered, Salesforce runs the trigger in chunks of 200.

So, if 1000 records are updating, Salesforce runs the trigger 5 times on 200 records each time.

This means that whatever you do in that trigger gets hit 5 times e.g. if you do 1 SOQL query, then the 1000 record update will use 5 queries.

For example, suppose we have this useless trigger on Account:

trigger AccountTrigger on Account (after insert) {
    List<Account> accounts = [SELECT Id FROM Account];

    System.debug('Number of SOQL queries used: ' + Limits.getQueries());
}

Then, the following test passes:

@IsTest
private class TriggerChunkingTest {

    @IsTest
    static void fiveChunks() {
        Integer nAccounts = 1000;
        List<Account> accounts = new List<Account>();

        for(Integer i=0; i < nAccounts; i++) {
            accounts.add(new Account(Name = String.valueOf(i)));
        }

        Test.startTest();
        insert accounts;
        System.assertEquals(5, Limits.getQueries());
        Test.stopTest();
    }
}

With the following debug output:

11:50:15.253 (1264272078)|USER_DEBUG|[9]|DEBUG|Number of SOQL queries used: 1 11:50:17.429 (3437153880)|USER_DEBUG|[9]|DEBUG|Number of SOQL queries used: 2 11:50:18.693 (4705911499)|USER_DEBUG|[9]|DEBUG|Number of SOQL queries used: 3 11:50:20.586 (6601999951)|USER_DEBUG|[9]|DEBUG|Number of SOQL queries used: 4 11:50:22.141 (8156881669)|USER_DEBUG|[9]|DEBUG|Number of SOQL queries used: 5

As you can imagine, even with efficient triggers, this can blow up pretty fast. 5000 records? Your trigger is going to run 25 times.

As a rule of thumb, I like things to work OK on 1000 records.

If my system can cause updates to more records than that at once, I tend to use some asynchronous method to split that update up.

For example, Campaign Members tend to blow up in crazy ways. Suppose I have a trigger on Campaign which needs to update the corresponding Campaign Members (CMs). And those CMs have their own triggers. I would update the CMs using a Queueable which only updates 1000 or so at-a-time. In Campaigns, you can easily end up with more than 10,000 members, so you literally have no option but to split that update yourself or you'll hit the DML row limit.

Answered by Aidan on November 23, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP