In the realm of Salesforce development, understanding the fundamental building blocks of data manipulation is paramount. At the heart of this lie Data Manipulation Language (DML) statements. These are not abstract concepts; they are the direct commands that developers issue to the Salesforce platform to interact with and modify the data stored within your organization’s database. Whether you’re building custom applications, automating business processes, or performing complex data migrations, a firm grasp of DML statements is essential for efficient, accurate, and secure data management. This article will delve into what DML statements mean in Salesforce, exploring their types, syntax, best practices, and their critical role in unlocking the full potential of the platform.

Understanding the Core of Data Manipulation in Salesforce
At its essence, Salesforce is a robust Customer Relationship Management (CRM) system built on a sophisticated database architecture. To effectively leverage this data, developers need a way to create, read, update, and delete records. This is precisely where DML statements come into play. They are the verbs of Salesforce’s data language, allowing you to orchestrate the flow of information and ensure your business processes are accurately reflected in your CRM.
The CRUD Operations: The Foundation of DML
DML statements are intrinsically linked to the fundamental operations performed on any database: Create, Read, Update, and Delete (CRUD). In Salesforce, these operations are mapped to specific DML keywords that form the backbone of data interaction.
Creating New Records (Insert)
The insert statement is used to add new records to Salesforce objects. This could be a new Lead, a new Account, a new Opportunity, or any custom object you’ve defined. When you execute an insert statement, you are essentially populating a new row in a database table with the specified values for each column (field).
-
Syntax: The basic syntax involves the
insertkeyword followed by the list of records you wish to insert. These records are typically represented as sObject lists in Apex code.List<Account> newAccounts = new List<Account>(); Account acc1 = new Account(Name='Acme Corporation', Industry='Technology'); Account acc2 = new Account(Name='Beta Solutions', Industry='Finance'); newAccounts.add(acc1); newAccounts.add(acc2); insert newAccounts; -
Considerations: When inserting records, it’s crucial to ensure that all required fields are populated. Salesforce enforces validation rules, and attempting to insert a record that violates these rules will result in an error. Furthermore, consider the potential for duplicate records, and implement strategies to prevent them if necessary.
Reading Existing Records (Querying – While not strictly a DML statement, it’s the prelude to manipulation)
While SELECT statements in SOQL (Salesforce Object Query Language) are used for retrieving data, they are the crucial first step before performing any DML operation. You need to identify the records you want to manipulate before you can update or delete them.
-
Syntax: SOQL queries use the
SELECTkeyword to specify the fields to retrieve,FROMto indicate the object, andWHEREto filter the results.Account existingAccount = [SELECT Id, Name, Industry FROM Account WHERE Name = 'Acme Corporation' LIMIT 1]; -
Importance in DML: Before updating or deleting records, you’ll often query for them to retrieve their unique
Idand any other necessary information. This ensures you are targeting the correct data.
Updating Existing Records (Update)
The update statement is used to modify the values of existing records in Salesforce. This is a common operation for keeping customer information current, advancing opportunities through sales stages, or reflecting changes in business status.
-
Syntax: Similar to
insert, theupdatestatement takes a list of sObjects whose fields have been modified.Account existingAccount = [SELECT Id, Name, Industry FROM Account WHERE Name = 'Acme Corporation' LIMIT 1]; existingAccount.Industry = 'Software'; update existingAccount; -
Best Practices: When updating records, it’s generally more efficient to retrieve the record, modify its fields, and then perform a single
updateoperation, rather than performing multiple individual updates. This minimizes the number of DML statements executed, which is important for governor limits.
Deleting Records (Delete)
The delete statement is used to permanently remove records from Salesforce. This should be done with caution, as deleted records are typically moved to the Recycle Bin and can be restored within a certain timeframe. Permanent deletion bypasses the Recycle Bin.
-
Syntax: The
deletestatement takes a list of sObjects to be removed.Account accountToDelete = [SELECT Id FROM Account WHERE Name = 'Beta Solutions' LIMIT 1]; delete accountToDelete; -
Caution and Permissions: Deleting records is a sensitive operation. Users must have the appropriate permissions to delete records, and the action can trigger workflows, Apex triggers, and other automation. Be mindful of cascading deletes and their impact.
Upsert: A Powerful Combination
Beyond the individual CRUD operations, Salesforce offers the upsert statement, a highly efficient DML operation that combines the functionality of insert and update. It allows you to insert new records or update existing ones based on a specified external or custom unique identifier.
-
How it Works: When you perform an
upserton a list of sObjects, Salesforce first attempts to find a matching record in the database based on the specifiedallOrNoneHeaderand the unique field you’ve designated (e.g., an external ID field). If a match is found, the existing record is updated with the provided data. If no match is found, a new record is inserted. -
Syntax: The
upsertstatement requires you to specify the object and the field to use for matching.

```apex
List<Contact> newOrUpdatedContacts = new List<Contact>();
Contact contact1 = new Contact(LastName='Smith', Email='john.smith@example.com', External_ID__c='EMP123');
Contact contact2 = new Contact(LastName='Jones', Email='jane.jones@example.com', External_ID__c='EMP456');
newOrUpdatedContacts.add(contact1);
newOrUpdatedContacts.add(contact2);
upsert newOrUpdatedContacts External_ID__c;
```
- Benefits:
upsertis incredibly valuable for data integration scenarios, bulk data loading, and synchronizing data from external systems. It simplifies your code by reducing the need for separateif/elselogic to determine whether to insert or update.
DML in Apex: The Developer’s Toolkit
While DML statements can be executed through various means, their most prominent use in Salesforce development is within Apex, the proprietary programming language. Apex code allows developers to implement complex business logic, create custom user interfaces, and automate intricate processes, all of which heavily rely on DML operations.
Apex Triggers: Reactive Data Manipulation
Apex triggers are special Apex classes that execute automatically before or after a DML event occurs on a specific object. They are a powerful mechanism for enforcing business rules, performing complex validation, and automating related data changes.
-
Trigger Contexts: Triggers can fire in various contexts, such as
before insert,after insert,before update,after update,before delete,after delete, andundelete. Each context provides access to different sets of records and allows for specific types of actions.beforetriggers are ideal for validating data, modifying field values before they are committed to the database, and preventing DML operations.aftertriggers are used for actions that should occur after the data has been successfully saved, such as updating related records, sending emails, or making callouts to external systems.
-
DML within Triggers: You can perform DML operations within Apex triggers, but it’s crucial to do so responsibly. For example, an
after inserttrigger on an Account might insert related Contacts, or anafter updatetrigger on an Opportunity might update a related Project record.
DML Statements and Governor Limits
Salesforce has a set of governor limits designed to ensure that Apex code runs efficiently and doesn’t consume excessive resources. DML operations are a significant factor in these limits.
-
Total DML Statements: There’s a limit on the total number of DML statements you can execute within a single Apex transaction. This limit is typically 150 DML statements per transaction for most editions.
-
DML Rows: There’s also a limit on the total number of records that can be affected by DML operations within a transaction, usually 10,000 rows.
-
Best Practices for DML Statements: To stay within these limits and ensure optimal performance, several best practices should be followed:
- Bulkify your code: Instead of performing DML operations on individual records within a loop, collect all records into lists and perform a single DML operation on the entire list. This is known as bulkification and is the most critical DML best practice.
- Minimize SOQL queries and DML statements: Efficiently query for the data you need and perform DML operations only when necessary.
- Use
upsertstrategically:upsertcan reduce the number of DML statements required for data loading and integration. - Consider
Databasemethods: For more granular control or to bypass certain trigger logic (use with extreme caution), you can use theDatabaseclass methods (e.g.,Database.insert,Database.update). These methods allow for partial success, meaning if some records fail, others can still succeed.
Advanced DML Concepts and Considerations
Beyond the basic DML statements, Salesforce offers more advanced functionalities and considerations that are crucial for robust data management.
allOrNone Parameter
When performing DML operations, especially insert, update, and delete, you can specify the allOrNone parameter. This parameter dictates whether the entire operation should succeed or fail.
-
allOrNone = true(Default): If any record in the DML statement fails validation or encounters an error, the entire transaction is rolled back, and no changes are committed. This ensures data integrity by preventing partial updates or inserts. -
allOrNone = false: If set tofalse, the DML operation can succeed partially. Records that can be successfully processed are committed, while those that fail are not. The operation returns aDatabase.SaveResultorDatabase.DeleteResultarray, allowing you to inspect which records succeeded and which failed, along with the reasons for failure. This is often used withDatabaseclass methods.
DML Options and Database Methods
The Database class in Apex provides methods that offer more control over DML operations than their literal statement counterparts. These methods are particularly useful when you need to handle partial successes or errors gracefully.
Database.insert(sObjectList, allOrNone): Inserts a list of sObjects.Database.update(sObjectList, allOrNone): Updates a list of sObjects.Database.delete(sObjectList): Deletes a list of sObjects.Database.upsert(sObjectList, field): Performs an upsert operation.
These methods return arrays of Database.SaveResult or Database.DeleteResult objects, which contain success/failure status and error messages for each individual record processed.
![]()
Error Handling with DML
Robust error handling is essential when working with DML statements. Unexpected errors can occur due to validation rules, trigger logic, or data integrity issues.
-
try-catchBlocks: In Apex, you should wrap your DML operations withintry-catchblocks to gracefully handle any exceptions that may arise.try { insert newAccounts; } catch (DmlException e) { System.debug('DML operation failed: ' + e.getMessage()); // Implement error logging or user notification } -
Inspecting
SaveResult: When usingDatabasemethods withallOrNone = false, you can iterate through the returnedSaveResultarray to identify and log errors for individual records.
By understanding and effectively implementing DML statements, developers can build powerful, data-driven solutions within Salesforce that accurately reflect and drive business operations. Mastering these commands is not just about writing code; it’s about wielding the fundamental tools that empower you to shape and manage the data that fuels your organization’s success.
