Saturday, December 17, 2022

Interview Questions on Data Loader and Import Wizard in Salesforce

Q. Difference between Bulk Api Vs Data Loader?

  1. Dataloader is a good to insert upto 5 million records.
  2. Bulk API allows you to run the inserts on dataloader asynchronously. Using the serial mode a batch of 200 records can be processed at a time. The serial mode is better than the parallel to the database contention.
  3. Bulk API is very fast when you are working on large sets of data like 2 million. The SOAP API the default of dataloader will be comparatively slow tot he Bulk API.

Q. Can we import history objects in Salesforce using data loader?

History cannot be created since AccountHistory, CaseHistory and ContactHistory doest’t support create() operation.

“Note that it is not possible to insert directly into the Opportunity History or Case History tables. These tables mainly carry the field audit history information of Opportunity or Case records. If such data does need to be migrated over, you should migrate this data into a Read Only Custom Object.”

Q. When to use data loader?

You need to load 50,000 to 5,000,000 records. Data Loader is supported for loads of up to 5 million records. If you need to load more than 5 million records, use a product from App Exchange.

Q. Can we use record type name in data loader while importing the data?

You cannot use Record Type Name in data loader to map to a record type. You will need to use RecordTypeId for this purpose.

Changing record types for multiple records via the data loader is not as straight forward as it would seem. Instead of using the record type name, usage of the record type ID gathered via the URL is required.

Q. I want to be able to export the related account fields for each quotation in Salesforce using Data Loader. Is it possible?

You can export parent and their parent attribute values from Child object. But from the parent, you cannot export child object related attributes.

You could move to 5 levels up to the Parent or related objects.

As we cannot query foreign key relationships more than 5 levels away from the root SObject

so, maximum up to this level is supported from Quote:

Opportunity.Account.Parent.Parent.Parent.Name

In the below example I have shown how to move from

Quote –> Opportunity –> Parent Account –> Parent.Parent.Account

Example

SELECT Id, Name,

OpportunityId,

Opportunity.Name,

Opportunity.Account.Name,

Opportunity.Account.Id,

Opportunity.Account.ParentId,

Opportunity.Account.Parent.Name

FROM Quote

Results

hierarchy

Q. What items cannot be migrated using data loader?

  1. Custom Metadata being a metadata record cannot be migrated via Data Loader.
  2. Platform Events cannot be migrated.
  3. history of the object we will not be able to migrate because history automatically create when any changes made by user on record and also they don’t have creatable and updatable attributes.

Q. I want to upload files (.doc, .pdf) from external system to the attachments of Case record.What is the best approach?

We can go with data loader using SOAP API, which is by default (not Bulk API).

Why?

  1. You do not need to write any code to use Data loader
  2. It is tested solution, so no bugs
  3. With SOAP API you will need around 500+ API (100000 records / 200 batch size = 500 API required) which is not too much (we have 100k API limit per 24 HRS for enterprise org).
  4. Also depending on the data and schema there might be locking issue with Bulk API.
  5. Bulk API does not allow zip file size more than 10 MB.

Q. What to do when data load insert cause data locking?

Since your batch job is running in parallel across its batches, the batches can independently attempt to gain locks on the same Account, resulting in this exception.

The two techniques that are often applicable are

  • Order your incoming data by parent record Id, to prevent records in different batches from contending for the lock on the parent.
  • Ensure that the Data Loader is configured to utilize the Bulk API in Serial mode, rather than Parallel mode.

“Enable serial mode for Bulk API”. This mean your job would still technically run asynchronous, but you will remove the ability to run in parallel. This checkbox is a good troubleshooting option if you are seeing severe numbers, but you increase your run time when you tell Data Loader it has to process one batch at a time.

Q. How to configure Bulk API in data loader?

The Bulk API is optimized to load or delete a large number of records asynchronously. It is faster than the SOAP-based API due to parallel processing and fewer network round-trips. By default, Data Loader uses the SOAP-based API to process records.

To configure Data Loader to use the Bulk API for inserting, updating, upserting, deleting, and hard deleting records:

  1. Open the Data Loader.
  2. Choose Settings | Settings.
  3. Select the Use Bulk API option.
  4. Click OK.

Q. What is Serial mode for Bulk API option?

You can also select the Enable serial mode for Bulk API option. Processing in parallel can cause database contention. When contention is severe, the load can fail. Serial mode processes batches one at a time, however it can increase the processing time for a load.

Q. Can we do hard delete when we configure Data loader to use Bulk API?

You can hard delete records when you configure Data Loader to Use Bulk API. Keep in mind that hard deleted records are immediately deleted and can’t be recovered from the Recycle Bin.

Q. Data Loader vs Import Wizard?

Use Data Loader when:

  • You need to load 50,000 to 5,000,000 records. Data Loader is supported for loads of up to 5 million records. If you need to load more than 5 million records.
  • You need to load into an object that is not yet supported by the import wizards.
  • You want to schedule regular data loads, such as nightly imports.
  • You want to export your data for backup purposes.

Use the import wizards when:

  • You are loading less than 50,000 records.
  • The object you need to import is supported by import wizards.
  • You want to prevent duplicates by uploading records according to account name and site, contact email address, or lead email address.

Q. What is batch size in data loader?

In a single insert, update, upsert, or delete operation, records moving to or from Salesforce are processed in increments of this size. The maximum is 200 records. Recommended value between 50 and 100.

The maximum value is 10,000 if the Use Bulk API option is selected.

Q. How to attach zip files using Data Loader?

Select this option to use Bulk API to upload zip files containing binary attachments, such as Attachment records or Salesforce CRM Content.

This option is only available if the Use Bulk API option is selected.

Q. How assignment rule is added in data loader?

Specify the ID of the assignment rule to use for inserts, updates, and upserts. This option applies to inserts, updates, and upserts on cases and leads.

1. Create lead assignment rule

2. Put assignment ruled in data loader setting

Note- Wont work if lead status is converted

Q. How to import attachments using dataloader?

Create an attachments.csv file (the name of the file is unimportant) with the following column headers:

ParentId – ID of the record to which the attachment should be associated

Name – Name of the attachment

ContentType – Format of the extension (e.g. .xls, .pdf, etc)

OwnerID – ID for the owner of the attachment

Body – File path to the Attachment on the local machine (C:\documents and settings\schun\desktop\attachments\file.xls)

Graphical user interface, application, table, ExcelDescription automatically generated

Log in to the Data Loader.

Select the “Insert” command.

In the ‘Select Sforce Object’ step, select the ‘Show all Sforce Objects’ checkbox and then select “Attachments”.

Choose the attachments.csv file.

In the mapping step, map the following fields:

Parent ID

Name

Owner ID

Body – Make sure to map the Body column which you created previously with the file extension. This is how you designate the file and location of the attachments to be inserted.

Click “OK” to proceed with the insert. It may take a few minutes but the attachments should be successfully uploaded to your salesforce org.

Q. How to Import related(child) records using an External ID in Salesforce?

Make sure parent object have an External Id Field.

2. In the data loader, use Upsert operation.

Note:
Do not select Insert operation. Since we will use Id field to find matching records, all the records in the file will be inserted since Id will be blank in the import file.

3. Match the new records with Id field so that new records in the file will be inserted/created.

Note:
We get this matching field since we are using Upsert operation.

4. For Parent lookup or master detail field, select the External Id field.

5. The mapping will show the External Id mapping.

This will avoid VLOOKUP to parent record Id while inserting child records.

Q. Difference between 15 and 18 digit Id?

ID fields in the Salesforce.com user interface contain 15-character, base-62, case-sensitive strings. Each of the 15 characters can be a numeric digit (0-9), a lowercase letter (a-z), or an uppercase letter (A-Z). Two unique IDs may only be different by a change in case.

Because there are applications like Access which do not recognize that 50130000000014c is a different ID from 50130000000014C, an 18-digit, case-safe version of the ID is returned by all API calls. The 18 character IDs have been formed by adding a suffix to each ID in the Force.com API. 18-character IDs can be safely compared for uniqueness by case-insensitive applications, and can be used in all API calls when creating, editing, or deleting data.

If you need to convert the 18-character ID to a 15-character version, truncate the last three characters. Salesforce.com recommends that you use the 18-character ID.

15 digit case-sensitive version which is referenced in the UI
18 digit case-insensitive version which is referenced through the API
The last 3 digits of the 18 digit ID are a checksum of the capitalizations of the first 15 characters, this ID length was created as a workaround to legacy systems which were not compatible with case-sensitive IDs.
The API will accept the 15 digit ID as input but will always return the 18 digit ID.

Q. Can We Bypass the Required Fields In Data Loader?

Inserting the records through Data loader Required fields cannot be skipped.

Q.What Are The Required Fields When Inserting Users Through Data Loader?:

  • Alias
  • Username
  • Email
  • First Name
  • Last Name
  • Locale (LOCALESIDKEY)
  • Language (LANGUAGELOCALEKEY)
  • Email Encoding (EMAILENCODINGKEY)
  • Time Zone (TIMEZONESIDKEY)
  • Currency (CURRENCYISOCODE)
  • ProfileId (not profile name but rather the 15-character ID from the URL of the profile in the UI) 

Q. How to differentiate commas within field while uploading using Data Loader ?

Data Loader CSV file for import will contain commas for any of the field content, you will have to enclose the contents within double quotation marks ” “. Data Loader will be able to find this.

Q. What to do if Hard Delete Not Working In Apex Data Loader?

Ans: Check your Profile permissions “Bulk API Hard Delete” check box should be enable for permanently deleting the records using Apex Data Loader. Using Hard Delete operation, we can delete existing records from our Salesforce organization permanently. Deleted records will not be available in Recycle Bin.

Q.How To Import MultiSelect Picklist Values Through Data Loader?

By using apex data loader inserting or updating multi select pick list fields through the Apex Data Loader, multiple values for the mulit picklist field are separated by a semi-colon “;”.

Q. How to Import Different Languages Data Into Salesforce?

Follow the below simple steps to import different languages data using Apex Data loader.Languages_IN_ Salesforce

Open the .xls file that displays the other language characters correctly and which was prepared for import.

Click File –> Save As. In “Save as type”, kindly select Unicode Text (*.txt).

Click “Save” button.

Open the .txt file in Notepad.

In Notepad, click File –> Save As, In“File name”, change the file’s extension from .txt to .csv. “Save as type”, select “All Files”, In “Encoding”, select “UTF-8”, Click “Save” button.

Import the .csv file.

Q. How To Start Loading Data From 101 Row Of The CSV File In Data Loader?

Ans: Open your data loader then Menu section Go to Settings–>scroll down and Set “Start at Row” field to start the process from the 101 row number from the CSV file.

You can see the below screen.

Start_at_row_in_data_loader

Q. How Can We Bypass Workflow Rules When Using The Data Loader?

We don’t have any option to skip workflow rules, Validation rules and Triggers from data loader. But we can deactivate the workflow rules while loading data through data loader. But it’s not a good practice, why because that the workflow can be deactivated in the app before using the Data Loader, but users can be creating/editing records during this process which should not be interrupted

Q. What is an External Id in data loader in salesforce?

Suppose we have account table in Salesforce and account table outside of the Salesforce (ex: .csv file, sql database). In Salesforce all the records can be identified with record id and outside of the Salesforce we can’t recognize records with Salesforce id that is the reason to compare outside table and salesforce table in Salesforce for one of the field we have to enable external ID (we can enable external id for text, number, auto number and email). If we enable external id we can compare that particular column with the column which is available in external table. While comparing if the both column values are same then it will update otherwise it will insert.

An external ID field is simply a field which is a unique identifier for your record, other than the Salesforce Id, usually coming from an external system. For example, if you are importing accounts, you may have a field called ‘ERP Account Number’ which is the identifier for your Accounts inside your ERP system.

Q. Assign Permission Set to multiple Users Through Data Loader?

To assign Permission Set for multiple Users using apex data loader, follow the below steps,

Create a .CSV file with User id and Permission set id to be assigned.

Login into Apex Data loader.

Click “Insert” button.

Select “Permission Set Assignment” as object and browse the .csv file created in step1

Q. How To Use Upsert Operation With Apex Data Loader?

Upsert opeation makes use of the sObject record’s primary key(Salesforce.com Record Id) or the external ID, if specified to determine whether new records should be created or else we have to update the existing records.

Q. What is the filed mapping file format of the data loader?

.SDL ( “Salesforce Data Loader” file of Field mapping)

Q. How to insert null values into dataloader?

In dataloader settings we should enable ‘insert null values’ checkbox otherwise we can’t insert null values.

Q. Data loader or Import wizard supports which file format?

.CSV (Comma Separated Values)

Q. Which operations we can perform on dataloader?

Insert (Inserting brand new records, no need of ID)

Update (Updating the existing records based on the record ID)

Upsert (To Upsert we should have one external ID field on the object, based on the external id field if the value already exists it will update, if doesn’t exist then it will insert)

Delete (Delete the records based on the id provided, to delete we need only id, deleted records can be found in recycle bin)

Hard delete (Delete the records based on the id provided, to delete we need only id, deleted records can’t be found in recycle bin, Note: If we enable bulk API in data loader settings then only we can perform Hard delete.)

Export (From any object we can export records based on the SOQL query, Note: We can’t export deleted records which are there in the recycle bin)

Export all (From any object we can export records based on the SOQL query, Note: Using Export all we can export deleted records which are there in the recycle bin also)

Q. Can Users can mass transfer records to which they do not have read access?
False

Q. What is DateTime format supported in .csv file for upload in salesforce?
YYYY-MM-DDThh:mm:ss.00Z

Q. Is there is any option to specify time zone for uploading records in apex data loader?
Yes. Present in settings page of apex data loader.,

No comments:

Post a Comment