Total 202 Questions
Last Updated On : 16-Jul-2025
Preparing with Salesforce-B2C-Commerce-Cloud-Developer practice test is essential to ensure success on the exam. This Salesforce SP25 test allows you to familiarize yourself with the Salesforce-B2C-Commerce-Cloud-Developer exam questions format and identify your strengths and weaknesses. By practicing thoroughly, you can maximize your chances of passing the Salesforce certification spring 2025 release exam on your first attempt. Surveys from different platforms and user-reported pass rates suggest Salesforce-B2C-Commerce-Cloud-Developer practice exam users are ~30-40% more likely to pass.
Given a job step configured in the steptype.json, a developer needs to add a custom status code
“No_FILES_FOUND”.
Which code snippet will complete the requirement?
A. var status = {success: ‘OK’. Message: ‘NO_FILES_FOUND’};
return status
B. var status = {success: ‘OK’. Message: ‘NO_FILES_FOUND’};
return status
C. var status = require(‘dw/system/status’);
return new Status(Status.OK, ‘NO_FILES_FOUND’);
D. this.status = ‘NO_FILES_FOUND’
return this;
E. return ‘NO_FILES_FOUND
Explanation:
In a task-oriented CommonJS job step, you control the exit status using the dw.system.Status class. This allows you to:
Indicate whether the step succeeded or failed
Provide a custom status code (like "NO_FILES_FOUND")
Optionally include a message
✅ Correct Implementation:
var Status = require('dw/system/Status');
return new Status(Status.OK, 'NO_FILES_FOUND');
This tells the job framework:
The step completed successfully (Status.OK)
The custom status code is "NO_FILES_FOUND" — which can be referenced in the steptypes.json file
❌ Why the other options are incorrect
A & B (object literal with success and Message) These are not valid return types for job steps. The framework expects a Status object.
C. this.status = 'NO_FILES_FOUND' this is not used for returning status in job steps.
D. return 'NO_FILES_FOUND' Returning a string does not provide the structured status object required by the job framework.
Once the Cache Information tool of the storefront toolkit is enabled, how can a Digital Developer view caching information for a particular component of the page?
A. Hover over the caching icons now present on the storefront.
B. Open the Request Logs to view the caching information.
C. Start a pipeline debugging session and view the caching information provided.
D. Right-click on the component in UX Studio and view the caching properties of the file.
Explanation:
When you enable the Cache Information tool in the Storefront Toolkit, the site displays small caching icons (usually in red, yellow, or green) over various page components to indicate their caching status.
To view detailed caching information for a specific component:
Simply hover over these icons.
A tooltip will appear showing details like:
Cache status (cached / not cached)
Cache time-to-live (TTL)
Cache ID
❌ Why the other options are incorrect:
B. Open the Request Logs
The request log provides useful diagnostics but does not show per-component cache info.
C. Start a pipeline debugging session
Pipelines are legacy and not used in SFRA. Even if debugging, it doesn't show storefront cache overlay info.
D. Right-click in UX Studio
UX Studio lets you edit metadata and templates, but does not display runtime cache data from the storefront.
When inspecting the weekly service status report for a critical internally hosted web service used in the
application, a developer notices that there are too many instances of unavailability.
Which two solutions can reduce the unavailability of the service?
Choose 2 answers.
A. Update the service to have a faster response time.
B. Modify the code that makes the request to the external service to be wrapped in a try/catch block.
C. Increase the web service time out
D. Change the code that sets the throwOnError attribute of the service to be true.
Explanation:
To reduce unavailability of a critical internally hosted web service, the developer should:
B. Wrap Requests in Try/Catch Blocks
Why? Prevents unhandled exceptions from crashing the application when the service fails.
Example:
try {
var response = HTTPClient.request(serviceURL);
} catch (e) {
Logger.error('Service unavailable: ' + e.message);
// Fallback logic (e.g., cached data)
}
C. Increase the Web Service Timeout
Why? Prevents premature failures due to temporary latency spikes.
How? Adjust the timeout in Business Manager (Administration > Operations > Services) or code:
service.setTimeout(5000); // Increase from default (e.g., 2s to 5s)
Why Not Other Options?
A. Faster Response Time: Requires service-side optimizations (not controlled by the developer).
D. throwOnError=true: Forces failures to throw exceptions (increases unavailability).
Given the requirements:
• To integrate with an external web service using HTTP requests
• To create a service for this purpose with the Service framework using the LocalServiceRegistry
class.
• To test the service before the external service provider makes the API available
Which solution allows the developer to satisfy the requirements?
A. Create a service and implement the mockfull callback and a sitepreference to enable or disable the mock response.
B. Create a service and implement the mockFill callback and set the service mode to mock.
C. Create a service and a Sitepreference that induce the service to respond witch a mock response using a conditional.
D. Create two services, one mock and the real one, and a Sitepreference that enable the mock or the real one
Explanation:
To test a web service integration before the external API is available, Salesforce B2C Commerce provides a built-in mocking mechanism via the Service Framework. The recommended approach is:
✅ Step-by-step:
Create the service using LocalServiceRegistry.createService(...)
Implement the mockFill callback in the service definition
This callback simulates the entire service lifecycle: request creation, execution, and response parsing
Set the service mode to mock in Business Manager:
Go to: Administration > Operations > Services
Select your service and set Service Mode to Mocked
✅ Example:
var service = LocalServiceRegistry.createService('my.http.service', {
mockFill: function (svc, args) {
return {
status: 'OK',
object: {
responseCode: 200,
message: 'Mocked response'
}
};
},
createRequest: function (svc, args) {
// Normally used in live mode
},
parseResponse: function (svc, response) {
// Normally used in live mode
}
});
❌ Why the other options are incorrect
A. mockfull callback and site preference toggle mockFull is a valid callback, but using a site preference to toggle mocking is not required and adds unnecessary complexity.
C. Site preference conditional logic Again, not needed. The Service Framework already supports mocking via configuration.
D. Two separate services (mock and real) This is inefficient and harder to maintain. The Service Framework is designed to handle mocking within a single service definition.
Which technical reports datapoint measures the performance of a controller’s script execution if network factors and Web Adaptor processing is ignored?
A. Processing time
B. Cache hit ratio
C. Call count
D. Response time
Explanation:
In Salesforce B2C Commerce, the Technical Reports dashboard provides several metrics to evaluate controller and pipeline performance. Among these, Processing Time specifically measures:
The execution time of the controller’s script logic
Excludes network latency and Web Adapter processing
Helps isolate performance issues in server-side code
This makes it the most accurate datapoint for understanding how efficiently your controller scripts are running.
Why the other options are incorrect
B. Cache hit ratio Measures how often content is served from cache
C. Call count Number of times a controller or pipeline is invoked
D. Response time Total time including network, adapter, and server-side processing
Given a file in a plug-in cartridge with the following code:
‘use strict’:
Var base = module.superModule;
Function applyCustomCache (req,res,next){
res.CachePeriod = 6; //eslint-disable-line no-param-reassign
res.cachePeriodUnit = ‘hours’) //eslint-disable-line no-param-reassign
next();
}
Module.exports = base;
Module.exports.applyCustomCache = applyCustomCache;
What does this code extend?
A. A controller
B. A middleware script
C. A decorator
D. A model
Explanation:
This code snippet uses module.superModule and module.exports to extend functionality, which is a common pattern in Salesforce B2C Commerce controller extension.
🔍 Breakdown of the code:
'use strict';
var base = module.superModule;
function applyCustomCache(req, res, next) {
res.cachePeriod = 6; // eslint-disable-line no-param-reassign
res.cachePeriodUnit = 'hours'; // eslint-disable-line no-param-reassign
next();
}
module.exports = base;
module.exports.applyCustomCache = applyCustomCache;
module.superModule – used to inherit the original implementation (typical in controllers).
applyCustomCache – a middleware function, likely intended to be inserted into a controller route.
module.exports = base; followed by module.exports.applyCustomCache = applyCustomCache; – shows that it's adding functionality to the base (extending it).
✅ Why it's a controller:
Controllers are typically extended using module.superModule to override or add new route logic.
The applyCustomCache function is structured like middleware that would be used in SFRA-style controllers.
❌ Why the others are incorrect:
B. Middleware script – Middleware is used within controllers, but this file exports and extends a controller, not just middleware.
C. Decorator – Decorators modify model objects, and use a different extension pattern.
D. Model – Models use module.superModule too, but this file deals with req, res, and next, which are specific to controllers and middleware.
Universal Containers specifies a new category hierarchy for navigating the digital commerce storefront.
A
Digital Developer uses Business Manager to manually create a catalog with the specified category hierarchy,
then uses the Products & Catalogs > Import & Export module to export the catalog as a file.
How can other Developers with sandboxes on the same realm create the same catalog in their own sandboxes?
A. Use Business Manager to upload and import a copy of the export file obtained from the original Developer.
B. Use the remote upload capability of the Site Import & Export module of Business Manager.
C. Use the import capability of the Site Import & Export module of Business Manager.
D. Use the Business Manager Data Replication module to replicate the catalog from the original Developer’s sandbox.
Explanation:
To replicate a manually created catalog and category hierarchy across sandboxes in the same realm, the most efficient and scalable method is to:
Export the catalog from the original sandbox using:
Merchant Tools > Products and Catalogs > Import & Export
Import the catalog into other sandboxes using:
Administration > Site Development > Site Import & Export > Import
This ensures that:
The entire catalog structure, including categories and product assignments, is preserved
The import is schema-compliant and follows the two-pass import process (objects first, then relationships)
❌ Why the other options are incorrect
A. Use Business Manager to upload and import a copy of the export file
This refers to the Catalog Import & Export module, which is fine for product-level imports but not ideal for full site-level replication.
B. Use remote upload capability of Site Import & Export
This is valid only if you're using WebDAV or FTP, but it doesn't complete the import itself.
D. Use Data Replication module
Data replication is used between staging and production, not across developer sandboxes.
A developer wants to add a link to the My Account Page.
What is the correct code to accomplish this?
A. href=”${URLUtils.get(‘Account-Show’)}>${Resource.msg(‘myaccount’,’account’,request.locale())}
B. ="”${url.get(‘Account-Show’)}">${Resource.message(‘myaccount’)
C. ="”${URLUtils.url(‘Account-Show’)}">${Resource.msg(‘myaccount’,’account’,null)}
D. (‘Account-Show’)}>${ResourceMgr.getPropierties(‘myaccount’,’account’,null)}
Explanation:
To add a link in an ISML template (used in Salesforce B2C Commerce / SFRA), you typically use:
URLUtils.url('Controller-Action') to generate URLs.
Resource.msg('key', 'bundle', null) to fetch localized message strings from properties files.
✅ Why Option C is Correct:
< a href = " $ {URLUtils. url ( ' Account-Show ' ) } "> $ { Resource. msg('myaccount','account',null ) }< / a >
URLUtils.url('Account-Show') → Correct way to generate the URL to the My Account page.
Resource.msg('myaccount','account',null) → Looks up the localized text for the link label from the account.properties file using key myaccount.
❌ Why the Other Options Are Incorrect:
A. href=”${URLUtils.get(‘Account-Show’)}>
❌ URLUtils.get() is not a valid method. The correct method is url().
B. url.get() and Resource.message()
❌ url.get() and Resource.message() are both invalid function names in this context.
D. ResourceMgr.getPropierties()
❌ Not valid syntax and ResourceMgr is not typically used in ISML for rendering localized text.
Universal Containers sells physical gift cards for the holidays.
What needs to occur to guarantee the cards will always be available?
A. Create an inventory record with an unlimited Allocation value
B. Create an inventory record with an extremely high Allocation value (i.e., 1 billion certificates).
C. Create a perpetual inventory record
D. Create an inventory record with Backorder Handling enabled
Explanation:
In Salesforce B2C Commerce, a perpetual inventory record ensures that a product is always considered in stock, regardless of actual allocation or stock levels. This is ideal for products like gift cards, which:
Are not physically limited in quantity
Can be generated or fulfilled digitally or on demand
Should never show as out of stock to shoppers
✅ Key Attribute:
In Business Manager, set the Perpetual flag to true on the product’s inventory record.
This tells the system:
“This product is always available for purchase.”
❌ Why the other options are incorrect
A. Unlimited Allocation value Allocation is a numeric field. Even a high value can eventually be depleted. It’s not a guarantee of perpetual availability.
B. Extremely high Allocation (e.g., 1 billion) This is a workaround, not a best practice. It can lead to performance issues and is not semantically correct.
D. Backorder Handling enabled Backorders allow purchases when stock is depleted, but they imply future fulfillment. Not ideal for gift cards, which should be instantly available.
A client that sells to multiple countries in Europe needs to disable Apple Pay for Denmark.
Which Business Manager module is used to achieve this requirement?
A. Locale Payments
B. Payment Methods
C. Payment Processors
D. Apple Pay
Explanation:
To enable or disable specific payment methods (like Apple Pay) per country or locale, the correct place to do that in Salesforce B2C Commerce Business Manager is the Payment Methods module.
✅ Why "Payment Methods" is correct:
The Payment Methods module allows you to:
Activate or deactivate payment methods (like Apple Pay).
Restrict payment methods by country, currency, or site.
Customize availability based on locale, customer group, or other criteria.
So, to disable Apple Pay for Denmark, you'd go to:
Merchant Tools > Ordering > Payment Methods
Then configure the availability conditions so that Apple Pay is not available for customers from Denmark.
❌ Why the other options are incorrect:
A. Locale Payments
🔴 Not an actual Business Manager module.
C. Payment Processors
🔴 This is where you configure how a payment is processed (e.g., integration settings), not where it's enabled/disabled by country.
D. Apple Pay
🔴 There's no standalone "Apple Pay" module in Business Manager; Apple Pay is managed under Payment Methods.
Page 9 out of 21 Pages |
Salesforce-B2C-Commerce-Cloud-Developer Practice Test Home | Previous |