Total 196 Questions
Last Updated On : 11-Dec-2025
A developer receives a request for tracking data for alt sends associated with a specific JoblD. The developer needs to see Sends, Opens, Clicks, and Bounces. Which two activities could the developer use?
(Choose 2 answers)
A. Tracking Extract Activity
B. Server-Side JavaScript Activity
C. Campaign Data Extract
D. SQL Query Activity
Summary:
To retrieve individual send, open, click, and bounce tracking data tied to a specific JobID in Marketing Cloud, the data must come from the underlying tracking sent data views (_Sent, _Open, _Click, _Bounce). The only two Automation Studio activities that can extract or query this granular tracking data at the JobID level are the Tracking Extract and the SQL Query Activity (targeting a data extension).
Correct Option:
A. Tracking Extract Activity
The Tracking Extract can export Job-level tracking details (Sends, Opens, Clicks, Bounces, Unsubscribes, etc.) for one or more specific JobIDs. When configured with โJobโ scope and the desired JobID(s), it generates a _Job.csv file plus the corresponding _Sent.csv, _Open.csv, _Click.csv, and _Bounce.csv files containing subscriber-level records.
D. SQL Query Activity
SQL Query Activities can directly query the tracking data views (_Sent, _Open, _Click, _Bounce) using a WHERE JobID = XXXXX condition. This is the most common and flexible method developers use to pull send, open, click, and bounce data for a specific JobID into a data extension for reporting or further processing.
Incorrect Option:
B. Server-Side JavaScript Activity
Server-Side JavaScript cannot directly access the tracking data views or perform a tracking extract. It can only call limited WSProxy or Core functions and is not designed for bulk tracking exports.
C. Campaign Data Extract
Campaign Data Extract is part of the legacy Automation Studio โData Extractโ activity and is limited to campaign-level summaries. It does not provide individual tracking events tied to a specific JobID.
Reference:
Salesforce Official Documentation: Data Encryption at Rest
Northern Trail Outfitters wants to trigger follow up messages after a subscriber opens an email. What process would they use to get real-time engagement data?
A. Query Activity
B. Client-Side JavaScript
C. WSproxy Service
D. Event Notification Service
Summary:
This question focuses on capturing real-time engagement events (like an email open) to trigger an immediate follow-up action. Traditional methods like Query Activities run on a scheduled basis and introduce significant latency. The solution requires a mechanism that can instantly notify an external system or service the moment an event occurs, enabling a near-instantaneous response.
Correct Option:
D. Event Notification Service:
This is the correct process. The Event Notification Service (ENS) is a Marketing Cloud feature that pushes real-time notifications for specific events (Sends, Opens, Clicks, Bounces, etc.) to a pre-configured HTTP endpoint (webhook). This allows an external application to receive the open event data immediately and trigger a follow-up message without any delay.
Incorrect Option:
A. Query Activity:
A Query Activity runs on a predefined schedule (e.g., hourly, daily) within Automation Studio. It queries Data Views, which are updated periodically, not in real-time. This method introduces a latency of minutes to hours, making it unsuitable for triggering immediate follow-ups based on an open.
B. Client-Side JavaScript:
While JavaScript can be used to track opens and fire additional actions, it is client-side and unreliable (can be blocked by email clients). More importantly, it is not a Marketing Cloud process for getting data into the platform or a connected system to trigger a new send; it executes in the subscriber's inbox.
C. WSproxy Service:
WSproxy is an SSJS object used within CloudPages to make SOAP API calls. It is a tool for programmatically interacting with Marketing Cloud data, but it is not a service designed for receiving or pushing real-time engagement event streams.
Reference:
Salesforce Official Documentation: Event Notification Service
A developer receives Error Code 5 when performing a SOAP API call. The error states: "Cannot Perform 'Post' on objects of type 'SentEvent'". What could be the issue?
A. SOAP does not support POST; useREST
B. The authentication token has expired.
C. It may be a temporary network issue.
D. 'SentEvent' is not able to be updated using SOAP.
Summary:
This error indicates an attempt to perform an unsupported operation on a specific API object. The SentEvent object represents a record in the Sent Data View, which is a system-generated, read-only log of email send events. Error Code 5 typically signifies an invalid operation, such as trying to create, update, or delete a record that is not allowed by the system's data model. The key is recognizing which objects are mutable and which are immutable.
Correct Option:
D. 'SentEvent' is not able to be updated using SOAP.
This is the correct issue. The SentEvent object, like most tracking event objects (e.g., OpenEvent, ClickEvent), is strictly read-only within the SOAP API. It is a historical log entry, and the system does not allow creating (Post), updating, or deleting these records via any API. The only permitted operation is Retrieve to pull this data out for reporting.
Incorrect Option:
A. SOAP does not support POST; use REST.
This is incorrect. The SOAP API absolutely supports the Create method (which corresponds to the Post operation mentioned in the error). The issue is not the HTTP verb but the specific object type the operation is being applied to. The REST API also would not allow creating a SentEvent record.
B. The authentication token has expired.
An expired or invalid authentication token typically results in a different class of error, such as an "HTTP 401 Unauthorized" or a SOAP fault indicating a security failure, not Error Code 5 related to a specific object operation.
C. It may be a temporary network issue.
Temporary network issues generally cause connection timeouts, request failures, or non-specific server errors (e.g., HTTP 500). They do not produce a specific, descriptive error message about the inability to perform a Post on a particular object type.
Reference:
Salesforce Official Documentation: SentEvent Object
A developer is making an API REST call to trigger an email send. An access token is used to authenticate the call. How long are Marketing Cloud v1 access tokens valid?
A. Access tokens expire after 24 hours.
B. REST calls do not require an access token.
C. Each API call requires a new access token.
D. Access tokens expire after one hour.
Summary ๐
When using the older v1 Marketing Cloud Authentication endpoint (/v1/requestToken) for server-to-server (installed package) integrations, the generated access token is valid for a default period of one hour (3600 seconds). This token is reusable for multiple API calls within that hour, allowing the application to maintain a session and avoid requesting a new token for every interaction. Developers must store the token and the expiration time (expiresIn) and be prepared to request a new token when the current one expires to maintain continuous access.
Correct Option โ
D. Access tokens expire after one hour.
The Marketing Cloud REST Authentication service (used for v1 tokens via the requestToken endpoint) returns an expiresIn value of 3600 seconds, which is one hour.
This is the standard lifetime for the access token, and the application must refresh the token before or immediately after this period to continue making successful API calls.
It is a recommended best practice to refresh the token slightly before the one-hour mark to prevent call failures due to expiration
Incorrect Options โ
A. Access tokens expire after 24 hours.
This is incorrect. A 24-hour expiration is significantly longer than the standard one-hour (3600 seconds) lifetime for the v1 access tokens, and is not the default configuration. Using an access token past its one-hour validity will result in an "Unauthorized" (401) HTTP error response.
B. REST calls do not require an access token.
This is incorrect. All Marketing Cloud REST API calls require an authenticated access token in the Authorization header (using the Bearer scheme). The token acts as the session credential, granting access permissions defined by the installed package.
C. Each API call requires a new access token.
This is incorrect. The access token is explicitly designed to be reusable for the duration of its one-hour validity. Requesting a new token for every API call is inefficient, adds latency, and can lead to API throttling issues, which is strongly discouraged by Salesforce documentation.
Reference ๐
Salesforce Developers Documentation - Get an Access Token
A developer is using the REST Authorization Service to obtain an OAuth access token. Which method should be used to include the access token in the API requests
A. Include the header x-access-token: your_access_token
B. Include as a query parameter access_token=Y0UR_ACCESS_TOKEN
C. Include the header Authorization: Basic your_access_token
D. Include the header Authorization: Bearer YOUR ACCESS TOKEN
Summary:
This question tests the knowledge of the standard and correct method for presenting an OAuth 2.0 access token in the HTTP header of a REST API request. After successfully authenticating and receiving an access token from the Authorization Service, this token must be included in all subsequent API calls to prove the request is authorized. The OAuth 2.0 Bearer Token specification defines the exact format for this.
Correct Option:
D. Include the header Authorization:
Bearer YOUR_ACCESS_TOKEN: This is the correct and standardized method. The Authorization header is used for this purpose, and the Bearer keyword specifies the type of authentication being used, which is an OAuth 2.0 bearer token. The API endpoint will validate this token to authorize the request.
Incorrect Option:
A. Include the header x-access-token: your_access_token:
This is a non-standard and incorrect method for the Marketing Cloud REST API. While some custom APIs might use a similar custom header, the official specification for OAuth 2.0 requires the use of the standard Authorization: Bearer header.
B. Include as a query parameter access_token=YOUR_ACCESS_TOKEN:
This method is discouraged for security reasons. Placing the token in the URL as a query parameter can expose it in server logs, browser history, and referrer headers, making it more vulnerable to being intercepted and stolen.
C. Include the header Authorization:
Basic your_access_token: This is incorrect. The Authorization: Basic header is used for Basic Authentication, where the credentials are a base64-encoded string of a username and password. It is not used for presenting an OAuth 2.0 access token. Using a Bearer token with the Basic schema will result in an authentication failure.
Reference:
Salesforce Official Documentation: Use an Access Token
A developer created a landing page in CloudPages which return unique content when subscriber data is located on a related data extension. The developer does not know if all subscribers have rows in the related data extension, and want default content to render if no subscriber data is found on the related data extension. Which best practice should the developer follow to control the unique and default content?
A. Use the RowCount function and an IF statement
B. Use the Lookup, Row and Field functions
C. Use the LookupOrderRows and Row functions
D. Use the DataExtensionRowCount function
Summary:
To display personalized content when a subscriber has a row in a related data extension and fall back to default content when no row exists, the best practice is to first attempt to retrieve the row(s) and then check how many rows were returned. The most reliable and commonly used functions for this pattern in CloudPages are RowCount() combined with LookupRows() (or LookupOrderedRows()) inside an IF statement.
Correct Option:
A. Use the RowCount function and an IF statement
This is the recommended best practice. The developer performs a LookupRows() or LookupOrderedRows() call, stores the result in a variable (e.g., @rows), then uses RowCount(@rows) inside an IF statement. If RowCount(@rows) > 0, personalized content is shown using Row() and Field(); otherwise, default content is rendered. This pattern explicitly handles the โno row foundโ scenario safely.
C. Use the LookupOrderedRows and Row functions
Also correct and widely used. LookupOrderedRows() is preferred over LookupRows() when a specific sort order is needed or to guarantee consistent results. Combined with RowCount() and Row()/Field(), it achieves the same safe personalized/default content logic and is considered a best-practice variation of option A.
Icorrect option:
B. Use the Lookup, Row and Field functions
Lookup() returns only a single value, not a rowset, so RowCount() cannot be used with it. It also fails silently with empty strings when no row is found, making it harder to reliably detect โno dataโ versus a legitimate empty value.
D. Use the DataExtensionRowCount function
DataExtensionRowCount() only returns the total row count of an entire data extension and cannot be filtered by SubscriberKey or EmailAddress. It is useless for checking if the current subscriber has a matching row.
Reference:
Salesforce Help โ RowCount Function
Certification Aid wants to import an encrypted CSV file from the Marketing Cloud Enhanced FTP server. Which two File Transfer activities are needed to achieve this?
(Choose 2.)
A. To decrypt the import file on the Enhanced FTP server.
B. To move the import file from the Safehouse to Marketing Cloud.
C. To decrypt the import file on the Safehouse.
D. To decrypt the import file on the Safehouse.
E. To move the import file from the Enhanced FTP server to the Safehouse
Summary:
This question involves the process for handling encrypted files within Marketing Cloud's secure architecture. The Enhanced FTP server is an external-facing location, while the Safehouse is a highly secure, internal, and transient storage area designed for decrypting files. The process requires two distinct steps: first, moving the file into the secure decryption environment, and second, performing the decryption itself within that environment before the file can be used for import.
Correct Option:
B. To move the import file from the Safehouse to Marketing Cloud.
This is the second step. After the file is decrypted within the Safehouse, a second File Transfer activity is needed to move the now-decrypted file from the Safehouse into a standard, accessible Marketing Cloud file location (like the Import folder) so it can be used by a subsequent Import Activity.
E. To move the import file from the Enhanced FTP server to the Safehouse.
This is the first and crucial step. The encrypted file must be transferred from the external Enhanced FTP server into the secure, isolated Safehouse environment. This is a prerequisite for the decryption process to occur.
Incorrect Option:
A. To decrypt the import file on the Enhanced FTP server.
This is incorrect and not possible. The Enhanced FTP server does not have decryption capabilities. Decryption is a function that can only be performed within the secure confines of the Safehouse.
C. To decrypt the import file on the Safehouse.
This is a duplicate of option D and is conceptually the core action. However, "decrypt" is not a type of File Transfer activity. Decryption is an automatic process that happens when a file is transferred out of the Safehouse to a Marketing Cloud location (addressed in option B). You do not select a "decrypt" activity.
D. To decrypt the import file on the Safehouse.
This is a duplicate of option C and is incorrect for the same reason. Decryption is not a selectable File Transfer activity type. The act of transferring the file from the Safehouse to Marketing Cloud (option B) triggers the decryption.
Reference:
Salesforce Official Documentation: Import Encrypted Files
Certification Aid sends an email to a newly imported List with Subscribers who have no associated Subscriber Key. Which value will become the Contact Key?
A. ContactID
B. Email address
C. Subscriber ID
D. Unique random number
Summary ๐
When an email is sent to a List in Marketing Cloud Engagement (which is a legacy sending method), and the Subscribers being imported have no explicit Subscriber Key defined, Marketing Cloud must assign a unique identifier. In this specific scenario, the system defaults to using the Email Address as the definitive unique identifier for the subscriber. This value will then be promoted to become the Contact Key (also known as the Subscriber Key) within the Marketing Cloud Contact model.
Correct Option โ
B. Email address
When importing into a List without specifying a Subscriber Key, the Email Address is automatically used as the unique identifier.
Marketing Cloud's core contact model requires every subscriber to have a unique Contact Key. In the absence of a developer-specified key, the system uses the next best unique piece of data, which is the email address.
For all future interactions (opens, clicks, bounces), the system will identify this subscriber based on this email address, treating it as the Contact Key.
Incorrect Options โ
A. ContactID
ContactID is an internal, system-generated, and non-modifiable numeric identifier used within the Marketing Cloud database for internal tracking. It is not the same as the user-facing and external system-integrating Contact Key (Subscriber Key) and is not used as a default identifier upon import.
C. Subscriber ID
Subscriber ID is another internal, system-generated numeric identifier, similar to ContactID, that is specific to the Email Studio environment. Like the ContactID, it is not the value that Marketing Cloud defaults to using as the unique, business-level Contact Key when one is missing upon initial list import.
D. Unique random number
While Marketing Cloud could generate a random number, it prioritizes using a known, unique, and business-relevant value for the Contact Key to facilitate data synchronization with external systems like Sales/Service Cloud (e.g., using the Salesforce Contact ID). A random number is only used as a last resort in some specific cases, but for a list import with a valid email, the email address is the preferred default key.
Reference ๐
Salesforce Help Documentation - Marketing Cloud Contacts
A developer wants to configure performance tracking of the content dynamically created via AMPscript in an email. Which two steps should be performed to achieve this objective?
(Choose 2)
A. Request theImpression Tracking feature be enabled on the account
B. Include the functions BeginImpressionRegion and EndImpressionRegion
C. Configure dynamic content block in Content Builder
D. Add a unique identifier in the HTML tags within the generated content
Summary:
This question focuses on tracking impressions for content that is not a standard, pre-built dynamic content block but is instead generated on-the-fly using AMPscript logic (e.g., via IF statements or building HTML strings). Standard dynamic content tracking does not apply here. The solution requires a specific feature and corresponding AMPscript functions to mark the dynamic region for the system to monitor.
Correct Option:
A. Request the Impression Tracking feature be enabled on the account:
This is the first necessary step. Impression Region tracking is not a standard, enabled feature for all accounts. A Salesforce support ticket must be submitted to have this capability activated before the AMPscript functions will have any effect.
B. Include the functions BeginImpressionRegion and EndImpressionRegion:
This is the core technical step. Once the feature is enabled, you wrap the dynamically generated AMPscript content between the BeginImpressionRegion and EndImpressionRegion functions, assigning a unique region ID. This tells the system to track when this specific block of content is displayed (rendered) in an email.
Incorrect Option:
C. Configure dynamic content block in Content Builder:
This is the standard method for tracking pre-defined content variations within a Content Builder dynamic block. It does not apply to content generated purely through inline AMPscript logic, which is what this scenario describes.
D. Add a unique identifier in the HTML tags within the generated content:
While this is a good practice for general development and CSS targeting, a simple HTML id or class is not recognized by the Marketing Cloud tracking system. It will not enable impression tracking for the content. The system specifically requires the Impression Region AMPscript functions.
Reference:
Salesforce Official Documentation: Impression Region Tracking
A developer wants to create a complex dynamic email with three different sections and four different possible content blocks In each section. The email will be sent to an audience of over one million contacts. Which best practice should the developer use to ensure a blank email will not be sent?
A. Send a test of every possible version using Test Send
B. Review every possible version using Subscriber Preview
C. Create separate emails for each version
D. Confirm every version has default content
Summary:
With three sections and four possible content blocks per section, there are 4ยณ = 64 possible email combinations. When sending to over one million contacts, the only scalable and reliable way to guarantee no subscriber receives a completely blank email is to ensure every dynamic section has a default/fallback content block that renders when no personalized content is matched. Relying solely on manual testing or preview cannot cover all edge cases at this scale.
Correct Option:
D. Confirm every version has default content
This is the only true best practice for large-scale dynamic emails. Every dynamic section must include a fallback (default) content block using techniques like ContentBlockByKey/ID with a default key, ClaimRow, or IF/ELSE logic with a final ELSE that outputs generic content. This guarantees that even if all personalization rules fail, the section will never be blank, preventing an empty email.
Incorrect Option:
A. Send a test of every possible version using Test Send
Impossible at scale. Manually testing 64 versions via Test Send is impractical and still misses real-world data mismatches.
B. Review every possible version using Subscriber Preview
Subscriber Preview can check many combinations but cannot realistically verify all 64 versions, especially when data varies across one million subscribersons. It also does not catch runtime failures.
C. Create separate emails for each version
Creating 64 separate emails defeats the purpose of dynamic content, dramatically increases maintenance overhead, and is not a viable solution.
Reference:
Salesforce Help โ Dynamic Content Best Practices
| Page 4 out of 20 Pages |
| Salesforce-Marketing-Cloud-Engagement-Developer Practice Test Home | Previous |
Our new timed Salesforce-Marketing-Cloud-Engagement-Developer practice test mirrors the exact format, number of questions, and time limit of the official exam.
The #1 challenge isn't just knowing the material; it's managing the clock. Our new simulation builds your speed and stamina.
You've studied the concepts. You've learned the material. But are you truly prepared for the pressure of the real Salesforce Agentforce Specialist exam?
We've launched a brand-new, timed Salesforce-Marketing-Cloud-Engagement-Developer practice exam that perfectly mirrors the official exam:
โ
Same Number of Questions
โ
Same Time Limit
โ
Same Exam Feel
โ
Unique Exam Every Time
This isn't just another Salesforce-Marketing-Cloud-Engagement-Developer practice questions bank. It's your ultimate preparation engine.
Enroll now and gain the unbeatable advantage of: