Salesforce-Marketing-Cloud-Engagement-Consultant Practice Test Questions

Total 293 Questions


Last Updated On : 11-Dec-2025


undraw-questions

Think You're Ready? Prove It Under Real Exam Conditions

Take Exam

Utilizing journey builder interactions for sending post-purchase communications to customers, what contact entry mode fits?



A. Re-entry anytime


B. Re-entry only after exit


C. No re-entry





C.
  No re-entry

Explanation:

For post-purchase communications in Journey Builder—such as order confirmations, shipping updates, delivery notifications, or review requests—the appropriate contact entry mode is No re-entry. This setting ensures that a contact enters the journey only once per purchase event. Since each purchase represents a distinct transaction, the customer should receive the post-purchase sequence exactly one time for that specific order. Using No re-entry prevents the system from injecting the same contact multiple times for the same purchase, which eliminates the risk of sending duplicate emails and provides a clean, professional customer experience.

Why the Other Options Are Not Correct

A. Re-entry anytime is incorrect because it allows a contact to enter the journey again even while they are still active in it from a previous entry. In a post-purchase scenario, if a customer happens to meet the entry criteria again (for example, due to data refresh or another minor event), they would immediately start receiving the same sequence a second time for the original order—resulting in duplicate order confirmations or shipping updates, which is a major customer experience issue.

B. Re-entry only after exiting is also not suitable for transactional post-purchase journeys. Although it prevents overlap while the contact is active, it still permits the contact to re-enter the same journey as soon as they complete or exit it. If a customer makes another purchase shortly after the first one finishes, they could receive two nearly identical sequences running close together, or worse, the system might confuse messaging for different orders. Salesforce strongly recommends avoiding this mode for any truly transactional flow where one sequence per event is required.

References:
Salesforce Help: Journey Builder Entry Modes
Trailhead: Build Transactional Journeys in Marketing Cloud Engagement

The customer has the following requirements for storing engagement data in their data warehouse:

All email open and click activity must be pulled daily from MC
Output files must meet the specific requirements for the data warehouse
All the activity must be provided via FTP in one file.

Which automation workflow meets the customer requirements?



A. Report activity that generates recent send summary report -& Report delivered directly to FTP


B. Query activity to pull data view information -& Extract activity of data extension -& Transfer activity


C. Extract activity of tracking extracts that combines data into required file -& Transfer activity


D. Extract activity of data view tables -& Query activity to create the required file -& Transfer activity





C.
  Extract activity of tracking extracts that combines data into required file -& Transfer activity

Explanation:

Let's break down the requirements and see which option fulfills them all:

"All email open and click activity": This data is stored in system Tracking Extracts (Open, Click, Sent, etc.), not in standard Data Extensions or Reports.

"must be pulled daily": Requires a scheduled Automation.

"Output files must meet the specific requirements for the data warehouse": The Extract activity allows you to combine multiple data sources and define the exact file format (CSV, TAB, pipe-delimited), column order, and headers.

"All the activity must be provided via FTP in one file": The Extract activity can combine multiple data sources (e.g., Open and Click extracts) into a single output file. The subsequent Transfer activity is the component specifically designed to send files via FTP/SFTP.

Option C is a direct, two-step automation workflow that meets every requirement precisely.

Why the other options are incorrect:

A. Report activity that generates recent send summary report → Report delivered directly to FTP: This fails multiple requirements. Reports are aggregated summaries, not the raw, record-level "activity" data. They cannot be customized to the warehouse's specific file format. The "Recent Send Summary" does not provide the granular open and click detail needed.

B. Query activity to pull data view information → Extract activity of data extension → Transfer activity: This is a flawed and inefficient approach. Data Views (_Open, _Click) cannot be queried directly by a Query Activity. To use them in a query, you must first import them into a Data Extension, which adds unnecessary steps. More critically, the Extract activity in this chain is only extracting from a single Data Extension, not combining data. It would not natively create the required "one file" from multiple sources.

D. Extract activity of data view tables → Query activity to create the required file → Transfer activity: This is incorrect because you cannot use an Extract activity on Data Views directly. Data Views are virtual tables and must be queried into a Data Extension first. This option has the steps in the wrong order. The correct flow to use a Query would be: SQL Query Activity (to pull from Data Views into a DE) → Extract Activity (on that DE) → Transfer Activity.

Reference:
Salesforce Help Article: "Create an Extract Activity" – This documentation details how the Extract Activity is the primary tool for exporting data, including the ability to "Combine multiple data sources into one file" and define the exact file format.

Salesforce Help Article: "Tracking Extracts" – This confirms that raw open, click, sent, etc., data is exported via system-generated Tracking Extracts.

Northern Trail Outfitters is expanding globally into 16 new countries and wants to start localizing their email content to speak to subscribers in their own language. They want to do this as efficiently as possible and are anticipating growth into other locales in the near future. Which two options could be recommended? (Choose 2 answers)



A. Leverage Content Builder to create email templates for each language and populate the templates via the UI.


B. Leverage enhanced dynamic content blocks within Content Builder to create language specific emails.


C. Leverage personalization strings within the email template to pull in language-specific content.


D. Leverage AMPscript within an email template to lookup subscriber language and personalize the email based on the value.





B.
  Leverage enhanced dynamic content blocks within Content Builder to create language specific emails.

D.
  Leverage AMPscript within an email template to lookup subscriber language and personalize the email based on the value.

Explanation

These two options offer the best balance of efficiency, scalability, and maintainability for a growing number of languages:

B. Leverage enhanced dynamic content blocks within Content Builder
Efficiency & Scalability: This method is highly recommended for localization. You can create one master email and use Dynamic Content blocks to swap out large sections of text, images, or entire modules based on the subscriber's language preference (stored in a Data Extension).

Maintainability: Content Builder's dynamic content interface is user-friendly, allowing marketers to manage the localized content for 16+ languages in a single, organized view within one email. This is much easier than managing 16 separate email copies.

D. Leverage AMPscript within an email template to lookup subscriber language and personalize the email based on the value.
Efficiency & Scalability: AMPscript is the most powerful and scalable solution for highly complex or highly repetitive localization. You can use the LookupRows() or Lookup() functions to pull all necessary content (Subject Line, Body Text, Footers, etc.) from a single Localization Data Extension using the subscriber's Language ID as the key.

Maintainability: By storing all localized phrases in one central Data Extension, translation updates and additions for new locales become a simple data import process, rather than requiring the editing of multiple email templates. This is ideal for supporting growth into many future locales.

Why the Incorrect Answers are Wrong

A. Leverage Content Builder to create email templates for each language and populate the templates via the UI.
Inefficient & Not Scalable: Creating and maintaining 16 separate email templates is time-consuming and prone to errors. If a design or branding change is needed, it would have to be implemented 16 times, which is the opposite of an efficient process and will not scale well.

C. Leverage personalization strings within the email template to pull in language-specific content.
Not Suitable for Full Localization: Personalization strings (%%fieldName%%) are designed to pull in single, short pieces of data like a name or a city. They are not suitable for pulling in entire blocks of localized paragraphs, headings, or footer content, which is required for full email localization. This would require an extremely large and complex Sendable Data Extension.

Reference
Both Dynamic Content and AMPscript are the officially recommended methods within Salesforce Marketing Cloud for advanced content personalization and localization.

Salesforce Help: Dynamic Content is described as a feature in Content Builder for displaying personalized content based on subscriber attributes.

Salesforce Help/Developer Documentation: AMPscript functions (specifically Lookup functions) are the standard method for retrieving external, personalized data like localization strings from a central Data Extension during send time.

A customer needs to import data from an SFTP site. The customer wants to:

Segment the contents of the file and then send emails.
Transfer the file to the SFTP site at various times daily.
Send to data extensions.

What sequence of automation activities should meet these requirements?



A. Scheduled: Import File & SQL Query(s) & Send Email(s)


B. Scheduled: Transfer File & Import File & SQL Query(s) & Send Email(s)


C. File Drop: Import File & SQL Query(s) & Send Email(s)


D. File Drop: Import File & Group Refresh & Send Email(s)





C.
  File Drop: Import File & SQL Query(s) & Send Email(s)

Explanation:

Here’s why:

Trigger type: File Drop (not Scheduled)
The requirement says the customer will transfer the file to the SFTP site at various times daily.
Because times vary, you don’t want a fixed Scheduled automation (A or B).
A File Drop automation listens for a file to appear in a specific SFTP folder and then runs automatically when the file arrives — perfect for “various times daily.”

Activity sequence
For what they want to do:

Import File
Pulls the data from the SFTP (Enhanced FTP) into a Data Extension.

SQL Query Activity (segment the contents)
Lets you segment and move data into one or more target Data Extensions (e.g., filtered audiences, different campaigns, etc.).

Send Email Activity
Sends to the resulting Data Extensions, as required.

That matches exactly:
“Segment the contents of the file and then send emails” and “Send to data extensions.”

Why the others are wrong

A. Scheduled: Import File & SQL Query(s) & Send Email(s)
Uses a Scheduled start, which doesn’t align well with “various times daily” unless you overschedule and risk missing or duplicating runs.

B. Scheduled: Transfer File & Import File & SQL Query(s) & Send Email(s)
Same scheduling issue as A.
Also assumes you need a Transfer File activity, which is typically used to move/decrypt/unzip files, not necessary if the external system is already placing the final file on the SFTP location MC is watching.

D. File Drop: Import File & Group Refresh & Send Email(s)
Uses Group Refresh, which applies to lists / groups, not data extensions.
The requirement explicitly says: “Send to data extensions.” So this doesn’t fit.

So the best and most accurate sequence is:
C. File Drop: Import File & SQL Query(s) & Send Email(s) ✅

Northern Trail Outfitters receives a nightly encrypted unsub file to their Marketing Cloud SFTP from a third-party email platform. These files are used to unsubscribe existing subscribers. They do not use Email Address as Subscriber Key. What Automation Studio Activity sequence should be used to ensure the appropriate subscribers are unsubscribed from the All Subscriber List?



A. Import File & Data Extract & File Transfer & Import File


B. File Transfer & Import File & Query & Data Extract & File Transfer & Import File


C. Import File & Query & Data Extract & File Transfer & Import File


D. File Transfer & Import File & Data Extract & File Transfer & Import File





D.
  File Transfer & Import File & Data Extract & File Transfer & Import File

Explanation:

When the nightly unsubscribe file arrives encrypted on Marketing Cloud’s Enhanced FTP and the account does not use Email Address as Subscriber Key, the only supported way to process global unsubscribes is the official encrypted-unsubscribe-file workflow.

The exact sequence is:

File Transfer – decrypts the file and places the plain-text version in the Safehouse
Import File – imports the decrypted file into a dedicated Unsubscribe Data Extension
Data Extract – creates an _Unsubscribe extract file (the special extract type required for All Subscribers processing)
File Transfer – moves the generated _Unsubscribe extract from the Safehouse back to the Import directory
Import File – imports that extract into All Subscribers, which automatically sets the Unsubscribed status at the Subscriber Key level

This is the documented, supported process when Subscriber Key ≠ Email Address and the file is encrypted.

Why the Other Options Are Not Correct

A is missing the initial File Transfer to decrypt the file and the second File Transfer to move the extract – the process would fail at decryption or at the final import step.
B is overcomplicated and wrong – adding a Query activity is unnecessary and breaks the official flow.
C starts with Import File on an encrypted file (which Marketing Cloud will reject) and again includes an unnecessary Query.

References
Salesforce Help: Process Encrypted Unsubscribe Files from Third-Party Systems 
Knowledge Article: “How to unsubscribe subscribers when Subscriber Key is not Email Address and file is encrypted” (explicitly lists the 5-step sequence in option D)
Automation Studio: Data Extract Activity → Extract Type = Unsubscribe

Therefore, the only sequence that correctly processes an encrypted third-party unsubscribe file when Subscriber Key ≠ Email Address is D.

What are two possible outcomes when “Multipart MIME” is selected during the send cv process? Choose 2 answers



A. An auto-generated text version will be sent with your HTML email.


B. A custom text version will be sent with your HTML email.


C. The email will avoid detecting by various SPAM filters.


D. Open and click activity are tracked in either version





A.
  An auto-generated text version will be sent with your HTML email.

D.
  Open and click activity are tracked in either version

Explanation:

Multipart MIME is a technical email format that packages both an HTML version and a Text version of an email into a single message. The recipient's email client decides which version to display.

A. An auto-generated text version will be sent with your HTML email: This is the primary, default behavior when you select "Multipart MIME" without providing a custom text version. The system will automatically generate a plain-text version by stripping the HTML tags from your HTML content. This ensures a text version is always present.

D. Open and click activity are tracked in either version: This is a key point. Regardless of whether the recipient's email client displays the HTML or the Text part, the tracking is unified. Opens are tracked via a 1x1 pixel image (present in the HTML part). Clicks are tracked via rewritten links (which are present in both the HTML and the auto-generated Text parts). The tracking data is attributed to the single send job.

Why the other options are incorrect:

B. A custom text version will be sent with your HTML email: This is not a direct outcome of selecting "Multipart MIME." A custom text version is sent only if you manually create and associate a Text Block in Content Builder or provide a text content area. Selecting "Multipart MIME" alone does not create custom text; it creates an auto-generated one. This option describes a best practice action, not a system outcome of the setting.

C. The email will avoid detection by various SPAM filters: This is false and misleading. Including a text version is a positive factor for sender reputation and spam filtering, as it signals you are following good formatting practices. However, it does not "avoid detection." Spam filters are complex and consider hundreds of factors (content, sender reputation, authentication, etc.). Multipart MIME is a baseline formatting standard, not a spam filter bypass.

Reference:
Salesforce Help Article: "Formatting Options for Email Sends" – Explicitly states: "Multipart MIME: Sends both the HTML and text parts of the email... If you don't specify a text part, Marketing Cloud generates one for you automatically." It also confirms that tracking applies.

Email Deliverability Best Practices: General industry knowledge confirms that providing a text alternative is a positive inbox placement factor, but never a guarantee. The exam tests on platform-specific, factual outcomes, not general deliverability claims.

A retail company needs to create journeys that will target subscribers based on website behavior. They have identified 3 separate groups:

Customers who searched for an item on their website.
Customers who abandoned a cart on their website.
Customers who made a purchase on their website.

What should the consultant ask in order to design the data structure for this solution? Choose 3 answers



A. Should customers exit the journey when the goal is met?


B. How are subscribers identified in your web analytics?


C. How many messages should be included in each journey?


D. How long after the behavior occurs will a subscriber need to enter a journey?


E. Should a single customer exist in multiple journeys at the same time?





B.
  How are subscribers identified in your web analytics?

D.
  How long after the behavior occurs will a subscriber need to enter a journey?

E.
  Should a single customer exist in multiple journeys at the same time?

Explanation:

✅ B. How are subscribers identified in your web analytics?
To connect website behavior (search, cart, purchase) with Marketing Cloud subscribers, you need a common key:

Is it Subscriber Key, customer ID, email, cookie ID, or something else?
This determines what fields you must store in the web event data extensions.
Without this, you can’t reliably join web behavior data to contact data.

This is a core data-structure question.

✅ D. How long after the behavior occurs will a subscriber need to enter a journey?
This impacts:

Retention period for the web-behavior data extensions (e.g., keep abandoned cart records for 7, 14, 30 days?).
How long the event data must remain available for entry filters or journey entry events.
Whether you need rolling windows (e.g., “people who searched in last 24 hours”).

So this directly affects how you design data extensions, retention, and indexing.

✅ E. Should a single customer exist in multiple journeys at the same time?
This matters because:

If a customer can be in multiple behavior-based journeys (search, abandon, purchase), you may need:
Separate behavioral data extensions for each event type, or
A unified event model with flags/event types and journey-specific filters.
If they should not be in multiple journeys, you may need:
Status fields or flags in a DE to indicate current journey membership.
Data structures that support exclusion logic or priority handling.

That’s again a structural/architectural design point.

❌ Why the others are not primarily data-structure questions

A. Should customers exit the journey when the goal is met?
This is about journey configuration / logic, not how data is stored.

C. How many messages should be included in each journey?
This is about content and orchestration, not the data model.

So, for designing the data structure to support behavior-based journeys (search, abandoned cart, purchase), the consultant should ask:

B, D, and E ✅

What are data extension data retention policies?



A. Settings to "soft" delete all data in a Data Extension so there is no data loss.


B. Settings to control when a data extension creates a back-up of the data it contains.


C. Settings to define when a data extension or the data within the data extension is deleted.


D. Settings to prevent users from deleting a Data Extension created by another user





C.
  Settings to define when a data extension or the data within the data extension is deleted.

Explanation:

Data Retention Policies in Salesforce Marketing Cloud are a set of rules applied to a Data Extension (DE) to automatically remove old or irrelevant data. Their primary purpose is to:

Manage Database Size: Control the overall size of your Marketing Cloud database, which improves performance for queries and sends.
Maintain Data Relevance: Ensure the data you use for segmentation and personalization is current and useful.
Comply with Regulations: Help adhere to data privacy regulations (like GDPR or CCPA) by automatically deleting personal data after a specified time limit.

When setting a policy, you can choose to delete:

Individual Records: Based on a field's date value (e.g., delete a record 90 days after the last purchase date).
All Records: Delete all rows in the DE after a set number of days.
The Entire Data Extension: Delete both the records and the DE structure itself after a set number of days.

Why the Incorrect Answers are Wrong

A. Settings to "soft" delete all data in a Data Extension so there is no data loss.
Incorrect. The purpose of retention is permanent deletion (or "hard" deletion) to manage storage and compliance. While the data might be temporarily moved to a staging area for a short time, the policy's goal is data removal, not prevention of loss.

B. Settings to control when a data extension creates a back-up of the data it contains.
Incorrect. Marketing Cloud's standard Backup and Restore service handles backups of the entire environment. Retention policies are for deletion, not backup creation.

D. Settings to prevent users from deleting a Data Extension created by another user.
Incorrect. User permissions and roles control who can delete a DE, not the data retention policy. The policy is focused purely on automated, time-based deletion of the data or the structure.

Reference:
Salesforce Marketing Cloud documentation confirms the role of data retention policies in managing the age and volume of data:

Data Retention Policy: Define how long a data extension or the data contained within the data extension is kept before being automatically deleted. Setting up a data retention policy can help keep your data current and reduce unnecessary storage.

How are Publication Lists used?



A. To allow subscribers to opt-down/out instead of unsubscribing from all


B. To built dynamic content rules by subscriber type


C. To manage subscribers in guided and triggered email sends


D. To send communication to all subscribers, regardless of opt-in status





A.
  To allow subscribers to opt-down/out instead of unsubscribing from all

Explanation:

Why A is Correct: Preference Management

Granular Control: Publication Lists are a subscription management tool. They act as a filter applied to a send to ensure only subscribers who have opted into that specific communication category receive the email.
Opt-Down Functionality: By creating a separate Publication List for each type of email (e.g., "Weekly Newsletter," "Product Alerts," "Event Invitations"), you allow subscribers to opt-out (unsubscribe) from just one category without being globally unsubscribed from your entire account (the All Subscribers List).
Deliverability and Compliance: This approach is crucial for good email hygiene, deliverability, and compliance (like CAN-SPAM). It keeps your subscribers happier and maintains a larger active audience.

Why the Incorrect Answers are Wrong

B. To build dynamic content rules by subscriber type:
Incorrect. Dynamic content rules are built using the data fields within a Data Extension (or List Attributes), typically using AMPscript or Dynamic Content blocks, not Publication Lists. Publication Lists are solely for managing subscription status.

C. To manage subscribers in guided and triggered email sends:
Misleading. While Publication Lists are used in these sends, their purpose is not to "manage subscribers" (that is the role of the Data Extension or Contact Model) but specifically to filter out unsubscribed contacts for that communication category. The broader term "manage subscribers" is too generic for the specific opt-out function of the Publication List.

D. To send communication to all subscribers, regardless of opt-in status:
Incorrect. Publication Lists are designed to honor the subscriber's opt-out status. Ignoring a subscriber's opt-in status would violate compliance rules and best practices. The only time a Publication List might bypass a global unsubscribe is for Transactional Sends, but even then, it honors its own category unsubscribe status.

Reference
Salesforce Marketing Cloud documentation defines the role of Publication Lists in subscriber preference management:
Publication Lists: Publication lists help you manage subscribers' unsubscribe or opt-out actions. Having a separate publication list for each communication type enables you to honor an opt-out request from one publication type without unsubscribing that person from all previously subscribed-to publications.

A customer wants to automate a series of three emails as part of a Membership renewal drip campaign.

Email #1 will be sent one month prior to the member's renewal date
Email #2 will be sent one week prior to the member's renewal date
Email #3 will be sent on the member's renewal date
A master audience is updated in real time via the API

Which steps should be included in the customer's automation?



A. Import activity -& Three filter activities -& Three send definitions to the filtered audiences


B. Three send definitions to the master data extension


C. Import activity -& Three send definitions to the master data extension


D. Three filter activities -& Three send definitions to the filtered audiences





D.
  Three filter activities -& Three send definitions to the filtered audiences

Explanation:

Here’s why:

You have:
A master audience Data Extension, updated in real time via the API

Three emails based on relative time to Renewal Date:

Email 1: 1 month before
Email 2: 1 week before
Email 3: On the renewal date

You want a recurring/scheduled Automation Studio process that, each day:
Identifies who should get which email that day, based on the renewal date.
Sends the correct email to those people.

Why D is correct

Three Filter Activities
Filter 1: members whose renewal date = Today + 30 days → audience for Email #1
Filter 2: members whose renewal date = Today + 7 days → audience for Email #2
Filter 3: members whose renewal date = Today → audience for Email #3

Each filter produces a filtered Data Extension (or filtered audience) containing only the members who should receive that specific email on that run.

Three Send Definitions
Each filtered DE then feeds into its corresponding Send Definition:
Filtered DE #1 → Send Email #1
Filtered DE #2 → Send Email #2
Filtered DE #3 → Send Email #3

The automation can run daily, and the filters will always pull the correct audience based on the dates.

Why the other options are wrong

A. Import activity - Three filter activities - Three send definitions
The master audience is already updated via API in real time. No need for an Import Activity in the automation.

B. Three send definitions to the master data extension
This would send all three emails to everyone in the master DE, regardless of renewal date — not date-driven targeting.

C. Import activity - Three send definitions to the master data extension
Same problem as B (no segmentation by date), plus unnecessary Import.

So the correct steps for the automation are:
D. Three filter activities followed by three send definitions to the filtered audiences. ✅

Page 3 out of 30 Pages
Salesforce-Marketing-Cloud-Engagement-Consultant Practice Test Home Previous

Experience the Real Exam Before You Take It

Our new timed Salesforce-Marketing-Cloud-Engagement-Consultant practice test mirrors the exact format, number of questions, and time limit of the official exam.

The #1 challenge isn't just knowing the material; it's managing the clock. Our new simulation builds your speed and stamina.



Enroll Now

Ready for the Real Thing? Introducing Our Real-Exam Simulation!


You've studied the concepts. You've learned the material. But are you truly prepared for the pressure of the real Salesforce Agentforce Specialist exam?

We've launched a brand-new, timed Salesforce-Marketing-Cloud-Engagement-Consultant practice exam that perfectly mirrors the official exam:

✅ Same Number of Questions
✅ Same Time Limit
✅ Same Exam Feel
✅ Unique Exam Every Time

This isn't just another Salesforce-Marketing-Cloud-Engagement-Consultant practice questions bank. It's your ultimate preparation engine.

Enroll now and gain the unbeatable advantage of:

  • Building Exam Stamina: Practice maintaining focus and accuracy for the entire duration.
  • Mastering Time Management: Learn to pace yourself so you never have to rush.
  • Boosting Confidence: Walk into your Salesforce-Marketing-Cloud-Engagement-Consultant exam knowing exactly what to expect, eliminating surprise and anxiety.
  • A New Test Every Time: Our Salesforce-Marketing-Cloud-Engagement-Consultant exam questions pool ensures you get a different, randomized set of questions on every attempt.
  • Unlimited Attempts: Take the test as many times as you need. Take it until you're 100% confident, not just once.

Don't just take a Salesforce-Marketing-Cloud-Engagement-Consultant test once. Practice until you're perfect.