Integration-Architect Practice Test Questions

Total 106 Questions


Last Updated On : 2-Jun-2025



Preparing with Integration-Architect practice test is essential to ensure success on the exam. This Salesforce SP25 test allows you to familiarize yourself with the Integration-Architect exam questions format and identify your strengths and weaknesses. By practicing thoroughly, you can maximize your chances of passing the Salesforce certification spring 2025 release exam on your first attempt.

Surveys from different platforms and user-reported pass rates suggest Integration-Architect practice exam users are ~30-40% more likely to pass.

An organization needs to integrate Salesforce with an external system and is considering authentication options. The organization already has implemented SAML, using a thirdparty Identity Provider for integrations between other systems. Which use case can leverage the existing SAML integration to connect Salesforce with other internal systems?



A.

Make formula fields with HYPERLINK() to external web servers more secure.


B.

Make Apex SOAP outbound integrations to external web services more secure.


C.

A Make Apex REST outbound integrations to external web services more secure.


D.

Make an API inbound integration from an external Java client more secure.





D.
  

Make an API inbound integration from an external Java client more secure.



Explanation:

Salesforce supports SAML-based Single Sign-On (SSO) for both UI and API authentication. If a company already uses a SAML-compliant Identity Provider (IdP), this can be extended to secure inbound API connections. For example, a Java client integrating with Salesforce can authenticate users using SAML Bearer Assertion Flow. This allows the external system to obtain an access token for Salesforce using a previously authenticated SAML session, eliminating the need to store or transmit passwords.

Options B and C (Apex outbound calls) are not secured via SAML, as SAML is primarily for inbound user authentication—not for securing outbound REST/SOAP integrations from Salesforce.
Option A (HYPERLINK fields) isn't relevant to authentication at all.

So, option D is correct because it properly applies SAML for authenticating API requests coming into Salesforce, leveraging existing identity infrastructure and enhancing security by removing reliance on stored credentials.

Universal Containers (UC) owns a variety of cloud-based applications, including Salesforce, alongside several on premise applications. The on-premise applications are protected behind a corporate network with limited outside access to external systems. UC would like to expose data from the on-premise applications to Salesforce for a more unified user experience. The data should be accessible from Salesforce in real-time. Which two actions should be recommended to fulfill this system requirement?
Choose 2 answers



A.

Develop an application in Heroku that connects to the on-premise database via an ODBC string and VPC connection.


B.

Develop custom APIs on the company's network that are invokable by Salesforce.


C.

Deploy MuleSoft to the on-premise network and design externally facing APIs to expose the data.


D.

Run a batch job with an ETL tool from an on-premise server to move data to Salesforce.





B.
  

Develop custom APIs on the company's network that are invokable by Salesforce.



C.
  

Deploy MuleSoft to the on-premise network and design externally facing APIs to expose the data.



Explanation:

To achieve real-time data access from Salesforce to on-premise systems, the integration must overcome network barriers and ensure secure, low-latency access. Developing custom REST or SOAP APIs (B) on the internal network that Salesforce can invoke (via callouts) is a direct and flexible approach. These APIs should be exposed securely through a DMZ or API gateway.

Alternatively, MuleSoft (C) is an ideal middleware solution for hybrid integrations. When deployed on-premise, MuleSoft can bridge cloud-to-ground communication, managing authentication, transformation, and secure API exposure. It simplifies complex integration flows and allows centralized governance and error handling.

Option A (Heroku + ODBC) is overly complex and introduces unnecessary hops. Option D (batch ETL) does not support real-time use cases and contradicts the requirement for live access.

Therefore, the best strategy includes secure, API-based access with minimal latency, enabled either directly or through a robust integration platform like MuleSoft.

Only authorized users are allowed access to the EBS and the Enterprise DMS. Customers call Customer Support when they need clarification on their bills. Customer Support needs seamless access to customer billing information from the E and view generated bills from the DMS. Which three authorization and authentication needs should an integration consultant consider while integrating the DMS and ESB with Salesforce?
Choose 3 answers



A.

Users should be authorized to view information specific to the customer they are servicing without a need to search for customer.


B.

Identify options to maintain DMS and EBS authentication and authorization details in Salesforce.


C.

Consider Enterprise security needs for access to DMS and EBS.


D.

Consider options to migrate DMS and EBS into Salesforce.


E.

Users should be authenticated into DMS and EBS without having to enter username and password.





A.
  

Users should be authorized to view information specific to the customer they are servicing without a need to search for customer.



C.
  

Consider Enterprise security needs for access to DMS and EBS.



E.
  

Users should be authenticated into DMS and EBS without having to enter username and password.



Explanation:

Integrating Salesforce with systems like Document Management Systems (DMS) and Enterprise Billing Systems (EBS) requires strong identity and access control. First, users should be able to view only data for the customer they are supporting (A)—this is critical for both data privacy and security compliance.

Enterprise-level security (C) must be accounted for, including firewall restrictions, audit logging, and encryption. The architecture should align with internal IT and security policies, especially when exposing sensitive customer billing and document data.

For seamless user experience, users should be authenticated into DMS and EBS without re-entering credentials (E). This is best achieved with SSO or token-based authentication (like SAML or JWT), which improves usability and security by avoiding password proliferation.

Option B (storing credentials in Salesforce) is discouraged due to security risks. Option D (migrating DMS/EBS into Salesforce) is impractical for most enterprises due to cost, complexity, and compliance limitations.

A healthcare services company maintains a Patient Prescriptions System that has 50+ million records in a secure database. Their customer base and data set growing rapidly.

They want to make sure that the following policies are enforced:
1. Identifiable patient prescriptions must exist only in their secure system's databaseand encrypted at rest.
2. Identifiable patient prescriptions must be made available only to people explicit authorized in the Patient Prescriptions System assigned nurses anddoctors, patient, and people explicitly the patient may authorize.
3. Must be available only to verified and pre-approved people or legal entities.

To enable this, the company provides the following capabilities:
1. One-time use identity tokens for patients, nurses, doctors, and other people that expire within a few minutes.
2. Certificates for legal entities.
. RESTful services.

The company has a Salesforce Community Cloud portal for patients, nurses, doctors, and other authorized people. A limited number of employees analyze de identified data in Einstein Analytics.

Which two capabilities should the integration architect require for the Community Cloud portal and Einstein Analytics?
Choose 2 answers



A.

Identity token data storage


B.

Bulk load for Einstein Analytics


C.

Callouts to RESTful services


D.

Encryption in transit and at rest





C.
  

Callouts to RESTful services



D.
  

Encryption in transit and at rest



Explanation:

To meet strict data privacy requirements for medical data, integrations must ensure security and compliance. The Community Cloud portal (now called Experience Cloud) must retrieve sensitive prescription data via secure RESTful APIs (C). These APIs are built to support token-based access control (e.g., JWT, one-time-use tokens) and ensure that only authorized users can access specific data.

In addition, encryption in transit and at rest (D) is a regulatory and ethical requirement. TLS/SSL ensures data is secure during transmission, while database-level or platform encryption protects stored data from unauthorized access. Einstein Analytics (now Tableau CRM) must also adhere to these policies if it handles even de-identified data.

Option A (storing identity tokens) is incorrect, as tokens are transient and should not be persisted. Option B (bulk loading into analytics) violates the real-time, secure, and minimal retention principles defined in the scenario.

Northern Trail Outfitters' (NTO) Salesforce org usually goes through 8k-10k batches a day to synch data from external sources. NTO's Integration Architec has received requirements for a new custom object, FooBarc, for which 90M records will need to be loaded into the org. Once complete, 20GB (about 30M records) needs to be extracted to an external auditing system. What should the architect recommend using to meet these requirements in a day?



A.

Insert using Bulk API 2.0 and query using REST API.


B.

Insert and query using Bulk API 1.0.


C.

Insert using Bulk API 1.0 and query using REST API.


D.

Insert and query using Bulk API 2.0.





D.
  

Insert and query using Bulk API 2.0.



Explanation:

Salesforce Bulk API 2.0 is optimized for large-scale data operations with simplified job management and better performance over Bulk API 1.0. It automatically handles batching and concurrency behind the scenes. For inserting 90 million records in a day, Bulk API 2.0 is the best-suited tool due to its ability to parallelize jobs and manage throughput automatically.

Moreover, querying 30 million records for extraction (e.g., audit system) is also more efficient with Bulk API 2.0, which supports PK chunking and handles large result sets better than the REST API. REST API isn't built for high-volume querying—it’s better for transactional data.

Bulk API 1.0, while still valid, requires manual batch splitting and doesn’t scale as effectively. Big Objects (used in D) are for long-term storage of non-transactional data—not for high-volume daily inserts and extracts.
Thus, Bulk API 2.0 is the correct tool for both insertion and extraction at this scale, while simplifying operations and improving reliability.

Customer is evaluating Platform Events solution and would like help in comparing/contrasting it with Outbound Message for a real-time / near-real time needs. They expect 3,000 consumers of messages from Salesforce. Which three considerations should be evaluated and highlighted when deciding between the solutions?
Choose 3 answers



A.

Both Platform Events and Outbound Message offer declarative means for asynchronous near-real time needs. They aren't best suited for realtime integrations.


B.

In both Platform Events and Outbound Messages, the event messages are retried by and delivered in sequence, and only once. Salesforce ensures there is no duplicate message delivery.


C.

Message sequence is possible in Outbound Message but not guaranteed with Platform Events. Both offer very high reliability. Fault handling and recovery are fully handled by Salesforce.


D.

Number of concurrent subscribers to Platform Events is capped at 2,000. An Outbound Message configuration can pass only 100 notifications in a single messages to a SOAP end point.


E.

Both Platform Events and Outbound Message are highly scalable. However, unlike Outbound Message, only Platform Events have Event Delivery and Event Publishing limits to be considered.





A.
  

Both Platform Events and Outbound Message offer declarative means for asynchronous near-real time needs. They aren't best suited for realtime integrations.



D.
  

Number of concurrent subscribers to Platform Events is capped at 2,000. An Outbound Message configuration can pass only 100 notifications in a single messages to a SOAP end point.



E.
  

Both Platform Events and Outbound Message are highly scalable. However, unlike Outbound Message, only Platform Events have Event Delivery and Event Publishing limits to be considered.



Explanation:

When comparing Platform Events and Outbound Messaging, it’s important to evaluate scale, reliability, and delivery characteristics. Both are asynchronous and near-real-time mechanisms and can be triggered declaratively (Apex, Workflow, or Process Builder).

Platform Events (PE) support multiple subscribers (up to 2,000), while Outbound Messages (OM) only target a single endpoint per configuration and support up to 100 messages per call (D). PE is more flexible for event-driven, publish-subscribe patterns across many consumers.

However, PE has strict event publishing and delivery limits (E), which can throttle high-throughput use cases. Outbound Messaging doesn't have such publish limits but lacks the scalability and retry customization offered by PE.

Option B is false — PE doesn’t guarantee message order or uniqueness, and messages may be delivered more than once. Option C is also incorrect — message sequencing is not guaranteed in OM and reliability varies.

Northern Trail Outfitters is planning to create a native employee facing mobile app with the look and feel of Salesforce's Lighting Experience. The mobile ap needs to integrate with their Salesforce org. Which Salesforce API should be used to implement this integration?



A.

Streaming API


B.

REST API


C.

Connect REST API


D.

User Interface API





D.
  

User Interface API



Explanation:

Salesforce’s User Interface (UI) API is purpose-built for building custom apps (especially mobile or SPA apps) that require the same metadata-driven behavior as the Lightning Experience. It automatically respects things like record layouts, field-level security, and Lightning page configurations without the developer having to hard-code these aspects.

If you want your mobile app to mimic the Lightning Experience, the UI API is ideal—it’s what Lightning Experience and Salesforce mobile use under the hood. It minimizes development effort while ensuring consistency in business logic and page rendering.

REST API (B) and Connect REST API (C) are great for data access, but they don’t replicate the user interface logic (e.g., layouts, dynamic components). Streaming API (A) is for event-based push notifications, not for rendering or form-building.

So, for an employee-facing app that needs to look and feel like Lightning and respect dynamic layouts and metadata-driven UI behavior, UI API is the best and most efficient choice.

Northern Trail Outfitters requires an integration to be set up between one of their Salesforce orgs and an external data source us Salesforce Connect. The external data source supports Open Data Protocol. Which three configurations should an Integration Architect recommend be implemented in order to secure requests coming from Salesforce?

Choose 3 answers



A.

Configure Identity Type for OData connection.


B.

Configure a Certificate for OData connection.


C.

Configure Special Compatibility for OData connection,


D.

Configure CSRF Protection for OData connection.


E.

Configure CSRF Protection on External Data Source.





A.
  

Configure Identity Type for OData connection.



B.
  

Configure a Certificate for OData connection.



D.
  

Configure CSRF Protection for OData connection.



Explanation:

For OData (Open Data Protocol) security:

Identity Type (A): Configure authentication (e.g., OAuth, Basic Auth).
Certificate (B): Encrypt connections via TLS.
CSRF Protection (D): Prevent cross-site request forgery.

"Special Compatibility" (C) and External Data Source CSRF (E) are red herrings.

A company has an external system that processes and tracks orders. Sales reps manage their leads and opportunity pipeline in Salesforce. In the current state, the two systems are disconnected and processing orders requires a lot of manual entry on sales rep part. This creates delays in processing orders and incomplete data due to manual entry.

As a part of modernization efforts the company decided to integrate Salesforce and the order management system. The following technical requirements were identified:
1. Orders need to be created in real time from salesforce
2. Minimal customization*, and code should be written due to a tight timeline and lack of developer resources
3. Sales reps need to be able to see order history and be able to see most up to date information on current order status.
4. Managers need to be able to run reports in Salesforce to see daily and monthly order volumes and fulfillment timelines.
5. The legacy system is hosted on premise and is currently connected to the Enterprise Service Bus (ESB). The ESB is flexible enough to provide any methods and connection types needed by salesforce team.
6. There are 1000 sales reps. Each user processes/creates on average 15 orders per shift. Most of the orders contain 20-30 line items.

How should an integration architect integrate the two systems based on the technical requirements and system constraints?



A.

Use Salesforce external object and OData connector.


B.

Use Salesforce custom object, custom REST API and ETL.


C.

Use Salesforce standard object, REST API and ETL.


D.

Use Salesforce big object, SOAP API and Dataloader.





C.
  

Use Salesforce standard object, REST API and ETL.



Explanation:

This integration scenario requires real-time order creation, minimal customization, and support for a large sales team. Using Salesforce standard objects like Orders, Products, and OrderItems minimizes customization (requirement #2), and aligns with Salesforce data models.

For real-time order creation (#1), Salesforce REST API is well-suited — it's lightweight, stateless, and easy to implement. Since the legacy system is already connected to an ESB, the ESB can expose RESTful endpoints that Salesforce can call. This also supports current order status visibility (#3) and avoids duplicating data.

For reporting (#4), Salesforce needs to ingest daily volumes. An ETL process can extract summarized data from the legacy system and load it into Salesforce standard objects or a custom reporting layer.

Big Objects (D) are not suitable due to limited reporting tools. Custom REST APIs (B) increase dev effort. External objects (A) can’t be reported on easily and may perform poorly for complex data sets.

Northern Trail Outfitters is seeking to improve the performance and security of outbound integrations from Salesforce to on-premise servers. What should the Architect consider before recommending a solution?



A.

External gateway products in use


B.

A Default gateway restrictions


C.

Considerations for using Deterministic Encryption


D.

Shield Platform Encryption Limitations





A.
  

External gateway products in use



Explanation:

When Salesforce performs outbound callouts to on-premise systems, the key challenge is passing through network boundaries and firewalls securely. In most enterprise environments, this is managed via API gateways or proxy servers deployed in a DMZ. These external gateway products (e.g., Apigee, Mulesoft, Azure API Management) handle tasks like IP whitelisting, SSL offloading, load balancing, authentication, and throttling.

Before designing the integration, the architect must understand the current gateway infrastructure (A) to align with enterprise architecture, reduce friction, and ensure compliance. This may affect endpoint URLs, header formats, and certificate management.

Option B (Default gateway restrictions) is vague and not directly actionable. Option C (Deterministic Encryption) and Option D (Shield Encryption limitations) are more relevant to data storage, not to outbound connection strategy.

So, understanding what external gateway technology is in use is critical for ensuring compatibility, security, and maintainability of the integration.

Page 4 out of 11 Pages
Integration-Architect Practice Test Home Previous