Summer Sale Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: v4s65

DP-700 Exam Dumps - Implementing Data Engineering Solutions Using Microsoft Fabric

Go to page:
Question # 9

You need to resolve the sales data issue. The solution must minimize the amount of data transferred.

What should you do?

A.

Spilt the dataflow into two dataflows.

B.

Configure scheduled refresh for the dataflow.

C.

Configure incremental refresh for the dataflow. Set Store rows from the past to 1 Month.

D.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Year.

E.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Month.

Full Access
Question # 10

You have a Fabric workspace named Workspace1 that contains an Apache Spark job definition named Job1.

You have an Azure SQL database named Source1 that has public internet access disabled.

You need to ensure that Job1 can access the data in Source1.

What should you create?

A.

an on-premises data gateway

B.

a managed private endpoint

C.

an integration runtime

D.

a data management gateway

Full Access
Question # 11

You need to ensure that usage of the data in the Amazon S3 bucket meets the technical requirements.

What should you do?

A.

Create a workspace identity and enable high concurrency for the notebooks.

B.

Create a shortcut and ensure that caching is disabled for the workspace.

C.

Create a workspace identity and use the identity in a data pipeline.

D.

Create a shortcut and ensure that caching is enabled for the workspace.

Full Access
Question # 12

You need to ensure that WorkspaceA can be configured for source control. Which two actions should you perform?

Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

A.

Assign WorkspaceA to Capl.

B.

From Tenant setting, set Users can synchronize workspace items with their Git repositories to Enabled

C.

Configure WorkspaceA to use a Premium Per User (PPU) license

D.

From Tenant setting, set Users can sync workspace items with GitHub repositories to Enabled

Full Access
Question # 13

You need to schedule the population of the medallion layers to meet the technical requirements.

What should you do?

A.

Schedule a data pipeline that calls other data pipelines.

B.

Schedule a notebook.

C.

Schedule an Apache Spark job.

D.

Schedule multiple data pipelines.

Full Access
Question # 14

You need to ensure that the data engineers are notified if any step in populating the lakehouses fails. The solution must meet the technical requirements and minimize development effort.

What should you use? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Full Access
Question # 15

You need to recommend a solution to resolve the MAR1 connectivity issues. The solution must minimize development effort. What should you recommend?

A.

Add a ForEach activity to the data pipeline.

B.

Configure retries for the Copy data activity.

C.

Configure Fault tolerance for the Copy data activity.

D.

Call a notebook from the data pipeline.

Full Access
Question # 16

You need to recommend a solution for handling old files. The solution must meet the technical requirements. What should you include in the recommendation?

A.

a data pipeline that includes a Copy data activity

B.

a notebook that runs the VACUUM command

C.

a notebook that runs the OPTIMIZE command

D.

a data pipeline that includes a Delete data activity

Full Access
Go to page: