Plan, implement, and manage a solution for data analytics |
10-15% |
The Implementing Analytics Solutions Using Microsoft Fabric DP-600 exam covers planning, implementing, and managing data analytics solutions. This includes planning a data analytics environment, implementing and managing it effectively, and overseeing the analytics development lifecycle to ensure smooth operations and continuous improvement. |
Prepare and serve data |
40-45% |
This topic of the DP-600 exam covers fundamental tasks crucial for data engineers, focusing on creating objects in a lakehouse or warehouse, efficient data copying, transformative processes, and optimizing performance to ensure data readiness for analytical tasks. |
Implement and manage semantic models |
20-25% |
This area of the DP-600 exam syllabus delves into designing and constructing semantic models, essential for structuring data meaningfully. It also emphasizes optimizing large-scale semantic models to enhance efficiency and usability within enterprise analytics environments. |
Explore and analyze data |
20-25% |
Here, the focus of the Implementing Analytics Solutions Using Microsoft Fabric DP-600 exam is on conducting exploratory analytics to derive insights, querying data using SQL for extracting specific information. The topic also focuses on leveraging these capabilities to drive informed decision-making through comprehensive data exploration and analysis. |
Plan, implement, and manage a solution for data analytics |
10–15% |
This section covers planning a data analytics environment by identifying requirements for a solution such as components, features, performance, and capacity SKUs. It also involves recommending settings in the Fabric admin portal, choosing a data gateway type, and creating a custom Power BI report theme.
|
Implement and manage a data analytics environment |
30% |
This section looks at implementing workspace and item-level access controls for Fabric items. It also covers implementing data sharing for workspaces, warehouses, and lakehouses. Managing sensitivity labels in semantic models and lakehouses includes configuring Fabric-enabled workspace settings and managing Fabric capacity, including configuring capacity settings.
|
Manage the analytics development lifecycle |
10-15% |
This aspect focuses on implementing version control for a workspace along with creating and managing a Power BI Desktop project file. It also deals with planning and implementing deployment solutions, performing impact analysis of downstream dependencies from lakehouses, data warehouses, dataflows, and semantic models. Deploying and managing semantic models by using the XMLA endpoint is covered along with creating and updating reusable assets like Power BI template files, data source files, and shared semantic models.
|
Prepare and serve data |
40–45% |
This section examines creating objects in a lakehouse or warehouse and ingesting data by using a data pipeline, dataflow or notebook. It also touches on creating and managing shortcuts, implementing file partitioning for analytics workloads in a lakehouse and creating views, functions and stored procedures. Enriching data by adding new columns or tables and copying data by choosing an appropriate method or using a data pipeline, dataflow or notebook is discussed. Scheduling data pipelines and dataflows along with implementing Fast Copy when using dataflows is covered. Adding stored procedures, notebooks and dataflows to a data pipeline and scheduling data pipelines is also addressed.
|
Implement and manage semantic models |
20–25% |
This section delves into designing and building semantic models such as choosing a storage mode including Direct Lake. It also deals with identifying use cases for DAX Studio and Tabular Editor 2. Designing and building composite models including aggregations along with implementing dynamic row-level security and object-level security is studied. Optimizing enterprise-scale semantic models through performance improvements in queries, report visuals, DAX Studio and Tabular Editor 2 is discussed. Implementing incremental refresh is another aspect covered.
|
Explore and analyze data |
20–25% |
This part examines performing exploratory analytics involving descriptive, diagnostic, and prescriptive analytics. It also touches upon profiling and querying data using SQL against lakehouses and warehouses in Fabric via SQL queries, the visual query editor, or the XMLA endpoint. Studying resources is also mentioned.
|
Official Information |
|
https://learn.microsoft.com/en-us/credentials/certifications/resources/study-guides/dp-600 |