Microsoft Fabric was introduced almost a year ago and the certification associated with it was released during January 2024. After a two-month beta period, the exam is now generally available for candidates to take. If you’re planning to take the DP-600 exam, it’s essential to understand what it entails and how to prepare effectively.
DP-600 is a comprehensive exam that has a wide range of topics that needs to be covered before taking the exam. In this blog, we’ll break down the exam format, the skills you need to master, and the key topics you should focus on to succeed.
Format of the exam
The DP-600 exam typically consists of 55-60 questions, presented in various formats:
Multiple choice questions with usually 4 choices. The choices are usually provided:
- Multiple-choice questions with four options, often using:
- Radio buttons
- Drop-down menus
- Drag-and-drop functionality
- Drag-and-drop to arrange options in the correct order
- True/False questions
- Case study questions (usually 1-2 case studies with multiple related questions)
During the exam, you’ll have access to Microsoft documentation. However, it’s crucial to manage your time wisely, as relying too heavily on the documentation can eat into your allotted time.
Exam Content
The complete skillset required for the exam is described here:
We have described below the skillsets required from a developer perspective. This makes it easier to understand what are the things that a developer must know.
Interaction Languages
Candidates taking the DP-600 exam should be familiar with SQL, Power Query, and DAX at an intermediate level. Knowledge of Python is also required, particularly for working with PySpark functions.
To simplify the required skillset, we’ve categorized the exam content into three main areas: Power BI, Fabric Data Engineering, and Others.
1. Power BI
The Power BI section can be further divided into the following subtopics:
DAX
- Knowing the functions to be used in a formula for deriving the required result.
- Knowledge of how to analyze the performance of DAX queries.
- Being aware of how to optimize inefficient DAX queries.
Semantic Model
- A solid understanding of semantic models in Fabric.
- Familiarity with the use cases and benefits of Direct Lake.
- Knowledge of administrative options (tenant settings) related to semantic models.
- Ability to use third-party tools like Tabular Editor to modify semantic model properties.
- Awareness of TOM C# Scripting.
Power BI Desktop
- Proficiency in using the various GUI functions within Power Query.
- Knowledge of different options in Power Query for data exploration.
- Familiarity with Power BI features such as themes, visuals, and the modelling tab.
- Understanding of how to use Git environments and Deployment Pipelines.
2. Fabric Data Engineering
Ingestion and Transformation
- Knowledge of how to orchestrate with activities.
- Understanding of how to use Copy Data.
- Proficiency in using Dataflow Gen 2 for ingestion and transformation tasks with Power Query.
Spark
- Strong understanding of working with Delta tables.
- Familiarity with the various options provided by Spark for managing Delta tables.
- Knowledge of maintenance operations on Delta tables.
One Lake
- Understanding of how to use shortcuts and their nuances.
- Ability to mix Item Permissions and Workspace Permissions for granular access control.
- Clear understanding of the differences between Lakehouse and Warehouse.
T-SQL
- Knowledge of commonly used T-SQL functions.
- Familiarity with T-SQL syntax for statements typically executed in a warehouse.
Others
- Understanding the license options available and identifying the appropriate license for specific scenarios.
- Knowledge of tenant settings available for Fabric and Power BI.
- Familiarity with Workspace settings and grouping.
References for the exam
Below are the links to help you learn the skills required for the exam from scratch. Additionally, we have provided resources tailored to each section of the skillset described in the study guide.
Video Links for End-to-End Learning:
Microsoft Fabric – DP 600 Certificate Training (youtube.com)
Documentation:
Microsoft Fabric documentation – Microsoft Fabric | Microsoft Learn
Get started with Microsoft Fabric – Training | Microsoft Learn
End-to-end tutorials in Microsoft Fabric – Microsoft Fabric | Microsoft Learn
Plan, implement, and manage a solution for data analytics (10–15%)
1. Plan a data analytics environment
Topic | Resource 1 | Resource 2 |
Identify requirements for a solution, including components, features, performance, and capacity stock-keeping units(SKUs) | MSDoc | |
Recommend settings in the Fabric admin portal | MSDoc | Youtube |
Choose a data gateway type | MSDoc | Youtube |
Create a custom Power BI report theme | MSDoc | Youtube |
2. Implement and Manage a Data Analytics environment
Topic | Resource 1 | Resource 2 | Resource 3 |
Implement workspace and item-level access controls for Fabric items | MSDoc | MSDoc | |
Implement data sharing for workspaces, warehouses, and lakehouses | MSDoc | ||
Manage sensitivity labels in semantic models and lakehouses | MSDoc | ||
Configure Fabric-enabled workspace settings | MSDoc | Youtube | MSDoc |
Manage Fabric capacity | Blog | Blog |
3. Manage the Analytics Development Lifecycle
Topic | Resource 1 | Resource 2 |
Implement version control for a workspace | Youtube | MSDoc |
Create and manage a Power BI Desktop project(.pbip) | MSDoc | Blog |
Plan and implement deployment solutions | MSDoc | Youtube |
Perform impact analysis of downstream dependencies from lakehouses, data warehouses, dataflows, and semantic models | MSDoc | |
Deploy and manage semantic models by using the XMLA endpoint | MSDoc | |
Create and update reusable assets, including Power BI template(.pbit) files, Power BI data source(.pbids) files, and shared semantic models | Blog |
Prepare and serve data (40–45%)
1. Create objects in a lakehouse or warehouse
Topic | Resource 1 | Resource 2 | Resource 3 | Resource 4 |
Ingest data by using a data pipeline, dataflow, or notebook | Blog | Youtube | MSDoc | MSDoc |
Create and manage shortcuts | Blog | MSDoc | ||
Implement file partitioning for analytics workloads in a lakehouse | MSDoc | |||
Create views, functions, and stored procedures | Blog | Blog | ||
Enrich data by adding new columns or tables | Blog | Blog |
2. Copy Data
Topic | Resource 1 | Resource 2 | Resource 3 |
Choose an appropriate method for copying data from a Fabric data source to a lakehouse or warehouse | MSDoc | MSDoc | |
Copy data by using a data pipeline, dataflow, or notebook | MSDoc | ||
Add stored procedures, notebooks, and dataflows to a data pipeline | MSDoc | Blog | |
Schedule data pipelines | MSDoc | ||
Schedule dataflows and notebooks | MSDoc | MSDoc | Youtube |
3. Transform Data
Note: Some of the transformation skills are not specific to any tool and can be achieved using Spark, Dataflow or T-SQL. Links have been provided for each of them.
Topic | Resource 1 | Resource 2 | Resource 3 |
Implement a data cleansing process | |||
Implement a star schema for a lakehouse or warehouse, including Type 1 and Type 2 slowly changing dimensions | MSDoc | MSDoc | |
Implement bridge tables for a lakehouse or a warehouse | MSDoc | ||
Denormalize data | Blog | Blog | |
Aggregate or de-aggregate data | MSDoc | Blog | Blog |
Merge or join data | Blog | MSDoc | Youtube |
Identify and resolve duplicate data, missing data, or null values | Blog | Blog | |
Convert data types by using SQL or PySpark | MSDoc | Blog | |
Filter data | Blog | MSDoc |
4. Optimize Performance
Topic | Resource 1 | Resource 2 |
Identify and resolve data loading performance bottlenecks in dataflows, notebooks, and SQL queries | MSDoc | |
Implement performance improvements in dataflows, notebooks, and SQL queries | Blog | |
Identify and resolve issues with Delta table file sizes | Blog | MSDoc |
Implement and manage semantic models (20–25%)
1. Design and build semantic models
Topic | Resource 1 | Resource 2 | Resource 3 |
Choose a storage mode, including Direct Lake | Blog | Blog | |
Identify use cases for DAX Studio and Tabular Editor 2 | Youtube | Blog | |
Implement a star schema for a semantic model | MSDoc | Blog | Blog |
Implement relationships, such as bridge tables and many-to-many relationships | MSDoc | Blog | |
Write calculations that use DAX variables and functions, such as iterators, table filtering, windowing, and information functions | MSDoc | MSDoc | |
Implement calculation groups, dynamic strings, and field parameters | Blog | Blog | |
Design and build large format dataset | Blog | MSDoc | |
Design and build composite models that include aggregations | Blog | MSDoc | |
Implement dynamic row-level security and object-level security | Blog | MSDoc | |
Validate row-level security and object-level security | MSDoc | Blog |
2. Optimize enterprise-scale semantic models
Topic | Resource 1 | Resource 2 | Resource 3 |
Implement performance improvements in queries and report visuals | Blog | Blog | MSDoc |
Improve DAX performance by using DAX Studio | Blog | ||
Optimize a semantic model by using Tabular Editor 2 | Blog | Blog | |
Implement incremental refresh | Blog | MSDoc |
Explore and Analyze data (20–25%)
1. Perform exploratory analytics
Note: Within this sub-topic, there exists these skillsets- implementing descriptive, diagnostic, prescriptive and predictive analytics. These skillsets are so vast that they cannot be covered using couple of links. We would suggest working with the various visuals available in Power BI and get an understanding of the different options available.
Design Power BI reports – Training.|Microsoft Learn
Perform advanced analytics in Power BI – Training.|.Microsoft Learn
2. Query Data by using SQL
Topic | Resource 1 | Resource 2 |
Query a lakehouse in Fabric by using SQL queries or the visual query editor | MSDoc | Blog |
Query a warehouse in Fabric by using SQL queries or the visual query editor | MSDoc | MSDoc |
Connect to and query datasets by using the XMLA endpoint | Blog | Blog |
Conclusion
We hope that this blog with the link aggregations will help you in preparing for the DP-600 exam and to become a certified Fabric Analytics Engineer. We wish you all the best from Data Crafters!