Writing quality code with NDepend v2017
The new version of NDepend v2017 has totally blown my mind. I can't stop exploring the new and enhanced Dashboard with features like Technical debt estimation, Quality Gates, Rules and Issues. In this post I will try to summarise what's new with NDepend v2017 and how to use these new features to write quality code.
I'm sure that you have experienced the feeling when you start typing code and after few weeks down the project you don't really know If what you have designed and coded is actually good or bad (by bad I mean that it's in some way rigid, fragile or non-reusable). I'm an advocate of Continuous Integration so I do have loads of metrics that help me identify broken windows or code smells easily in my code during the check-in stage. All these metrics encompass standards such as Design, Globalisation, Interoperability, Mobility, Naming, Performance, Portability, Security, Usage and so on. But none of them give a global rating that I could easily use to check if my project is actually good or bad.
Here is a quick sneak peek of the new dashboard:
The aspects I'm most interested in and that I will delve into detail are the following ones:
This section considers the settings I have configured for my project. In this case the debt has increased from 7.75% to 7.93% due to the increase of number of issues in the solution. It also determines that the time needed to reach band "A" is of 3 hours and 32 min. The total amount of days needed to fix all the issues is the Debt (1 day and 1 hour).
To get values closer to reality, you have to configure your project to specify how long it will take you or any member of your team to fix an issue (most of the times I just specify half a day per issue as a rule). Here you can see the settings I have specified in my solutions as a rule of thumb and that you can consider in your projects:
These settings use the following considerations:
Another aspect to consider to get a proper estimation is also the code coverage. If you configure the coverage correctly in your solution then NDepend can get that data and use it to get a more comprehensive estimation.
To configure code coverage for NDepend you can follow my steps below:
Go to NDepend project coverage settings under the Analysis tab and in there you'll have to select the XML file generated by DotCover.
If you run your tests with ReSharper you can select the coverage option and then in that menu go to the export button and select "Export to XML for NDepend". Leave this file in a known folder so you can automate this easily later on. The goal here is to configure everything manually but then you will have to do the work around so you can trigger all this with your build agent and get the report at the end of the run.
Chose the exported file and run again your analysis:
Now with all these details if you run NDepend you should get something like this:
Now you can see proper debt and the coverage. This is a little project that I'm currently working on and that it really works to demonstrate how good NDepend is in this case. If you don't know what one of the terms means, you can just click on it and you'll be redirected to the panel with all the details about that specific metric and its description.
The following three additional panels help shaping the technical debt information: Quality Gates, Rules and Issues. Below you'll find a quick introduction on each section and its relevance.
Here are some of these gates used for your reference:
Here are some of these rules:
If you think that one of these rules does not apply to your project, you can just uncheck it and the framework will take of it so you don't have to worry about it anymore.
Then each issue has a clear definition of the time that could take to fix:
I have been fiddling with tools like FXCop and NDepend for a while now and I must say that NDepend is a must have in my toolkit belt. Really easy to use and with just one click you can have real arguments on the issues that need to be fixed in your solution and how long the team should take to fix them.
I'm sure that you have experienced the feeling when you start typing code and after few weeks down the project you don't really know If what you have designed and coded is actually good or bad (by bad I mean that it's in some way rigid, fragile or non-reusable). I'm an advocate of Continuous Integration so I do have loads of metrics that help me identify broken windows or code smells easily in my code during the check-in stage. All these metrics encompass standards such as Design, Globalisation, Interoperability, Mobility, Naming, Performance, Portability, Security, Usage and so on. But none of them give a global rating that I could easily use to check if my project is actually good or bad.
Enhanced Dashboard
This new version contains a new set of application metrics that really improved the overall quality of the product. I will start integrating this as part of my release procedure as it gives a really good grasp of the status of the project from the coding side.Here is a quick sneak peek of the new dashboard:
The aspects I'm most interested in and that I will delve into detail are the following ones:
Technical Debt Estimation
This new feature is a MUST for me. Just analyse your project with NDepend v2017 and let it give you the percentage of technical debt according to the rules that you have configured in your project. After every analysis you can see the trend and act accordingly using this metric:This section considers the settings I have configured for my project. In this case the debt has increased from 7.75% to 7.93% due to the increase of number of issues in the solution. It also determines that the time needed to reach band "A" is of 3 hours and 32 min. The total amount of days needed to fix all the issues is the Debt (1 day and 1 hour).
To get values closer to reality, you have to configure your project to specify how long it will take you or any member of your team to fix an issue (most of the times I just specify half a day per issue as a rule). Here you can see the settings I have specified in my solutions as a rule of thumb and that you can consider in your projects:
These settings use the following considerations:
- Your team will mostly code 6 hour a day. The rest of the time is spent with meetings, emails, research, etc.
- The estimated effort to fix one issue is of 4 hours. That's the minimum I would give as average. There are issues that are fixed in 5 min and there are others that might take quite a bit of time. Don't forget that this time also includes filling the ticket details in your scrum environment and documentation, etc.
- Then depending on the severity of the issue there is a threshold specified too as you can see in the figure above.
Another aspect to consider to get a proper estimation is also the code coverage. If you configure the coverage correctly in your solution then NDepend can get that data and use it to get a more comprehensive estimation.
To configure code coverage for NDepend you can follow my steps below:
Configuring JetBrains DotCover.
Once you've run your initial analysis, NDepend will also ask you to configure Code Coverage to get more information about your project and some additional metrics.Go to NDepend project coverage settings under the Analysis tab and in there you'll have to select the XML file generated by DotCover.
If you run your tests with ReSharper you can select the coverage option and then in that menu go to the export button and select "Export to XML for NDepend". Leave this file in a known folder so you can automate this easily later on. The goal here is to configure everything manually but then you will have to do the work around so you can trigger all this with your build agent and get the report at the end of the run.
Chose the exported file and run again your analysis:
Now with all these details if you run NDepend you should get something like this:
Now you can see proper debt and the coverage. This is a little project that I'm currently working on and that it really works to demonstrate how good NDepend is in this case. If you don't know what one of the terms means, you can just click on it and you'll be redirected to the panel with all the details about that specific metric and its description.
The following three additional panels help shaping the technical debt information: Quality Gates, Rules and Issues. Below you'll find a quick introduction on each section and its relevance.
Quality Gates
Quality gates are based on Rules, Issues and Coverage. Basically this section determines certain parameters that your project should match in order to pass "quality". So for example: your project should contain a % of code coverage, your project should not contain Blocker or Critical issues, etc.Here are some of these gates used for your reference:
Rules
Rules are defined as Project Rules and they check for violations in your code. This is like the rules defined by FXCop and that provide real arguments as to why your code is breaking a rule or that it needs to be better. Once you've gone through several iterations of fixing these, then your code will get cleaner and better (I promise you!). And most of all, you will understand the reason behind the rule!.Here are some of these rules:
If you think that one of these rules does not apply to your project, you can just uncheck it and the framework will take of it so you don't have to worry about it anymore.
Issues
The number of issues are just a way of grouping the rules so you can determine which ones are the important ones to fix. So you can violate few rules but then these rules are categorised between blocker and low. So even though the project is violating 18 rules, 1 of these rules is just Low. This gives you an understanding of what's important to fix and what can wait.Then each issue has a clear definition of the time that could take to fix:
Conclusion
To conclude, writing quality code is one of my main concerns nowadays. It's really easy to write code and also code that works but the difference between code that works and excellent code is this: quality and NDepend has the solution for you.I have been fiddling with tools like FXCop and NDepend for a while now and I must say that NDepend is a must have in my toolkit belt. Really easy to use and with just one click you can have real arguments on the issues that need to be fixed in your solution and how long the team should take to fix them.
Comments
Post a Comment