If you are currently working on a legacy project and you want to make your life easier, then this article is for you.
Recently, I started working on a high-traffic/high-profile fashion website. Due to a time-sensitive warehousing relocation, the websites order management system had to completely change by a fixed date. As a result, getting something working by a certain date was the focus. Like a lot of things that are done in a rush, the codebase and the release process suffered. The team hit all the business objectives and now has some time to focus on improving things. This raised two the question, firstly, how should the team tackle these improvements and secondly, how can we create some metrics around these changes so we can track progress. In essence, how can we tall the team is making good meaningful progress rather than just doing busy work?
In a stroke of fortuitous timing, the folks over at NDepends got in contact with me on the exact same day the team started discussing these changes. For those of you who have never looked at NDepends before, it's a tool that integrates inside of Visual Studio, that claims to help with things like:
If you are interested in learning how my experiment with NDepends went and if you can use it to make your life easier, then read on.
When I go in and work with new clients on a project, one of the first things you need to do is to try and figure out how bad things are. It's all very good, having the team say 'the codebase is crap', however, if you can't be specific about how bad things are, what you will fix and how long it will take then it will be very hard for your product owner to agree to this time. My personal opinion is that I think all companies should set aside to fix things on a weekly basis, I also think I deserve to win the lottery, however, in reality, you need to be realistic. If you want to improve your codebase, you will very likely have to sell the concept to someone, in order for them to sign-off the time. Previously, this has been pretty difficult. I've used tools like Sonarcube to determine how 'good' the codebase was, however, this involved setting up a continuous integration pipeline which took months. The feature I was most optimistic about using within NDepends was it's ability to generate a code quality report directly within Visual Studio with a few clicks of a button.
Installing NDepends is idiot proof.. you download an installer, run it, click a few buttons and its installed. After NDepends is installed locally, within Visual Studio you will see an 'NDepends' menu option.
Before you can generate a NDepends code quality report, you will need to create an NDepends project. Within Visual Studio, open up the solution you want to run the test again and within the NDepends menu, select 'Project' -> 'New Project', like so:
This project is where NDepends will store all it's config settings and relevant settings to your solution. It does not affect your end website, so you can call it whatever you want. After you have created a project, you can then select the 'Run Analysis on this project':
It's that simple. The report gets generated as a standalone HTML website, so you can email it around the team. You can also see the report in Visual Studio. I wasn't really to sure what to expect from the report, but it contained loads of useful insights. Including things like:
Longtime reads of this site will know that I'm into good development practices. I don't like working in teams when things like ensuring good quality are manual. I like tools that can take subjective issues and turn them into binary decisions.
For this project, NDepends definitely helped increase code quality and it's report generated much-needed buy-in from the business to spend dev time fixing things.
Before I installed NDepends, I had a preconceived notion that there would be a steep learning curve before I would be able to get benefits from using it. Thi even put me off trying to use it for a few weeks. Figuring out how to generate a report is simple and within about 10 minutes of installing it, I had a report that gave me a much clear picture of where the biggest areas in the website that contain smells, which classes had the most critical issues.
The thing I like about the report is that because it's all within Visual Studio, rather than something like Sonar cube, I can generate a report and then click through the 'Queries and rules'' explorer and fix things quickly.
After the refactoring and the report was down to no errors, it was really simple to start enforcing stricter coding standard around the team as it allowed us to introduce more binary standards into the code review process. If you have ever worked in a code review, one issue is always with getting a consistent standard. More senior people on the team will give harder and more in-depth feedback to junior people. As code reviews can be very subjective and open to individual preferences getting agreement as to 'good' code can be open to debate. When the process was turned into a binary, does your commit add extra code smells as defined by NDepends, yes/no.
Software Architect, Programmer and Technologist Jon Jones is founder and CEO of London-based tech firm Digital Prompt. He has been working in the field for nearly a decade, specializing in new technologies and technical solution research in the web business. A passionate blogger by heart , speaker & consultant from England.. always on the hunt for the next challenge