waldo: Dependency Analysis Tool (ALDependencyAnalysis)
Remember thispost? Probably not. Nearly a year ago at Directions US, I showed some “howdid I do stuff” during a number of sessions. And it ended with a lot of feedback, whichcame down to: “can I have it”? So, that’s where I wrote a post “Ihave work to do” ;-).
The“DevOps”-part of the work is done: ALOpsis available and well used :-).
But the second promise – the “Dependency Analysis”, I only completed in November 2019 – and totally forgot to blog about it. In my defense – I did explain it at NAVTechDays, and that video IS online. You can find it here: NAV TechDays 2019 – Development Methodologies for the future (This is a link to the point in the video that explains the “dependency analysis” part).
Whatis it all about?
Well, in the movefrom C/AL to AL, you have a few options.
either you migrateyour old code to AL (you know, the txt2al-way of doing things) and basicallyend up with “old wine in new bottles”.
Or you rewrite. And if you rewrite, you either rewriteeverything in just one app, or you take the opportunity and divide your oldmonolith into a multitude of apps.
In my opinion, it does make sense to rewrite thesolution/product into AL, and take the opportunity to split it in multiple apps– and make dependencies if necessary.
Thing is – when youhave a product that multiple people have been working on, for multiple years –there is not one single person that have a overall overview of all createdfunctionalities – let alone how they were developed (and therefore dependentfrom each other). But – if you arerewriting your product, you probably WANT to have a complete overview of allthis, INCLUDING a view on the dependency.
So you have toanalyse that. Hence the name: “Dependency Analysis“.
How do I analyze an old codebase, and still have a complete overview of the entire functionality – and how do I decide on how to split it in apps?In my opinion, theonly way to do that, is to automate the crap out of it. The only way to not forget anything, is tonot use your memory.
In my company, wecreated a set of tools that I’d like to share with you. It contains:
All contributionsare very welcome ;-).
On how to use it,I’d like to refer you to the video, of course – it will get you started in 20minutes, and explain you the basic steps. In fact, the past couple of months, I referred a few partners to thisvideo, and they were all able to do their dependency analysis – so I guess it’sdescriptive enough, and the tools work (well enough ;-)). But still, let me give you a short overviewof the steps I think you should take – with a few remarks I think areinteresting to consider.
Step 0a: Set up waldo.model.tools
You might rememberthis blogpost: C/ALSource Code Analysis with PowerShell. Well, it’s that tool we will be using for the next steps. It can analyzeC/AL code – so it’s right what we need ;-). And apparently people are able to get it running. I actually came across thisblogpost where Ricardo Paiva Moinhos used this tool to create a genericdatamigration script from C/AL to AL. Awesome!
Step 0b: Set up a Business Central environment withthe ALDependencyAnalysis app
This is actually assimple as cloning the app from the ALDependencyAnalysis-repo,and publish it to the environment where you would like to perform theanalysis. In my case, a simple dockercontainer on my laptop. Make sure APIsare available .. because the app will deploy some custom APIs for us to be ableto upload data.
When you installedthe app, you’ll have a new rolecenter: the “DependencyAnalysis Rolecenter“.
Welcome to your“Dependency Analysis Control Center” ;-).
Step 1: Get all objects from C/AL and automaticallytag it if possible
The assumption hereis that you have exported all objects to C/AL (ALL, also default objects,because most likely, you did changes in these, and you’d want to have thereferences on where you did changes).
With thewaldo.model.tools, you can analyze the C/AL Code – so we’ll use that. In the Scripts-folderof the ALDependencyAnalysis-repo,you’ll find the scripts that I used to upload the necessary stuff to perform the analysis.
So for uploading theobjects, you need to run the “prepare”script first to load the objects in PowerShell. You’ll see that the scripts loads the model in the $Model variable,which will be used for the magic. Thatvariable will be quite big in memory ;-). The prepare-script is also going to load all companies from your API –because it needs that in the upcoming scripts.
Next, there is the 1_UploadObjects.ps1 script, that is simplygoing to loop all objects from the $Model variable, and upload them via the APIto your Business Central environment.
It is quiteimportant that during the module, that you tag your object. In a way, your object needs to have a“reason to exist”. An“intent”. A – let’s call it –a “module”. This module-codeusually is a piece of business logic that you added to the solution. “FA” could be a module name for“Fixed Assets”, for example. In this example, you see what I mean – all objects get a module (lastcolumn): the reason why they were created.
You can imagine thatdoing this manually, it’s huge job. Butprobably you have some logic that can determine quite a lot of the objectmodules for you, like the prefix of an object, the range, or something likethat. So we created a function in “ModelObject.Codeunit.al”that can handle that for you. Justchange in what you think works best for you!
Step 2: Manually correct/ignore modules
From the time youhave all objects in the table of your app, it’s time to correct all modules sothat every single object in that table is tagged by the right module. Your procedure might not have been able todecently tag all objects, and further on in the analysis, it’s important tohave the right names for all objects.
This is also whereyou would like to ignore the useless modules. Just imagine you already knowwhich parts of your product you will skip .. then it makes no sense to take itfurther in your analysis.
Step 3: Get Where-Used per object
This is where itgets interesting (or at least in my opinion ;-)).
The idea is that weare going to create dependencies between these modules. Now, do know this:
So – you can alreadyimagine, there is another script in the Scripts-folder: 2_UploadObjectLinks.ps1. Thatscript is a slightly bit more complicated. It will figure out all links, remove the ones that refer to themselves,build an object collection, loop it, and sends it to the assigned API,resulting in yet another few tables that get filled.
The “ObjectLinks” is the “raw data”, the links between objects. So that’s basically what the PowerShellscript was able to find out. But whileuploading this data, the app also fills the “Module Links. And it speaks for itself: this is the reallyinteresting table that you want to analyze..
Step 4: Analyze dependencies per module
To look at a bunchof data in a table is hard. Since we’re talking about “links”, whynot use “graphviz” to visualize what we have. And that’s exactly what we did – we used thistool: http://www.webgraphviz.com/. A very simple way to show a(dependency) graph – by easily create a bunch of text that can be copied inthis online tool. And that’s exactlywhat we can do now. With the action“Show Full Graphiz”, it shows a message. Just copy this message to the webgraphiztool, and you’ll have a visual representation of the interdependencies of allmodules of your product. Like we did:
Step 5: Solve circular dependencies
You might ask“what are circular dependencies. Well – easy: just imagine there are a bunch of dependencies, but theymake a circle. Like:
Well – if you have abig monolith, with a bunch of modules with a lot of interdependencies, yourgraph may look like I showed above – and all these red arrows, basicallyindicate “you have work to do”.
You can solve theseinterdependencies by either “not implementing modules anymore” (youcan simply do that by toggling the ignore (action)), or start to combine modules if you realize it doesn’tmake sense to split functionalities in modules. In any case: you can’t implement modules that are circularly dependent.
Step 6: manually create App-layer
Once you solved alldependencies, you might want to decide to combine multiple modules in a set ofapps.
Now, this step isobviously not mandatory – if you want to create a separate app for all yourmodules – please do. Honestly, I wish Icould turn back the time and had done that. Or at least went a bit more extreme … but we didn’t .. We couldn’t imageat the point having to maintain all those modules (+80) as apps. So we continued analysis by simply creatingan app-layer, and starting to assign modules to apps. So, simply create a record for each app inthe “Apps”-table, and assign an app for each module.
Step 7: Analyze dependencies per app
Now, becareful. Modules can have a decentdependency-flow (you solved it in step 5), but once you start combining again,you might end up with circular dependencies again. Just look at this:
So again, you havethis “Get Full Graph Text” action for Apps, which you can use toanalyze.
Step 8: Solve circular dependencies
This is the laststep! Now you need to solve the circulardependencies again! You can simply dothat by combining modules in one app, move modules from app to app, split, orsimply again remove modules ;-). You know what I mean – structure yourfunctionality, and come up with an architecture that is possible as acombination of AL Apps.
We ended up withthis:
And again – I wish Iwent a little bit more extreme on the “BASE” app – that would havehelped us a lot more with new apps, that could use a part of the BASE app, butnot all .. .
Anyway – for you todecide.
Look at thisblogpost/solution as a way to get a good, mental picture on the monolith youmight have in C/AL .. . Or as one way tohave a complete picture onit. And when you have – it’s going to beso much easier to make decisions regarding dependencies .. or things to ignore.. or .. .
Do NOT judge mycode, please. It has been developedbecause we needed a tool, quickly, for one time only – not to be sold, not tobe used for anything else. I justdecided to share it because I noticed that many people were interested.
The tool is there ASIS. I’m not going to support, nor updateit. Any contributions are alwayswelcome, of course ;-).
Расскажите о новых и интересных блогах по Microsoft Dynamics, напишите личное сообщение администратору.
|waldo: C/AL Source Code Analysis with PowerShell||Blog bot||NAV: Blogs||0||15.02.2019 03:21|
|waldo: Enable the updated Code Analysis for AL||Blog bot||NAV: Blogs||0||26.03.2018 08:11|
|waldo: Enable Code Analysis for AL||Blog bot||NAV: Blogs||0||23.01.2018 08:12|
|waldo: Microsoft Dynamics NAV 2017 – what’s really new?||Blog bot||NAV: Blogs||0||13.10.2016 05:22|
|Dynamics AX Sustained Engineering: Release of Dynamics AX 2009 Impact Analysis Tool||Blog bot||DAX Blogs||0||28.02.2010 00:13|
|Опции темы||Поиск в этой теме|