Development Project Assessment Tools

    Programming is a creative process, and very often attempts to measure any parameters of a project are seen as something seditious. Indeed, many programmers are grinning at trying to measure team performance, such as the number of lines of source code written. On the other hand, any process has a set of properties and changing parameters. Knowing their meanings is very important, as it helps answer the question "How are you?" Find out what is happening at the moment and plan steps, for example, to reduce errors in the product.

    Obviously, the more measurable parameters, the more opportunities for analysis of the situation. It’s great if you have data on the current number of bugs in your system. Even better if you can look at it in the context of the main components, and even better if you know how quickly changes were made to these components in order to assess the risks and prioritize the correction of existing errors. This is one of the trivial scenarios, but to implement it, you need tools that collect data suitable for analysis. This is the very “primary accounting” of the project, which, being turned into reports, allows you to make decisions at a higher level.
    And then it turns out that these data by and large have to be entered either manually, or they are stored in such an unstructured form that their analysis becomes a time-consuming task.

    Where to get the data from?

    Automation of development processes is usually reduced to three main components. This is task management ("tasks and bugs"), source code and automatic assembly. On the basis of these “three pillars” a lot of software products have been created and this scheme has proved its worth in creating a continuous integration environment. There are many solutions on the market, both free and paid, that will help you manage your code, tasks and bugs, and build a project. Moreover, there are complexes that integrate these parts with each other and allow, for example, to associate changes that are made to the source code with errors or tasks, thereby allowing to manage changes in the project at a higher level.

    The level of integration of these components may be different, but the issue related to measurements and reports is somehow not focused. Well, really, what kind of reports can be based on the source code base? The situation with tasks is slightly better; many systems will allow you to filter "tasks and bugs" by any status. If we consider the whole situation in a complex, then it turns out that combining all three components into a single set of reporting data is a non-trivial task.


    When designing the Visual Studio Team System in 2003, it was decided to create a complete analytics system as the most important component of the complex. The main goals were as follows:
    • 1) Automatically collect all possible information
    • 2) To minimize user requests for manual input of information
    • 3) Provide quick analysis tools on demand (ad-hoc)

    In this case, the components of version control, task management and source code are considered as data sources, which with the help of special adapters fill information in the database of reports. It should be noted that adapters can be expanded. This allows filling the database with information from additional data sources, expanding the scheme. The source code example of such an adapter can be found on the Codeplex website.

    Data Acquisition Scheme in TFS

    The processed data in the OLAP database can then be analyzed using reports based on Reporting Services or Pivot Excel spreadsheets.

    This approach allowed us to improve the quality of the source data by an order of magnitude, but the most important thing is that it allows us to analyze data in a single complex with a wide variety of filters and sections. Several predefined reports based on Reporting Services are already shipped with Team Foundation Server 2010, and these reports are described in the documentation for the Agile and CMMI process templates . But things are much more interesting with reports that can be built on the basis of OLAP and Excel crosstabs.

    Excel like a Swiss knife

    Working with project analytic data using Excel is very simple. To do this, just connect to the external data source of the TFS cube:

    Excel Data Source Connection Menu

    Usually, the server on which the TFS database is located is the OLAP server. Do not forget to get access rights to the user on behalf of whom you are connecting to this server.

    Dialog for entering OLAP server address in Excel

    Then you just have to select the information display mode:

    Pivot table mode

    And you can create your first Pivot table based on TFS data.

    Pivot TFS table

    TFS analytic cube contains 15 fact tables. This is the data that can be expressed in aggregated form. This, for example, information such as the number of modified lines of code, the number of unit tests passed, the number of lines of code covered by unit tests (code coverage) and others. This information can be filtered in various dimensions , which are more than 60. For example, date / time, project areas, iterations, modules, team members, test categories, projects, work item fields and so on.


    Putting it all together

    All these analytical sections and measurable parameters allow literally “three clicks” to build complex queries or use samples and change them to your liking. Knowing this information, the team and management can clearly understand what the current situation is on the project. For example, it is possible to build a query that displays information about the components of the project, the number of tests passed, as well as the speed of making changes to these components. Such a report allows you to analyze the situation using several sets of parameters. The thing is that even if some component of your project has a small number of errors, but the speed of making changes to the code is high, this component can be highlighted as a candidate for closer attention of testers.


    Report Output Options

    An additional interesting feature of Excel reports is their "self-sufficiency." Having generated the necessary report, you can save it, send it by mail to interested parties. And if you need to look at the actual data, just open the file again and update the data from the source. An additional interesting scenario is the publication of an Excel file on the Sharepoint website in a document library with Excel services enabled. This allows you to see graphs and tables in HTML form, and does not require the user access rights to the OLFS TFS cube, and the ability to join it.

    But the analytical capabilities based on the OLAP data source can be used not only with Excel. Also, using standard SharePoint Portal Server tools, data from the TFS cube OLAP can be converted to KPI and dashboards formed, which in a concise, concise form of “traffic lights” help to instantly evaluate the current state of the project. Examples of such indicators can be threshold values ​​for the number of errors on a project, the number of remaining open tasks on a date, data on the percentage of code coverage by unit tests (code coverage).


    I would also like to mention a new tool, Live Labs Pivot - a very interesting mechanism for data visualization. There are already examples of its use as a tool for outputting data from TFS data sources.


    Forewarned - means armed, the proverb says. The complexity of creating software systems more than ever depends on whether you can adequately assess the current situation on the project using analytical tools, formal metrics and reports. If you know what is happening, then you can take adequate steps to prevent the failure of events. Applying Team System 2010 technical tools, assessing the current situation on a project becomes a solvable task and allows you to focus the team on precisely those things that affect the success of the entire project.

    Also popular now: