Visual Studio 2019 support in PVS-Studio
Support for Visual Studio 2019 in PVS-Studio immediately affected several different components: the IDE plug-in itself, command line analysis application, C ++ and C # analyzers, as well as several utilities. I’ll briefly talk about what problems we encountered in supporting the new version of the IDE and how to solve them.
Before you start, I want to look back a bit to trace the history of support for previous versions of Visual Studio, which will give a better understanding of our vision of the task and the decisions made in certain situations.
Starting with the first version of PVS-Studio analyzer, in which a plug-in for the Visual Studio environment appeared (then it was also a version of Visual Studio 2005), supporting new versions of Visual Studio was a fairly simple task for us - it basically boiled down to updating the plug-in project file and dependencies of various Visual Studio extension APIs. Sometimes it was necessary to additionally support new features of the C ++ language, which the Visual C ++ compiler was gradually learning, but this also usually did not cause problems immediately before the release of the next edition of Visual Studio. And there was only one analyzer in PVS-Studio then - for the C and C ++ languages.
Everything changed for the release of Visual Studio 2017. In addition to the fact that many of the extension APIs for this IDE changed very significantly in this version, after the update we ran into problems of ensuring backward compatibility of the work of the new C # analyzer that appeared at that time (as well as our new C ++ analyzer for MSBuild projects) with older versions of MSBuild \ Visual Studio.
Therefore, before reading this article, I highly recommend that you read the related article about Visual Studio 2017 support: " Visual Studio 2017 and Roslyn 2.0 support in PVS-Studio: sometimes using ready-made solutions is not as easy as it seems at first glance". The article mentioned above describes the problems that we encountered last time, as well as the interaction schemes of various components (for example, PVS-Studio, MSBuild, Roslyn). Understanding this interaction will be useful when reading this article.
Ultimately, the solution to these The problems brought significant changes to our analyzer, and, as we hoped, the new approaches that we used then will make it possible to support updated versions of Visual Studio \ MSBuild much easier and faster in the future. During numerous updates Visual Studio 2017. Did we need this new approach with the support of Visual Studio 2019? This is below.
PVS-Studio Plugin for Visual Studio 2019
It all began, it would seem, not bad. It was easy enough to port the plug-in to Visual Studio 2019, where it started and worked fine. Despite this, 2 problems were immediately revealed, which promised future troubles.
The first - the interface IVsSolutionWorkspaceService , used to support the regime Lightweight Solution Load, which, incidentally, was disabled in a previous update still in the Visual Studio 2017 was decorated with the attribute Deprecated , which so far is only a warning in the assembly, but in the future bode b about lshie Problems. Microsoft quickly introduced this mode and abandoned it ... We dealt with this problem quite simply - refused to use the appropriate interface.
The second - when loading Visual Studio with the plugin, the following message appeared: Visual Studio has detected one or more extensions that are at risk or not functioning in a feature VS update.
Viewing the Visual Studio launch logs (ActivityLog file) has finally dotted the 'i':
Warning: Extension 'PVS-Studio' uses the 'synchronous auto-load' feature of Visual Studio. This feature will no longer be supported in a future Visual Studio 2019 update, at which point this extension will not work. Please contact the extension vendor to get an update.
For us, this meant one thing - changing the way the plug-in is loaded into asynchronous mode. I hope you are not upset if I do not overload you with details about interacting with COM interfaces of Visual Studio, and I will go over the changes briefly enough.
Microsoft has an article on creating asynchronously loaded plugins: " How to: Use AsyncPackage to load VSPackages in the background ". At the same time, it was obvious to everyone that the matter would not be limited to these changes.
One of the main changes is the method of loading, or rather, initialization. Previously, the necessary initialization took place in two methods - the overridden Initialize method of our Package inheritance class , and the OnShellPropertyChange method. The need to transfer part of the logic to the OnShellPropertyChange method is due to the fact that when the plug-in is loaded synchronously, Visual Studio may not yet be fully loaded and initialized, and as a result of this, not all necessary actions could be performed at the plug-in initialization stage. An option to solve this problem is to wait for Visual Studio to exit the 'zombie' state and delay these actions. This is the logic and has been rendered in OnShellPropertyChange with a check of the 'zombie' state.
In the AsyncPackage abstract class , from which asynchronously loaded plugins are inherited, the Initialize method has a sealed modifier, so initialization must be done in the overridden InitializeAsync method , which was done. We also had to change the logic with tracking the 'zombie' state of Visual Studio, because we stopped receiving this information in the plugin. However, a number of actions that needed to be performed after the plugin was initialized did not go away. Output is the use of the method OnPackageLoaded interface IVsPackageLoadEvents , where he carried out actions that require deferred execution.
Another problem that logically arises from the fact of asynchronous loading of the plugin is the absence of PVS-Studio plugin commands at the time of starting Visual Studio. When you open the analyzer log by double-clicking from the file manager (if you need to open it through Visual Studio), the necessary version of devenv.exe was launched with the command to open the analyzer report. The launch command looked something like this:
"C:\Program Files (x86)\Microsoft Visual Studio\
2017\Community\Common7\IDE\devenv.exe"
/command "PVSStudio.OpenAnalysisReport
C:\Users\vasiliev\source\repos\ConsoleApp\ConsoleApp.plog"
The "/ command" flag here is used to invoke a command registered in Visual Studio. Now this approach did not work, since the commands were not available until the plug-in was downloaded. As a result, I had to stop on the “crutch” with parsing the launch line of devenv.exe after loading the plugin, and if there is a string representation of the command to open the log - in fact, loading the log. Thus, in this case, having refused to use the “correct” interface for working with commands, it was possible to maintain the necessary functionality by delaying the loading of the log until the plug-in is fully loaded.
Fuh, it seems to be sorted out and everything works - everything loads and opens correctly, there are no warnings - finally.
And then the unexpected happens - Pavel (hello!) Installs a plug-in, after which he asks why we did not do asynchronous loading?
To say that we were surprised - to say nothing - how so? No, really - here is the new version of the plugin installed, here is the message that the package is synchronously downloadable. We install with Alexander (and hello to you too) the same version of the plugin on our machines - everything is fine. Nothing is clear - we decided to see which versions of the PVS-Studio libraries were loaded in Visual Studio. And suddenly it turns out that the versions of PVS-Studio libraries for Visual Studio 2017 are used, despite the fact that the correct version of the libraries are in the VSIX package - for Visual Studio 2019.
Having tinkered with VSIXInstaller, I managed to find the cause of the problem - the package cache. The theory was also confirmed by the fact that when restricting access rights to the package in the cache (C: \ ProgramData \ Microsoft \ VisualStudio \ Packages) VSIXInstaller wrote error information to the log. Surprisingly, if there is no error, no information about the fact that the package is installed from the cache is not written to the log.
Note . Studying the behavior of VSIXInstaller and related libraries, he noted to himself that it is very cool that Roslyn and MSBuild have open source code, which makes it convenient to read, debug and track the logic of work.
As a result, the following happened - when installing the plugin, VSIXInstaller saw that the corresponding package was already in the cache (there was a .vsix package for Visual Studio 2017), and used it instead of the actual installed package during installation. Why this does not take into account the restrictions / requirements described in .vsixmanifest (for example, the version of Visual Studio for which you can install the extension) is an open question. Because of this, it turned out that although .vsixmanifest contained the necessary restrictions, the plugin designed for Visual Studio 2017 was installed on Visual Studio 2019.
The worst thing is that such an installation broke the dependency graph of Visual Studio, and although outwardly it might even seem that the development environment was working fine, in fact everything was very bad. It was impossible to install and uninstall extensions, make updates, and so on. The process of 'recovery' was also rather unpleasant, because it was necessary to delete the extension (the corresponding files), as well as manually edit the configuration files that store information about the installed package. In general - it’s not pleasant enough.
To solve this problem and avoid similar situations in the future, it was decided to create a GUID for the new package in order to completely separate the Visual Studio 2017 and Visual Studio 2019 packages (there is no such problem with older packages, and they always used a common GUID).
And since we were talking about unpleasant surprises, I’ll mention one more thing - after updating to Preview 2, the menu item 'moved' under the 'Extensions' tab. It would seem that it’s okay, but access to the functions of the plugin has become less convenient. On subsequent versions of Visual Studio 2019, including the release version, this behavior has been preserved. I did not find any mention of this 'feature' at the time of its release in the documentation or the blog.
Now, it would seem, everything works, and with the plug-in support for Visual Studio 2019 is finished. The day after the release of PVS-Studio 7.02 with support for Visual Studio 2019, it turned out that this was not so - another problem with the asynchronous plug-in was found. For the user, this could look like this: when opening a window with the results of the analysis (or starting the analysis), our window was sometimes displayed “empty” - it contained no contents: buttons, a table with analyzer warnings, etc.
In fact, this problem was sometimes repeated in the course of work. However, it was repeated only on one machine, and began to appear only after updating Visual Studio in one of the first versions of 'Preview' - there were suspicions that something had broken during installation / update. Over time, however, the problem ceased to be repeated even on this machine, and we decided that it "repaired by itself." It turned out that no - just so lucky. More precisely, no luck.
The matter turned out to be in the order of initialization of the environment window itself (the descendant of the ToolWindowPane class ) and its contents (in fact, our control with the grid and buttons). Under certain conditions, the control was initialized before pane was initialized, and despite the fact that everything worked without errors, the FindToolWindowAsync method(creating a window on the first call) it worked correctly, but the control remained invisible. We fixed this by adding lazy initialization for our control to the pane fill code.
Support C # 8.0
Using Roslyn as a basis for the analyzer has a significant advantage - there is no need to manually maintain new language constructs. All this is supported and implemented within the framework of Microsoft.CodeAnalysis libraries - we use ready-made results. Thus, support for the new syntax is implemented by updating the libraries.
Of course, as far as static analysis is concerned, here you already have to do everything yourself, in particular, to process new language constructs. Yes, we get the new syntax tree automatically by using the more recent version of Roslyn, but we need to teach the analyzer how to perceive and process new / changed nodes of the tree.
I think the most talked-about innovation in C # 8 is the nullable reference types. I will not write about them here - this is a rather large topic worthy of a separate article (which is already in the process of writing). In general, we have so far settled on ignoring nullable annotations in our dataflow mechanism (i.e., we understand, parse and skip them). The fact is that despite the non-nullable reference type of the variable, you can still write null to it quite simply (or by mistake) , which can lead to NRE when dereferencing the corresponding link. In this case, our analyzer can see a similar error and give a warning on the use of a potentially null reference (of course, if it sees such an assignment in the code) despite the non-nullable reference type of the variable.
I want to note that the use of nullable reference types and the accompanying syntax opens up the possibility of writing very interesting code. To ourselves, we called this the 'emotional syntax'. The code below compiles quite well:
obj.Calculate();
obj?.Calculate();
obj.Calculate();
obj!?.Calculate();
obj!!!.Calculate();
By the way, during the course of my work I found a couple of ways to 'fill up' Visual Studio using the new syntax. The fact is that you can not limit the number of characters to one when you put '!'. That is, you can write not only a code of the form:
object temp = null!
but also:
object temp = null!!!;
You can pervert, go ahead and write like this:
object temp = null!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!;
This code compiles successfully. But if you request information about the syntax tree using the Syntax Visualizer from the .NET Compiler Platform SDK, Visual Studio will crash.
You can get information about the problem from the Event Viewer:
Faulting application name: devenv.exe,
version: 16.0.28803.352, time stamp: 0x5cc37012
Faulting module name: WindowsBase.ni.dll,
version: 4.8.3745.0, time stamp: 0x5c5bab63
Exception code: 0xc00000fd
Fault offset: 0x000c9af4
Faulting process id: 0x3274
Faulting application start time: 0x01d5095e7259362e
Faulting application path: C:\Program Files (x86)\
Microsoft Visual Studio\2019\Community\Common7\IDE\devenv.exe
Faulting module path: C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\
WindowsBase\4480dfedf0d7b4329838f4bbf953027d\WindowsBase.ni.dll
Report Id: 66d41eb2-c658-486d-b417-02961d9c3e4f
Faulting package full name:
Faulting package-relative application ID:
If you go further and increase the number of exclamation points several times, Visual Studio will fall by itself - the help of Syntax Visualizer is no longer needed. The Microsoft.CodeAnalysis libraries and the csc.exe compiler also do not digest this code.
Of course, these are synthetic examples, but still this fact seemed funny to me.
Toolset
Note . Once again I am faced with the problem of translating the word 'evaluation' in the context of a conversation about MSBuild projects. The translation, which seemed the closest in meaning and at the same time sounding normal, was “building a project model”. If you have alternative translation options - you can write to me, it will be interesting to read.
It was obvious that updating toolset would be the most time-consuming task. More precisely, it seemed so from the very beginning, but now I am inclined to believe that the most problematic was the plugin support. In particular, this was due to the already existing toolset and the mechanism for building the MSBuild project model, which worked successfully now, although it required expansion. No need to write algorithms from scratch greatly simplified the task. Our bet on “our” toolset, made at the stage of supporting Visual Studio 2017, was justified once again.
Traditionally, it all starts with updating NuGet packages. On the NuGet package management tab for solutions, there is a 'Update' button ... which does not work. When updating all packages, multiple version conflicts arose, and resolving them all seemed to be not very correct. A more painful, but, it seems, more reliable way is to 'piece by piece' update the target Microsoft.Build / Microsoft.CodeAnalysis packages.
One of the differences was immediately identified by tests of diagnostic rules - the structure of the syntax tree for an already existing node has changed. It's okay, corrected quickly.
Let me remind you that during the work we test analyzers (C #, C ++, Java) on open source projects. This allows you to test diagnostic rules well - find, for example, false positives, or get an idea of what other cases have not been considered (reduce the number of false negatives). These tests also help to track possible regression at the initial stage of updating libraries / toolset. And this time was no exception, as a number of problems surfaced.
One problem was the deterioration of behavior within the CodeAnalysis libraries. More specifically, on a number of projects in the library code, exceptions occurred during various operations - obtaining semantic information, opening projects, etc.
Attentive readers of the article about Visual Studio 2017 support remember that our distribution kit has a stub - the MSBuild.exe file is 0 bytes in size.
This time I had to go further - now the distribution kit also contains empty compiler stubs - csc.exe, vbc.exe, VBCSCompiler.exe. What for? The way to this began with the analysis of one of the projects in the test base, on which the 'diffs' of reports appeared - a number of warnings were absent when using the new version of the analyzer.
The problem turned out to be conditional compilation symbols - when analyzing a project using the new version of the analyzer, some of the symbols were extracted incorrectly. To better understand what caused this problem, I had to dive into the Roslyn libraries.
To parse conditional compilation characters, use the methodGetDefineConstantsSwitch Class Csc from the library Microsoft.Build.Tasks.CodeAnalysis . Parsing is performed using the String.Split method on a number of delimiters:
string[] allIdentifiers
= originalDefineConstants.Split(new char[] { ',', ';', ' ' });
This parsing method works fine, all the necessary conditional compilation symbols are successfully extracted. Digging further.
The next key point - the method call ComputePathToTool class ToolTask . This method builds the path to the executable file ( csc.exe ) and checks for its presence. If the file exists, the path to it is returned, otherwise null is returned .
Caller Code:
....
string pathToTool = ComputePathToTool();
if (pathToTool == null)
{
// An appropriate error should have been logged already.
return false;
}
....
Since there is no csc.exe file (it would seem - why do we need it?), PathToTool at this stage is null , and the current method ( ToolTask.Execute ) finishes its execution with the result false . As a result, the results of the task, including the resulting conditional compilation symbols, are ignored.
Well, let's see what happens if you put the csc.exe file in the expected location.
In this case, pathToTool indicates the actual location of the existing file and execution of the ToolTask.Execute method continues. The next key point is the call to the ManagedCompiler.ExecuteTool method. And it begins as follows:
protected override int ExecuteTool(string pathToTool,
string responseFileCommands,
string commandLineCommands)
{
if (ProvideCommandLineArgs)
{
CommandLineArgs = GetArguments(commandLineCommands, responseFileCommands)
.Select(arg => new TaskItem(arg)).ToArray();
}
if (SkipCompilerExecution)
{
return 0;
}
....
}
The SkipCompilerExecution property is true (logically, we are not actually compiling). As a result, the calling method (already mentioned ToolTask.Execute ) checks that the return code of the ExecuteTool method is 0, and, if so, completes its execution with the value true . What you had behind csc.exe was there - the real compiler or Leo Tolstoy's 'War and Peace' in textual form does not matter.
As a result, the main problem stems from the fact that the sequence of steps is defined in the following order:
- check the existence of the compiler;
- check if the compiler needs to be started;
and not vice versa. Compiler stubs successfully solve this problem.
Well, how did the characters of successful compilation come about if the csc.exe file was not detected (and the result of the task was ignored)?
There is a method for this case - CSharpCommandLineParser.ParseConditionalCompilationSymbols from the Microsoft.CodeAnalysis.CSharp library . Parsing is also performed by the String.Split method with a number of delimiters:
string[] values
= value.Split(new char[] { ';', ',' } /*,
StringSplitOptions.RemoveEmptyEntries*/);
Notice the difference with the set of delimiters from the Csc.GetDefineConstantsSwitch method ? In this case, the whitespace is not a separator. Thus, if conditional compilation characters were written with a space, this method will parse them incorrectly.
This situation arose on problematic projects - conditional compilation characters were written in them with a space, and were successfully parsed using GetDefineConstantsSwitch , but not ParseConditionalCompilationSymbols .
Another problem that revealed itself after updating the libraries was the deterioration of behavior in a number of cases, in particular, on projects that were not collected. Problems arose in the Microsoft.CodeAnalysis libraries and returned to us in the form of various exceptions -ArgumentNullException (some internal logger was not initialized), NullReferenceException and others.
I want to talk about one of these problems below - it seemed to me quite interesting.
We encountered this problem when checking the latest version of the Roslyn project - a NullReferenceException was thrown from the code of one of the libraries . Due to enough detailed information about the location of the problem, we quickly found the problem code and, for the sake of interest, decided to try to see if the problem recurs when working from Visual Studio.
Well - it was possible to reproduce it in Visual Studio (the experiment was conducted on Visual Studio 16.0.3). To do this, we need a class definition of the following form:
class C1
{
void foo()
{
T1 val = default;
if (val is null)
{ }
}
}
We will also need Syntax Visualizer (part of the .NET Compiler Platform SDK). It is necessary to request TypeSymbol (menu item “View TypeSymbol (if any)”) from the node of the syntax tree of type ConstantPatternSyntax ( null ). After that, Visual Studio will restart, and in the Event Viewer you can see information about the problem, in particular, find the stack trace:
Application: devenv.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.NullReferenceException
at Microsoft.CodeAnalysis.CSharp.ConversionsBase.
ClassifyImplicitBuiltInConversionSlow(
Microsoft.CodeAnalysis.CSharp.Symbols.TypeSymbol,
Microsoft.CodeAnalysis.CSharp.Symbols.TypeSymbol,
System.Collections.Generic.HashSet'1
ByRef)
at Microsoft.CodeAnalysis.CSharp.ConversionsBase.ClassifyBuiltInConversion(
Microsoft.CodeAnalysis.CSharp.Symbols.TypeSymbol,
Microsoft.CodeAnalysis.CSharp.Symbols.TypeSymbol,
System.Collections.Generic.HashSet'1
ByRef)
at Microsoft.CodeAnalysis.CSharp.CSharpSemanticModel.GetTypeInfoForNode(
Microsoft.CodeAnalysis.CSharp.BoundNode,
Microsoft.CodeAnalysis.CSharp.BoundNode,
Microsoft.CodeAnalysis.CSharp.BoundNode)
at Microsoft.CodeAnalysis.CSharp.MemberSemanticModel.GetTypeInfoWorker(
Microsoft.CodeAnalysis.CSharp.CSharpSyntaxNode,
System.Threading.CancellationToken)
at Microsoft.CodeAnalysis.CSharp.SyntaxTreeSemanticModel.GetTypeInfoWorker(
Microsoft.CodeAnalysis.CSharp.CSharpSyntaxNode,
System.Threading.CancellationToken)
at Microsoft.CodeAnalysis.CSharp.CSharpSemanticModel.GetTypeInfo(
Microsoft.CodeAnalysis.CSharp.Syntax.PatternSyntax,
System.Threading.CancellationToken)
at Microsoft.CodeAnalysis.CSharp.CSharpSemanticModel.GetTypeInfoFromNode(
Microsoft.CodeAnalysis.SyntaxNode, System.Threading.CancellationToken)
at Microsoft.CodeAnalysis.CSharp.CSharpSemanticModel.GetTypeInfoCore(
Microsoft.CodeAnalysis.SyntaxNode, System.Threading.CancellationToken)
....
As you can see, the cause of the problem is the dereferencing of the null reference.
As I mentioned earlier, we encountered the same problem during the testing of the analyzer. If you use the Microsoft.CodeAnalysis debug libraries to build the analyzer, you can come to the exact spot by debugging by requesting TypeSymbol from the desired node in the syntax tree.
As a result, we arrive at the ClassifyImplicitBuiltInConversionSlow method mentioned in the stack trace above :
private Conversion ClassifyImplicitBuiltInConversionSlow(
TypeSymbol source,
TypeSymbol destination,
ref HashSet useSiteDiagnostics)
{
Debug.Assert((object)source != null);
Debug.Assert((object)destination != null);
if (source.SpecialType == SpecialType.System_Void ||
destination.SpecialType == SpecialType.System_Void)
{
return Conversion.NoConversion;
}
Conversion conversion
= ClassifyStandardImplicitConversion(source, destination,
ref useSiteDiagnostics);
if (conversion.Exists)
{
return conversion;
}
return Conversion.NoConversion;
}
The problem is that the destination parameter is null in this case . Accordingly, when referring to destination.SpecialType exception is thrown a NullReferenceException . Yes, Debug.Assert is higher than dereferencing , but this is not enough, since in fact it does not protect against anything - it only helps to identify the problem in debug versions of libraries. Or does not help.
Changes in building a model of C ++ projects
Nothing particularly interesting happened here - the old algorithms did not require any significant modifications, which would be interesting to talk about. There were, perhaps, two points on which it makes sense to dwell.
First, we had to modify the algorithms that rely on the value of ToolsVersion to be written in numerical format. Without going into details - there are several cases when you need to compare toolsets and choose, for example, a more current new version. This version, respectively, had a higher numerical value. There was a calculation that ToolsVersion, corresponding to the new version of MSBuild / Visual Studio, will be equal to 16.0. Whatever the case ... For the sake of interest, I quote a table on how the values of various properties changed in different versions of Visual Studio:
Visual studio product name | Visual studio version number | Tools Version | PlatformToolset version |
Visual studio 2010 | 10.0 | 4.0 | 100 |
Visual studio 2012 | 11.0 | 4.0 | 110 |
Visual studio 2013 | 12.0 | 12.0 | 120 |
Visual studio 2015 | 14.0 | 14.0 | 140 |
Visual studio 2017 | 15.0 | 15.0 | 141 |
Visual studio 2019 | 16.0 | Current | 142 |
The joke, of course, is outdated, but you can’t help but recall changing the versions of Windows and Xbox in order to understand that predicting future values (no matter what name, version) is unstable in the case of Microsoft. :)
The solution was simple enough - introducing prioritization of toolsets (highlighting a separate priority entity).
The second point is problems when working in Visual Studio 2017 or in an adjacent environment (for example, the presence of the VisualStudioVersion environment variable) The fact is that calculating the parameters necessary to build a model of a C ++ project is much more complicated than building a model of a .NET project. In the case of .NET, we use our own toolset and the corresponding value of ToolsVersion. In the case of C ++, we can build on both our own and existing toolsets in the system. Starting with Build Tools in Visual Studio 2017, toolsets are written in the MSBuild.exe.config file , and not in the registry. Accordingly, we cannot get them from the general list of toolsets (for example, through Microsoft.Build.Evaluation.ProjectCollection.GlobalProjectCollection.Toolsets ), unlike those toolsets that are recorded in the registry (corresponding to <= Visual Studio 2015) .
As a consequence of the foregoing, it will not work to build a project model using ToolsVersion 15.0 , since the system will not see the necessary toolset. The most relevant toolset - Current - will be available, as this is our own toolset, therefore, there is no such problem for Visual Studio 2019. The solution turned out to be simple and allowed to solve the problem without changing the existing algorithms for constructing the project model - adding another one to the list of your own toolsets, Current , 15.0 .
Changes in building a model of C # .NET Core projects
Within the framework of this task, 2 problems were solved at once, since they turned out to be related:
- after adding the 'Current' toolset, the analysis of .NET Core projects for Visual Studio 2017 stopped working;
- Analysis of .NET Core projects on a system where at least one version of Visual Studio was not installed did not work.
The problem in both cases was the same - some of the basic .targets / .props files were searched in the wrong ways. This led to the fact that it was not possible to build a project model using our toolset.
In the absence of Visual Studio, you could see such an error (with the previous version of toolset'a - 15.0 ):
The imported project "C:\Windows\Microsoft.NET\Framework64\
15.0\Microsoft.Common.props" was not found.
When building the C # .NET Core model of the project in Visual Studio 2017, you could see the following problem (with the current version of the toolset, Current ):
The imported project
"C:\Program Files (x86)\Microsoft Visual Studio\
2017\Community\MSBuild\Current\Microsoft.Common.props" was not found.
....
Since the problems are similar (but it looks like that), you can try to kill two birds with one stone.
Below I describe how this problem was solved without going into technical details. These very details (about building models of C # .NET Core projects, as well as changing the construction of models in our toolset'e) are waiting in one of our future articles. By the way, if you carefully read the text above, you might notice that this is the second reference to future articles. :)
So, how did we solve this problem? The solution was to expand our own toolset at the expense of the main .targets / .props files from the .NET Core SDK ( Sdk.props , Sdk.targets) This allowed us to have more control over the situation, more flexibility in managing imports, as well as in building a model of .NET Core projects in general. Yes, our toolset has grown a bit again, and we also had to add some logic to set up the environment projects necessary for building the .NET Core model, but it looks like it was worth it.
Previously, the principle of work when building a model of .NET Core projects was as follows: we simply requested this construction, and then everything worked at the expense of MSBuild.
Now, when we have taken more control into our own hands, it looks a little different:
- preparation of the environment necessary for building a model of .NET Core projects;
- model building:
- start of construction using .targets / .props files from our toolset'a;
- continued construction using external files.
From the steps described above, it is obvious that setting the necessary environment has two main objectives:
- initiate model building using .targets / .props files from your own toolset;
- redirect further operations to external .targets / .props files.
To search for .targets / .props files necessary for building a model of .NET Core projects, a special library is used - Microsoft.DotNet.MSBuildSdkResolver. The initiation of building using files from our toolset was solved by using a special environment variable used by this library - we suggest where to import the necessary files (from our toolset). Since the library is part of our distribution, there are no fears that the logic will suddenly change and stop working.
Now Sdk files are first imported from our toolset, and since we can easily change them, the control of the further logic of building the model passes into our hands. Therefore, we can determine for ourselves which files need to be imported and from where. This also applies to the Microsoft.Common.props mentioned above. We import this and other basic files from our own toolset with confidence in their availability and content.
After that, having completed the necessary imports and setting a number of properties, we transfer the further control of model building to the actual .NET Core SDK, where the rest of the necessary actions take place.
Conclusion
In general, support for Visual Studio 2019 went easier than support for Visual Studio 2017, which, as I see it, is due to several factors. First, Microsoft did not change as many things as between Visual Studio 2015 and Visual Studio 2017. Yes, we changed the main toolset, began to orient plug-ins for Visual Studio on asynchrony, but nonetheless. The second - we already had a solution ready with our own toolset and building project models - there was no need to invent everything all over again, it was enough just to expand the existing solution. The relatively simple support for analyzing .NET Core projects for new conditions (as well as for cases of analysis on a machine where there are no installed Visual Studio instances) due to the expansion of our project model building system also gives hope that we made the right choice.
But still, I would like to repeat one thought that was in the previous article again - sometimes using ready-made solutions is not as simple as it seems at first glance.
If you want to share this article with an English-speaking audience, then please use the link to the translation: Sergey Vasiliev. Support of Visual Studio 2019 in PVS-Studio