How to merge file contents into PowerShell. And do not suffer

    What should have been done

    Once I needed to merge a pack of text files in one directory into one file. I didn’t want to do this with my hands and, as always, Google came to help me! I heard a lot about the power of a tool like PowerShell, and decided to use it for this mega-task. If only because I am familiar with the wretchedness of the cmd.exe tool not by hearsay. Well, doing it with your hands is not our way.

    Something went wrong

    Google told me that you can do this with a simple command.

    Get-ChildItem -Filter *.log | Get-Content | Out-File result.txt

    "Really cool! Just some Unix way!" - I thought. I copied the command, slightly modified it and pressed Enter. The cursor moved to a new line ... and nothing more. In the file manager, I opened the final file - there really was something similar to the desired result. It had a lot of lines from the source files. Returning to the console, I saw that the process was still ... in progress. Did Ctrl + C .

    Looking closely at the file size, I saw that it was somehow suspiciously large. Its size exceeded 100 megabytes. Although the water data was not so big.

    Why did this happen?

    It's all about my "light modification". I just didn’t need an extension filter. And this parameter is not obligatory. And it turned out that the team created the resulting file, saw that it was in the directory, read it and wrote its contents to the end again and did it until I pressed Ctrl + C I couldn’t explain the continuous growth of the output file for myself

    I repeated it in "sterile" conditions. For simplicity and purity of the experiment I did in a separate directory, as I am afraid to kill the working machine.

    1. Create a text file

      echo"Hello world" > hello.txt

    2. I execute the command

      Get-ChildItem | Get-Content | Out-File result.txt

      or in short form

      dir | cat | Out-File result.txt

      The problem repeats. The resulting file grows, replenished with a line from the source (or lines from itself?). For 10 seconds of execution:

      • one line of source file turns into 400 thousand lines
      • file size increased from 11 bytes to almost 8 megabytes
      • The processor is loaded by about 20-25%.
      • there are no disk subsystem overloads or RAM. Apparently, PowerShell is well optimized in terms of working with these components. )

    It is also interesting - if the last command specifies the name of a single file in the directory as a parameter, then, of course, you guessed the drum roll ... the emptiness will be written to the file!

    Here is such an "interesting" logic of work

    What happened

    The file created in the first step starts to grow. This behavior is at least unpredictable.
    Also surprised that the operating system continues to work normally. The file is growing slowly (or not?) Without blocking the user's work.

    Why is it dangerous

    Invisible filling of disk space.

    How to avoid

    Filter the list of input files:

    Get-ChildItem -Filter *.log | Get-Content | Out-File result.txt

    But this will not save if both the input and output files are suitable for your filter condition.


    I am using PowerShell version 5.1.17134.407. By the way, in trying to find out, I tried all the methods / logic and common sense known to me (namely, flags like -Version --version -v -h ). But it did not help. Rescued , as always, Stackoverflow. Here's how to get the PowerShell version


    This response collected almost 3,000 likes! This is certainly less than the answer to the question of how to close vim , but also, I think, revealing!

    In general, PowerShell is really a powerful thing (at least in comparison with cmd.exe)! And I, of course, will continue to use it.

    Also popular now: