PowerShell - Low priority remote file transfer












0















I am writing a PowerShell script that needs to be able to download many large files from many servers (i.e. windows server hosts) at once. Many of these servers may be used for important activities and it is therefore important that the transfer method has the ability to fade into the background when load on the servers start to get high.



Here are 3 transfer methods I know of along with reasons why I don't think they're good enough for the job.



BitsTransfer: The first thing that came to mind was BitsTransfer (https://docs.microsoft.com/en-us/powershell/module/bitstransfer/start-bitstransfer?view=win10-ps), however, there is a fatal issue with this approach in that it can't transfer files that're in use for writing by other processes, and a lot of the files that need to be downloaded from the servers will be in such a state. See start-bitstransfer : The process cannot access the file because it is being used by another process for that issue...



Copy-Item: This gives no control over it's priority, and so there exists the potential that a large file transfer could ground a server to a halt whilst it is processing an important load.



Robocopy: Again, I see no mention of priority or background processing.










share|improve this question



























    0















    I am writing a PowerShell script that needs to be able to download many large files from many servers (i.e. windows server hosts) at once. Many of these servers may be used for important activities and it is therefore important that the transfer method has the ability to fade into the background when load on the servers start to get high.



    Here are 3 transfer methods I know of along with reasons why I don't think they're good enough for the job.



    BitsTransfer: The first thing that came to mind was BitsTransfer (https://docs.microsoft.com/en-us/powershell/module/bitstransfer/start-bitstransfer?view=win10-ps), however, there is a fatal issue with this approach in that it can't transfer files that're in use for writing by other processes, and a lot of the files that need to be downloaded from the servers will be in such a state. See start-bitstransfer : The process cannot access the file because it is being used by another process for that issue...



    Copy-Item: This gives no control over it's priority, and so there exists the potential that a large file transfer could ground a server to a halt whilst it is processing an important load.



    Robocopy: Again, I see no mention of priority or background processing.










    share|improve this question

























      0












      0








      0








      I am writing a PowerShell script that needs to be able to download many large files from many servers (i.e. windows server hosts) at once. Many of these servers may be used for important activities and it is therefore important that the transfer method has the ability to fade into the background when load on the servers start to get high.



      Here are 3 transfer methods I know of along with reasons why I don't think they're good enough for the job.



      BitsTransfer: The first thing that came to mind was BitsTransfer (https://docs.microsoft.com/en-us/powershell/module/bitstransfer/start-bitstransfer?view=win10-ps), however, there is a fatal issue with this approach in that it can't transfer files that're in use for writing by other processes, and a lot of the files that need to be downloaded from the servers will be in such a state. See start-bitstransfer : The process cannot access the file because it is being used by another process for that issue...



      Copy-Item: This gives no control over it's priority, and so there exists the potential that a large file transfer could ground a server to a halt whilst it is processing an important load.



      Robocopy: Again, I see no mention of priority or background processing.










      share|improve this question














      I am writing a PowerShell script that needs to be able to download many large files from many servers (i.e. windows server hosts) at once. Many of these servers may be used for important activities and it is therefore important that the transfer method has the ability to fade into the background when load on the servers start to get high.



      Here are 3 transfer methods I know of along with reasons why I don't think they're good enough for the job.



      BitsTransfer: The first thing that came to mind was BitsTransfer (https://docs.microsoft.com/en-us/powershell/module/bitstransfer/start-bitstransfer?view=win10-ps), however, there is a fatal issue with this approach in that it can't transfer files that're in use for writing by other processes, and a lot of the files that need to be downloaded from the servers will be in such a state. See start-bitstransfer : The process cannot access the file because it is being used by another process for that issue...



      Copy-Item: This gives no control over it's priority, and so there exists the potential that a large file transfer could ground a server to a halt whilst it is processing an important load.



      Robocopy: Again, I see no mention of priority or background processing.







      windows powershell file-transfer windows-server-2008-r2






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question










      asked Dec 18 '18 at 15:57









      Lost CrotchetLost Crotchet

      1032




      1032






















          1 Answer
          1






          active

          oldest

          votes


















          0














          You should be looking to multi-threading, parallel processing, Background Jobs, Runspace Jobs, and Thread Jobs for your use case.




          https://randombrainworks.com/2018/01/28/powershell-background-jobs-runspace-jobs-thread-jobs




          Your code should first check if the file is in use, and skip the file if it is and place that in a collection, that you are going to use to come back and try again.



          Yet, it also seems that you are asking for dynamic throttling based on server resource consumption state, that also would mean you would need separate code to check for that resource state before taking any action and IMHO, queueing is about as close as you'd get.




          https://dille.name/blog/2015/09/08/processing-a-queue-using-parallel-powershell-jobs-with-throttling







          share|improve this answer























            Your Answer








            StackExchange.ready(function() {
            var channelOptions = {
            tags: "".split(" "),
            id: "3"
            };
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function() {
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled) {
            StackExchange.using("snippets", function() {
            createEditor();
            });
            }
            else {
            createEditor();
            }
            });

            function createEditor() {
            StackExchange.prepareEditor({
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: true,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: 10,
            bindNavPrevention: true,
            postfix: "",
            imageUploader: {
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            },
            onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            });


            }
            });














            draft saved

            draft discarded


















            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1385620%2fpowershell-low-priority-remote-file-transfer%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            0














            You should be looking to multi-threading, parallel processing, Background Jobs, Runspace Jobs, and Thread Jobs for your use case.




            https://randombrainworks.com/2018/01/28/powershell-background-jobs-runspace-jobs-thread-jobs




            Your code should first check if the file is in use, and skip the file if it is and place that in a collection, that you are going to use to come back and try again.



            Yet, it also seems that you are asking for dynamic throttling based on server resource consumption state, that also would mean you would need separate code to check for that resource state before taking any action and IMHO, queueing is about as close as you'd get.




            https://dille.name/blog/2015/09/08/processing-a-queue-using-parallel-powershell-jobs-with-throttling







            share|improve this answer




























              0














              You should be looking to multi-threading, parallel processing, Background Jobs, Runspace Jobs, and Thread Jobs for your use case.




              https://randombrainworks.com/2018/01/28/powershell-background-jobs-runspace-jobs-thread-jobs




              Your code should first check if the file is in use, and skip the file if it is and place that in a collection, that you are going to use to come back and try again.



              Yet, it also seems that you are asking for dynamic throttling based on server resource consumption state, that also would mean you would need separate code to check for that resource state before taking any action and IMHO, queueing is about as close as you'd get.




              https://dille.name/blog/2015/09/08/processing-a-queue-using-parallel-powershell-jobs-with-throttling







              share|improve this answer


























                0












                0








                0







                You should be looking to multi-threading, parallel processing, Background Jobs, Runspace Jobs, and Thread Jobs for your use case.




                https://randombrainworks.com/2018/01/28/powershell-background-jobs-runspace-jobs-thread-jobs




                Your code should first check if the file is in use, and skip the file if it is and place that in a collection, that you are going to use to come back and try again.



                Yet, it also seems that you are asking for dynamic throttling based on server resource consumption state, that also would mean you would need separate code to check for that resource state before taking any action and IMHO, queueing is about as close as you'd get.




                https://dille.name/blog/2015/09/08/processing-a-queue-using-parallel-powershell-jobs-with-throttling







                share|improve this answer













                You should be looking to multi-threading, parallel processing, Background Jobs, Runspace Jobs, and Thread Jobs for your use case.




                https://randombrainworks.com/2018/01/28/powershell-background-jobs-runspace-jobs-thread-jobs




                Your code should first check if the file is in use, and skip the file if it is and place that in a collection, that you are going to use to come back and try again.



                Yet, it also seems that you are asking for dynamic throttling based on server resource consumption state, that also would mean you would need separate code to check for that resource state before taking any action and IMHO, queueing is about as close as you'd get.




                https://dille.name/blog/2015/09/08/processing-a-queue-using-parallel-powershell-jobs-with-throttling








                share|improve this answer












                share|improve this answer



                share|improve this answer










                answered Dec 18 '18 at 23:23









                postanotepostanote

                93023




                93023






























                    draft saved

                    draft discarded




















































                    Thanks for contributing an answer to Super User!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid



                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function () {
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fsuperuser.com%2fquestions%2f1385620%2fpowershell-low-priority-remote-file-transfer%23new-answer', 'question_page');
                    }
                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    Plaza Victoria

                    Puebla de Zaragoza

                    Musa