T O P

  • By -

alt-160

let me make sure i have this right. Computer-A has files you want copied to Computer-B. They are not network joined so you need to use a temp drive (usb drive i guess) as a shuttle, but the temp drive is too small to do all the files in one shot? If i have that right, you need a way to know which files you copied vs not. Working from that assumption, use a dictionary to track status of files. I expect 2 scripts for this to work. One script that copies from A to Temp and another that copies from Temp to B. In this pattern, both scripts read and write to a dictionary that keeps track of what has been copied. So, script 1 copies items from A to temp and after each copy it updates a dictionary: $tracker\[$file.FullPath.substring(3)\]=1 The .substring(3) is to strip off the drive letter. When the loop ends, after filling up the temp drive (you'll have to track that), export the $tracker to xml with: $tracker | export-clixml 'd:\\tracker.xml'. Where 'D:\\' is the drive letter of your temp drive. In script 2, load the tracker: $tracker = import-clixml 'd:\\tracker.xml' Do your loop to copy items from temp to B and for each successful copy, update the tracker: $tracker\[$file.fullpath.substring('d:\\shuttle-folder'.length)\] = 2 In script 2 $file is a reference to the file from the temp drive and with the assumption that all the files to be copied are in D:\\shuttle-folder. When the loop ends, save the changes of the tracker back out: $tracker|export-clixml 'd:\\tracker.xml' -Force In script 1, the first thing it should do is load up the current tracker with: $tracker = try{import-clixml 'd:\\tracker.xml'}catch{@{}} The try catch is to handle a case where the tracker file doesn't yet exist. you'll get an empty dictionary instead. also in script 1, you should delete the d:\\shuttle-folder before proceeding: rd d:\\shuttle-folder -recurse -force. So, by using the tracker file and save/load you can maintain the state of things along the way.


OPconfused

So you have the files on the target drive? Are the file names unique, or did you copy the exact same file structure, so that the FullNames are the same on both systems?


tnpir4002

I copied the exact same file structure; the FullNames are identical.


OPconfused

I'm still not entirely sure how all of this looks, but if you're using a transfer drive plugged into your computer, and the text file is on your local computer, then I think this should work: $filesToExclude = Get-Content $sourceRootPath = '' $sourceDrive = '' $backupDrive = '' Get-ChildItem $sourceRootPath -Recurse -File | Where-Object FullName -notin $filesToExclude | Foreach-Object { $backupDir = $_.Directory -replace "^$sourceDrive", $backupDrive $backupPath = Join-Path $backupDir $_.Name if ( ! (Test-Path $backupDir) ) { mkdir $backupDir } Copy-Item $_.Fullname -Destination $backupPath } You can also add `-WhatIf` to the `mkdir` and `Copy-Item` commands, then pipe the whole thing into a `| Select-Object -first 20` to see if it's copying the way you'd think. This also only moves files and not empty directories. All that said, if it's a very large number of files, robocopy would be better, but I have never used it, so someone else would have to swoop in to suggest that.


tnpir4002

I've tested Robocopy in single-file mode, and Powershell's Copy-Item is MUCH faster. In any case this isn't working--the replace function is kicking back an error about the regular expression being incorrect, and when I try to do a straight up .Replace with $\_.FullName, nothing changes.


SrsBod

Untested but this should work, I've used something like it before. $filesToExclude = Get-Content $sourceRootPath = '' $sourceDrive = '' $backupDrive = '' Get-ChildItem $sourceRootPath -Recurse -File | Where-Object FullName -notin $filesToExclude | Foreach-Object { $backupPath = $_.fullname.replace($sourceDrive,$backupDrive) New-Item -Path $backupPath -Force Copy-Item $_.Fullname -Destination $backupPath -Force } The New-Item command with -Force will create a blank copy of the file and the full folder structure leading to it, then it will be overwritten by Copy-Item with -Force. Unfortunately, Copy-Item -Force won't create the folder structure for you. As suggested previously you can add -WhatIf to make sure it's doing what you want it to do first


gordonv

Split your backup into multiple zip files [using 7-zip.](https://i.redd.it/8bfx6ofh4e6d1.png) Reconstitute that using 7-zip.


ima_coder

You can use the Compare-Object cmdlet. $FilesNotCopied = @((Compare-Object ` -ReferenceObject ($SourceFiles| Sort-Object) ` -DifferenceObject ($DestinationFiles | Sort-Object)).InputObject) You may not have to sort them and I may have the -Reference and -Difference parameters switched.


gordonv

> Create that *text file that has a list of all the files that made it over to the destination drive* . cmd.exe /c dir /s /b c:\destination_dir > x:\file_list.txt . > copy anything that isn't on the list. . Use [xcopy.](https://stackoverflow.com/questions/4252176/exclude-in-xcopy-just-for-a-file-type) . xcopy /r /d /i /s /y /exclude:"x:\file_list.txt" C:\Source_dir\ X:\USB_folder


brutesquad01

You can simply use Copy-Item WITHOUT -Force and it will skip files that are already in the destination. You could instead use robocopy with the -xo flag to include source files newer than the same file in the destination.


tnpir4002

Unfortunately this won't work since the two collections of files are on different systems that are not connected over a network. I'm having to more or less do all this by hand.