PDA

View Full Version : Moving Files & Folders by Last Modified Date



Torantis
03-27-2018, 09:04 PM
Please keep it simple:


Sub snb()
c00 = "G:\OF"
c01 = "G:\old"

If Dir(c01, 16) = "" Then MkDir c01

sn = Split(CreateObject("wscript.shell").exec("cmd /c dir G:\OF\*.csv /b /s /o-d").stdout.readall, vbCrLf)
For j = 1 To UBound(sn)
If DateDiff("d", FileDateTime(sn(j)), Date) > 30 Then Exit Sub
Name sn(j) As c01 & Dir(sn(j))
Next
End Sub



This code looks in folder G:\OF & it's subfolders for all .csv files, sorted by date (descending).
If the last time a file has been saved lays within 30 days from today it will be moved to folder G:\Old.

This code works well and I have used it a couple of times. One piece that I am trying to adjust with it is, I do not want it to move the file to the new location, just make a copy of the file so the data can be changed and sorted within the individual documents without impacting the original file. I am fairly new to using wscripts in my code so do not totally understand which piece to adjust. I see where it essentially renames the file or the directory with the same file name.

Torantis
03-27-2018, 09:31 PM
I was able to figure it out and changed it to the following:



Sub snb()
c00 = "C:\TEST\Move"
c01 = "C:\TEST\Data"


If Dir(c01, 16) = "" Then MkDir c01


sn = Split(CreateObject("wscript.shell").exec("cmd /c dir C:\TEST\Move\*.csv /b /s /o-d").stdout.readall, vbCrLf)
For j = 1 To UBound(sn)
If DateDiff("d", FileDateTime(sn(j)), Date) > 2 Then Exit Sub
FileCopy sn(j), c01 & Dir(sn(j))
Next


'This code looks in folder G:\OF & it's subfolders for all .csv files, sorted by date (descending).
'If the last time a file has been saved lays within 30 days from today it will be moved to folder G:\Old.


End Sub


The only other question I have in regards to this is, the code will copy several files from different folders into one folder and the files that will be copied will potentially be modified at the time of the copy by the source system (pulling from 10 different networked computers) and I do not want to prevent the source system from being able to modify the file at the time of the copy. The files are storing part information in a .csv file so each part that is getting ran is logged into the individual batch (production order) file when the cycle is completed.

Another thing: At least one of the folders contains 60,000+ csv files - is there a way to speed up this process?

Someone originally built a tracker which essentially does the same thing I am trying to do however it was breaking in the macro and looping through process handler that would not get resolved. Here is a sample of what they did: (it appears as though they were trying to set it up to wait its turn)



Public Function ShellX( _
ByVal PathName As String, _
Optional ByVal WindowStyle As VbAppWinStyle = vbMinimizedFocus, _
Optional ByVal Events As Boolean = True _
) As Long


'Declarations:
Const STILL_ACTIVE = &H103&
Const PROCESS_QUERY_INFORMATION = &H400&
Dim ProcId As Long
Dim ProcHnd As Long


'Get process-handle:
ProcId = Shell(PathName, WindowStyle)
ProcHnd = OpenProcess(PROCESS_QUERY_INFORMATION, True, ProcId)


'wait for process end:
Do
If Events Then DoEvents
GetExitCodeProcess ProcHnd, ShellX
Loop While ShellX = STILL_ACTIVE


'clean up:
CloseHandle ProcHnd


End Function
Sub testdunno()
Dim x As Long
x = ShellX("CMD /C robocopy \\sourcefolder$\prod\run_data (file://sourcefolder$/prod/run_data) C:\Tracker\Data /maxage:2", vbHide) End Sub

snb
03-28-2018, 12:33 AM
At least one of the folders contains 60,000+ csv files - is there a way to speed up this process?

That's the reason you should move files from this directory (and store them elsewhere) once they do not serve any purpose anymore.

SamT
03-28-2018, 07:17 AM
This and the previous three posts were moved from
http://www.vbaexpress.com/forum/showthread.php?43880

Torantis
03-28-2018, 10:18 PM
That's the reason you should move files from this directory (and store them elsewhere) once they do not serve any purpose anymore.

I completely agree however the company wants to store these in the same location due to regulations at the moment. The folder actually contains 207,416 csv files in it.

snb
03-29-2018, 01:09 AM
Do the filenames contain any clue to a creation date ?

Torantis
03-29-2018, 08:45 PM
Unfortunately they do not. They all have unique IDs to them that contain PartNum_ProductOrderNum.csv

snb
03-30-2018, 01:47 AM
You can't programmatically reduce inefficiency created by policy.
If files were stored in two different directories you could use the 'mirror' directory to modify files and move them after modification.
So you can keep the 'mirror' directory as small as possible.

SamT
03-30-2018, 07:33 AM
Just to clarify... You said in various posts...

A
do not want it to move the file to the new location, just make a copy of the file so the data can be changed and sorted ["stored?") within the individual documents without impacting the original file.

B
the files that will be copied will potentially be modified at the time of the copy by the source system (...) and I do not want to prevent the source system from being able to modify the file at the time of the copy.

C
the company wants to store these in the same location due to regulations at the moment.

D
files are storing part information in a .csv file so each part that is getting ran is logged into the individual batch (production order) file

E
They all have unique IDs to them that contain PartNum_ProductOrderNum.csv

F
The folder actually contains 207,416 csv files in it.

Soooooo many questions about the entire process.

IN no particular order...


Where do the original Part Description files originate?
Where are the originals stored?
Where are the modified files stored.
What do the Original and the modified file names look like?
What is the difference between an original and a modified file's contents?
Can the source target folder and output folders of the modifying application be changed?
What is your ideal folder configuration?
What is the company's folder requirements?


Please step us thru the process as is from creating the original file to the end of the process when the file is no longer used. In great and verbose detail, please.

Please describe the process as you want it.

Torantis
04-02-2018, 03:22 AM
Answers to your questions:

· Where do the original Part Description files originate?
· The original part description files originate once the paper work is scanned at the workstation and then saved to the run folder directly · Where are the originals stored?
· All files are stored in a single folder.


· Where are the modified files stored.
· Files are created, modified, and stored in the same folder. (the files are not stored or copied to any other location)


· What do the Original and the modified file names look like?
· files are named as such Part Number_Batch Number.csv (H749##########_21917063.csv)


· What is the difference between an original and a modified file's contents?
· The difference between the start and end of the file is the addition of each part ran within that batch. Every time a part is ran, the file is updated with the run parameters of that part.


· Can the source target folder and output folders of the modifying application be changed?
· Yes and No - They can be changed however, I cannot simply just change them without updating validation documents and re-qualifying the system (is something that I have been working on). I would actually like to create and archive folder to where files older than a specific age will get moved to it, reducing the file count within the run data folders.


· What is your ideal folder configuration?
· run_data and archived folder


· What is the company's folder requirements?
· The company's folder requirements are fairly simple in regards to the file structure. System owner is able to specify the structure based on their application needs. The twist behind that is once the structure is configured, in order to make changes to it, several documents need to be updated and approved, systems need to be tested for functionality in a test environment, and the change can take place. This obviously is not impossible to do however since the "file system" has been working in terms of the application that accesses and controls the files and the project I am working on is simply for tracking purposes only to monitor and display throughput, cycle times, and efficiencies.
Honestly, I personally want all the information to dump or get pulled into a SQL DB via a SQLscript and the information can be looked at, sorted, manipulated, or whatever is needed in the end. The reality of this is in the work but again, many view it as, if it is not broken don't fix it. On the other hand, I have done many, many different things with excel and always try to figure out how I can do more. This one actually has me stumped as it is right now and beginning to think with the current structure, it is going to be hard to accomplish.

The excel sheet was running fine up until recently when corp IT decided to make changes (uncertain as to what changes they made because they are not giving any specific answers)

This is basically the procedure before the changes were made:


Sub RunBoardProgram()
SendKeys "{HOME}"

Application.ScreenUpdating = False
Workbooks("All Folder hour by hour.xlsm").Worksheets("HourBoards").Activate

Range("V39") = Now()
ClearData
CreateFolder
ShellProcedure
SortData
FindShift
ChangeOvers
FillBoards
FillCycleTime
Display_Format
Update_board2
Workbooks("All Folder hour by hour.xlsm").Worksheets("HourBoards").Activate
Test
Range("A1").Select
SendKeys "{HOME}"
Application.CutCopyMode = False

EnterFormulas

ActiveWorkbook.Save
Application.ScreenUpdating = True

EnterFormulas
'Kill "C:\Tracker\Data\*.*"
'dTime = Now() + TimeValue("00:10:00")
'Application.OnTime dTime, "RunBoardProgram"

Exit Sub
End Sub



Public Sub ShellProcedure()
Dim myDir As String, fn As String, txt As String, a(), n As Long, i As Long, ff As Integer

'Dim wsh As Object

'Set wsh = VBA.CreateObject("WScript.Shell")

'Dim waitOnReturn As Boolean: waitOnReturn = True 'Dim windowStyle As Integer: windowStyle = 1 'Dim errorCode As Integer

'wsh.Run "C:\Tracker\BatchFiles\BatchFile1.bat", windowStyle, waitOnReturn 'wsh.Run "C:\Tracker\BatchFiles\\BatchFile2.bat", windowStyle, waitOnReturn 'wsh.Run "C:\Tracker\BatchFiles\\BatchFile3.bat", windowStyle, waitOnReturn 'wsh.Run "C:\Tracker\BatchFiles\\BatchFile4.bat", windowStyle, waitOnReturn


testdunno
testdunno1
testdunno2
testdunno3

myDir = "C:\Tracker\Data\" '<- change here fn = Dir(myDir & "*.csv")



Do While fn <> ""

ff = FreeFile
Open myDir & "\" & fn For Input As #ff
Do While Not EOF(ff)
Line Input #ff, txt
n = n + 1: ReDim Preserve a(1 To n)
a(n) = Split(txt, ",")
Loop
Close #ff
fn = Dir()
Loop


With ThisWorkbook.Sheets("Data").Range("a1")
For i = 1 To n
.Offset(i - 1).Resize(, UBound(a(i)) + 1).Value = a(i)
Next
End With

On Error Resume Next
'Kill "C:\Tracker\Data\*.*"

Cells.Replace What:="""", Replacement:="", LookAt:=xlPart, SearchOrder _
:=xlByRows, MatchCase:=False, SearchFormat:=False, ReplaceFormat:=False

End Sub



Sub testdunno()
Dim x As Long
x = ShellX("CMD /C robocopy \\SourceLocation\hiddendirectory$\producttype\run_data (file://SourceLocation/hiddendirectory$/producttype/run_data) C:\Tracker\Data /maxage:2", vbHide) End Sub
Public Function ShellX( _
ByVal PathName As String, _
Optional ByVal windowStyle As VbAppWinStyle = vbMinimizedFocus, _
Optional ByVal Events As Boolean = True _
) As Long

'Declarations:
Const STILL_ACTIVE = &H103&
Const PROCESS_QUERY_INFORMATION = &H400& Dim ProcId As Long Dim ProcHnd As Long

'Get process-handle:
ProcId = Shell(PathName, windowStyle)
ProcHnd = OpenProcess(PROCESS_QUERY_INFORMATION, True, ProcId)

'wait for process end:
Do
If Events Then DoEvents
GetExitCodeProcess ProcHnd, ShellX
Loop While ShellX = STILL_ACTIVE

'clean up:
CloseHandle ProcHnd

End Function


Please keep in mind that I am not the original author of the above code and have been tasked with trying to get the sheet to function again. I do know that one of the tasks that IT completed removed access to all users to the source folders. Once I was able to get access back to the tracker, the sheet would not function. The sheet updated every 15 minutes before (with the number of files in each run folder, it would take about 5 minutes to update). Now the sheet errors out at different locations however, more consistently would error out at:


Do
If Events Then DoEvents
GetExitCodeProcess ProcHnd, ShellX
Loop While ShellX = STILL_ACTIVE

And it would get stuck in that loop because Shellx = 259 which is
' --------------------------------------------------------------
' Constants required for return values (Error code checking) ' --------------------------------------------------------------
Private Const ERROR_SUCCESS As Long = 0
Private Const ERROR_ACCESS_DENIED As Long = 5
Private Const ERROR_NO_MORE_ITEMS As Long = 259 I even tried clearing ShellX if it = 259 and the sheet would continue but 85% of the data was missing because it did not copy over the files when it thought there were no more items.
Essentially here is what it needs to do:
Update every 15 mins
Copy files to a separate folder from 4 different folders that are csv files and are two days old or younger:
Folder 1 contains 207,475 csv files
Folder 2 contains 67,000 csv files
Folder 3 contains less than 3,000 csv files Folder 3 contains less than 1,000 csv files Copy each individual part information into the master sheet from each csv file Manipulate the data and display

SamT
04-02-2018, 06:16 AM
See if either of these link give you any ideas
http://www.vbaexpress.com/kb/getarticle.php?kb_id=405
https://technet.microsoft.com/en-us/library/ee176615.aspx

However, in the interest of speed of the initial move, I would use Windows Explorer

In Details view, by right clicking on the file Name header in Windows Explorer, you can select which Columns are displayed. Clicking on any column Header will sort the list by that attribute. Then, by using the shift key, you can select all the contiguous files you want to move and move them all at once. Since you are moving them to a subfolder, (or any folder on the same hard drie,) it is very fast.

After that, just let VBA handle it.



Do
If Events Then DoEvents
GetExitCodeProcess ProcHnd, ShellX
Loop While ShellX = STILL_ACTIVE

Loop While ShellX = STILL_ACTIVE is recursively calling ShellX without the required path name, so ShellX is probably returning some error number. Not only that, but you hare not setting ShellX to any value

Try something like
Dim ExitCode As Long

Do
If Events Then DoEvents
ExitCode = GetExitCodeProcess ProcHnd, ShellX
If ExitCode = ERROR_SUCCESS then Exit Loop '???
If ExitCode = ERROR_NO_MORE_ITEMS Then MsgBox "Oops!"
Loop While ExitCode = STILL_ACTIVE

ShellX = ???
Note that I don't truly understand your code

Torantis
04-03-2018, 03:43 AM
I found the links interesting however the code suggestion will not be very beneficial due to the fact that, as I mentioned, it gets stuck in the loop with error 259 so this does not resolve the issue. In addition, as mentioned, if I clear that based off of 259 it does not pull in all the information needed since it is not present in the tracking folder.

SamT
04-03-2018, 07:15 AM
does not pull in all the information needed since it is not present in the tracking folder.

Do you need code to place the needed info into the tracking folder?

Torantis
04-03-2018, 11:28 PM
Yes ideally, I would need code that works without bogging down the system or taking over 45 minutes to run. Currently robocopy is taking way to long to copy over the needed files even if I set the files to be indexed by the system. I did try to change it do it in a scheduled batch file however it still takes just as long. I am not certain as to the quickest and easiest way to copy over the files with the current directory structure. Essentially I am trying to fix it so the sheet is accurate with information as close to possible to what it was before by performing the update every 10 minutes until I am able to implement a more robust solution to the directory structure.

Torantis
04-04-2018, 11:23 PM
As I thought more about it, it seems as though robocopy is looking at to much detail for determining which files were needed to copy. I switched it from robocopy to xcopy through a batch file executed by excel vba. This copy is performed much fast. I was able to get the robocopy to move a little faster by having the system running the macros to index the network location however it is still at 30-45 mins. XCOPY performs the task on the larger directory in <5 minutes which is ideal for my situation. This will get me in the area of allowing this sheet to function today while I work on porting the information into SQLServer.

Thank you for taking the time to assist me with this and it is good to know that I am not the only one that was baffled by the code or the direction the original coder took to accomplish the tasks.