Home |
Search |
Today's Posts |
#1
![]() |
|||
|
|||
![]()
Morning everyone,
I am dealing with a range of files ouput by a structural analysis program (extension used is .out) which are formatted as standard text files with information arranged in columns after some initial header information. For each new analysis that I perform, possibly a hundred different iterations, a new folder with 20 or so new output files will be created. For info the files are typically as follows: DISP_LS1, DISP_LS2, DISP_LS3, DISP_LS4, DISP_LS5.OUT EQUIV1, EQUIV2, EQUIV3, EQUIV4, EQUIV5.OUT LONGIT1, LONGIT2, LONGIT3, LONGIT4, LONGIT5.OUT REACT_1, REACT_2, REACT_3, REACT_4, REACT_5.OUT The above output files are determined based on how many loadsteps I have defined in the analysis, there may 5 load steps as above, or 20 or more, etc. I would like to be able to read the separate text files into a standard excel file that is placed in each directory where the information is located. This is easily done with the import data feature, but it is time consuming and must be done for each input file separately. Preferably, I would like an automatic import of all files within the same directory, with a particular extension (i.e. *.out) pulled into their own independent worksheets within the file. I will then use additonal worksheets to combine the information and analyze it as necessary. Obviously a standard column width for each file type would need to be defined within the setup. Finally, if possible, I would like to control where the input from the files begins. In the import data wizard, you can specify which row to start the input from. the different types of input files, i.e. DISP or REACT, may start result information on row 10 or row 15 respectively. This isn't a must have, as I can clean up the data manually, but it would be a nice feature. The idea here is to create a standard template file that I can copy and place into each subdirectory once the outputs are complete and then import and coalate all the data into stanadard excel files for analysis and reporting. Preferred means of beginning the import would be by simply running a macro. Any ideas? Thanks, Jason Falls |
#2
![]() |
|||
|
|||
![]()
Version of Excel is 2002 (XP)
|
#3
![]() |
|||
|
|||
![]()
So essentially, you're just opening up a bunch of text .Out files and saving
them as .xls files? If that's true, then this may get you started: Option Explicit Sub testme01() Dim myFiles() As String Dim fCtr As Long Dim myFile As String Dim myPath As String Dim ValidPrefixes As Variant Dim RowsToDelete As Variant Dim iCtr As Long Dim GoodFileName As Boolean Dim newWks As Worksheet ValidPrefixes = Array("disp_ls", "equiv", "longit", "react") RowsToDelete = Array(1, 0, 6, 18) If UBound(ValidPrefixes) < UBound(RowsToDelete) Then MsgBox "Design error--match rows with prefixes!" Exit Sub End If With Application.FileDialog(msoFileDialogFolderPicker) If .Show = True Then myPath = .SelectedItems(1) Else Exit Sub End If End With myPath = myPath & "\" myFile = Dir(myPath & "*.out") If myFile = "" Then MsgBox "no files found" Exit Sub End If 'get the list of files fCtr = 0 Do While myFile < "" fCtr = fCtr + 1 ReDim Preserve myFiles(1 To fCtr) myFiles(fCtr) = myFile myFile = Dir() Loop If fCtr 0 Then For fCtr = LBound(myFiles) To UBound(myFiles) GoodFileName = False For iCtr = LBound(ValidPrefixes) To UBound(ValidPrefixes) If LCase(Left(myFiles(fCtr), Len(ValidPrefixes(iCtr)))) _ = LCase(ValidPrefixes(iCtr)) Then GoodFileName = True Exit For End If Next iCtr If GoodFileName = False Then 'do nothing Else Workbooks.OpenText Filename:=myPath & myFiles(fCtr), _ Origin:=437, _ StartRow:=1, DataType:=xlDelimited, _ TextQualifier:=xlDoubleQuote, _ ConsecutiveDelimiter:=False, Tab:=False, _ Semicolon:=False, _ Comma:=True, Space:=False, Other:=False, _ FieldInfo:=Array(Array(1, 1), Array(2, 1), _ Array(3, 1), Array(4, 1)), _ TrailingMinusNumbers:=True Set newWks = ActiveSheet With newWks If RowsToDelete(iCtr) 0 Then .Rows("1:" & RowsToDelete(iCtr)).Delete End If .UsedRange.Columns.AutoFit End With newWks.Parent.SaveAs _ Filename:=myPath & Left(myFiles(fCtr), _ Len(myFiles(fCtr)) - 4) & ".xls", _ FileFormat:=xlWorkbookNormal newWks.Parent.Close savechanges:=False End If Next fCtr End If End Sub The bad news is I didn't know how to parse the input files. I guessed that the files were comma separated. But if you meant that the files are fixed width (arranged into columns??), you can record a macro when you open one of the files. You'll be able to replace this hunk of code: Workbooks.OpenText Filename:=myPath & myFiles(fCtr), _ Origin:=437, _ StartRow:=1, DataType:=xlDelimited, _ TextQualifier:=xlDoubleQuote, _ ConsecutiveDelimiter:=False, Tab:=False, _ Semicolon:=False, _ Comma:=True, Space:=False, Other:=False, _ FieldInfo:=Array(Array(1, 1), Array(2, 1), _ Array(3, 1), Array(4, 1)), _ TrailingMinusNumbers:=True With your code (with slight modifications to get the correct file.) If the all the text files have different layout, the macro could accommodate that, too. And I chose to import all the data and clean it up after the import (rowstodelete variable). You could modify it to use the Startrow in the import (6 of one/half dozen of the other...). And if you're new to macros, you may want to read David's intro: http://www.mvps.org/dmcritchie/excel/getstarted.htm TxRaistlin wrote: Morning everyone, I am dealing with a range of files ouput by a structural analysis program (extension used is .out) which are formatted as standard text files with information arranged in columns after some initial header information. For each new analysis that I perform, possibly a hundred different iterations, a new folder with 20 or so new output files will be created. For info the files are typically as follows: DISP_LS1, DISP_LS2, DISP_LS3, DISP_LS4, DISP_LS5.OUT EQUIV1, EQUIV2, EQUIV3, EQUIV4, EQUIV5.OUT LONGIT1, LONGIT2, LONGIT3, LONGIT4, LONGIT5.OUT REACT_1, REACT_2, REACT_3, REACT_4, REACT_5.OUT The above output files are determined based on how many loadsteps I have defined in the analysis, there may 5 load steps as above, or 20 or more, etc. I would like to be able to read the separate text files into a standard excel file that is placed in each directory where the information is located. This is easily done with the import data feature, but it is time consuming and must be done for each input file separately. Preferably, I would like an automatic import of all files within the same directory, with a particular extension (i.e. *.out) pulled into their own independent worksheets within the file. I will then use additonal worksheets to combine the information and analyze it as necessary. Obviously a standard column width for each file type would need to be defined within the setup. Finally, if possible, I would like to control where the input from the files begins. In the import data wizard, you can specify which row to start the input from. the different types of input files, i.e. DISP or REACT, may start result information on row 10 or row 15 respectively. This isn't a must have, as I can clean up the data manually, but it would be a nice feature. The idea here is to create a standard template file that I can copy and place into each subdirectory once the outputs are complete and then import and coalate all the data into stanadard excel files for analysis and reporting. Preferred means of beginning the import would be by simply running a macro. Any ideas? Thanks, Jason Falls -- Dave Peterson |
Reply |
Thread Tools | Search this Thread |
Display Modes | |
|
|
![]() |
||||
Thread | Forum | |||
Pulling data from 1 sheet to another | Excel Worksheet Functions | |||
Automatic cell increment with data from sheet 1 to sheet 2 | Excel Worksheet Functions | |||
How do I import fractions without data being converted to dates? | Excel Discussion (Misc queries) | |||
Automatic copying data excluding blank cells | Excel Worksheet Functions | |||
VBA Import of text file & Array parsing of that data | Excel Discussion (Misc queries) |