No it is on this page in Post #124.
I made quite a few posts during that period.
Printable View
You should click the Star under OBP's name and give him some Reputation, he went above and beyond for you.
I'll start looking at the FSO scripting you posted earlier. So the benefit is that it reads the file 1 line at a time. I see the limitations open file has and far less coding.
OBP has been great. No doubt there. I have tried show my gratitude throughout the thread. But yes, I will do it the formal way for this site via the reputation.
After putting this through a test, it performs just fine. When appending files, it will create blank rows in the master file. Any reason that happens? I think that can corrected when importing the file to an Access table by running a delete rows query.
Also, there may be instances when this application is ran by Access Runtime. I never read anything where any of the VBA reference libraries wouldn't work correctly if using Runtime. Is that misguided?
I will take a look at the blank line problem, but it may be fixed by using FSO.
I have never used Runtime, so I am not sure.
Hey again, Im back to working on the import routine. Theses last couple of posts are where I picked up. I have tried everything to get this file to import. I get the error message "item cannot be found in the collection corresponding to the requested normal or ordinal." Also the error message box in the VBA shows reccount = 0. I have made sure the table matches the file and cut the table size down as you suggested in the last post. The file is not any different from the other routine we worked on in the past routine. Only differences are the data in it and the field size. I'm out of good ideas to try. Maybe the VBA I have revised is incorrect. I you have a chance to look at it, some feedback would be appreciated.
Code:Dim i As Integer Do Until EOF(1)
xchar = Input(1, #1)
If xchar = vbLf Then
reccount = reccount + 1
fieldcount = 1
reclength = 0
If reccount > 1 Then
With rst
.AddNew
For i = 1 To 199
.Fields(i) = fieldstring(i)
fieldstring(i) = ""
Next i
.Update
End With
With rst2
.AddNew
.Fields(72) = keyfield
For i = 1 To maxfields - 199
.Fields(i) = fieldstring(i + 199)
fieldstring(i + 199) = ""
Next i
.Update
End With
With rst
.AddNew
.Fields(199) = keyfield
.Update
End With
End If
Else \
If reccount > 0 Then
x = x + 1
If xchar = Chr(34) Then
If first = 0 Then
first = 1
Else
first = 0
End If
End If
Andy, first of all what file are you trying to import from, I have just tested the code which was developed for TESTMAIN, which works OK, on the other 2 files that you sent me and neither of them complete the input.
SECTESTPRIMARY5736298.1500919251120784620 - Copy.dat fails due to key violations, because the fields after the first 2 are completely blank, so it tries to put 2 blanks in the Location key field, which creates a duplicate.
Also SECTESTPRIMARY5736298.1500919251120784620 - Copy has completely different and far fewer Field than the other 2, so should obviously be in a different table.
SECTVAIELOCMAINNUC5006.1500915498085621484uSE stops due to a field being too large, I have overcome that by clipping the field to 255 characters, but the question arises of why that particular field comes out over 255 characters. I will have a closer look at it.
I have also realised that because of the data structure of the original files FSO may have the same problem that I had to overcome, ie the use of "," in a field which creates an extra field, shifting everything to the right one field, I will check that out too.
OK, I was right the FSO has exactly the same problem with the comma within quotes, causing too many fields to be created, so that is going to cause a major problem if we tried to use it for importing the data.
It should still work OK for Appending though.
SECTVAIELOCMAINNUC5006.1500915498085621484uSE has 2 fields that are too large, one of them over 300 characters, I think they are also the ones with quotes and commas which look as if they have concatenated 2 fields.
I have set both fields to Memo and they import OK.
You could of course set most of the fields with real text data in them to memo to ensure that they never exceed the 255 character limit.
Yes, when I set those two fields two memo the import worked just fine. So it the TESTPRIMARY that is giving me the difficulties. As far as creating extra fields, that should be something I can take care of during the import by running a delete query. At least I think it should.
also see thread: How to import a text file with more than 255 fields into Access
The new version of EXPTESTPRIMARY1808398.1491995706329461740 that you sent me has a problem for you, it has extra field names not in the current TableImport2.
In fact it looks like an extra 24 tables.
ALNVALUE,ASSETATTRSD,LS_CHANGEBY,LOCATIONSPEC.CHANGEDATE,CLASSSTRUCTURESD_L OCSPEC,DISPLAYSEQUENCE,INHERITEDFROMITEM,ITEMSPECVALCHANGED,LINKEDTOATTRIBU TE,LINKEDTOSECTION,LOCATIONSPECSD,LS01,LS02,LS03,LS04,LS05,MANDATORY,MEASUR EUNITSD,NUMVALUE,LS_ORGSD,SECTION,TABLEVALUE
That means that TestPrimary needs it's own table, or you extend the curent one.
LOCATIONSPEC.CHANGEDATE may give you headache as full stops arenot allowed in Access Field Names.
Can I recommend to you FreeFileViewer, it is great for looking at files and finding things with it's search.
Ok understood. I can't get around the error message I posted earlier for the TESTPRIMARY. I'm using the same routine that we did for the other file. The keyfield is in the second table so I moved it to add keyfield after rst 2 is processed.
When I spoke of adding extra rows it was in regards to the append procedure. Even if it is they should be able to be delete by a query after that are imported. I crossed up threads into one comment and made that confusing.
You are correct. I have created two tables for the TESTPRIMARY import that are independent and only for TESTPRIAMRY.
rst.Open "tblImportTESTPRIMARY1", CurrentProject.Connection, adOpenDynamic, adLockOptimistic
rst2.Open "tblImportTESTPRIMARY2", CurrentProject.Connection, adOpenDynamic, adLockOptimistic
I didn't post all the code. Just the record adding piece. Sorry for not giving the complete picture.
I compared the fields in the file against the tables and they are accounted for. The primary key is in table 2 and it needs to be added to table 1 to establish a joining field.. Do I manually add the primary key field in table 1 or does the code take care of that? Also, RecordID was added for the other file during the import. Do I add that manually to the table or it is automated as well?
I have 198 fields in the first table. There will be 199 if I am to manually add the primary key. Table 2 has 84 fields with field 73 being the primary key.
You have to manually add the KeyID Field to Table 1, the code should then be modified to update that field when the field count gets to 199 during import.
I suggest that when you have finished the table and code changes that you send me a copy so that we will both talking about the same things in future.
To overcome adding blank lines replace the Subroutine code with this one
Code:Appendsub:
MsgBox "current file appending to - " & FileFound & " from - " & FileName
reccount = 0
datastring = ""
Open firstfile For Input As #1
Open sDest & FileFound For Append As #2
Do Until EOF(1)
xchar = Input(1, #1)
If xchar = vbLf Then
reccount = reccount + 1
datastring = datastring & xchar
If reccount > 1 And Len(datastring) > 20 Then
'MsgBox Len(datastring) & Len(datastring) > 100
Print #2, datastring
datastring = ""
End If
End If
If reccount > 1 Then datastring = datastring & xchar
Loop
'MsgBox datastring
'Print #2, datastring
Close #1
Close #2
MsgBox "data transferred to " & FileFound
Return
This is the part that controls it, the 20 could probably be reduced to 5 or 2 and still work OK
And Len(datastring) > 20