Results 1 to 6 of 6

Array slicing optimisation

Threaded View

Kyle123 Array slicing optimisation 02-01-2012, 08:27 AM
OnErrorGoto0 Re: Array slicing optimisation 02-01-2012, 09:19 AM
Kyle123 Re: Array slicing optimisation 02-01-2012, 09:29 AM
snb Re: Array slicing optimisation 02-01-2012, 09:37 AM
nilem Re: Array slicing optimisation 02-01-2012, 10:22 AM
Kyle123 Re: Array slicing optimisation 02-01-2012, 10:43 AM
  1. #1
    Forum Guru Kyle123's Avatar
    Join Date
    03-10-2010
    Location
    Leeds
    MS-Off Ver
    365 Win 11
    Posts
    7,239

    Array slicing optimisation

    Hi All,

    I am optimising some code and need to be able to slice large arrays and am hoping that there is a more efficient way.

    The arrays are 40k lines plus.

    I currently loop through three workbooks, each with 40k lines and, I put each range directly into a variant array, then into an array.

    I then loop through the array to access each sub array and convert each sub array to 1d so I can use the filter function. I have tried using INDEX and tried transposing the data from each workbook, both error, which I'm assuming is due to the size of the data.

    Currently I'm using the below (which is an excerpt):

    Sub getData()Const fPath     As String = "D:\My documents\xyz\"
    Dim myfile      As String
    Dim tempArr()   As String
    Dim x           As Integer
    Dim i           As Long
    Dim rs          As Variant
    Dim rs(1 To 3)  As Variant
    
    
    myfile = Dir(fPath & "*.csv")
    x = 1
    Do Until myfile = ""
            Set wb = Workbooks.Open(fPath & myfile, False, True)
            rs = wb.Sheets(1).Columns(1).SpecialCells(2).Value
            rscoll(x) = rs
            x = x + 1
            wb.Close False
    myfile = Dir()
    Loop
    
    
    For x = 1 To 3
        ReDim tempArr(1 To UBound(rscoll(x)))
        
        For i = 1 To UBound(rscoll(x))
             tempArr(i) = UCase(rscoll(x)(i, 1))
        Next i
        
        ' Find data in array using Filter()
        ' Do some other stuff
        
    Next x
    End Sub
    To give a bit more background, the files are TabDelimited and I need to find given words (60+) that may exist on any row, and in any of a number of fields on each row.

    I've also tried Querying the workbooks with ADO and filtering the recordsets, but this was unbelievably slow.

    Any ideas as to whether this can be sped up at all? / Better way of attacking the problem?

    Cheers

    Kyle
    Last edited by Kyle123; 02-01-2012 at 10:44 AM.

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts

Search Engine Friendly URLs by vBSEO 3.6.0 RC 1