Page 1 of 2 12 LastLast
Results 1 to 10 of 12

Thread: Maximum buffer error

  1. #1

    Question Maximum buffer error

    Hello LAE team!

    What can I do to resolve attached error occurring at a Lookup node? I've got LAE 6.1.4 Professional Plus Edition set up in my computer

    Situation:

    Input key is from a Filter node which has 7,024,637 records across 57 columns

    Lookup key is from a Filter node which has 1,362 records across 8 columns

    Hope to hear from the team soonest.Lookup node error.jpg

    Regards,

    tris

  2. #2
    Lavastorm Employee gmullin's Avatar
    Join Date
    May 2014
    Location
    Chicago
    Posts
    146

    Default

    If you open up <LAE Install Dir>/conf/brain/ls_brain_node.prop and find the parameter called ls.brain.node.lookup.maxBufferSize, you can increase the value its equal to. Try 1073741824 (double) for example.

    Or if you don't want to edit the file you can also set a parameter in the Lookup node by pressing "Declare Parameters" button and entering a parameter like the screenshot. And set the value within the node.
    Attached Images Attached Images

  3. #3

    Default

    thanks so much for the very quick turnaround @gmullin! much appreciated!

  4. #4
    Lavastorm Employee stonysmith's Avatar
    Join Date
    Nov 2006
    Location
    Grapevine Tx
    Posts
    770

    Default

    Is one or more of those 8 columns exceptionally long? 1300 records should not have caused that problem.

  5. #5

    Default

    hello @rmullin and @stonysmith!

    1. I successfully created the MaxBufferSize parameter within the lookup node; unfortunately, it still shows the same error.
    2. I tried to change the Is_brain_node.prop but shows "Access denied" window when I tried to save the recommended change.
    3. The contents of the two files I'm trying to do a lookup are not exceptionally long. Just to troubleshoot, I tried to use a sample size of 200 records for the LookupKey and the node worked. Is 7MM records x 57 cols. too much for the node to handle?

    Any other option I can do?

    Thanks heaps team!

    Regards,

    tris
    Last edited by Tris_717684; 12-21-2017 at 08:03 PM.

  6. #6
    Lavastorm Employee stonysmith's Avatar
    Join Date
    Nov 2006
    Location
    Grapevine Tx
    Posts
    770

    Default

    Quote Originally Posted by Tris_717684 View Post
    Is 7MM records x 57 cols. too much for the node to handle?
    The size of the input file (pin1) does not matter in any way.
    The total byte size of the lookup file (pin2) can not exceed the buffer size.

    If there are columns in the lookup (pin2) that you are not using (neither in the lookup key nor sent to the output) then you could remove those columns with a filter node, thus reducing the memory required by the lookup.

  7. #7

    Default

    Thanks @stonysmith! will play around with it.

  8. #8

    Default

    Further to @gmullin's suggestion, I have attempted many times to edit and save and always get the "Access denied to edit the Is_brain_node.prop file.". What do i use to edit and save the prop file?

  9. #9
    Lavastorm Employee gmullin's Avatar
    Join Date
    May 2014
    Location
    Chicago
    Posts
    146

    Default

    Are you able to open Notepad as Admin and then try to open the ls_brain_node.prop file? Or can you edit the conf/site.prop file, you can add the same parameter there too and it will overwrite the value.

    I don't see why creating the parameter in the Lookup node itself wouldn't work, what value did you set the parameter to? Maybe try increasing the value?

  10. #10

    Default

    I am able to open and edit the *.prop file using Wordpad or Notepad without any problems. I have replicated the original *.prop file and renamed it to show ls_brain_node (old).prop as a fall back (and this confirms that I have admin access to this folder). However, on saving the edited *.prop file, an error window popped up showing that my access is denied. I cannot save it as a *.txt file as well and it's showing the same error.

    I have increased it 1,500,000,000 (nearly tripled) and the declared parameter still doesn't work.

    I have reduced the number of columns to 27 from 57 (but same number of rows)... and the lookup node still fails.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •