Arc Forumnew | comments | leaders | submitlogin
Ask AF: How to troubleshoot a memory leak?
3 points by evanrmurphy 3320 days ago | 12 comments
I have some code that goes as follows:

  (defop foo req
    (let x <something that allocates a bunch of memory>
Each time page foo is requested, a new x should be created. That works. The problem is, after the user leaves page foo, x's memory isn't getting freed. [1] Any idea how to make sure that happens?

I spent the evening sniffing for clues in srv.arc. Though my understanding of srv.arc is still incomplete, I've gotten far enough to realize it knows how to kill its idle threads (that is, unless I misunderstand how handle-request-1 works).

Suggestions would be appreciated.


[1] The way I diagnosed this was to test the app with top open in a terminal. Each time I opened a new browser tab to the URL, top showed the % memory usage of my mzscheme process to increase. It did not show a significant decrease after closing the tabs, even after waiting several minutes. I tested this repeatedly on tens of tabs.

4 points by rocketnia 3320 days ago | link

Here's the code I tried:

  (do1 nil (= foo (string:n-of 100000 "@_")))
  (defop foo req
    (let x
      (prn ellipsize.x)))
  ; I'm defining 'bar just to be sure it's the strings themselves which
  ; consume the memory.
  (do1 nil (= bar (string:n-of 10 "@_")))
  (defop bar req
    (let x
      (prn ellipsize.x)))
  ; And here's an op that should really cause a problem.
  (do1 nil (= baz (string:n-of 100000 "@_")))
  (defop baz req
    (let x copy.baz
      (push x bazzes)
      (prn ellipsize.x)))
If I repeatedly request foo, my Racket process's memory usage (Racket 5.0.1 on Windows XP) climbs as high as 50-60 MB, but after another request, it garbage-collects and releases about half the memory. It does reach higher peaks if I keep going, to the point where now it's going up to 120 MB and dropping to 60, but clearly all those strings I'm making are at least eligible for garbage collection, even if the garbage collector lets more and more of them stick around.

So I've seen no evidence of a memory leak yet... but then again, I haven't had an Arc server run for longer than it takes to do poking around like this. Maybe you should try out a more recent version of Racket?


4 points by waterhouse 3320 days ago | link

On a side note, I notice you prefix certain things with "(do1 nil", presumably so that it returns nil rather than returning a gigantic string that the REPL will try to print in its entirety. I stick "no:" in front of things for the same purpose. I.e.:

  (no:= foo (string:n-of 100000 "@_"))
  ;instead of
  (do1 nil (= foo (string:n-of 100000 "@_")))
I find this rather handy and thought I'd share it. (On another note, it seems "~" does the same thing as "no:", which I think is not much of an improvement (1 character instead of 3) and probably not worth allocating a whole ssyntax character for.)


4 points by fallintothis 3320 days ago | link

On another note, it seems "~" does the same thing as "no:", which I think is not much of an improvement



1 point by evanrmurphy 3320 days ago | link

Maybe you should try out a more recent version of Racket?

That's a good idea. I'm still using MzScheme 4.2.1. (Although before I go to the trouble, maybe I should skim recent release notes to see if they were supposed to have fixed any memory leak.)

Thanks for doing that test. I just tried again on my case and it got to > 450 MB (90%+ on a 512 machine) without showing signs of GCing! I'll push it even further later to see if it collects; if it does, then maybe I just need to lower a global like threadlimit* if I don't want the entire machine to get consumed.


5 points by waterhouse 3320 days ago | link

You might find the Scheme/Racket functions "current-memory-use" and "collect-garbage" useful for this kind of thing. And perhaps interesting just to play with.

  ;Assuming ($ x) = "evaluate x in Scheme"
  (mac mem-use body
    (w/uniq x
      `(let ,x ($.current-memory-use)
         (- ($.current-memory-use) ,x))))

  ;In action:
  arc> (mem-use (+ 1 2))
  arc> (mem-use (= x range.10))
  arc> (mem-use (= x (range 1 10)))
  arc> x
  (1 2 3 4 5 6 7 8 9 10)
  arc> (mem-use (* 1 2))
  arc> (mem-use ($.collect-garbage))
  arc> (repeat 5 (prn (mem-use:$:collect-garbage)))


1 point by evanrmurphy 3319 days ago | link

I put ($.collect-garbage) at the top of my defop and tested again. Memory leaks just the same. Somebody in Arcland is forgetting to take the trash out because Scheme has none to collect. :P

Nice macro, by the way. I added it to my utils with an attribution.


3 points by aw 3317 days ago | link

Start with replacing "..." with doing nothing:

  (defop foo req
    (let x <something that allocates a bunch of memory>
does this leak memory? (Using, of course, the trick the other people mentioned of forcing a garbage collection so that you can find out if you're really leaking memory). If this does leak memory, then your "<something that allocates a bunch of memory>" is storing a reference to all or part of your data somewhere outside of the defop and that's your problem. If you want proof that it's the "something" that's leaking memory and it's not the fault of srv.arc, turn the defop into a regular function and see that calling it still leaks memory.

If the above doesn't leak memory, then something in your "..." is doing it. Add "..." stuff back in until you start leaking memory again, and you've found your culprit.

Another question is are you using Arc 3.1 or Anarki? I haven't looked at Anarki, so I don't know what if anything it might or might not be doing differently, but you might want to ensure that you aren't returning some large object from the defop. In Arc 3.1, I could be wrong, but just reading through the code it looks like the return value would get discarded in the whilet loop in handle-request-thread, but maybe Anarki is doing something differently.


5 points by evanrmurphy 3305 days ago | link

I finally figured it out. Inside "..." is an flink whose continuation function references x. srv.arc's default threshold for harvest-fnids is very large, so my memory was getting completely consumed before any fnids could be harvested; hence the references to each x were lingering, and mzscheme couldn't consider them garbage.

My workaround is to have foo make a call to harvest-fnids with a calculated threshold much lower than the default, and now memory consumption isn't getting out of hand. Thanks for everyone's help!


1 point by evanrmurphy 3317 days ago | link

Thanks, aw. That's a really thorough method I'll plan to try.

Another question is are you using Arc 3.1 or Anarki?

Arc 3.1, and that's a great point about defop's return value.


3 points by garply 3320 days ago | link

I've had problems with a memory leak using srv.arc before as well - I was unable to track it down. Curious to hear what you find out.

On the other hand, have you considered switching to http.arc and dispatch.arc? I personally would like to phase srv.arc out of Anarki.


1 point by evanrmurphy 3320 days ago | link

I haven't tried http.arc and dispatch.arc yet. Were those written by palsecam? Have you had good experiences with them?


2 points by garply 3320 days ago | link

Yes and yes. With pg's code, I kept having to make ugly hacks to things like respond to be able to do what I wanted. It was hard to modify, hard to maintain, and hard to debug. IMO, palsecam's code is much more flexible and much better written. I highly recommend it.