I understand that LUA uses a mark-and-sweep GC algorithm. How
predictable is its behavior? What triggers it?
I'm interested in the possibility of using LUA in semi real-time
applications. This is not hard real time, but I can not tolerate the GC
stepping in a couple of seconds once in a while.
How can the GC behavior be controlled? Is there any way by which
possible "random" GC hits can be avoided?
> I'm interested in the possibility of using LUA in semi real-time
> applications. This is not hard real time, but I can not tolerate the GC
> stepping in a couple of seconds once in a while.
Your program must have a lot of objects to take a couple of seconds from
the GC. In typical programs one GC cicle takes less than 1/10 second.
(Nevertheless, you can always use "collectgarbage", as suggested by lhf.
If you call it with INT_MAX, for instance, you will stop the collector)
let me first say that Lua is a great language. I agree with almost every
design choice that's gone into it. The exception is the use of
'local'. Perhaps I'm missing something, but I think having a keyword
called 'global' would have been better. Writing 'global x' would give
you access to the global 'x'; writing 'global x=3' would assign/define
the global. Otherwise all references would be local.
Currently, 'local' is needed in front of almost every variable in a
function and I've had several hard to track down bugs from inadvertant
It's obviously too late to change Lua, but what about a function
declaration (called 'method' maybe) that defaults to local and requires
the use of the 'global' keyword to get outside the scope of the method.
So, you have
i='erer' --- global 'i' is unchanged
Just some thoughts,