[ANN] updated ltokenp, a token processor for Lua

classic Classic list List threaded Threaded
3 messages Options
Reply | Threaded
Open this post in threaded view
|

[ANN] updated ltokenp, a token processor for Lua

Luiz Henrique de Figueiredo
I've updated my ltokenp:
        http://www.tecgraf.puc-rio.br/~lhf/ftp/lua/#ltokenp

The package is self-contained and includes Lua 5.3.5. It can process all
programs for Lua 5.1 and later. It builds out of the box in Linux and macOS.

Here is the README. Enjoy. All feedback welcome.

ltokenp is a token processor for Lua: it allows you to process the stream
of tokens coming from the Lua lexer.

Potential uses of ltokenp include:
- Compressing Lua programs by removing comments and whitespace (strip.lua)
- Removing assertions (assert.lua)
- Adding new syntax sugar (self.lua)
- Experimenting with new syntax without hacking the Lua source (reserved.lua)
See also a sample skeleton in skel.lua.

ltokenp accepts Lua scripts to run and Lua files to process.
Scripts are run and files are processed as seen.
Each script appears as a separate argument after '-s', one '-s' per script.

Typical usage is
        ltokenp -s script.lua [file.lua ...]
but you can also do
        ltokenp -s s1.lua f1.lua -s s2.lua f2.lua

Scripts should define a global function FILTER to process the token stream.
        function FILTER(line,token,text,value) ... end
The arguments are:
- the line number where the token appears
- the token as a number
- the token as text
- the value of names, numbers, and strings; for other tokens,
  the value is the same as the text.

If no scripts are given, ltokenp just dumps the token stream with this:
        function FILTER(line,token,text,value)
                print(line,token,text,value)
        end
which is useful for debugging.

Scripts typically output the contents of the files with some modifications.
Unfortunately, all comments and whitespace are eaten by the lexer and never
reach the token stream.

ltokenp is actually a full-featured non-interactive Lua interpreter.
You can run ordinary Lua programs with ltokenp -s foo.lua.

To build ltokenp and run a simple test, just do make.

To install ltokenp where you can find it, use a variant of these:
        make install
        sudo make install
        sudo make install INSTALL_DIR=/usr/local/bin

This code is hereby placed in the public domain and also under the MIT license.

--lhf

Reply | Threaded
Open this post in threaded view
|

Re: [ANN] updated ltokenp, a token processor for Lua

Emeka
Great Job!

This sounds more like  transpiler and uglifier join together. I am borrowing JavaScript trending words here.
Is there a way to allow someone to inject his/her own sugar?

Regards, Janus

On Tue, Jul 31, 2018 at 12:41 AM, Luiz Henrique de Figueiredo <[hidden email]> wrote:
I've updated my ltokenp:
        http://www.tecgraf.puc-rio.br/~lhf/ftp/lua/#ltokenp

The package is self-contained and includes Lua 5.3.5. It can process all
programs for Lua 5.1 and later. It builds out of the box in Linux and macOS.

Here is the README. Enjoy. All feedback welcome.

ltokenp is a token processor for Lua: it allows you to process the stream
of tokens coming from the Lua lexer.

Potential uses of ltokenp include:
- Compressing Lua programs by removing comments and whitespace (strip.lua)
- Removing assertions (assert.lua)
- Adding new syntax sugar (self.lua)
- Experimenting with new syntax without hacking the Lua source (reserved.lua)
See also a sample skeleton in skel.lua.

ltokenp accepts Lua scripts to run and Lua files to process.
Scripts are run and files are processed as seen.
Each script appears as a separate argument after '-s', one '-s' per script.

Typical usage is
        ltokenp -s script.lua [file.lua ...]
but you can also do
        ltokenp -s s1.lua f1.lua -s s2.lua f2.lua

Scripts should define a global function FILTER to process the token stream.
        function FILTER(line,token,text,value) ... end
The arguments are:
- the line number where the token appears
- the token as a number
- the token as text
- the value of names, numbers, and strings; for other tokens,
  the value is the same as the text.

If no scripts are given, ltokenp just dumps the token stream with this:
        function FILTER(line,token,text,value)
                print(line,token,text,value)
        end
which is useful for debugging.

Scripts typically output the contents of the files with some modifications.
Unfortunately, all comments and whitespace are eaten by the lexer and never
reach the token stream.

ltokenp is actually a full-featured non-interactive Lua interpreter.
You can run ordinary Lua programs with ltokenp -s foo.lua.

To build ltokenp and run a simple test, just do make.

To install ltokenp where you can find it, use a variant of these:
        make install
        sudo make install
        sudo make install INSTALL_DIR=/usr/local/bin

This code is hereby placed in the public domain and also under the MIT license.

--lhf




--
P.S Please join our groups:  [hidden email]
 or [hidden email]  These are platforms for learning and sharing  of knowledge.                                                                                      www.satajanus.com | Satajanus  Nig. Ltd


Reply | Threaded
Open this post in threaded view
|

Re: [ANN] updated ltokenp, a token processor for Lua

Luiz Henrique de Figueiredo
> Is there a way to allow someone to inject his/her own sugar?

Sure. That's one of the intended uses for ltokenp.
See self.lua for an example: it adds '@' as sugar for 'self.'
See also the README.