Skip to content

A basic Lua Lexer I've written, basically a Tokenizer.

Notifications You must be signed in to change notification settings

Zaenalos/Lua-Lexer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 

Repository files navigation

📜 Lua-Lexer

Lua-Lexer is a lightweight lexer (tokenizer) for the Lua programming language. It breaks down Lua code into a series of tokens, providing a foundational tool for building an Abstract Syntax Tree (AST) or performing code analysis.

✨ Features

  • Simple and easy-to-use Lua lexer.
  • Converts Lua source code into tokens.
  • Perfect for building custom parsers or AST generators.

🚀 Getting Started

To get started with Lua-Lexer, clone the repository by running the following command in your terminal:

git clone https://github.com/Zaenalos/Lua-Lexer

📦 Installation and Usage

After cloning the repository, you can start using the lexer in your Lua projects. Follow the steps below:

  1. Require the Tokenizer module in your Lua script:

    local Tokenizer = require("Tokenizer")
  2. Pass your Lua code to the Tokenizer:

    local code = [=[print("Hello World")]=]
    local tokens = Tokenizer(code)
  3. Use the tokens for further processing:

    -- `tokens` is a table containing the parsed tokens from the Lua code
    for _, token in ipairs(tokens) do
        print(token.type, token.value)
    end

The Tokenizer function will return a table of tokens that you can use to build an AST or analyze the code further.

🤝 Contributing

Contributions are welcome! If you have ideas for new features or improvements, feel free to fork the repository and submit a pull request. Let's make Lua-Lexer better together!

📜 License

This project is licensed under the MIT License. See the LICENSE file for details.

About

A basic Lua Lexer I've written, basically a Tokenizer.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages