Work started on a Lua implementation

I’ve started work on an implementation of Fluent in Lua. My initial use case is to localize documents generated using the SILE Typesetter, but in some ways that is going to be an esoteric use case. If there is anybody interested in using it in more generic ways (such as UI localization) I’d love to hear feedback from them as I go along so that I don’t get stuck in my own use-case specifics. I hope the API ends up being broadly useful for all i18n needs in Lua.

My plan is to stick to pretty idiomatic Lua. In broad terms the idea will be to parse input resources to table(s) representing each locale that can be queried for localized messages using table meta methods.

And of course — if anybody wants to contribute I’d be glad for the help.

Hi Caleb,

thanks for opening a thread here. Feel free to send a PR too to add a pointer back to your project.

One thing I noticed is that your API names differ quite a bit from what we’re doing in other implementations. With the cross-platform nature of Fluent, we’re trying to align the APIs and their names. In your case, fluent.bundle (no idea how Lua segments packages) is probably right?

The other thing to note is that the JavaScript implementation hides the concept of a Localization class quite well. It exists in all the bindings, but not in @fluent/bundle. In Rust, it’s fluent.fallback, and I’m just adding fluent.runtime.Localization on the python side.

For the parser implementations, we all rely on the fixtures from to test. The actual resolver spec is gonna get worked on for real in Q4 this year.


Thanks for the feedback @Axel. Even before seeing this message I started (re)naming objects similarly to the other implementations. Obviously I’m going to try to keep it idiomatic of Lua and not just smash in some other language’s paradigm, but where possible I’ll break it down into modules and name them similarly.

I started working on a PEG grammar today. I’m not fat enough along to pass it fixtures yet, but I’ll keep those in mind and test against them as soon as I get far enough through the EBNF.

I got far enough along with the PEG grammar today that it has rules for parsing everything in the EBNF. I copied the test fixtures and it actually parsed them all on the first go. I’m still in shock. The AST generated in Lua is different enough that I can’t fully compare them to the references yet but spot checking them looks like I’m on the right track.

One thing I was very confused about and still haven’t found documented to my satisfaction is what to actually do with Junk. The test fixtures have it all parsed into the AST (and because that’s the way the EBNF specifies it, my implementation does now too) but the playground, the Rust and JS implementations I played with all throw errors on encountering junk.

Was that just for the reference implementation and actual production implementations should raise errors? Or are the other implementations lagging behind the spec?

Rust will actually return Result<ast::Resource, (ast::Resource, Vec<ParserError>)> which means you get Ok(Resource) if there were no errors, or Err((Resource, Vec<Error>)) in case of errors.

We collect Junk into the AST, and accumulate errors. This way we will recover if there are parser errors.