lexer now produces a vector of heap allocated tokens
This removes the problem of possibly expensive copies occurring due to working with tokens produced from the lexer (that C++ just... does): now we hold pointers where the copy operator is a lot easier to use. I want expensive stuff to be done by me and for a reason: I want to be holding the shotgun.
This commit is contained in:
@@ -91,6 +91,6 @@ enum class lerr_t
|
||||
};
|
||||
const char *lerr_as_cstr(lerr_t);
|
||||
|
||||
lerr_t tokenise_buffer(std::string_view, std::vector<token_t> &);
|
||||
lerr_t tokenise_buffer(std::string_view, std::vector<token_t *> &);
|
||||
|
||||
#endif
|
||||
|
||||
Reference in New Issue
Block a user