| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Adding a self parameter to c_str so that when a non-string
occurs, the error is reported against a function.
Legend:
A - Pass existing self to c_str.
B - Define self and pass to c_str and possibly other
functions.
C - Take new self parameter and pass to c_str and possibly
other functions.
D - Pass existing self to c_str and/or other functions.
E - Define self and pass to other functions, not c_str.
X - Pass nil to c_str.
* buf.c (buf_strm_put_string, buf_str): B.
* chksum.c (sha256_str, md5_str): C.
(sha256_hash, md5_hash): D.
* eval.c (load): D.
* ffi.c (ffi_varray_dynsize, ffi_str_put, ffi_wstr_put, ffi_bstr_put): A.
(ffi_char_array_put, ffi_wchar_array_put): C.
(ffi_bchar_array_put): A.
(ffi_array_put, ffi_array_out, ffi_varray_put): D.
* ftw.c (ftw_wrap): A.
* glob.c (glob_wrap): A.
* lib.c (copy_str, length_str, coded_length,split_str_set,
list_str, cmp_str, num_str, out_json_str, out_json_rec,
display_width): B.
(upcase_str, downcase_str, string_extend, search_str,
do_match_str, do_rmatch_str, sub_str, replace_str,
cat_str_append, split_str_keep, trim_str, int_str, chr_str,
span_str, compl_span_str, break_str, length_str_gt,
length_str_ge, length_str_lt, length_str_le, find, rfind, pos,
rpos, mismatch, rmismatch): A.
(c_str): Add self parameter and use in type mismatch diagnostic.
If the parameter is nil, use "internal error".
(flo_str): B, and correction to "flot-str" typo.
(out_lazy_str, out_quasi_str, obj_print_impl): D.
* lib.h (c_str): Declaration updated.
* match.c (dump_var): X.
(v_load): D.
* parser.c (open_txr_file): C.
(load_rcfile): E.
(find_matching_syms, provide_atom): X.
(hist_save, repl): B.
* parser.h (open_txr_file): Declaration updated.
* parser.y (chrlit): X.
* regex.c (search_regex): A.
* socket.c (getaddrinfo_wrap, sockaddr_pack): A.
(dgram_put_string): B.
(open_sockfd): D.
(sock_connect): E.
* stream.c (stdio_put_string, tail_strategy, vformat_str,
open_directory, open_file, open_tail, remove_path,
rename_path, tmpfile_wrap, mkdtemp_wrap, mkstemp_wrap): B.
(do_parse_mode, parse_mode, make_string_byte_input_stream): B.
(normalize_mode, normalize_mode_no_bin): E.
(string_out_put_string, formatv, put_string, open_fileno,
open_subprocess, open_command, base_name,
dir_name, short_suffix, long_suffix): A.
(run): D.
(win_escape_cmd, win_escape_arg): X.
* stream.h (parse_mode, normalize_mode,
normalize_mode_no_bin): Declarations updated.
* sysif.c (mkdir_wrap, do_utimes, dlopen_wrap, dlsym_wrap,
dlvsym_wrap): A.
(do_stat, do_lstat): C.
(mkdir_nothrow_exists, ensure_dir): E.
(chdir_wrap, rmdir_wrap, mkfifo_wrap, chmod_wrap,
symlink_wrap, link_wrap, readlink_wrap, exec_wrap,
getenv_wrap, setenv_wrap, unsetenv_wrap, getpwnam_wrap,
getgrnam_wrap, crypt_wrap, fnmatch_wrap, realpath_wrap,
opendir_wrap): B.
(stat_impl): statfn pointer-to-function argument now takes
self parameter. When calling it, we pass name.
* syslog.c (openlog_wrap, syslog_wrapv): A.
* time.c (time_string_local, time_string_utc,
time_string_meth, time_parse_meth): A.
(strptime_wrap): B.
* txr.c (txr_main): D.
* y.tab.c.shipped: Updated.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* RELNOTES: Updated.
* configure, txr.1: Bumped version and date.
* share/txr/stdlib/ver.tl: Bumped.
* txr.vim, tl.vim: Regenerated.
* protsym.c: Likewise.
* y.tab.c.shipped: Regenerated. Good catch! forgot to do this
for most recent change to parser.y.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* parser.c (read_unknown_structs_s): New symbol variable.
(parser_common_init): Initialize read_unknown_structs flag
member of the parser structure from the new special variable.
(parse_init): Initialize read_unknown_struct_s variable.
Register the *read-unknown-structs* dynamic variable.
* parser.h (struct parser): New member, read_unknown_structs.
(read_unknown_structs_s): Declared.
* parser.y (struct): Generate the struct literal syntax not
only for quasiquoted structures, but for structures with an
unknown type name, if the read_unkonwn_structs flag is set.
* txr.1: Documented.
* share/txr/stdlib/doc-syms.tl: Regenerated.
* y.tab.c.shipped: Regenerated.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* parser.y (json_val): We must nreverse the json_pairs which
were pushed in right to left order. This didn't matter for
constructing hashes so it was left out, but under quasiquoting
the order matters: it determines the order of evaluation and
of pattern matching.
* tests/011/patmatch.tl: New quasiquoting pattern matching
cases, including JSON.
* y.tab.c.shipped: Regenerated.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* eval.c (eval_init): get-json intrinsic registered.
* parser.c (prime_parser): Handle prime_json.
(lisp_parse_impl): Take enum prime_parser argument
directly instead of the interactive flag.
(lisp_parse, nread, iread): Pass appropriate prime_parser
value instead of the original flag.
(get_json): New function. Like nread, but passes prime_json.
* parser.h (enum prime_parser): New constant, prime_json.
(get_json): Declared.
* parser.l (prime_scanner): Handle prime_json.
* parser.y (SECRET_ESCAPE_J): New terminal symbol.
(spec): New productions around SECRET_ESCAPE_J for parsing
JSON.
* lex.yy.c.shipped, y.tab.c.shipped, y.tab.h.shipped: Updated.
* txr.1: Documented.
* share/txr/stdlib/doc-syms.tl: Updated.
|
|
|
|
|
|
|
|
| |
* parser.y (json): Add forgotten call end_of_json in
the '^' production.
(json_val): Pass zero to vector, not 0 which is nil.
* y.tab.c.shipped: Updated.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
There are two problems.
One is that in #J{~foo:"bar"} the foo expression is parsed
while the scanner is in Lisp mode and the : token is grabbed
by the parser as a lookahead token in the same mode. Thus it
comes back as a SYMTOK (the colon symbol).
We fix this by recognizing the colon symbol as synonym of
the colon character. But only if it is spelled ":" in
the syntax: we look at the lexeme.
The second problem is that even if we fix the above, ~foo
produces a sys:unquote form which is rejected because it is
not a string. We fix this in the most straightforward way:
by deleting the code which restricts keys to strings,
an extension I already thought about making anyway.
* parser.y (json_col): New non-terminal. This recognizes
either a SYMTOK or a ':' token, with the semantic restriction
that the SYMTOK's lexeme must be ":".
(json_vals): Use yybadtok instead of generic error.
(json_pairs): Do not require keys to be strings. Use json_col
instead of ':', and also yybadtok instead of generic error.
* y.tab.c.shipped: Updated.
|
|
|
|
|
|
|
|
| |
* parser.y (json_val): Handle json_vals being a list,
indicating quasiquoting. We must nreverse it and turn it into
a sys:vector-lit form.
* y.tab.c.shipped: Updated.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Because #J<json> produces the (json ...) form that translates
into quote, ^#J<json> yields a quasiquote around a quote. This
has some disadvantages, because it requires an explicit eval
in some situtions to do what the programmer wants.
Here, we introduce an alternative: the syntax #J^<json> will
produce a quasiquote instead of a quote.
The new translation scheme is
#J X -> (json quote <X>)
#J^ X -> (json sys:qquote <X>)
where <X> denotes the Lisp object translation of JSON syntax X.
* parser.c (me_json): The quote symbol is now already in the
json form, so all that is left to do here is to take the cdr
to pop off the json symbol.
* parser.l (JPUNC, NJPUNC): Allow ^ to be a punctuator in
JSON mode.
* parser.y (json): For regular #J, generate the new (json
quote ...) syntax. Implement J# ^ which sets up the nonzero
quasi_level around the processing of the JSON syntax, so that
everything is in a quasiquote, finally producing the
(json sys:qquote ...) syntax.
* lex.yy.c.shipped, y.tab.c.shipped: Updated.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* parser.h (end_of_json_unquote): Declared.
* parser.l (JPUNC, NJPUNC): Add ~ and * characters to set of
JSON punctuators.
(grammar): Allow closing brace character in NESTED, SPECIAL
and QSPECIAL statues to be a token. This is because it occurs
as a lookahead character in this situation #J{"foo":~expr}.
The lexer switches from the JSON to the NESTED start state
when it scans the ~ token, so that expr is treated as Lisp.
But then } is consumed as a lookahead token by the parser in
that same mode; when we pop back to JSON mode, the } token has
already been scanned in NESTED mode.
We add two new rules in JSON mode to the lexer to recognize
the ~ unquote and ~* splicing unquote. Both have to push the
NESTED start condition.
(end_of_json_unquote): New function.
* parser.y (JSPLICE): New token.
(json_val): Logic for unquoting. The array and hash rules must
now be prepared to deal with json_vals and json_pairs now
producing a list object instead of a hash or vector. That is
the signal that the data contains active quasiquotes and must
be translated to the special literal syntax for quasiquoted
vectors and hashes. Here we also add the rules for ~ and ~*
unquoting syntax, including managing the lexer's transition
back to the JSON start condition.
(json_vals, json_pairs): We add the logic here to recognize
unquotes in quasiquoting state. This is more clever than the
way it is done in the Lisp areas of the grammar. If no
quasiquotes occur, we construct a vector or hash,
respectively, and add to it. If unquotes occur and if we
are nested in a quasiquote, we switch the object to a list,
and continue it that way.
(yybadtoken): Handle JSPLICE.
* lex.yy.c.shipped, y.tab.c.shipped, y.tab.h.shipped: Updated.
|
|
|
|
|
|
|
|
|
|
| |
* parser.l (HASH_N_EQUALS, HASH_N_HASH): Recognize these
tokens in the JSON start state also.
* parser.y (json_val): Add the circular syntax, exactly like
it is done for n_expr and i_expr. And it works!
* lex.yy.c.shipped, y.tab.c.shipped, y.tab.h.shipped: Updated.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
(needs buffer literal error message cleanup)
* parser.c (json_s): New symbol variable.
(is_balanced_line): Follow braces out of initial state.
This concession allows the listener to accept input
like #J{"a":"b"}.
(me_json): New static function (macro expander). The #J X
syntax produces a (json Y) form, with the JSON syntax X
translated to a Lisp object Y. If that is evaluated,
this macro translates it to (quote Y).
(parse_init): initialize json_s variable with interned symbol,
and register the json macro.
* parser.h (json_s): Declared.
(end_of_json): Declared.
* parser.l (num_esc): Treat u escape sequences in the same way
as x. This function can then be used for handling the \u
escapes in JSON string literals.
(DIG19, JNUM, JPUNC, NJPUNC): New lex named patterns.
(JSON, JLIT): New lex start conditions.
(grammar): Recognize #J syntax, mapping to HASH_J token,
which transitions into JSON start state.
In JSON start state, handle all the elements: numbers,
keywords, arrays and objects. Transition into JLIT state.
In JLIT start state, handle all the elements of JSON string
literals, including surrogate pair escapes.
JSON literals share the fallback {UANY} fallback patter with
other literals.
(end_of_jason): New function.
* parser.y (HASH_J, JSKW): New token symbols.
(json, json_val, json_vals, json_pairs): New nonterminal
symbols, and rules.
(i_expr, n_expr): Generate json nonterminal, to hook the
stuff into the grammar.
(yybadtoken): Handle JKSW and HASH_J tokens.
* lex.yy.c.shipped, y.tab.c.shipped, y.tab.h.shipped:
Updated.
|
|
|
|
|
|
|
|
|
|
| |
* parser.y (parse): When issuing the diagostic indicating the
likely starting line of the unterminated expression, instead
of mentioning that line in the diagnostic text, let's just
issue the diagnostic against that line. The programmer's
text editor can then jump to that line.
* y.tab.c.shipped: Updated.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This is motivated by the recent crash regression in the #;
comment out mechanism. The parser doesn't have adequate
coverage in the test suite.
* tests/012/syntax.tl: New file, for testing syntax.
A problem was found #;.expr did not work inside a list,
only at top level. It required a space before the dot.
* parser.y (listacc): A couple of productions to handle
hash-semicolon immediately followed by a dot without
any whitespace, and then by an expression.
* y.tab.c.shipped: Regenerated.
|
|
|
|
|
|
|
|
|
|
|
|
| |
A crash has showed up when processing commented-out objects.
This is due to the most recent gc fix to the parser.
* parser.y (set_syntax_tree): Avoid the set macro when tree
has the special value nao (not an object).
* y.tab.c.shipped: Regenerated.
* y.tab.h.shipped: Likewise.
|
|
|
|
|
|
|
|
|
| |
* parser.y (set_syntax_tree): New static function. Uses
GC-correct assignment via set macro.
(spec): Call set_syntax_tree instead of raw assignments to
parser->syntax_tree, which violate GC rules.
* y.tab.c.shipped: Regenerated.
|
|
|
|
|
|
|
|
|
|
|
| |
This picks up the changes introduced by the previous
three commits.
* lex.yy.c.shipped: Updated.
* y.tab.c.shipped: Likewise.
* y.tab.h.shipped: Likewise.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Whereas @a..@b parses and transforms to (rcons @a @a),
@(a)..@(a) goes to @(rcons a @(a)).
* parser.l (grammar): Under 248 compatibility or lower, the @
character now produces the OLD_AT token. Otherwise it produces
the '@' character, as before.
* parser.y (OLD_AT): New token replaces the '@' at the old
low precedence position. '@' is now at the highest precedence,
together with OLD_DOTDOT. (We don't care about interactions
between '@' and OLD_DOTDOT, because OLD_DOTDOT only exists in
185 compatibility, in which '@' is OLD_AT).
(meta): The two rules have to be unfortunately duplicated for
OLD_AT, since there is no BNF OR operator in Yacc.
* txr.1: Compat note added.
* lex.yy.c.shipped: Updated.
* y.tab.c.shipped, y.tab.h.shipped: Likewise.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* METALICENSE: 2020 copyrights bumped to 2021. Added note
about SHA-256 routines from Colin Percival.
* LICENSE, LICENSE-CYG, Makefile, alloca.h, args.c, args.h,
arith.c, arith.h, buf.c, buf.h, cadr.c, cadr.h, chksum.c,
chksum.h, chksums/crc32.c, chksums/crc32.h, combi.c, combi.h,
configure, debug.c, debug.h, eval.c, eval.h, ffi.c, ffi.h,
filter.c, filter.h, ftw.c, ftw.h, gc.c, gc.h, glob.c, glob.h,
hash.c, hash.h, itypes.c, itypes.h, jmp.S, lex.yy.c.shipped,
lib.c, lib.h, linenoise/linenoise.c, linenoise/linenoise.h,
lisplib.c, lisplib.h, match.c, match.h, parser.c, parser.h,
parser.l, parser.y, protsym.c, rand.c, rand.h, regex.c,
regex.h, share/txr/stdlib/asm.tl, share/txr/stdlib/awk.tl,
share/txr/stdlib/build.tl, share/txr/stdlib/cadr.tl,
share/txr/stdlib/compiler.tl, share/txr/stdlib/conv.tl,
share/txr/stdlib/copy-file.tl, share/txr/stdlib/debugger.tl,
share/txr/stdlib/defset.tl, share/txr/stdlib/doloop.tl,
share/txr/stdlib/each-prod.tl, share/txr/stdlib/error.tl,
share/txr/stdlib/except.tl, share/txr/stdlib/ffi.tl,
share/txr/stdlib/getopts.tl, share/txr/stdlib/getput.tl,
share/txr/stdlib/hash.tl, share/txr/stdlib/ifa.tl,
share/txr/stdlib/keyparams.tl, share/txr/stdlib/op.tl,
share/txr/stdlib/package.tl, share/txr/stdlib/param.tl,
share/txr/stdlib/path-test.tl, share/txr/stdlib/place.tl,
share/txr/stdlib/pmac.tl, share/txr/stdlib/quips.tl,
share/txr/stdlib/save-exe.tl, share/txr/stdlib/socket.tl,
share/txr/stdlib/stream-wrap.tl, share/txr/stdlib/struct.tl,
share/txr/stdlib/tagbody.tl, share/txr/stdlib/termios.tl,
share/txr/stdlib/trace.tl, share/txr/stdlib/txr-case.tl,
share/txr/stdlib/type.tl, share/txr/stdlib/vm-param.tl,
share/txr/stdlib/with-resources.tl,
share/txr/stdlib/with-stream.tl, share/txr/stdlib/yield.tl,
signal.c, signal.h, socket.c, socket.h, stream.c, stream.h,
struct.c, struct.h, strudel.c, strudel.h, sysif.c, sysif.h,
syslog.c, syslog.h, termios.c, termios.h, time.c, time.h,
tree.c, tree.h, txr.1, txr.c, txr.h, unwind.c, unwind.h,
utf8.c, utf8.h, vm.c, vm.h, vmop.h, win/cleansvg.txr,
y.tab.c.shipped: Copyright year bumped to 2021.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The following does not work and is fixed here:
@(output)
@(repeat)
@{lisp-expr ...}
@(end)
When a Lisp expression occurs in a braced expansion
syntax, it is not not traversed to find variables.
* parser.y (extract_vars): Treat the second element of a
sys:var as a Lisp expression, and find variables. Don't do
this in 128 or older compatibility mode, because then that is
a TXR expression.
* y.tab.c.shipped: Regenerated.
|
|
* Makefile (BS_LIC_FROM, BS_LIC_TO): Variables removed.
(y.tab.c): Remove all filtering hacks. Don't remove the
license from y.tab.c. Don't remove yyparse declaration from
y.tab.h. Provide a pattern rule for producing any missing
file X from X.shipped. That's how y.tab.c and y.tab.h
get produced from y.tab.c.shipped and y.tab.h.shipped,
respectively, in user mode.
* y.tab.c.shipped, y.tab.h.shipped: New files, generated
using Bison 2.5.
|