| Commit message (Collapse) | Author | Age | Files | Lines |
... | |
|
|
|
|
|
|
| |
* tests/011/patmatch.tl: New test case.
* txr.1: Heading fix: Quasiquote matching notation, not
quasiliteral. Examples of quasiquote notation added.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This is a revert of November 2016 commit
606132c336dbeb0dd8bb851a64c97f2c11b76a85.
The commit claims that it fixes a bug, but in fact it
introduces one. There is no discussion in that commit about
what motivated it.
The commit which follows that one introduces a
naivey-implemented diagnostic for catching unbound functions
at macro-expansion time, so the likely motivation for this
wrong fix was to suppress false positives from that naive
diagnostic for recursive functions. That has long since been
replaced by a better approach.
Because of the bug, we cannot do this very useful thing: we
cannot write an inline version of a funtion as a macro first,
and then a real function which just calls that macro:
(defmacro foo (arg) ...)
(defun foo (arg) (foo arg))
The bug causes the (foo arg) call in this function not to be
expanded, due to the shadowing.
* eval.c (do_expand): When expanding defun, do not introduce
the function's name as a lexical function binding, because it
isn't one.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
FFI has the problem that when things go wrong in calls, or in
specifications of functions, the diagnostics refer to an
internal function like ffi-call or ffi-make-call-desc, which
is not helpful in identifying the error. We want the
diagnostics to refer to the foreign function, or foreignb
callback wrapper, to which the problem pertains.
The approach taken is to stick the name symbol into the
ffi-call-desc object. Functions which work with a
ffi-call-desc can pull out the name and use it for reporting.
* ffi.c (struct txr_ffi_call_desc): Add name member.
(ffi_call_desc_print_op): Include name in printed
representation.
(ffi_desc_mark_op): Mark the name.
(ffi_make_call_desc): Take new argument to specify the name,
storing it into the structure. If it is specified,then use
that name for reporting errors, otherwise stick with
ffi-make-call-desc.
(ffi_call_wrap, ffi_closure_dispatch,
ffi_closure_dispatch_safe, ffi_make_closure): Use the name
from the call descriptor, or else the function's own name if
that is nil.
(ffi_init): Update registration of ffi-make-call-desc
intrinsic to five arguments with four required.
* ffi.h (ffi_make_call_desc): Declaration updated.
* share/txr/stdlib/ffi.tl (deffi, deffi-cb-expander): Pass the
name symbol down to ffi-make-call-desc.
* txr.1: Documented.
|
|
|
|
|
| |
* lib.c (obj_print_impl): print (sys:struct-lit ...)
syntax as #S(...).
|
|
|
|
|
|
|
|
|
|
|
|
| |
* lib.c (out_json_rec): When printing keys that might be
potentially quasiquoted symbols that look like ~abc, we
must avoid the condensed {~abc:~def} style. This is because
abd:~def looks like a single symbol, where abc is a package
qualifier. To be safe and readable at the same time, we add
spaces whenever either the key or the value are conses,
indicating non-JSON syntax. Spaces are added on both sides of
the colon, and also after the preceding comma, if there
is a previous item.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* parser.y (json_val): We must nreverse the json_pairs which
were pushed in right to left order. This didn't matter for
constructing hashes so it was left out, but under quasiquoting
the order matters: it determines the order of evaluation and
of pattern matching.
* tests/011/patmatch.tl: New quasiquoting pattern matching
cases, including JSON.
* y.tab.c.shipped: Regenerated.
|
|
|
|
| |
* share/txr/stdlib/quips.tl (sys:%quips%): New entry.
|
|
|
|
|
|
|
|
|
|
|
| |
* lib.c (out_json_str): Strengthen the test for escaping the
forward slash. It has to occur in the sequence </script
rather than just </. Recognize <!-- and --> in the string,
and encode them.
* tests/010/json.tl: Cover this area with some tests.
* txr.1: Documented.
|
|
|
|
|
|
|
| |
* share/txr/stdlib/match.tl (transform-qquote): Handle hash
error case with separate pattern. Use compile-error and *match
form instead of error. Diagnose splicing unquote and nested
quasiquote.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
This allows
(when-match ^(,a ,b) '(1 2) (list a b)) -> (1 2)
which is a nice alternative that is supported by some
Lisp pattern matchers. We don't need it since we have (@a @b).
The motivation is JSON matching.
(when-match ^#J{"foo" : {"x" : ~val}}
#J{"foo" : {"x" : "y"}} val)
-> "y"
* share/txr/stdlib/match.tl (compile-match): Recognize qquote
case and handle via transform-qquote function.
(non-triv-pat-p): Let's declare quasiquotes to be nontrivial.
(transform-qquote): New function: transform quasi-quoted
syntax into regular pattern matching syntax.
* txr.1: Documented.
|
|
|
|
|
|
|
|
|
|
|
|
| |
* tests/010/json.tl: on Windows characters are limited to the
BMP range 0 to #\xFFFF. The character escape \x10437 is out
of range, and so throws an error, simply from that syntax
being read. The two test cases which use this character are
clumped into their own test form, which is executed
conditionally on wide characters being more than two bytes.
Because the expression is still parsed on Windows, we read the
troublesome character from a string at run-time, and
interpolate it.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* configure: Solaris 10 doesn't have mkdtemp, so we detect
it separately from mkstemp, which Solaris 10 does have,
producing HAVE_MKSTEMP and MAKE_MKDTEMP config.h symbols.
* stream.c (mkdtemp_wrap): Separately surround
with #if HAVE_MKDTEMP.
(mkstemp_wrap): Reduce scope of #if HAVE_MKSTEMP only around
this function. Bugfix here: in the #else case
of #if HAVE_MKSTEMPS, we are calling the mkstemps function
instead of mkstemp.
|
|
|
|
|
| |
* share/txr/stdlib/doc-lookup.tl (open-url): Handle :cygnal
together with :cygwin.
|
|
|
|
|
|
|
|
|
|
|
|
| |
* RELNOTES: Updated.
* configure, txr.1: Bumped version and date.
* share/txr/stdlib/ver.tl: Bumped.
* txr.vim, tl.vim: Regenerated.
* protsym.c: Likewise.
|
|
|
|
|
| |
* txr.1: Move the json macro description to the beginning of
the section.
|
|
|
|
|
| |
* txr.1: Fix duplicate 4. bullet; mention the that control
character U+007F is rendered as an escape also.
|
|
|
|
|
|
|
|
| |
* stream.c (mkdtemp_wrap): Rename template argument to prefix,
because template is a C++ keyword.
(mkstemp_wrap): Rename local variable from template to templ.
* stream.h (mkdtemp_wrap): Rename template argument.
|
|
|
|
|
|
| |
* genvim.txr (list): Add txr_junqbkt: unquoted bracket.
(txr_junqlist): Support optional # for vector syntax.
(txr_junqbkt): New region for unquoted bracket expressions.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* genvim.txr (ws, jlist, jsonkw, jerr, jpunc, jesc, juesc,
jnum): New variables.
(txr_circ): Move down; this somehow clashes with JSON regions
beginning with #, so that even if we include txr_circ in JSON
regions, it doesn't work properly.
(txr_jerr, txr_jpunc, txr_jesc, txr_juesc, txr_jnum): Define
using variables.
(txr_jkeyword): Switch to regex match instead of keyword. Vim
8.0 does not recognize keywords when they are glued to #J, as
in #Jtrue, even though #J is excluded from the region.
(txr_jatom): New region.
(txr_jarray, txr_jhash): Define using jlist variable for
contained items.
(txr_jarray_in, txr_jhash_in): New regions for the inner
parts without the #J.
|
|
|
|
|
|
|
|
| |
* lib.c (chr_iscntrl): Don't use iswcntrl; it fails to report
0x80-0x9F as control characters. A bit of hand-crafted logic
does the job.
* txr.1: Redocumented.
|
|
|
|
|
|
|
|
|
|
|
| |
* Makefile (SHIPPED): New variable, holds names of files that
get copied to a .shipped suffix and commited to the repo.
(ABBREV3SH): New macro, version of ABBREV3 that doesn't assume
it is expanding an entire recipe line, and thus can be
embedded into shell commands.
(%.shipped): Rule removed.
(shipped): New rule that prints what it is copying.
Non-maintainer version of rule errors out.
|
|
|
|
|
|
|
|
|
| |
* Makefile (%.shipped): Call the ABBREV macro to produce
output when "make *.shipped" is invoked, otherwise unless
VERBOSE=1 is used, it works silently. This is only in
maintainer mode. The rule in regular mode has the ABBREV
call. (But should we have that rule at all outside of
maintainer mode?)
|
|
|
|
|
|
|
| |
* sysif.c (wrap_utimes, wrap_lutimes): Rename static functions
to utimes_wrap and lutimes_wrap, the convention used
everwhere for wrappers of library functions.
(sysif_init): Follow rename.
|
|
|
|
|
|
|
|
| |
* lib.c (put_json): Turn on indentation if the flat argument
doesn't suppress it, and the stream is not in forced-off
indentation mode.
(tojson): Retarget to put_json, so we don't have to
repeat this logic.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* /share/txr/stdlib/getput.tl (get-jsons): If the s parameter
is a string, convert it to a byte input stream so that.
(put-jsons): Add missing t return value.
(file-put-json, file-append-json, file-put-jsons,
file-append-jsons, command-put-jsons, command-put-jsons): Add
missing object argument to all these functions, and a missing
"w" open-file mode to several of them.
* stream.c (mkstemp_wrap): Calculate length of suff the
defaulted argument, not the raw suffix argument.
* test/010/json.tl: New file, providing tests that touch every
area of the new JSON functionality.
* tests/common.tl (mstest, with-temp-file): New macros.
* txr.1: Document that get-jsons takes a source which could be
a string.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The parser maintains some token objects for one parse job to
the next in the tok_pushback and recent_tok arrays. When the
recent_tok is assigned into the parser, it could be a
wrong-way assignment: the parser is a gen 1 object,
yet the token's semantic value of type val is a gen 0.
* parser.c (lisp_parse_impl, txr_parse): Before re-enabling
gc, indicate that the parser object may have been mutated by a
wrong-way assignment using the mut macro.
* txr.c (txr_main): Likewise.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The big comment I added above end_of_json_unquote summarizes
the issue. This issue has been uncovered by some test cases in
a JSON test suite, not yet committed.
* parser.l <JMARKER>: New start condition. Used as a reliable
marker in the start condition stack, based on which
end_of_json_quasiquote can intelligently fix-up the stack.
(JSON): In the transitions to the quasiquote scanning NESTED
state, push the JMARKER start condition before NESTED.
(JMARKER): The lexer should never read input in the JMARKER
state. If we don't put in a rule to catch this, if that ever
happens, the lexer will just copy the source code to standard
output. I ran into this during debugging.
(end_of_json_unquote): Rewrite the start condition stack
intelligently based on what the Lisp lookahead token has done
to it, so parsing can smoothly continue.
* lex.yy.c.shipped: Regenerated.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The recursive JSON printer must check for the circularity
circularity-related conditions and emit #n= and #n# notations.
* lib.c (circle_print_eligible): Function moved before
out_json_rec to avoid a forward declaration.
(check_emit_circle): New static function. This is a block of
code for doing the circular logic, taken out of
obj_print_impl, because we would like to use it in
out_json_rec.
(out_json_rec): Call check_emit_circle to emit any #n= or #n#
notation. The function returns 1 if it has emitted #n#,
in which a se we are done.
(obj_print_impl): Replace moved block of code with call to
check_emit_circle.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* configure: check for mkstemp and mkdtemp.
* stream.c (stdio_set_prop): Implement setting the :name
property. We need this in mkstemp_wrap in order to punch in
the temporary name, so that the application can retrieve it.
(mkdtemp_wrap, mkstemp_wrap): New functions.
(stream_init): Register mkdtemp and mkstemp intrinsics.
* stream.h (mkdtemp_wrap, mkstemp_wrap): Declared.
* txr.1: Documented.
* share/txr/stdlib/doc-syms.tl: Updated.
|
|
|
|
|
|
|
|
|
|
|
| |
* stream.c (tmpfile_wrap): New static function.
(stream_init): Register tmpfile intrinsic.
* stream.h (tmpfile_wrap): Declared.
* txr.1: Documented.
* share/txr/stdlib/doc-syms.tl: Updated.
|
|
|
|
|
|
|
|
|
| |
* txr.1: Documented file-get-json, file-put-json,
file-append-json, file-get-jsons, file-put-jsons,
file-append-jsons, command-get-json, command-put-json,
command-get-jsons, command-put-jsons.
* share/txr/stdlib/doc-syms.tl: Updated.
|
|
|
|
| |
* txr.1: Document return value of put-json and put-jsonl.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* lisplib.c (getput_set_entries): Register autoload for
get-jsons, put-jsons, file-get-json, file-put-json,
file-append-json, file-get-jsons, file-put-jsons,
file-append-jsons, command-get-json, command-put-json,
command-get-jsons and command-put-jsons.
* share/txr/stdlib/getput.tl (get-jsons, put-jsons,
file-get-json, file-put-json, file-append-json,
file-get-jsons, file-put-jsons, file-append-jsons,
command-get-json, command-put-json, command-get-jsons and
command-put-jsons): New functions.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Let's restore the previous commit and fix it differently.
The real problem is that the tn_flatten function does not need
to use the set macro.
When the flattened tree is rebuilt with tn_build_tree, every
left and right link of every node is set at that time. That
function uses the set macro, so all is well.
The dummy node is not pulled into the rebuild, so it doesn't
cause any problem; the dummy node is only mutated in
tn_flatten.
This is a better fix not only because tn_flatten is now more
efficient. It is important for the dummy node to look like it
is a generation zero object. The dummy node ends up as the
pseudo-root of the tree, which means that it if it looks like
it is in generation 1, all generation 0 below it will be
wastefully checked by the garbage collector.
* tree.c (tn_flatten): Do not use the set macro. Just use
straight assignment to convert the tree into a linked list.
(tr_rebuild): Restore dummy node to be all zeros.
|
|
|
|
|
|
|
|
|
| |
* tree.c (tr_rebuild): The dummy on-stack node we use in the rebuilding
process must have its generation field set to 1 rather than 0, so it looks
like a mature heap object rather than a baby object. Otherwise
when its pointer is assigned to another node which happens to
be gen 1, the on-stack dummy will be added to the checkobj array,
and gc will try to mark it.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* eval.c (eval_init): Register put-json and put-jsonl
intrinsics.
* lib.c (out_json_str): Do not output the U+DC01 to U+DCFF
code points by masking them and using put_byte. This is
unnecessary; if we just send them as-is to the text stream,
the UTF-8 encoder does that for us.
(put_json, put_jsonl): New functions.
* lib.h (put_json, put_jsonl): Declared.
* txr.1: Documented. The bulk of tojson is moved under the
descriptions of these new functions, and elsewhere where the
document pointed to tojson for more information, it now points
to put-json. More detailed description of character treatment
is given.
* share/txr/stdlib/doc-syms.tl: Updated.
|
|
|
|
|
|
|
|
|
|
|
| |
* lib.c (out_json_rec): In the CONS case, if ctx is null bail
and report and invalid object. This lets us call the function
with a null context.
(tojson): Do not support (json ...) syntax. Instead of
obj_print, pass the object directly to out_json_rec.
* txr.1: Do not mention handling json macro syntax.
Common leading text factored out of bulleted paragraphs section.
|
|
|
|
|
|
|
|
|
| |
* lib.c (out_json_str): When the < character is seen, if the
lookahead character is /, output the < and a backslash to
escape the /.
* txr.1: Moved description of special JSON output handling
under tojson, and described the above escaping there also.
|
|
|
|
|
|
|
|
|
| |
* share/txr/stdlib/compiler.tl (compiler comp-catch): The
tfrag's output register may be (t 0) in which case the
maybe-mov in the cfrag code generates a mov into (t 0) which
is not allowed. Instead of tfrag.oreg we create an abstrction
coreg which could be either toreg or oreg and consistently use
it.
|
|
|
|
|
|
| |
* parser.c (repl): syntax error exceptions carry some text as
an argument, which we now print. This distinguishes
end-of-stream syntax errors from real ones.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* parser.c (lisp_parse_impl): Access pi->eof directly instead
of going through parser_eof. We are using pi->errors in the
same expression! This is the only place pi->eof is needed.
(read_file_common): Don't call parser_eof. Assume that if
error_val emerges, and errors is zero, it must be eof.
(read_eval_ret_last): Simplify: the code actually does nothing
specia for eof versus errors, so we just bail if error_var
emerges.
(get_parser): Function removed.
(parse_errors): This is the only caller of get_parser, which
now just calls gethash to fetch the parser.
(parser_eof): Function removed.
(parse_init): sys:get-parser, sys:parser-errors and
sys:parser-eof intrinsics removed. The C function
parser_errors still exists; it is used in a few places.
* parser.h (get_parser, parser_eof): Declarations removed.
* share/txr/stdlib/compiler.tl (compile-file-conditionally):
Use the new parse-errors function instead of the removed
sys:parser-errors. Since that works on the stream, we don't
have to call sys:get-parser. Because it returns nil if there
are no errors, we drop the > test.
|
|
|
|
|
|
|
|
|
|
| |
* parser.c (parse_errors): New function.
* parser.h (parse_errors): Declared.
* txr.1: Documented.
* share/txr/stdlib/doc-syms.tl: Updated.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
* eval.c (eval_init): get-json intrinsic registered.
* parser.c (prime_parser): Handle prime_json.
(lisp_parse_impl): Take enum prime_parser argument
directly instead of the interactive flag.
(lisp_parse, nread, iread): Pass appropriate prime_parser
value instead of the original flag.
(get_json): New function. Like nread, but passes prime_json.
* parser.h (enum prime_parser): New constant, prime_json.
(get_json): Declared.
* parser.l (prime_scanner): Handle prime_json.
* parser.y (SECRET_ESCAPE_J): New terminal symbol.
(spec): New productions around SECRET_ESCAPE_J for parsing
JSON.
* lex.yy.c.shipped, y.tab.c.shipped, y.tab.h.shipped: Updated.
* txr.1: Documented.
* share/txr/stdlib/doc-syms.tl: Updated.
|
|
|
|
| |
* txr.1: Add missing .IP after .RE to return the indentation.
|
|
|
|
|
|
|
|
|
|
|
|
| |
* eval.c (eval_init): tojson intrinsic registered.
* lib.c (tojson): New function.
* lib.h (tojson): Declared.
* txr.1: Documented.
* share/txr/stdlib/doc-syms.tl: Updated.
|
|
|
|
|
|
|
|
| |
* parser.y (json): Add forgotten call end_of_json in
the '^' production.
(json_val): Pass zero to vector, not 0 which is nil.
* y.tab.c.shipped: Updated.
|
|
|
|
|
|
|
|
|
| |
* lib.c (out_json_rec): Save, establish and restore
indentation when printing [ ] and { } notation.
Enforce line breaks, and force a line break after the
object if one occurred in the object.
(out_json): Turn on indentation if it is off (but
not if it is forced off). Restore after doing the object.
|
|
|
|
|
|
|
|
|
|
| |
First cut, without line breaks or indentation.
* lib.c (out_json_str, out_json_rec, out_json): New static
functions.
(obj_print_impl): Hook in json printing via out_json.
* txr.1: Add notes about output to JSON extensions.
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The JSON null will map to the Lisp null symbol. I thought
about using : but that could cause surprises; like when it's
passed to functions as an optional argument, it will trigger
the default value.
* parser.l (JSON): Add rules for producing null keyword.
* txr.1: Documented.
* lex.yy.c.shipped: Updated.
|
|
|
|
|
|
|
|
|
|
|
| |
* parser.l <JLIT>: Convert \u+0000 sequence to U+DC00
code point, the pseudo-null. Also include JLIT
in in the rule for catching bad bytes that are not
matched by {UANYN}.
* txr.1: Document this treatment as extensions to JSON.
* lex.yy.c.shipped: Updated.
|