[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Delay and dynamic extent

> Note that the lambda form does not close over the dynamic
> environment so why should it be different from delay?  Well if you
> did this the "dynamic environment" would in fact be the lexical
> environment so you would lose all the benefits of dynamic binding.
> When one writes a function with lambda there is an expectation that
> the dynamic environment is abstracted over.  However when using a
> DELAY the expectation is that only the evaluation time is changed,
> not the computation being performed (unless side-effects are
> performed that expose the evaluation order).

I'm afraid I can see problems with that characterization of delay. I'd
also like to show an example that seems to demonstrate why delay
should _not_ be closed over the dynamic environment.

It is sometimes convenient to invert an iteration "inside-out" --
turn an iteration over a collection into a lazy list of the traversed
elements. Generators in Icon, Ruby and now Python use a similar
trick. Here's a generic iteration inverter:

(define (foreach->lazy-list foreach-fn collection)
      (lambda (k-main)
	  (lambda (val)
	      (lambda (k-reenter)
		(k-main (cons val
			      (lambda (k-new-main)
				(set! k-main k-new-main)
				(k-reenter #f)))))))))
	(k-main '())))))

We can see how it works:

(define (force-print lazy-lst)
  (let ((lst (force lazy-lst)))
    (if (pair? lst)
      (begin (display (car lst)) (newline)
	(force-print (cdr lst))))))

(force-print (foreach->lazy-list for-each '(1 2 3)))

We should point out the expression (delay (call/cc (lambda (k) ...))).
It is critically important that 'k' be the continuation in the
'forcing' expression rather than the continuation of the 'delay'
expression. Indeed, it is the former. That seems to contradict to
the quoted assertion that 'delay' merely delays the time the computation
is performed. If it were, in 
	(let* ((x (delay (call-with-current-continuation (lambda (k) (k 1)))))
	       (_ (display "once"))
               (y (force x) ))
	      (list (integer? x) y)
'k' would have been the continuation that delivers the result to x. In
reality, 'k' delivers the result to 'force'.

> This is consistent with the concept of delaying the
> computation (you only want to change **when** the computation is
> performed, not **what** the computation performs)

In the case of call/cc, it is important *what* is performed (and
*where* it is performed).

Let us consider what happens when we try the following:

(define (for-each-str iteratee str)
  (with-input-from-string str
    (lambda () 
      (do ((val (read) (read))) ((eof-object? val))
	(iteratee val)))))

  (foreach->lazy-list for-each-str "4 5 6 7"))

The following subexpression of foreach->lazy-list

	(k-main (cons val
			   (lambda (k-new-main)
				(set! k-main k-new-main)
				(k-reenter #f))))))

executes in the dynamic context when (current-input-port) is the
string port. Therefore, 'delay' is evaluated in this context. Suppose
delay is closed over the dynamic environment. Our delay thus captures
the fact that (current-input-port) is the string port. After the delay
and cons are evaluated, k-main is invoked, which switches
(current-input-port) to that of the main program. The main program
later on forces the cdr of the lazy list. This forces our delay --
which restores the binding of (current-input-port) to the string
port. Then k-reenter is executed, which re-enters the scope
of with-input-from-string and swaps the current binding for
the input port with that for the string port. Both bindings however
are the same. Therefore, when we re-escape after the iteration, the
input port remains bound to the string port. Thus the act of forcing
a lazy list has corrupted the binding of the input port in the main

> If this inheritance did not happen, the DELAY form would lose modularity (in
> the sense that the one writing the DELAY expression would have to take
> into account all the possible places where the promise is forced).

I guess that is the reason why a non-strict functional language must be pure.