[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Too much of a good thing?

This page is part of the web mail archives of SRFI 43 from before July 7th, 2015. The new archives for SRFI 43 contain all messages, not just those from before July 7th, 2015.



Per Bothner <per@xxxxxxxxxxx> writes:
> Sergei Egorov wrote:
> > <Flame>
> > I think that most of the operations that generate vectors element-by-element
> > are useless, especially when the performance is the same as in making a list
> > first and then turning it into a vector.
> 
> - which of course is never.

Yeah, you can easily do it *more expensively*.

ISTM that a resizing+copying implementation, while it may win on cache
locality for small vectors, loses on big vectors. First you've got the
problem of doubling the allocation of your million element array when
you're only going to *need* 1100000 elements. Secondly, it's not
entirely clear to me that the cache locality improvement will
compensate for the log2(n)^2 element copies you'll be making along the
way.

IMO, the only reasonable vector constructors require a priori
knowledge of the vector size. I still like and use resizable vectors
(my personal library calls them 'dynamic vectors'), but I'm
intentionally making a space/time tradeof when I use them, and the
time profile is *not* the same as R5RS vectors although they are still
constant-time access.

david rush
-- 
And Visual Basic programmers should be paid minimum wage :)
	-- Jeffrey Straszheim (on comp.lang.functional)