[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: binary vs non-binary ports

This page is part of the web mail archives of SRFI 56 from before July 7th, 2015. The new archives for SRFI 56 contain all messages, not just those from before July 7th, 2015.



On Sat, 18 Sep 2004 09:31:02 -0700, Per Bothner wrote
> 
> I get nervous when I hear about the "creative process of software
> engineers".

I personally like this human factor, but that is off topic here.

> * Portable programs cannot assume they can do binary i/o
> on ports opened in the default character mode, unless we
> make unreasonable demands on implementors.

How portable must it be? But I think I get the message. Do you want
to use standard I/O primitives to write binary files? 
I've argued that an I/O primitive should determine the language
interpretation, not the port. The port has to handle the 
communication or storage medium.

> * Most file formats that mix text and binary i/o do *not* handle
> general strings: often they only support whatever character encoding
> the "creative" engineers are most familiar with.

Hmm. That's quite a pessimistic view of the world. Maybe it's more
like: "whatever character encoding is sufficient for the project
at hand". When the time comes, internationalization is necessary,
it is allways possible to refactor your program.

> * I/O APIs designed by people unfamiliar with internationalization
> issues often have problems in today's internationalized world.

True.

> * A quick-and-dirty fix is often to specify that strings are in UTF8.

Could be, but that still leaves the trouble of decoding up to 6 byte
UTF8 encodings.


--
Hans Oesterholt-Dijkema