[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: binary vs non-binary ports



On Sat, 18 Sep 2004 09:31:02 -0700, Per Bothner wrote
> 
> I get nervous when I hear about the "creative process of software
> engineers".

I personally like this human factor, but that is off topic here.

> * Portable programs cannot assume they can do binary i/o
> on ports opened in the default character mode, unless we
> make unreasonable demands on implementors.

How portable must it be? But I think I get the message. Do you want
to use standard I/O primitives to write binary files? 
I've argued that an I/O primitive should determine the language
interpretation, not the port. The port has to handle the 
communication or storage medium.

> * Most file formats that mix text and binary i/o do *not* handle
> general strings: often they only support whatever character encoding
> the "creative" engineers are most familiar with.

Hmm. That's quite a pessimistic view of the world. Maybe it's more
like: "whatever character encoding is sufficient for the project
at hand". When the time comes, internationalization is necessary,
it is allways possible to refactor your program.

> * I/O APIs designed by people unfamiliar with internationalization
> issues often have problems in today's internationalized world.

True.

> * A quick-and-dirty fix is often to specify that strings are in UTF8.

Could be, but that still leaves the trouble of decoding up to 6 byte
UTF8 encodings.


--
Hans Oesterholt-Dijkema