oleg-at-pobox.com |haskell-cafe| wrote:
> Yang wrote:
>> (Something like this is straightforward to build if I abandon
>> Concurrent Haskell and use cooperative threading, and if the
>> operations I wanted to perform could be done asynchronously.)
> All operations could be done asynchronously, at least on Linux and
> many Unixes:
>>http://www.usenix.org/events/usenix04/tech/general/full_papers/elmeleegy/elmeleegy_html/index.html
Thanks for this pointer.
>>> (Something like this is straightforward to build if I abandon
>> Concurrent Haskell and use cooperative threading, and if the
>> operations I wanted to perform could be done asynchronously.)
>> That seems as a very good idea. You might be aware that well-respected
> and well-experienced systems researchers call for *abandoning*
> threads. Threads are just a very bad model.
>> The Problem with Threads
> Edward A. Lee
>http://www.eecs.berkeley.edu/Pubs/TechRpts/2006/EECS-2006-1.html> Also, IEEE computer, May 2006, pp. 33-42.
>> From the abstract:
> ``Although threads seem to be a small step from sequential computation,
> in fact, they represent a huge step. They discard the most essential
> and appealing properties of sequential computation: understandability,
> predictability, and determinism. Threads, as a model of computation,
> are wildly nondeterministic, and the job of the programmer becomes one
> of pruning that nondeterminism. Although many research techniques
> improve the model by offering more effective pruning, I argue that
> this is approaching the problem backwards. Rather than pruning
> nondeterminism, we should build from essentially deterministic,
> composable components. Nondeterminism should be explicitly and
> judiciously introduced where needed, rather than removed where not
> needed. The consequences of this principle are profound. I argue for
> the development of concurrent coordination languages based on sound,
> composable formalisms. I believe that such languages will yield much
> more reliable, and more concurrent programs.''
I had read this not long ago. While the bulk of the paper argues for
determinism, my understanding is that he ultimately doesn't actually
advocate tossing out threads per se; he approves of their use for data
flow (message-passing) and with state isolation.
"This style of concurrency is, of course, not new. Component
architectures where data flows
through components (rather than control) have been called
“actor-oriented” [35]. These can take
many forms. Unix pipes resemble PN, although they are more limited in
that they do not support
cyclic graphs. Message passing packages like MPI and OpenMP include
facilities for implementing
rendezvous and PN, but in a less structured context that emphasizes
expressiveness rather than
determinacy. A naive user of such packages can easily be bitten by
unexpected nondeterminacy.
Languages such as Erlang [4] make message passing concurrency an
integral part of a general-
purpose language. Languages such as Ada make rendezvous an integral
part. Functional languages
[30] and single-assignment languages also emphasize deterministic
computations, but they are less
explicitly concurrent, so controlling and exploiting concurrency can be
more challenging. Data
parallel languages also emphasize determinate interactions, but they
require low-level rewrites of
software."
>>> I believe that delimited continuations is a good way to build
> coordination languages, because delimited continuations let us build a
> sound model of computation's interaction with its context.
>
Aren't safepoints (+ no shared state) enough to tame this issue? What
visible difference is there between threads with safepoints and
delimited continuations?
Another reason for separate threads is that they can run on separate OS
threads (cores), thus exploiting parallelism.