Describing resources and their relationships with organisms seems to be a useful approach to a ‘unified ecology’, contributing to fill the gap between natural and human oriented processes, and opening new perspectives in dealing with biological complexity. This Resource Criterion defines the main properties of resources, describes the mechanisms that link them to individual species, and gives a particular emphasis to the biosemiotic approach that allows resources to be identified inside a heterogeneous ecological medium adopting the eco-field model. In particular, (...) this Criterion allows to couple matter, structured energy and information composing the ecological systems to the biosemiotic and cognitive mechanisms adopted by individual species to track resources, transforming neutral surroundings into meaningful species-specific Umwelten. The expansion of the human semiotic niche that is a relevant evolutionary process of the present time, assigns the role of powerful and efficient agency to the Resources Criterion to evaluate the effect of human intrusion into the natural systems with habits of key stone species, under the challenge of a growing use of alloctonous, immaterial and symbolic resources of the actual globalized societal models. The Resource Criterion interprets the ecological dynamics contributing to complete the epistemology of the ecology, to open a bridge toward economy and other societal sciences, and to contribute to formulate a GeneralTheory of Resources. (shrink)

We introduce a new “positive formalism” for encoding quantum theories in the general boundary formulation, somewhat analogous to the mixed state formalism of the standard formulation. This makes the probability interpretation more natural and elegant, eliminates operationally irrelevant structure and opens the general boundary formulation to quantum information theory.

This paper reorganizes and further develops the theory of partial meet contraction which was introduced in a classic paper by Alchourrón, Gärdenfors, and Makinson. Our purpose is threefold. First, we put the theory in a broader perspective by decomposing it into two layers which can respectively be treated by the generaltheory of choice and preference and elementary model theory. Second, we reprove the two main representation theorems of AGM and present two more representation results (...) for the finite case that "lie between" the former, thereby partially answering an open question of AGM. Our method of proof is uniform insofar as it uses only one form of "revealed preference", and it explains where and why the finiteness assumption is needed. Third, as an application, we explore the logic characterizing theory contractions in the finite case which are governed by the structure of simple and prioritized belief bases. (shrink)

Universal Logic is not a new logic, but a generaltheory of logics, considered as mathematical structures. The name was introduced about ten years ago, but the subject is as old as the beginning of modern logic: Alfred Tarski and other Polish logicians such as Adolf Lindenbaum developed a generaltheory of logics at the end of the 1920s based on consequence operations and logical matrices. The subject was revived after the flowering of thousands of new (...) logics during the last thirty years: there was a need for a systematic theory of logics to put some order in this chaotic multiplicity. This book contains recent works on universal logic by first-class researchers from all around the world. The book is full of new and challenging ideas that will guide the future of this exciting subject. It will be of interest for people who want to better understand what logic is. Tools and concepts are provided here for those who want to study classes of already existing logics or want to design and build new ones. (shrink)

This chapter sets forth a generaltheory of gender stratification. While both biological and ideological variables are taken into account, the emphasis is structural: It is proposed that the major independent variable affecting sexual inequality is each sex's economic power, understood as relative control over the means of production and allocation of surplus. For women, relative economic power is seen as varying-and not always in the same direction-at a variety of micro- and macrolevels, ranging from the household to (...) the state. A series of propositions links the antecedents of women's relative economic power, the interrelationship between economic and other forms of power, and the forms of privilege and opportunity into which each gender can translate its relative power. (shrink)

Institutional differentiation has been one of the central concerns of sociology since the days of Auguste Comte. However, the overarching tendency among institutionalists such as Durkheim or Spencer has been to treat the process of differentiation from a macro, "outside in" perspective. Missing from this analysis is how institutional differentiation occurs from the "inside out, "or through the efforts and struggles of individual and corporate actors. Despite the recent efforts of the "new institutionalism" to fill in this gap, a closer (...) look at the literature will uncover the fact that (1) it has tended to conflate macro-level institutions and meso-level organizations and (2) this has led to a taken for granted approach to institutional dynamics. This article seeks to develop a generaltheory of institutional autonomy; autonomy is a function of the degree to which specialized corporate units are structurally and symbolically independent of other corporate units. It is argued herein that the process by which these "institutional entrepreneurs" become independent can explain how institutions become differentiated from the "inside out." Moreover, this article offers five dimensions that can be operationalized, measuring the degree to which institutions are autonomous. (shrink)

In recent years there has been an outpouring of work at the intersection of social movement studies and organizational theory. While we are generally in sympathy with this work, we think it implies a far more radical rethinking of structure and agency in modern society than has been realized to date. In this article, we offer a brief sketch of a generaltheory of strategic action fields (SAFs). We begin with a discussion of the main elements of (...) the theory, describe the broader environment in which any SAF is embedded, consider the dynamics of stability and change in SAFs, and end with a respectful critique of other contemporary perspectives on social structure and agency. (shrink)

The philosophical literature on time and change is fixated on the issue of whether the B-series account of change is adequate or whether real change requires Becoming of either the property-based variety of McTaggart's A-series or the non-property-based form embodied in C. D. Broad's idea of the piling up of successive layers of existence. For present purposes it is assumed that the B-series suffices to ground real change. But then it is noted that modern science in the guise of Einstein's (...)generaltheory poses a threat to real change by implying that none of the genuine physical magnitudes countenanced by the theory changes its value with time. The aims of this paper are to explain how this seemingly paradoxical conclusion arises and to assess the merits and demerits of possible reactions to it. (shrink)

How can the propositional attitudes of several individuals be aggregated into overall collective propositional attitudes? Although there are large bodies of work on the aggregation of various special kinds of propositional attitudes, such as preferences, judgments, probabilities and utilities, the aggregation of propositional attitudes is seldom studied in full generality. In this paper, we seek to contribute to filling this gap in the literature. We sketch the ingredients of a generaltheory of propositional attitude aggregation and prove two (...) new theorems. Our first theorem simultaneously characterizes some prominent aggregation rules in the cases of probability, judgment and preference aggregation, including linear opinion pooling and Arrovian dictatorships. Our second theorem abstracts even further from the specific kinds of attitudes in question and describes the properties of a large class of aggregation rules applicable to a variety of belief-like attitudes. Our approach integrates some previously disconnected areas of investigation. (shrink)

This work represents an attempt to stake out the landscape for dynamicism based on a radical dismissal of the information-processing paradigm that dominates the philosophy of cognitive science. In Section 2, after setting up the basic toolkit of a theory of minimal representationalism, I introduce the central tenets of dynamic systems theory (DST) by discussing recent research in the dynamics of embodiment (Thelen et al. [2001]) in the perseverative-reaching literature. A recent proposal on the dynamics of representation—the dynamic (...) field approach (Spencer and Schöner [2003])—according to which the alleged representational gap between DST and representational theories of cognition needs to be bridged in order to explain higher-order cognitive activity will then be reviewed. In Section 3, I shall argue that Spencer and Schöner's attempt to bridge the representational gap may jeopardize the whole (antirepresentationalist) spirit of the DST project. In order to show why, I shall introduce the key concepts of reliability of environment and primagenesis, and argue that DST can account for de-coupled, offline cognitive activity with no need of positing representational resources. Conclusions and directions for future research will follow. Introduction 1.1 Minimal representationalism The dynamic field approach 2.1 Dynamic systems theory and the continuity hypothesis 2.2 The dynamic field approach: Bridging the representational gap? Towards a generaltheory of antirepresentationalism 3.1 Diagonal systems and microstimulated dissociations 3.2 Reliability of environment and primagenesis 3.3 Towards a generaltheory of antirepresentationalism Conclusion CiteULike Connotea Del.icio.us What's this? (shrink)

Community ecology entered the 1970s with the belief that niche theory would supply a generaltheory of community structure. The lack of wide-spread empirical support for niche theory led to a focus on models specific to classes of communities such as lakes, intertidal communities, and forests. Today, the needs of conservation biology for metrics of “ecological health” that can be applied across types of communities prompts a renewed interest in the possibility of generaltheory (...) for community ecology. Disputes about the existence of general patterns in community structure trace at least to the 1920s and continue today almost unchanged in concept, although now expressed through mathematical modeling. Yet, a new framework emerged in the 1980s from findings that community composition and structure depend as much on the processes that bring species to the boundaries of a community as by processes internal to a community, such as species interactions and co-evolution. This perspective, termed “supply-side ecology”, argued that community ecology was to be viewed as an “organic earth science” more than as a biological science. The absence of a generaltheory of the earth would then imply a corresponding absence of any generaltheory for the communities on the earth, and imply that the logical structure of theoretical community ecology would consist of an atlas of models special to place and geologic time. Nonetheless, a generaltheory of community ecology is possible similar in form to the generaltheory for evolution if the processes that bring species to the boundary of a community are analogized to mutation, and the processes that act on the species that arrive at a community are analogized to selection. All communities then share some version of this common narrative, permitting general theorems to be developed pertaining to all ecological communities. Still, the desirability of a generaltheory of community ecology is debatable because the existence of a generaltheory suppresses diversity of thought even as it allows generalizations to be derived. The pros and cons of a generaltheory need further discussion. (shrink)

The numerical representations of measurement, geometry and kinematics are here subsumed under a generaltheory of representation. The standard theories of meaningfulness of representational propositions in these three areas are shown to be special cases of two theories of meaningfulness for arbitrary representational propositions: the theories based on unstructured and on structured representation respectively. The foundations of the standard theories of meaningfulness are critically analyzed and two basic assumptions are isolated which do not seem to have received adequate (...) justification: the assumption that a proposition invariant under the appropriate group is therefore meaningful, and the assumption that representations should be unique up to a transformation of the appropriate group. A generaltheory of representational meaningfulness is offered, based on a semantic and syntactic analysis of representational propositions. Two neglected features of representational propositions are formalized and made use of: (a) that such propositions are induced by more general propositions defined for other structures than the one being represented, and (b) that the true purpose of representation is the application of the theory of the representing system to the represented system. On the basis of these developments, justifications are offered for the two problematic assumptions made by the existing theories. (shrink)

An examination of time as featured in the GeneralTheory of Relativity, which supercedes Einstein’s Special Theory, serves to rekindle the issue of the existenceof absolute time. In application to cosmology, Einstein’s GeneralTheory yields models of the universe featuring a worldwide time which is the same for all observers in the universe regardless of their relative motion. Such a cosmic time is a rough physical measure of Newton’s absolute time, which is based ontologically in (...) the duration of God’s being and is more or less accurately recorded by physical clocks. (shrink)

I present a generaltheory of abstraction operators which treats them as variable-binding term- forming operators, and provides a reasonably uniform treatment for definite descriptions, set abstracts, natural number abstraction, and real number abstraction. This minimizing, extensional and relational theory reveals a striking similarity between definite descriptions and set abstracts, and provides a clear rationale for the claim that there is a logic of sets (which is ontologically non- committal). The theory also treats both natural and (...) real numbers as answering to a two-fold process of abstraction. The first step, of conceptual abstraction, yields the object occupying a particular position within an ordering of a certain kind. The second step, of objectual abstraction, yields the number sui generis, as the position itself within any ordering of the kind in question. (shrink)

The conclusions derived by Keynes in his Treatise on Probability (1921) concerning induction, analogical reasoning, expectations formation and decision making, mirror and foreshadow the main conclusions of cognitive science and psychology.The problem of weight is studied within an economic context by examining the role it played in Keynes' applied philosophy work, The GeneralTheory (1936). Keynes' approach is then reformulated as an optimal control approach to dealing with changes in information evaluation over time. Based on this analysis the (...) problem of inductive justification, from a societal perspective, is not, What can we rationally believe will occur in the economic future, given our past experiences? but Can we make the future so as to attain specific economic goals with practical certainty? An answer requires that restrictions be placed on the methodological individualist approach and the acceptance of a restricted holistic approach. (shrink)

The considerations of Part I are extended and the experimental data and hypotheses that led to the establishment of the generaltheory of relativity are analyzed. It is found that one of the fundamental assumptions is that light is propagated homogeneously; i.e., by using arbitrary systems of coordinates, propagation of light can be represented by a homogeneous quadratic form. This is shown to be an assumption that can be verified by experiment, at least in principle. As a result (...) of adding a number of further assumptions to this, the usual formalism of the generaltheory of relativity can be established. In the above point of view, the generaltheory of relativity—like any other theory—cannot be built upad hoc, but is built on distinct physical hypotheses, each of which can be subjected to test by experiment. (shrink)

How can the propositional attitudes of several individuals be aggregated into overall collective propositional attitudes? Although there are large bodies of work on the aggregation of various special kinds of propositional attitudes, such as preferences, judgments, probabilities and utilities, the aggregation of propositional attitudes is seldom studied in full generality. In this paper, we seek to contribute to …lling this gap in the literature. We sketch the ingredients of a generaltheory of propositional attitude aggregation and prove two (...) new theorems. Our …rst theorem simultaneously characterizes some prominent aggregation rules in the cases of probability, judgment and preference aggregation, including linear opinion pooling and Arrovian dictatorships. Our second theorem abstracts even further from the speci…c kinds of attitudes in question and describes the properties of a large class of aggregation rules applicable to a variety of belief-like attitudes. Our approach integrates some previously disconnected areas of investigation. (shrink)

The purpose of this paper is to present in a uniform way the commutator theory for k-deductive system of arbitrary positive dimension k. We are interested in the logical perspective of the research — an emphasis is put on an analysis of the interconnections holding between the commutator and logic. This research thus qualifies as belonging to abstract algebraic logic, an area of universal algebra that explores to a large extent the methods provided by the generaltheory (...) of deductive systems. In the paper the new term ‘commutator formula’ is introduced. The paper is concerned with the meanings of the above term in the models provided by the commutator theory and clarifies contexts in which these meanings occur. The work is presented in an abstracted form: main ideas are outlined but proofs are deferred to the second part of the paper. (shrink)

Hans Kelsen is considered by many to be the foremost legal thinker of the twentieth century. During the last decade of his life he was working on what he called a generaltheory of norms. Published posthumously in 1979 as Allgemeine Theorie der Normen, the book is here translated for the first time into English. Kelsen develops his "pure theory of law" into a "generaltheory of norms", and analyzes the applicability of logic to norms (...) to offer an original and extreme position which some have called "normative irrationalism". Examining the views of over 200 philosophers and legal theorists on law, morality, and logic, and revising several of his own earlier positions, Kelsen's final work is a mandatory resource for legal and moral philosophers. (shrink)

We formulate a generaltheory of conservation laws and other invariants for a physical system through equivalence relations. The conservation laws are classified according to the type of equivalence relation, with group equivalence, homotopical equivalence, and other types of equivalence relations giving respective kinds of conservation laws. The stability properties in the topological (and differentiable) sense are discussed using continuous deformations with respect to control parameters. The conservation laws due to the Abelian symmetries are shown to be stable (...) through application of well-known theorems. (shrink)

This paper seeks to make a contribution toward a generaltheory of responsibility and irresponsibility. Such a theory, or framework or model, addresses therelationship between responsibility and irresponsibility. The motive for the effort is that the literature on business ethics, corporate citizenship, and corporate social responsibility combines negative prohibitions with positive requirements and at both individual and organizational levels of action. A prohibition takes the form “do not” expressed in laws and ethics. A requirement takes the form (...) “should” or “ought” expressed in theories of responsibility and stakeholder engagement. Armstrong (1977) points out that actually preventing harm may be socially much more valuable than promoting contribution. (shrink)

Relativity is the most important scientific idea of the twentieth century. Albert Einstein is the unquestioned founder of modern physics. His Special and General theories of Relativity introduced the idea to the world. In this classic short book he explains clearly, using the minimum amount of mathematical terms, the basic ideas and principles of his theory of Relativity. Unsurpassed by any subsequent books on Relativity, this remains the most popular and useful exposition of Einstein's immense contribution to human (...) knowledge. (shrink)

Human knowledge is a phenomenon whose roots extend from the cultural, through the neural and the biological and finally all the way down into the Precambrian “primordial soup.” The present paper reports an attempt at understanding this Greater System of Knowledge (GSK) as a hierarchical nested set of selection processes acting concurrently on several different scales of time and space. To this end, a general selection theory extending mainly from the work of Hull and Campbell is introduced. The (...) perhaps most drastic change from previous similar theories is that replication is revealed as a composite function consisting of what is referred to as memory and synthesis. This move is argued to drastically improve the fit between theory and human-related knowledge systems. The introduced theory is then used to interpret the subsystems of the GSK and their interrelations. This is done to the end of demonstrating some of the new perspectives offered by this view. (shrink)

Dayal’s (2004) theory of kind terms accounts for the deﬁniteness and number marking patterns in kind terms in many languages. Brazilian Portuguese has been claimed to be a counter-example to her theory as it seems to allow bare “singular” kind terms, which are predicted to be impossible according to her theory. However, the empirical status of the relevant data has not been clear so far. This paper presents a new data point from Singlish and conﬁrms the existence (...) of bare “singular” kind terms. A revised theory of kind terms is proposed that accounts for it. The proposed theory puts forth a number system with three basic categories, i.e. singular, plural and general. It is claimed that bare “singular” kind terms are in fact derived from general NPs, which are associated with number-neutral properties. The paper also discusses why bare “singular” kind terms are not perfectly acceptable in Brazilian Portuguese. (shrink)

Part of Keynes? 'struggle of escape from habitual modes of thought and expression' (Keynes 1960: viii) involves an implicit attempt to break with the methodology as well as the theory of the past. Unfortunately the rhetorical strategy Keynes adopted in The General Theoryblurred this attempt. As a result, it is only by examining both the methodology and rhetoric embedded in this work that it becomes possible to understand the book as a coherent whole. This paper demonstrates the validity (...) of taking such an approach. (shrink)

Is one of the roles of theory in biology answering the question “What is life?” This is true of theory in many other fields of science. So why should not it be the case for biology? Yet efforts to identify unifying concepts and principles of life have been disappointing, leading some (pluralists) to conclude that life is not a natural kind. In this essay I argue that such judgments are premature. Life as we know it on Earth today (...) represents a single example and moreover there is positive evidence that it may be unrepresentative of life considered generally. Furthermore, as I discuss, the prototype for theorizing about life has traditionally been based on multicellular plants and animals. Yet biologists have discovered that the latter represent a rare, exotic, and fairly recent form of Earth life. By far the oldest, toughest, most extensive, and diverse form of life on our planet is unicellular, prokaryotic microbes, and there are reasons to suppose that this is almost certainly true elsewhere in the universe as well. If there are explanatorily and predictively powerful, biologically distinctive principles for life that can be gleaned from our insular example of life it is more likely that they will be found among the microbes. I discuss some provocative ways in which unicellular microbes differ from multicellular eukaryotes and argue that some of them just might provide us with key insights into the nature of life. (shrink)

The article tries to inquire a third way in normative ethics between consequentialism or utilitarianism and deontology or Kantianism. To find such a third way in normative ethics, one has to analyze the elements of these classical theories and to look if they are justified. In this article it is argued that an adequate normative ethics has to contain the following five elements: (1) normative individualism, i. e., the view that in the last instance moral norms and values can only (...) be justified by reference to the individuals concerned, as its basis; (2) consideration of the individuals’ concerns and interests—aims, desires, needs, strivings—insofar as they have a justificatory function; (3) a pluralism of references of these concerns and hence of moral norms and values to all possible elements of actions; (4) the necessity of a principle of aggregation and weighing with regard to these concerns; (5) finally, as a central principle of aggregation and weighing, the principle of relative reference to self and others, operating as a generalizing meta-principle that guides the application of concrete principles and decisions. (shrink)

The article tries to inquire a third way in normative ethics between consequentialism or utilitarianism and deontology or Kantianism. To find such a third way in normative ethics, one has to analyze the elements of these classical theories and to look if they are justified. In this article it is argued that an adequate normative ethics has to contain the following five elements: (1) normative individualism, i. e., the view that in the last instance moral norms and values can only (...) be justified by reference to the individuals concerned, as its basis; (2) consideration of the individuals' concerns and interests—aims, desires, needs, strivings—insofar as they have a justificatory function; (3) a pluralism of references of these concerns and hence of moral norms and values to all possible elements of actions; (4) the necessity of a principle of aggregation and weighing with regard to these concerns; (5) finally, as a central principle of aggregation and weighing, the principle of relative reference to self and others, operating as a generalizing meta-principle that guides the application of concrete principles and decisions. (shrink)

David Lewis famously argued against structural universals since they allegedly required what he called a composition “sui generis” that differed from standard mereological com¬position. In this paper it is shown that, although traditional Boolean mereology does not describe parthood and composition in its full generality, a better and more comprehensive theory is provided by the foundational theory of categories. In this category-theoretical framework a theory of structural universals can be formulated that overcomes the conceptual difficulties that Lewis (...) and his followers regarded as unsurmountable. As a concrete example of structural universals groups are considered in some detail. (shrink)

The number of studies related to natural and artificial mechanisms of learning rapidly increases. However, there is no generaltheory of learning that could provide a unifying basis for exploring different directions in this growing field. For a long time the development of such a theory has been hindered by nativists' belief that the development of a biological organism during ontogeny should be viewed as parameterization of an innate, encoded in the genome structure by an innate algorithm, (...) and nothing essentially new is created during this process. Noam Chomsky has claimed, therefore, that the creation of a non-trivial general mathematical theory of learning is not feasible, since any algorithm cannot produce a more complex algorithm. This study refutes the above argumentation by developing a counter-example based on the mathematical theory of algorithms and computable functions. It introduces a novel concept of a Universal Learning System (ULS) capable of learning to control in an optimal way any given constructive system from a certain class. The necessary conditions for the existence of a ULS and its main functional properties are investigated. The impossibility of building an algorithmic ULS for a sufficiently complex class of controlled objects is shown, and a proof of the existence of a non-algorithmic ULS based on the axioms of classical mathematics is presented. It is argued that a non-algorithmic ULS is a legitimate object of not only mathematics, but also the world of nature. These results indicate that an algorithmic description of the organization and adaptive development of biological systems in general is not sufficient. At the same time, it is possible to create a rigorous non-algorithmic generaltheory of learning as a theory of ULS. The utilization of this framework for integrating learning-related studies is discussed. (shrink)

In this article, I try to do two things. First I analyse critically the suggestion that the principles of criminal culpability can be explained by reference to a single, all-encompassing concept, such as “defiance of the law”. I then go on to explain the foundations of criminal culpability by reference to three interlocking theories — the capacity theory, the character theory, and the agency theory. I conclude that even these three theories may not be sufficient to explain (...) the complex structure of culpability, which is shaped as much by shared cultural understanding as by moral theory. (shrink)

The review argues that Lovett’s theory of domination suffers from a problem. Lovett is aware of the problem and bites a fairly large bullet in response to it. What he does not seem aware of is that the problem can be avoided by opting for an account of welfare that he unfortunately ignores, despite the fact that it would serve his purposes well.

A unified theory is offered to account for three types of definite descriptions: with singular, plural, & mass predicates, & to provide an account for the word the in descriptions. It is noted that B. Russell's analysis ("On Denoting," Mind, 1905, 14, 479-493) failed to account for plural & mass descriptions. The proposed theory differs from Russell's only by the substitution of the notation (less than or equal to) for Russell's =. It is suggested that for every predicate (...) G there is an appropriate "part of" or "some of" relation on the extension of G. For singular predicates, (less than or equal to) becomes just the identity relation =. The predicate G is formulated as "the G" & analyzed with the formula ((schwa)x) (Gx . (Gy (includes) yy (less than or equal to) x)), which accounts for all cases of definite descriptions. It is suggested that the primary use of the is not to indicate uniqueness, but totality. C. Ornatowski. (shrink)

Modularity theorists have challenged that there are, or could be, general learning mechanisms that explain theory-of-mind development. In response, supporters of the ‘scientific theory-theory’ account of theory-of-mind development have appealed to children's use of auxiliary hypotheses and probabilistic causal modeling. This article argues that these general learning mechanisms are not sufficient to meet the modularist's challenge. The article then explores an alternative domain-general learning mechanism by proposing that children grasp the concept belief through (...) the progressive alignment of relational structure that occurs as a result of structural-comparison. The article also explores the implications of the proposed account for Fodor's puzzle of conceptual learning. (shrink)

This paper develops a method for extracting from data the quantum theoretical state representation belonging to any reproducible empirical scheme for preparing a physical system, provided only that at least one observable has its possible values limited to a finite set. In Part I, we formulate a general systematic procedure, based on the concept of irreducible tensor operators, for the selection of sets of observables sufficiently large to permit the unambiguous determination of an unknown quantum state.

In this paper we show that the three main equations used by Bohm in his approach to quantum mechanics are already contained in the earlier paper by Moyal which forms the basis for what is known as the Wigner-Moyal approach. This shows, contrary to the usual perception, that there is a deep relation between the two approaches. We suggest the relevance of this result to the more general problem of constructing a quantum geometry.

From the contention that no worldview can be both consistent and complete is derived the insight that a worldview is contextually dependent on past worldviews that it both transcends and includes. Mādhyamika Buddhism illustrates the deconstructive aspect of this thesis--namely, that worldviews claiming completeness or independence are inconsistent. Process philosophy, on the other hand, is a theory that describes reality as the ongoing process of asymmetrical transcendence and inclusion of worldviews as perspectival events. It is argued that both Mādhyamika (...) and process philosophies can be used to formulate a trans-cultural theory of worldviews that is both classificatory and evaluative. (shrink)

There are several areas in logic where the monotonicity of the consequence relation fails to hold. Roughly these are the traditional non-monotonic systems arising in Artificial Intelligence (such as defeasible logics, circumscription, defaults, ete), numerical non-monotonic systems (probabilistic systems, fuzzy logics, belief functions), resource logics (also called substructural logics such as relevance logic, linear logic, Lambek calculus), and the logic of theory change (also called belief revision, see Alchourron, Gärdenfors, Makinson [2224]). We are seeking a common axiomatic and semantical (...) approach to the notion of consequence whieh can be specialised to any of the above areas. This paper introduces the notions of structured consequence relation, shift operators and structural connectives, and shows an intrinsic connection between the above areas. (shrink)

This work represents an attempt to stake out the landscape for dynamicism based on a radical dismissal of the information-processing paradigm that dominates the philosophy of cognitive science. In Section 2, after setting up the basic toolkit of a theory of minimal representationalism, I introduce the central tenets of dynamic systems theory (DST) by discussing recent research in the dynamics of embodiment (Thelen et al. [2001]) in the perseverative-reaching literature. A recent proposal on the dynamics of representation--the dynamic (...) field approach (Spencer and Schöner [2003])-- according to which the alleged representational gap between DST and representational theories of cognition needs to be bridged in order to explain higher-order cognitive activity will then be reviewed. In Section 3 I shall argue that Spencer and Schöner's attempt to bridge the representational gap may jeopardize the whole (antirepresentationalist) spirit of the DST project. In order to show why, I shall introduce the key concepts of "reliability of environment" and "primagenesis", and argue that DST can account for de-coupled, offline cognitive activity with no need of positing representational resources. Conclusions and directions for future research will follow. (shrink)