Translate

Παρασκευή 29 Νοεμβρίου 2019

Epistemology and the History of Science: The Problem of Historical Epistemology in the Italian Debate of the Twentieth Century

Abstract

The essay, based also on unpublished writings, analytically reconstructs the Italian debate concerning the problem of historical epistemology and of the different relationships that can be established between epistemological reflection and the history of science. We start from awareness, à la Lakatos, that a “history of science without philosophy of science is blind, while a philosophy of science without the history of science is empty”. However, during the twentieth century Italian different theoretical positions emerged. Giulio Preti began by underlining how the history of science should be understood as the history of scientific thought. This position was close to that expressed by Giovanni Gentile for whom the history of science had to be reduced to the history of philosophy. Against this neo-realist claim, an epistemologist like Ludovico Geymonat reacted by underlining how science has its own history as a science. The critical debate between Preti and Geymonat has finally led the first to underline how the history of science must then be articulated in different conceptual traditions, while the latter ended up sharing the need to study the history of science as a history of scientific thought.

Aristotle’s Doctrine of Causes and the Manipulative Theory of Causality

Abstract

I will argue for the similarity between some aspects of Aristotle’s doctrine of causes and a particular kind of interventionist theory of causality. The interventionist account hypothesizes that there is a connection between causation and human intervention: the idea of a causal relation between two events is generated by the reflection of human beings on their own operating. This view is remindful of the Aristotelian concept of αἴτιον (cause), which is linked to the figure of the αἴτιος, the person who is responsible of an action. Aristotle conceives of the efficient cause as the active element which, in the φύσις, gives rise to movement and imposes the form, in analogy with the active element that in τέχνη operates the production: the craftsman. This analogy suggests that Aristotle conceives of the causation on the basis of the human ability to modify the environment with aims. Within the debate on the manipulative theory the classical accounts worked out by Collingwood, Gasking and von Wright have been recently criticized by Woodward. von Wright’s reductive account explains causation on the basis of human free action, while Woodward regards this reduction as a dangerous move which makes causal explanations too much anthropomorphic. I will show that Aristotle’s doctrine of causes is more similar to von Wright’s account and that the Aristotelian analysis of becoming supports von Wright’s reductive interventionism against Woodward’s criticism. Furthermore, I will draw a comparison between interventionism and dispositionalism, where the latter is another contemporary account of causation that claims Aristotelian roots.

Linking Mind to Molecular Pathways: The Role of Experiment Tools

Abstract

Neurobiologists talk of linking mind to molecular dynamics in and between neurons. Such talk is dismissed by cognitive scientists, including many cognitive neuroscientists, due to the number of “levels” that separate behaviors from these molecular events. In this paper I explain what neurobiologists mean by such claims by describing the kinds of experiment tools that have forged these linkages, directly on lab benches. I here focus on one of these tools, gene targeting techniques, brought into behavioral neuroscience from developmental biology more than a quarter-century ago. Discussion of this tool does more than illuminate these claims by neurobiologists, however. An account of its development shows the doubly dependent role that theory plays in neurobiology. Our best current theories about “how the brain works” depend entirely on the experiment tools neuroscientists have available. And these tools get developed via the solution of engineering problems, not the application of theory. Theory is thus of tertiary importance in neuroscience, not of the primary importance that many cognitive scientists assume it to occupy.

The Underdetermination of Theories and Scientific Realism

Abstract

The empirical underdetermination of theories is a philosophical problem which until the last century has not seriously troubled actual science. The reason is that confirmation does not depend only on empirical consequences, and theoretical virtues allow to choose among empirically equivalent theories. Moreover, I argue that the theories selected in this way are not just pragmatically or aesthetically better, but more probably (and/or largely) true. At present in quantum mechanics not even theoretical virtues allow to choose among many competing theories and interpretations, but this is because none of them possess those virtues to a sufficient degree. However, first, we can hope for some future advancement (new empirical tests, or new theories). Second, even if no further progress came forth, all the most credited competitors agree on a substantial core of theoretical assumptions. Therefore underdetermination does not show that we cannot be realist on unobservable entities in general, but at most that in particular fields our inquiry may encounter some de facto limits.

Cognitive Neuroscience and the Hard Problems

Abstract

This paper argues that the fundamental problem of cognitive neuroscience arises from the neuronal description of the brain and the phenomenal description of the conscious mind. In general philosophers agree that no functional approach can explain phenomenal consciousness; some even think that science is forever unable to explain the qualitative character of our experiences. In order to overcome these challenges, I propose a distinction between intrinsic and extrinsic properties of the brain according to which brain states are characterized by intrinsic properties, whereas the brain under the causal influence of an organism’s environment acquires extrinsic properties. These extrinsic properties may account for both phenomenal experiences as well as our thoughts about these experiences. At the end I discuss this proposal viability in relation to higher-order theories.

Is Einstein’s Interpretation of Quantum Mechanics Ψ-Epistemic?

Abstract

Harrigan and Spekkens (Found Phys 40:125–157, 2010), introduced the influential notion of an ontological model of operational quantum theory. Ontological models can be either “epistemic” or “ontic.” According to the two scholars, Einstein would have been one of the first to propose an epistemic interpretation of quantum mechanics. Pusey et al. (Nat Phys 8:475–478, 2012) showed that an epistemic interpretation of quantum theory is impossible, so implying that Einstein had been refuted. We discuss in detail Einstein’s arguments against the standard interpretation of QM, proving that there is a misunderstanding in Harrigan and Spekkens’ attribution of an epistemic perspective to Einstein, whose point of view was actually statistical, but in a quasi-classical sense.

Subject of Cognition from a Cultural Neuroscience Perspective

Abstract

This paper assesses, from a philosophical point of view, the latest cultural neuroscience results that suggest the traditional interpretation of subject of cognition be essentially reconstructed. We must move from a universalistic interpretation of cognitive process (mostly manifested in a classical Kantian transcendentalism) to an interpretation taking into explicit account the socio-cultural context of the subject’s activity, as well as often its biological nature. The principle of cultural and cognitive neurobiological determination of knowledge acquisition is proposed. We claim that subject of cognition is fixed in the historical and cultural context and it neurobiologically determined, and thus classical Kantian transcendentalism should be reconsidered in light of recent neuroscience research.

In Defence of Metametasemantics

Abstract

In the paper I defend the idea of metametasemantics against the arguments recently presented by Ori Simchen (2017). Simchen attacks the view, according to which metametasemantics incorporating all possible metasemantic accounts is necessary to protect the metasemantic theories from the notorious problem of inscrutability of reference (see Sider 2011). Simchen claims that if metametasemantics is allowed it ‘absorbs’ metasemantic theories to the extent that it diminishes their explanatory value. Furthermore, in this way Simchen sets up two main metasemantic paradigms i.e. productivism (roughly speaking: speaker’s metasemantics) and interpretationism (audience’s metasemantics) as the rival theories inevitably excluding each other. I endeavour to undermine Simchen’s point by demonstrating that his argumentation mixes up deflationary reading of the predicate ‘is true’ with its substantial reading. Consequently, I demonstrate that accepting metametasemantics does not diminish explanatory value of various metasemantic theories and thus that there is no good reason to forbid metametasemantics. I also argue that even if we ignore the above-mentioned confusion in Simchen’s reasoning, his arguments still fail when considering various problems with the notion of diminishment of explanatory value and because the analogy that his arguments are based on is fairly weak. Eventually, I conclude that metametasemantics does not pose any danger to metasemantics and that it provides a solid ground for developing a theory that benefits from both productivism and interpretationism.

Wigner’s Puzzle on Applicability of Mathematics: On What Table to Assemble It?

Abstract

Attempts at solving what has been labeled as Eugene Wigner’s puzzle of applicability of mathematics are still far from arriving at an acceptable solution. The accounts developed to explain the “miracle” of applied mathematics vary in nature, foundation, and solution, from denying the existence of a genuine problem to designing structural theories based on mathematical formalism. Despite this variation, all investigations treated the problem in a unitary way with respect to the target, pointing to one or two ‘why’ or ‘how’ questions to be answered. In this paper, I argue that two analyses, a semantic analysis ab initio and a metatheoretical analysis starting from the types of unreasonableness involved in this problem, will establish the interdisciplinary character of the problem and reveal many more targets, which may be addressed with different methodologies. In order to address objectively the philosophical problem of applicability of mathematics, a foundational revision of the problem is needed.

Two Models of the Subject–Properties Structure

Abstract

In the paper I discuss the problem of the nature of the relationship between objects and their properties. There are three contexts of the problem: of comparison, of change and of interaction. Philosophical explanations of facts indicated in the three contexts need reference to properties and to a proper understanding of a relationship between them and their bearers. My aim is to get closer to this understanding with the use of some models but previously I present the substantialist theory of object and shortly argue for its main theses. The two models enabling us the understanding of the subject–properties structure are: the plastic stuff model and the functional model. On the ground of the first a subject is compared to a piece of plastic stuff which is informed by different shapes. Properties are ways how a subject is, they “give” some “figure” to a subject. The core idea of the second model is that essences (performing the role of subjects) are immanent functional laws governing correlations of properties. As such they are similar to mathematical functions which are saturated by values. The relationship between a subject and properties can be grasped by analogy to such a saturation.

Δεν υπάρχουν σχόλια:

Δημοσίευση σχολίου

Αρχειοθήκη ιστολογίου

Translate