Skip to main content
Models live in a state of exception. Their versatility, the variety of their methods, the impossibility of their falsification and their epistemic authority allow mathematical models to escape, better than other cases of quantification,... more
Models live in a state of exception. Their versatility, the variety of their methods, the impossibility of their falsification and their epistemic authority allow mathematical models to escape, better than other cases of quantification, the lenses of sociology and other humanistic disciplines. This endows models with a pretence of neutrality that perpetuates the asymmetry between developers and users. Models are thus both underexplored and overinterpreted. While retaining a firm grip on policy, they reinforce the entrenched culture of transforming political issues into technical ones, possibly decreasing citizens’ agency and thus favouring anti-democratic policies. To combat this state of exception, one should question the reproducibility of models, foster complexity of interpretation rather than complexity of construction, and encourage forms of activism aimed at achieving a reciprocal domestication between models and society. To breach the solitude of modellers, more actors should engage in practices such as assumption hunting, modelling of the modelling process, and sensitivity analysis and auditing.
This editorial lays out the core themes of the special feature and provides an overview of the contributions. It introduces the main argument, namely that the promises of far-reaching change made by recent bioeconomy policies are in fact... more
This editorial lays out the core themes of the special feature and provides an overview of the contributions. It introduces the main argument, namely that the promises of far-reaching change made by recent bioeconomy policies are in fact strategically directed at avoiding transformative change to existing societal arrangements. Bioeconomy discourse showcases technological solutions purported to solve sustainability 'problems' while sustaining economic growth, but avoids issues of scalability, integration or negative consequences. Thus, bioeconomy policies, and particularly the latest versions of the predominantly European 'bio-resource' variety that have rhetorically integrated a lot of previous sustainability-minded criticism, serve to ward off or delay challenges to an unsustainable status quo, in effect prolongating the escalatory imperatives of capitalist modernity that are at the root of current crises. The editorial's second part highlights the contributions that the 13 featured articles, based on theoretical considerations as well as policy analyses and empirical case studies from a range of countries, make to this argument.
Diego, takes us on a leisurely stroll along the predicaments of British academia, as subject to periodic evaluations known as the REF (Research Evaluation Framework). Pardo-Guerra's (2022) book The Quantified Scholar. How Research... more
Diego, takes us on a leisurely stroll along the predicaments of British academia, as subject to periodic evaluations known as the REF (Research Evaluation Framework). Pardo-Guerra's (2022) book The Quantified Scholar. How Research Evaluation Transformed the British Social Sciences, published by Columbia University Press, asks important questions about how a culture of quantified evaluation has affected the operation of academia and the life of its members in the UK.
Diego, takes us on a leisurely stroll along the predicaments of British academia, as subject to periodic evaluations known as the REF (Research Evaluation Framework). Pardo-Guerra's (2022) book The Quantified Scholar. How Research... more
Diego, takes us on a leisurely stroll along the predicaments of British academia, as subject to periodic evaluations known as the REF (Research Evaluation Framework). Pardo-Guerra's (2022) book The Quantified Scholar. How Research Evaluation Transformed the British Social Sciences, published by Columbia University Press, asks important questions about how a culture of quantified evaluation has affected the operation of academia and the life of its members in the UK.
The last half a century has seen spectacular progresses in computing and modelling in a variety of fields, applications, and methodologies. Over the same period, a cross-disciplinary field known as sensitivity analysis has been making its... more
The last half a century has seen spectacular progresses in computing and modelling in a variety of fields, applications, and methodologies. Over the same period, a cross-disciplinary field known as sensitivity analysis has been making its first steps, evolving from the design of experiments for laboratory or field studies, also called ‘in-vivo’, to the so-called experiments ‘in-silico’. Some disciplines were quick to realize the importance of sensitivity analysis, whereas others are still lagging behind.

Major tensions within the evolution of this discipline arise from the interplay between local vs global perspectives in the analysis as well as the juxtaposition of the mathematical complexification and the desire for practical applicability. In this work, we retrace these main steps with some attention to the methods and through a bibliometric survey to assess the accomplishments of sensitivity analysis and to identify the potential for its future advancement with a focus on relevant disciplines, such as the environmental field.
While sensitivity analysis improves the transparency and reliability of mathematical models, its uptake by modelers is still scarce. This is partially explained by its technical requirements, which may be hard to understand and implement... more
While sensitivity analysis improves the transparency and reliability of mathematical models, its uptake by modelers is still scarce. This is partially explained by its technical requirements, which may be hard to understand and implement by the nonspecialist. Here we propose a sensitivity analysis approach based on the concept of discrepancy that is as easy to understand as the visual inspection of input-output scatterplots. First, we show that some discrepancy measures are able to rank the most influential parameters of a model almost as accurately as the variance-based total sensitivity index. We then introduce an ersatz-discrepancy whose performance as a sensitivity measure is similar that of the best-performing discrepancy algorithms, is simple to implement, easier to interpret and orders of magnitude faster.
This article explores how the modeling of energy systems may lead to an undue closure of alternatives by generating an excess of certainty around some of the possible policy options. We retrospectively exemplify the problem with the case... more
This article explores how the modeling of energy systems may lead to an undue closure of alternatives by generating an excess of certainty around some of the possible policy options. We retrospectively exemplify the problem with the case of the International Institute for Applied Systems Analysis (IIASA) global modeling in the 1980s. We discuss different methodologies for quality assessment that may help mitigate this issue, which include Numeral Unit Spread Assessment Pedigree (NUSAP), diagnostic diagrams, and sensitivity auditing (SAUD). We illustrate the potential of these reflexive modeling practices in energy policy-making with three additional cases: (i) the case of the energy system modeling environment (ESME) for the creation of UK energy policy; (ii) the negative emission technologies (NETs) uptake in integrated assessment models (IAMs); and (iii) the ecological footprint indicator. We encourage modelers to adopt these approaches to achieve more robust, defensible, and inclusive modeling activities in the field of energy research.
Present day reasoning about difficulties in science reproducibility, science governance, and the use of science for policy could benefit from a philosophical and historical perspective. This would show that the present crisis was... more
Present day reasoning about difficulties in science reproducibility, science
governance, and the use of science for policy could benefit from a philosophical and historical perspective. This would show that the present crisis was anticipated by some scholars of these disciplines, and that diagnoses were offered which are not yet mainstream among crisis-conscious disciplines, from statistics to medicine, from bibliometrics to biology. Diagnoses in turn open the path to possible solutions. This discussion is urgent given the joint impact of the crises on public trust in institutions. We ask whether the present crisis may be seminal in terms of drawing attention to alternative visions and governance arrangements
for the role between science and society. We finish by offering a number of suggestions in this direction.
Research Interests:
The present crisis of science's governance, affecting science's reproducibility, scientific peer review and science's integrity, offers a chance to reconsider evidence based policy as it is being practiced at present. Current evidence... more
The present crisis of science's governance, affecting science's reproducibility, scientific peer review and science's integrity, offers a chance to reconsider evidence based policy as it is being practiced at present. Current evidence based policy exercises entail forms of quantification – often in the form of risk analysis or cost benefit analyses – which aim to optimize one among a set of policy options corresponding to a generally single framing of the issue under consideration. More cogently the deepening of the analysis corresponding to a single view of what the problem is has the effect of distracting from what could be alternative readings. When using evidence based policy those alternative frames become a kind of 'uncomfortable knowledge' which is de facto removed from the policy discourse. All the more so when the analysis is supported by extensive mathematical modelling. Thus evidence based policy may result in a dramatic simplification of the available perceptions, in flawed policy prescriptions and in the neglect of other relevant world views of legitimate stakeholders. This use of scientific method ultimately generates – rather than resolving – controversies and erodes the institutional trust of the involved actors. We suggest an alternative approach – which we term quantitative story-telling – which encourages a major effort in the pre-analytic, pre-quantitative phase of the analysis as to map a socially robust universe of possible frames, which represent different lenses through which to perceive what the problem is. This is followed by an analysis where the emphasis in not on confirmatory checks or system optimization but – the opposite – on an attempt to refute the frames if these violate constraints of feasibility (compatibility with processes outside human control); viability (compatibility with processes under human control), and desirability (compatibility with a plurality of normative considerations relevant to the system's actors).
Research Interests:
This paper suggests adopting a 'post-normal science' (PNS) style and practice in scientific advice, and motivate the urgency of this methodological stance with the increasing complexity, and polarisation affecting the use of science-based... more
This paper suggests adopting a 'post-normal science' (PNS) style and practice in scientific advice, and motivate the urgency of this methodological stance with the increasing complexity, and polarisation affecting the use of science-based evidence for policy. We reflect on challenges and opportunities faced by a 'boundary organisation' that interfaces between science and policy, taking as example the European Commission's Directorate General Joint Research Centre, whose mission is stated as that to be the " in-house science service ". We suggest that such an institution can be exemplary as to what could be changed to improve the quality of evidence feeding into the policy processes in the European Union. This paper suggests how an in-house culture of reflexivity and humility could trigger changes in the existing styles and methods of scientific governance; at the JRC, taken as example, this would mean opening up to the existing plurality of norms and styles of scientific inquiry, and adopting more participatory approaches of knowledge production, assessment and governance. We submit that the institutional changes advocated here are desirable and urgent in order to confront the ongoing erosion of trust in 'evidence based policy', anticipating controversies before they become evident in the institutional setting in which institutions operate.
Research Interests:
Page 1. 153 Indices and Indicators of Justice, Governance, and the Rule of Law Hague Journal on the Rule of Law, 3 : 153???169, 2011 ?? 2011 T.M.C.ASSER PRESS and Contributors doi:10.1017/S1876404511200010 Indices and Indicators of... more
Page 1. 153 Indices and Indicators of Justice, Governance, and the Rule of Law Hague Journal on the Rule of Law, 3 : 153???169, 2011 ?? 2011 T.M.C.ASSER PRESS and Contributors doi:10.1017/S1876404511200010 Indices and Indicators of Justice, Governance, ...
This book collects contributions from the conference that took place in Brussels 28 and 29 of May 2009: "Can creativity be measured?" organized by DG Education and Culture (DG EAC) together with the Centre for Research on... more
This book collects contributions from the conference that took place in Brussels 28 and 29 of May 2009: "Can creativity be measured?" organized by DG Education and Culture (DG EAC) together with the Centre for Research on Lifelong Learning (CRELL) of the DG Joint Research Centre (DG JRC). The book provides an overview of two main approaches to the measurement of creativity. Firstly, aggregate level approaches, where different existing statistical indicators can be used as pointers of creativity in a region or a nation. Secondly, it explores some aspects in the measurement of creativity at the individual level. The conference constituted the first step in the challenge of measuring creativity in an international, comparative way. If we want to foster creativity we need to measure it. Without the adequate tools to monitor whether or not the policies in place are actually raising the capacity to be creative, there is no way to know that the policies are effective. The main co...
This paper deals with computations of sensitivity indices in sensitivity analysis. Given a mathematical or computational model y= f( x1, x2,…, xk), where the input factors xi's are uncorrelated with one another, one can see y as the... more
This paper deals with computations of sensitivity indices in sensitivity analysis. Given a mathematical or computational model y= f( x1, x2,…, xk), where the input factors xi's are uncorrelated with one another, one can see y as the realization of a stochastic process obtained by sampling each of the xi from its marginal distribution. The sensitivity indices are related to the decomposition of the variance of y into terms either due to each xi taken singularly (first order indices), as well as into terms due to the cooperative effects of more than one xi. In this paper we assume that one has computed the full set of first order sensitivity indices as well as the full set of total-order sensitivity indices (a fairly common strategy in sensitivity analysis), and show that in this case the same set of model evaluations can be used to compute double estimates of: the total effect of two factors taken together, for all such k2 couples, where k is the dimensionality of the model; the total effect of k-2 factors taken together, for all k2 such ( k-2) ples. We further introduce a new strategy for the computation of the full sets of first plus total order sensitivity indices that is about 50% cheaper in terms of model evaluations with respect to previously published works. We discuss separately the case where the input factors xi's are not independent from each other.
Поиск в библиотеке, Расширенный поиск. ...
Research Interests:
Research Interests:
Research Interests:
In this paper we first present an example of how Composite Indicators 'naturally' emerge in a context where country performance is being benchmarked, we discuss some salient aspect of the Composite Indicators Controversy, pitting... more
In this paper we first present an example of how Composite Indicators 'naturally' emerge in a context where country performance is being benchmarked, we discuss some salient aspect of the Composite Indicators Controversy, pitting "Aggregators" and "Non- Aggregators against one another, and showing Pros and Cons to the use of composite indicators. We offer next some examples of JRC experience
Introduction: The GESAMAC project Nuclear ssion, as an energy source, has been employed in the United States and Europe for more than 40 years (e.g., Balogh 1991), and yet the problem of safe disposal of radioactive waste arising as a... more
Introduction: The GESAMAC project Nuclear ssion, as an energy source, has been employed in the United States and Europe for more than 40 years (e.g., Balogh 1991), and yet the problem of safe disposal of radioactive waste arising as a by-product of power generation with this approach is still under study. Deep geological disposal|in which radioactive materials (e.g., spent fuel rods) are encapsulated and placed in a facility far below ground|is still the most actively investigated option, although in the 1970s disposal of nuclear waste in deep sea sediments was considered (e.g., Bishop and Hollister 1974) and some objections to underground storage persist today (e.g., Keeney and von Winterfeldt 1994, Shrader-Frechette 1994). It is fair to say that even after decades of research, the physico-chemical behavior of deep geological disposal systems over geological time scales (hundreds or thousands of years) is far from known with certainty (Pereira 1989, Draper et
In this paper, we present the results obtained in the framework of a European research project on the Assessment and Reliability of Transport Emission Models and Inventory Systems. As recommended by the European Commission in its... more
In this paper, we present the results obtained in the framework of a European research project on the Assessment and Reliability of Transport Emission Models and Inventory Systems. As recommended by the European Commission in its Emissions Ceiling Directive, and also in the guidelines of the Inter-Governmental Panel of Climate Change on emissions inventories, atmospheric emission estimates from all sectors (transport, industry, agriculture, etc.) must be accompanied by uncertainty estimations. This has important implications in policy-making. Very little has been done so far, mainly because the characterization of the full chain of uncertainties (from errors in primary data down to model selection and use) is the most difficult step of the analysis.We use a methodological approach for the characterization of the uncertainty in emission estimates which is based on the Monte Carlo method. The sensitivity analysis of the model-based emission estimates is conducted using the so-called E...
This paper deals with computations of sensitivity indices in global sensitivity analysis. Given a model y= f (x 1,..., xk, where the k input factors x i's are uncorrelated with one another, one can see y as the realisation of a... more
This paper deals with computations of sensitivity indices in global sensitivity analysis. Given a model y= f (x 1,..., xk, where the k input factors x i's are uncorrelated with one another, one can see y as the realisation of a stochastic process obtained by sampling each of the x i's ...
ABSTRACT
.   We illustrate a method of global sensitivity analysis and we test it on a preliminary case study in the field of environmental assessment to quantify uncertainty importance in poorly-known model parameters and spatially referenced... more
.   We illustrate a method of global sensitivity analysis and we test it on a preliminary case study in the field of environmental assessment to quantify uncertainty importance in poorly-known model parameters and spatially referenced input data. The focus of the paper is to show how the methodology provides guidance to improve the quality of environmental assessment practices and decision support

And 198 more

On July 9 2014 friend and philosopher Jerome R. Ravetz (https://en.wikipedia.org/wiki/Jerome_Ravetz) turned 95. A few friends gathered virtually in Oxford to honor him. Here a few slides I presented.
Do we live immersed in fantastic numbers?