This work introduces a self-consisten t model for software complexity, based on information which... more This work introduces a self-consisten t model for software complexity, based on information which can be typically collected at early stages of software lifecycle, e.g. in the functional specification phase, when a functional size measurement is also usually performed. The proposed model considers software complexity as structured into three bottom-up stages of the software architecture (from internal complexity of each
This work explains the Worked Function Point model for effective software project size evaluation... more This work explains the Worked Function Point model for effective software project size evaluation. This model was originally proposed in a contractual framework of fixed cost per function point, in order to achieve a more significant "worked size" to be correlated with development effort and cost. The model is suitable also for internal development estimation, wherever an average productivity is
Functional Size Measurement (FSM) methods have been introduced in the industry since late 70‟ s a... more Functional Size Measurement (FSM) methods have been introduced in the industry since late 70‟ s and have been successfully adopted in software development projects for sizing the amount of functional requirements, often for software estimation purposes. Typically, measurements with FSM are performed in two moments of the project – early design and delivery, respectively supporting project estimation and validation. This approach „works‟ for traditional waterfall cycles, typically requires adjustments in real world requirements changes over the projects, but is simply uneffective in modern agile development methodologies, where Evolutionary Requirements Analysis is the rule and the user (providing requirements and feedbacks) is constantly involved in the development process. This work depicts the adoption of the new generation FSM COSMIC method in order to address the agile development process with a standard measurement approach. Such approach results in fine-tuned measures all over the project cycle for agile projects – with a single metrics that holds valid from early design to end phases, while adequately covering all the intermediate iterations and refinements. Project management practices, such as Earned Value assessment, as well as general process improvement practices, such as the CMMI-DEV framework, are easily mapped onto COSMIC measurement and benefit from the proposed approach.
... Enterprise Data Warehouse (EDW) An EDW contains detailed (and possibly summarized) data captu... more ... Enterprise Data Warehouse (EDW) An EDW contains detailed (and possibly summarized) data captured from ... EFFORT ESTIMATION FOR DATAWAREHOUSE SYSTEMS Data warehouse systems productivity factors The main peculiar productivity aspects of data warehouse ...
ABSTRACT Due to the wider use of Function Point metrics in the software management field, there i... more ABSTRACT Due to the wider use of Function Point metrics in the software management field, there is an increasing need for methods to estimate, at an early stage of the software life cycle, a software application's value in Function Points. This work describes how to use the Early Function Points method (presented at IFPUG 97, Scottsdale, September 97) from the practical point of view and it presents a set of very encouraging project results. Early Function Points are not a measurement alternative to IFPUG 4.0 Function Points, but only a fast and early estimate thereof. The method is based on identifying software objects, like logical data and functionalities provided by the software under evaluation, at different levels of detail. The key factors are: macrofunctions, functions, microfunctions, functional primitives and logical data items. Each of these objects may be assigned a set of FP values based on a statistical table. The approach presented here has proved quite effective in providing a r...
ABSTRACT Functional Size Measurement (FSM) is a relevant approach for software management and est... more ABSTRACT Functional Size Measurement (FSM) is a relevant approach for software management and estimation since decades. Despite the efforts to refine the measurement definitions and practices FSM methods, real-world practitioners' needs led to a variety of proposals for approximation techniques aimed to provide (Function Point) figures in early phases of software projects and lifecycles, based on 'fuzzy' requirements (e.g. 'Size Classes', Estimated and Indicative, Quick & Early, Light and Early & Quick, Fast or Simple Function Points). This paper summarizes most common approximate sizing approaches, it provides an up-to-date comparison of their generic features, confidence levels and applicability, it shows how most approaches are basically variations or instances of a generic unique scheme (that can be derived from so-called Smart Function Points); it introduces the latest evolution of such approximation techniques -- for both IFPUG and COSMIC methods -- named 'EASY (Early & Speedy) Function Points'.
ABSTRACT Historical datasets, or Historical Data Repositories (HDR) are important in software eng... more ABSTRACT Historical datasets, or Historical Data Repositories (HDR) are important in software engineering, because they support a variety of useful analyses for benchmarking as well as estimation process purposes. Besides private repositories, some public domain data repositories - such as the PROMISE and the ISBSG datasets - have been available for over 15 years. Regardless of their importance, such repositories evolved more in amount of data points, than quality or conformity to commonly agreed practices for data collection and verification. This paper provides some considerations about current issues and possible improvements of (public) HDR's.
ABSTRACT Uncertainty is a measurable notion in statistics and measurement theory. Also, uncertain... more ABSTRACT Uncertainty is a measurable notion in statistics and measurement theory. Also, uncertainty typically propagates from one stage to the other in a measurement or estimation process. Software Requirements constitute an event space following standard distributions. Thus statistical concepts such as residuals, standard distribution, and uncertainty – and uncertainty propagation – are applicable from statistics and measurement theory. This work presents preliminary results that allow evaluating the benefits of pursuing such a research direction for Six Sigma practice. We present the results of a field study where we compare the life cycles of new requirements to Personalised Customer Communications Software with the calculated uncertainties of these requirements. A general approach to uncertainty propagation is applied in this case to quantify to what extent the uncertainty in input variables (requirements definition, for instance) can affect the outcomes of software analysis and development processes (project size, for instance), and finally our software estimation true capability. Uncertainty propagation is also known as "error propagation" in science (not to be confused with defects in software engineering).
The spreading of agile methodologies in software development raises the question of how to measur... more The spreading of agile methodologies in software development raises the question of how to measure requirements once more, as it happened in ‘traditional’ software industry development approaches decades ago. The difference is that requirements are not known in advance but detected as User Stories while iterating and enhancing the software product from one agile ‘Sprint’ to the other. Some authors – promoting best practices for agile software development – propose Story Points to size User Stories (e.g., Scrum, with Story Cards), yet not combined with base project estimation. Story Points are not standardized, thus leading to eventual misconceptions and quantitative differences among practitioners and domains. The uncertainty implied in such approach can therefore propagate to any estimate based on it, not to mention the difficulty in accurately tracing requirements and their variation over the project and across project iterations. This work investigates benefits from adopting a standardized Functional Size Measurement (FSM) method, such as COSMIC Function Points, in place of Story Points. Using a Transfer Function (from the Six Sigma practice) that transforms size into effort spent within a particular agile team, defect density prediction can be made using sensitivity analysis.
... The multi-level approach let us exploit all the knowledge we have on a particular branch of t... more ... The multi-level approach let us exploit all the knowledge we have on a particular branch of the system, without ... detailed", "intermediate" or "summary", depending of the detail level chosen (or forced) for the early classification of functionalities ... Figure 2. Generic hierarchy scheme. ...
Historical datasets, or Historical Data Repositories (HDR) are important in software engineering,... more Historical datasets, or Historical Data Repositories (HDR) are important in software engineering, because they support a variety of useful analyses for benchmarking as well as estimation process purposes. Besides private repositories, some public domain data repositories \u2013such as the PROMISE and the ISBSG datasets\u2013 have been available for over 15 years. Regardless of their importance, such repositories evolved more in amount of data points, than quality or conformity to commonly agreed practices for data collection and verification. This paper provides some considerations about current issues and possible improvements of (public) HDR\u2019s
Data Warehouse Systems are a special context for the application of functional software metrics. ... more Data Warehouse Systems are a special context for the application of functional software metrics. The use of an unique standard, as Function Point, gives serious comparability issues with traditional systems or other paradigms, in terms of both numerical size and implementation effort estimation. Peculiar guidelines are therefore necessary in order to identify the user view, the software boundaries, the data and the transactional components of such systems. Particularly, the boundary identification may strongly affect the measurement result for a data warehouse project; consequently, one can find huge, unacceptable deviations in the estimation of effort, time and cost for the given project. This paper shows the substantial differences between “traditional” software and data warehouse systems, the main guidelines that one can use when measuring the latter, and peculiar considerations for differentiating the effort estimation by measured element types. The depicted case studies highlig...
This work introduces a self-consisten t model for software complexity, based on information which... more This work introduces a self-consisten t model for software complexity, based on information which can be typically collected at early stages of software lifecycle, e.g. in the functional specification phase, when a functional size measurement is also usually performed. The proposed model considers software complexity as structured into three bottom-up stages of the software architecture (from internal complexity of each
This work explains the Worked Function Point model for effective software project size evaluation... more This work explains the Worked Function Point model for effective software project size evaluation. This model was originally proposed in a contractual framework of fixed cost per function point, in order to achieve a more significant "worked size" to be correlated with development effort and cost. The model is suitable also for internal development estimation, wherever an average productivity is
Functional Size Measurement (FSM) methods have been introduced in the industry since late 70‟ s a... more Functional Size Measurement (FSM) methods have been introduced in the industry since late 70‟ s and have been successfully adopted in software development projects for sizing the amount of functional requirements, often for software estimation purposes. Typically, measurements with FSM are performed in two moments of the project – early design and delivery, respectively supporting project estimation and validation. This approach „works‟ for traditional waterfall cycles, typically requires adjustments in real world requirements changes over the projects, but is simply uneffective in modern agile development methodologies, where Evolutionary Requirements Analysis is the rule and the user (providing requirements and feedbacks) is constantly involved in the development process. This work depicts the adoption of the new generation FSM COSMIC method in order to address the agile development process with a standard measurement approach. Such approach results in fine-tuned measures all over the project cycle for agile projects – with a single metrics that holds valid from early design to end phases, while adequately covering all the intermediate iterations and refinements. Project management practices, such as Earned Value assessment, as well as general process improvement practices, such as the CMMI-DEV framework, are easily mapped onto COSMIC measurement and benefit from the proposed approach.
... Enterprise Data Warehouse (EDW) An EDW contains detailed (and possibly summarized) data captu... more ... Enterprise Data Warehouse (EDW) An EDW contains detailed (and possibly summarized) data captured from ... EFFORT ESTIMATION FOR DATAWAREHOUSE SYSTEMS Data warehouse systems productivity factors The main peculiar productivity aspects of data warehouse ...
ABSTRACT Due to the wider use of Function Point metrics in the software management field, there i... more ABSTRACT Due to the wider use of Function Point metrics in the software management field, there is an increasing need for methods to estimate, at an early stage of the software life cycle, a software application's value in Function Points. This work describes how to use the Early Function Points method (presented at IFPUG 97, Scottsdale, September 97) from the practical point of view and it presents a set of very encouraging project results. Early Function Points are not a measurement alternative to IFPUG 4.0 Function Points, but only a fast and early estimate thereof. The method is based on identifying software objects, like logical data and functionalities provided by the software under evaluation, at different levels of detail. The key factors are: macrofunctions, functions, microfunctions, functional primitives and logical data items. Each of these objects may be assigned a set of FP values based on a statistical table. The approach presented here has proved quite effective in providing a r...
ABSTRACT Functional Size Measurement (FSM) is a relevant approach for software management and est... more ABSTRACT Functional Size Measurement (FSM) is a relevant approach for software management and estimation since decades. Despite the efforts to refine the measurement definitions and practices FSM methods, real-world practitioners' needs led to a variety of proposals for approximation techniques aimed to provide (Function Point) figures in early phases of software projects and lifecycles, based on 'fuzzy' requirements (e.g. 'Size Classes', Estimated and Indicative, Quick & Early, Light and Early & Quick, Fast or Simple Function Points). This paper summarizes most common approximate sizing approaches, it provides an up-to-date comparison of their generic features, confidence levels and applicability, it shows how most approaches are basically variations or instances of a generic unique scheme (that can be derived from so-called Smart Function Points); it introduces the latest evolution of such approximation techniques -- for both IFPUG and COSMIC methods -- named 'EASY (Early & Speedy) Function Points'.
ABSTRACT Historical datasets, or Historical Data Repositories (HDR) are important in software eng... more ABSTRACT Historical datasets, or Historical Data Repositories (HDR) are important in software engineering, because they support a variety of useful analyses for benchmarking as well as estimation process purposes. Besides private repositories, some public domain data repositories - such as the PROMISE and the ISBSG datasets - have been available for over 15 years. Regardless of their importance, such repositories evolved more in amount of data points, than quality or conformity to commonly agreed practices for data collection and verification. This paper provides some considerations about current issues and possible improvements of (public) HDR's.
ABSTRACT Uncertainty is a measurable notion in statistics and measurement theory. Also, uncertain... more ABSTRACT Uncertainty is a measurable notion in statistics and measurement theory. Also, uncertainty typically propagates from one stage to the other in a measurement or estimation process. Software Requirements constitute an event space following standard distributions. Thus statistical concepts such as residuals, standard distribution, and uncertainty – and uncertainty propagation – are applicable from statistics and measurement theory. This work presents preliminary results that allow evaluating the benefits of pursuing such a research direction for Six Sigma practice. We present the results of a field study where we compare the life cycles of new requirements to Personalised Customer Communications Software with the calculated uncertainties of these requirements. A general approach to uncertainty propagation is applied in this case to quantify to what extent the uncertainty in input variables (requirements definition, for instance) can affect the outcomes of software analysis and development processes (project size, for instance), and finally our software estimation true capability. Uncertainty propagation is also known as "error propagation" in science (not to be confused with defects in software engineering).
The spreading of agile methodologies in software development raises the question of how to measur... more The spreading of agile methodologies in software development raises the question of how to measure requirements once more, as it happened in ‘traditional’ software industry development approaches decades ago. The difference is that requirements are not known in advance but detected as User Stories while iterating and enhancing the software product from one agile ‘Sprint’ to the other. Some authors – promoting best practices for agile software development – propose Story Points to size User Stories (e.g., Scrum, with Story Cards), yet not combined with base project estimation. Story Points are not standardized, thus leading to eventual misconceptions and quantitative differences among practitioners and domains. The uncertainty implied in such approach can therefore propagate to any estimate based on it, not to mention the difficulty in accurately tracing requirements and their variation over the project and across project iterations. This work investigates benefits from adopting a standardized Functional Size Measurement (FSM) method, such as COSMIC Function Points, in place of Story Points. Using a Transfer Function (from the Six Sigma practice) that transforms size into effort spent within a particular agile team, defect density prediction can be made using sensitivity analysis.
... The multi-level approach let us exploit all the knowledge we have on a particular branch of t... more ... The multi-level approach let us exploit all the knowledge we have on a particular branch of the system, without ... detailed", "intermediate" or "summary", depending of the detail level chosen (or forced) for the early classification of functionalities ... Figure 2. Generic hierarchy scheme. ...
Historical datasets, or Historical Data Repositories (HDR) are important in software engineering,... more Historical datasets, or Historical Data Repositories (HDR) are important in software engineering, because they support a variety of useful analyses for benchmarking as well as estimation process purposes. Besides private repositories, some public domain data repositories \u2013such as the PROMISE and the ISBSG datasets\u2013 have been available for over 15 years. Regardless of their importance, such repositories evolved more in amount of data points, than quality or conformity to commonly agreed practices for data collection and verification. This paper provides some considerations about current issues and possible improvements of (public) HDR\u2019s
Data Warehouse Systems are a special context for the application of functional software metrics. ... more Data Warehouse Systems are a special context for the application of functional software metrics. The use of an unique standard, as Function Point, gives serious comparability issues with traditional systems or other paradigms, in terms of both numerical size and implementation effort estimation. Peculiar guidelines are therefore necessary in order to identify the user view, the software boundaries, the data and the transactional components of such systems. Particularly, the boundary identification may strongly affect the measurement result for a data warehouse project; consequently, one can find huge, unacceptable deviations in the estimation of effort, time and cost for the given project. This paper shows the substantial differences between “traditional” software and data warehouse systems, the main guidelines that one can use when measuring the latter, and peculiar considerations for differentiating the effort estimation by measured element types. The depicted case studies highlig...
Uploads
Papers by Luca Santillo