Search results

1 – 2 of 2
Open Access
Article
Publication date: 13 March 2024

Keanu Telles

The paper provides a detailed historical account of Douglass C. North's early intellectual contributions and analytical developments in pursuing a Grand Theory for why some…

Abstract

Purpose

The paper provides a detailed historical account of Douglass C. North's early intellectual contributions and analytical developments in pursuing a Grand Theory for why some countries are rich and others poor.

Design/methodology/approach

The author approaches the discussion using a theoretical and historical reconstruction based on published and unpublished materials.

Findings

The systematic, continuous and profound attempt to answer the Smithian social coordination problem shaped North's journey from being a young serious Marxist to becoming one of the founders of New Institutional Economics. In the process, he was converted in the early 1950s into a rigid neoclassical economist, being one of the leaders in promoting New Economic History. The success of the cliometric revolution exposed the frailties of the movement itself, namely, the limitations of neoclassical economic theory to explain economic growth and social change. Incorporating transaction costs, the institutional framework in which property rights and contracts are measured, defined and enforced assumes a prominent role in explaining economic performance.

Originality/value

In the early 1970s, North adopted a naive theory of institutions and property rights still grounded in neoclassical assumptions. Institutional and organizational analysis is modeled as a social maximizing efficient equilibrium outcome. However, the increasing tension between the neoclassical theoretical apparatus and its failure to account for contrasting political and institutional structures, diverging economic paths and social change propelled the modification of its assumptions and progressive conceptual innovation. In the later 1970s and early 1980s, North abandoned the efficiency view and gradually became more critical of the objective rationality postulate. In this intellectual movement, North's avant-garde research program contributed significantly to the creation of New Institutional Economics.

Details

EconomiA, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1517-7580

Keywords

Open Access
Article
Publication date: 15 December 2020

Soha Rawas and Ali El-Zaart

Image segmentation is one of the most essential tasks in image processing applications. It is a valuable tool in many oriented applications such as health-care systems, pattern…

Abstract

Purpose

Image segmentation is one of the most essential tasks in image processing applications. It is a valuable tool in many oriented applications such as health-care systems, pattern recognition, traffic control, surveillance systems, etc. However, an accurate segmentation is a critical task since finding a correct model that fits a different type of image processing application is a persistent problem. This paper develops a novel segmentation model that aims to be a unified model using any kind of image processing application. The proposed precise and parallel segmentation model (PPSM) combines the three benchmark distribution thresholding techniques to estimate an optimum threshold value that leads to optimum extraction of the segmented region: Gaussian, lognormal and gamma distributions. Moreover, a parallel boosting algorithm is proposed to improve the performance of the developed segmentation algorithm and minimize its computational cost. To evaluate the effectiveness of the proposed PPSM, different benchmark data sets for image segmentation are used such as Planet Hunters 2 (PH2), the International Skin Imaging Collaboration (ISIC), Microsoft Research in Cambridge (MSRC), the Berkley Segmentation Benchmark Data set (BSDS) and Common Objects in COntext (COCO). The obtained results indicate the efficacy of the proposed model in achieving high accuracy with significant processing time reduction compared to other segmentation models and using different types and fields of benchmarking data sets.

Design/methodology/approach

The proposed PPSM combines the three benchmark distribution thresholding techniques to estimate an optimum threshold value that leads to optimum extraction of the segmented region: Gaussian, lognormal and gamma distributions.

Findings

On the basis of the achieved results, it can be observed that the proposed PPSM–minimum cross-entropy thresholding (PPSM–MCET)-based segmentation model is a robust, accurate and highly consistent method with high-performance ability.

Originality/value

A novel hybrid segmentation model is constructed exploiting a combination of Gaussian, gamma and lognormal distributions using MCET. Moreover, and to provide an accurate and high-performance thresholding with minimum computational cost, the proposed PPSM uses a parallel processing method to minimize the computational effort in MCET computing. The proposed model might be used as a valuable tool in many oriented applications such as health-care systems, pattern recognition, traffic control, surveillance systems, etc.

Details

Applied Computing and Informatics, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2634-1964

Keywords

Access

Only Open Access

Year

Content type

Earlycite article (2)
1 – 2 of 2