Search results

1 – 10 of over 28000
Article
Publication date: 7 August 2009

Marek J. Greniewski

The purpose of this paper is to present the ability of Z‐notation to formulate formal requirements specification of huge application software based on an example of the…

Abstract

Purpose

The purpose of this paper is to present the ability of Z‐notation to formulate formal requirements specification of huge application software based on an example of the Manufacturing Resource Planning (MRP II) Standard System. Z‐notation is using formal transformation approach to obtain operating software instead of traditional programming. The original MRP II software requirement specification possesses descriptive form extended by list of control questions. To make formal requirements specification, the original specification must be extending by some definition taken after APICS Dictionary. The definitions respect such concept as: item, item code, location, and order.

Design/methodology/approach

Writing schemas based on subsystem order of MRP II Standard System and treating the system as three level structures (user interface, business logic, and database), the schemas described business logic level only. As a conclusion was necessity to extend descriptive requirements specification by definitions. The limited size of the presentation contains few examples of formalization process only mainly limited to the main schemas as: item system (full definition), inventory system, bill of material, work centers and routings, generic order system, master production schedule, and material requirement planning.

Findings

As a result of the research, it can be said that Z‐notation apparatus is sufficient to build requirements specifications of big application systems like MRP II, enterprise resource planning, or customer relationship management.

Originality/value

Libraries of typical algorithms like MRP II designed through formal approach could replace traditional programming and open new prospects in the future development of broad computerization.

Details

Kybernetes, vol. 38 no. 7/8
Type: Research Article
ISSN: 0368-492X

Keywords

Article
Publication date: 15 June 2015

Maximiliano Cristia and Claudia Frydman

This paper aims to present the verification process conducted to assess the functional correctness of the voting system. Consejo Nacional de Investigaciones Científicas y Técnicas…

Abstract

Purpose

This paper aims to present the verification process conducted to assess the functional correctness of the voting system. Consejo Nacional de Investigaciones Científicas y Técnicas (CONICET) is the most important research institution in Argentina. It depends directly from Argentina’s President but its internal authorities are elected by around 8,000 research across the country. During 2011, the CONICET developed a Web voting system to replace the traditional mail-based process. In 2012 and 2014, CONICET conducted two Web election with no complaints from candidates and voters. Before moving the system into production, CONICET asked the authors to conduct a functional and security assessment of it.

Design/methodology/approach

This process is the result of integrating formal, semi-formal and informal verification activities from formal proof to code inspection and model-based testing.

Findings

Given the resources and time available, a reasonable level of confidence on the correctness of the application could be transmitted to senior management.

Research limitations/implications

A formal specification of the requirements must be developed.

Originality/value

Formal methods and semi-formal activities are seldom applied to Web applications.

Details

International Journal of Web Information Systems, vol. 11 no. 2
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 1 December 2002

Wiwat Vatanawood and Wanchai Rivepiboon

This paper proposes a systematic scheme for synthesizing formal specification from the definitions of relational data model – entity relationship diagram and their data…

Abstract

This paper proposes a systematic scheme for synthesizing formal specification from the definitions of relational data model – entity relationship diagram and their data dictionaries. The formal specification of both structural and behavioral properties of relational data model is generated in Z schemas. In our approach, the mandatory structural constraints – the uniqueness of primary key, foreign keys, and referential integrity constraints among the relations in the model, are preserved. We propose a set of transformation rules to produce Z schemas of the states and primitive operations – cascade insertion, deletion, and updating. Moreover, a composition technique of constructing the composite operations is presented by using requirements particle networks. The revision of the formal specification can be easily conducted with the mathematical proofs of the properties of the data model using Z prover tool.

Details

Engineering Computations, vol. 19 no. 8
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 8 October 2018

Natalia Miloslavskaya

Nowadays, to operate securely and legally and to achieve business objectives, secure valuable assets and support uninterrupted business processes, all organizations need to match…

Abstract

Purpose

Nowadays, to operate securely and legally and to achieve business objectives, secure valuable assets and support uninterrupted business processes, all organizations need to match a lot of internal and external compliance regulations such as laws, standards, guidelines, policies, specifications and procedures. An integrated system able to manage information security (IS) for their intranets in the new cyberspace while processing tremendous amounts of IS-related data coming in various formats is required as never before. These data, after being collected and analyzed, should be evaluated in real-time from an IS incident viewpoint, to identify an incident’s source, consider its type, weigh its consequences, visualize its vector, associate all target systems, prioritize countermeasures and offer mitigation solutions with weighted impact relevance. Different security information and event management (SIEM) systems cope with this routine and usually complicated work by rapid detection of IS incidents and further appropriate response. Modern challenges dictate the need to build these systems using advanced technologies such as the blockchain (BC) technologies (BCTs). The purpose of this study is to design a new BC-based SIEM 3.0 system and propose a methodology for its evaluation.

Design/methodology/approach

Modern challenges dictate the need to build these systems using advanced technologies such as the BC technologies. Many internet resources argue that the BCT suits the intrusion detection objectives very well, but they do not mention how to implement it.

Findings

After a brief analysis of the BC concept and the evolution of SIEM systems, this paper presents the main ideas on designing the next-generation BC-based SIEM 3.0 systems, for the first time in open access publications, including a convolution method for solving the scalability issue for ever-growing BC size. This new approach makes it possible not to simply modify SIEM systems in an evolutionary manner, but to bring their next generation to a qualitatively new and higher level of IS event management in the future.

Research limitations/implications

The most important area of the future work is to bring this proposed system to life. The implementation, deployment and testing onto a real-world network would also allow people to see its viability or show that a more sophisticated model should be worked out. After developing the design basics, we are ready to determine the directions of the most promising studies. What are the main criteria and principles, according to which the organization will select events from PEL for creating one BC block? What is the optimal number of nodes in the organization’s BC, depending on its network assets, services provided and the number of events that occur in its network? How to build and host the SIEM 3.0 BC infrastructure? How to arrange streaming analytics of block’s content containing events taking place in the network? How to design the BC middleware as software that enables staff to interact with BC blocks to provide services like IS events correlation? How to visualize the results obtained to find insights and patterns in historical BC data for better IS management? How to predict the emergence of IS events in the future? This list of questions can be continued indefinitely for a full-fledged design of SIEM 3.0.

Practical implications

This paper shows the full applicability of the BC concept to the creation of the next-generation SIEM 3.0 systems that are designed to detect IS incidents in a modern, fully interconnected organization’s network environment. The authors’ attempt to begin with a detailed description of the basics for a BC-based SIEM 3.0 system design is presented, as well as the evaluation methodology for the resulting product.

Originality/value

The authors believe that their new revolutionary approach makes it possible not to simply modify SIEM systems in an evolutionary manner, but to bring their next generation to a qualitatively new and higher level of IS event management in the future. They hope that this paper will evoke a lively response in this segment of the security controls market from both theorists and direct developers of living systems that will implement the above approach.

Book part
Publication date: 23 June 2016

Alexander Chudik, Kamiar Mohaddes, M. Hashem Pesaran and Mehdi Raissi

This paper develops a cross-sectionally augmented distributed lag (CS-DL) approach to the estimation of long-run effects in large dynamic heterogeneous panel data models with…

Abstract

This paper develops a cross-sectionally augmented distributed lag (CS-DL) approach to the estimation of long-run effects in large dynamic heterogeneous panel data models with cross-sectionally dependent errors. The asymptotic distribution of the CS-DL estimator is derived under coefficient heterogeneity in the case where the time dimension (T ) and the cross-section dimension (N ) are both large. The CS-DL approach is compared with more standard panel data estimators that are based on autoregressive distributed lag (ARDL) specifications. It is shown that unlike the ARDL-type estimator, the CS-DL estimator is robust to misspecification of dynamics and error serial correlation. The theoretical results are illustrated with small sample evidence obtained by means of Monte Carlo simulations, which suggest that the performance of the CS-DL approach is often superior to the alternative panel ARDL estimates, particularly when T is not too large and lies in the range of 30–50.

Book part
Publication date: 18 January 2022

Badi H. Baltagi, Georges Bresson, Anoop Chaturvedi and Guy Lacroix

This chapter extends the work of Baltagi, Bresson, Chaturvedi, and Lacroix (2018) to the popular dynamic panel data model. The authors investigate the robustness of Bayesian panel…

Abstract

This chapter extends the work of Baltagi, Bresson, Chaturvedi, and Lacroix (2018) to the popular dynamic panel data model. The authors investigate the robustness of Bayesian panel data models to possible misspecification of the prior distribution. The proposed robust Bayesian approach departs from the standard Bayesian framework in two ways. First, the authors consider the ε-contamination class of prior distributions for the model parameters as well as for the individual effects. Second, both the base elicited priors and the ε-contamination priors use Zellner’s (1986) g-priors for the variance–covariance matrices. The authors propose a general “toolbox” for a wide range of specifications which includes the dynamic panel model with random effects, with cross-correlated effects à la Chamberlain, for the Hausman–Taylor world and for dynamic panel data models with homogeneous/heterogeneous slopes and cross-sectional dependence. Using a Monte Carlo simulation study, the authors compare the finite sample properties of the proposed estimator to those of standard classical estimators. The chapter contributes to the dynamic panel data literature by proposing a general robust Bayesian framework which encompasses the conventional frequentist specifications and their associated estimation methods as special cases.

Details

Essays in Honor of M. Hashem Pesaran: Panel Modeling, Micro Applications, and Econometric Methodology
Type: Book
ISBN: 978-1-80262-065-8

Keywords

Article
Publication date: 1 September 1995

Sea Ling and Bohdan Durnota

Modelling by means of specification languages is increasingly beingrecognized as an important phase in system development. It encouragesone to think about problems using models…

1554

Abstract

Modelling by means of specification languages is increasingly being recognized as an important phase in system development. It encourages one to think about problems using models organized around real‐world situations. The system to be developed should then be consistent, correct and unambiguous with respect to the models produced. The justin‐time kanban system is an example of a realworld problem with a multiple‐supplier and multiple‐client architecture. Uses two specification languages LOOPN and Object‐Z, proposed in the literature to model the kanbansystem. Focuses on describing the kanbansystem in the different notations, thus investigating how well they can express the just‐intime system. The kanban system consists of many replicated components, each having the same state space and exhibiting the same behaviour. To describe each and every component in the system would be repetitious and tedious. Discusses the ease of describing such a system.

Details

International Journal of Operations & Production Management, vol. 15 no. 9
Type: Research Article
ISSN: 0144-3577

Keywords

Article
Publication date: 15 April 2024

Sarah Herwald, Simone Voigt and André Uhde

Academic research has intensively analyzed the relationship between market concentration or market power and banking stability but provides ambiguous results, which are summarized…

Abstract

Purpose

Academic research has intensively analyzed the relationship between market concentration or market power and banking stability but provides ambiguous results, which are summarized under the concentration-stability/fragility view. We provide empirical evidence that the mixed results are due to the difficulty of identifying reliable variables to measure concentration and market power.

Design/methodology/approach

Using data from 3,943 banks operating in the European Union (EU)-15 between 2013 and 2020, we employ linear regression models on panel data. Banking market concentration is measured by the Herfindahl–Hirschman Index (HHI), and market power is estimated by the product-specific Lerner Indices for the loan and deposit market, respectively.

Findings

Our analysis reveals a significantly stability-decreasing impact of market concentration (HHI) and a significantly stability-increasing effect of market power (Lerner Indices). In addition, we provide evidence for a weak (or even absent) empirical relationship between the (non)structural measures, challenging the validity of the structure-conduct-performance (SCP) paradigm. Our baseline findings remain robust, especially when controlling for a likely reverse causality.

Originality/value

Our results suggest that the HHI may reflect other factors beyond market power that influence banking stability. Thus, banking supervisors and competition authorities should investigate market concentration and market power simultaneously while considering their joint impact on banking stability.

Details

The Journal of Risk Finance, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1526-5943

Keywords

Abstract

Details

Prioritization of Failure Modes in Manufacturing Processes
Type: Book
ISBN: 978-1-83982-142-4

1 – 10 of over 28000