Search results
1 – 10 of 441Ebrahim Vatan, Gholam Ali Raissi Ardali and Arash Shahin
This study aims to investigate the effects of organizational culture factors on the selection of software process development models and develops a conceptual model for selecting…
Abstract
Purpose
This study aims to investigate the effects of organizational culture factors on the selection of software process development models and develops a conceptual model for selecting and adopting process development models with an organizational culture approach, using 12 criteria and their sub-criteria defined in Fey and Denison’s model (12 criteria).
Design/methodology/approach
The research hypotheses were investigated using statistical analysis, and then the criteria and sub-criteria were selected based on Fey and Denison’s model and the experts’ viewpoints. Afterward, the organizational culture of the selected company was measured using the data from 2016 and 2017, based on Fey and Denison’s questionnaire. Due to the correlation between the criteria, using the decision-making trial and evaluation technique, the correlation between sub-criteria were determined, and by analytical network process method and using Super-Decision software, the process development model was preferred to the 12 common models in information systems development.
Findings
Results indicated a significant and positive effect of organizational culture factors (except the core values factor) on the selection of development models. Also, by changing the value of organizational culture, the selected process development model changed either. Sensitivity analysis performed on the sub-criteria implied that by changing and improving some sub-criteria, the organization will be ready and willing to use the agile or risk-based models such as spiral and win-win models. Concerning units where the mentioned indicators were at moderate and low limits, models such as waterfall, V-shaped and incremental worked more appropriately.
Originality/value
While many studies were performed in comparing development models and investigating their strengths and weaknesses, and the impact of organizational culture on the success of information technology projects, literature indicated that the impact of organizational sub-culture prevailing in the selection of development process models has not been investigated. In this study, new factors and indicators were addressed affecting the selection of development models with a focus on organizational culture. Correlation among the factors and indicators was also investigated and, finally, a conceptual model was proposed for proper adoption of the models and methodologies of system development.
Details
Keywords
Chaitanya Arun Sathe and Chetan Panse
This study aims to examine the enablers of productivity of enterprise-level Agile development process using modified total interpretative structural modeling (TISM). The two main…
Abstract
Purpose
This study aims to examine the enablers of productivity of enterprise-level Agile development process using modified total interpretative structural modeling (TISM). The two main objectives of the current study are to determine the variables influencing enterprise-level agile development productivity and to develop modified TISM for the corresponding components.
Design/methodology/approach
To identify enablers of the productivity of enterprise-level agile software development process a literature review and opinions of domain experts were collected. A hierarchical relationship among variables that show direct and indirect influence is created using the modified TISM (M-TISM) technique with Cross Impact Matrix-Multiplication Applied to Classification analysis. This study examined and analyzed the relationships between the determinants within the enterprise using a M-TISM technique.
Findings
With the literature review, the study could identify ten enabling factors of the productivity of Agile development process at the enterprise level. Results depict that program increment (PI) planning and scalable backlog management, continuous integration and continuous delivery (CI/CD), agile release trains (ART), agile work culture, delivery excellence, lean and DevOps practices, value stream mapping (VMS), team skills and expertise, collaborative culture, agile coaching, customer engagement have an impact on the productivity of enterprise-level Agile development process. The results show that team collaboration, agile ways of working and customer engagement have a greater impact on productivity improvement for enterprise-level Agile development process.
Research limitations/implications
The developed model is useful for organizations employing scaled Agile development processes in software development. This study provides a recommended listing of key enablers, that may enable productivity improvements in the Agile development process at the enterprise level. Strategists should focus on team collaboration and Agile project management. This study offers a modified TISM model to academicians to help them understand the effects of numerous variables on maintaining the productivity of an enterprise-level Agile. The identified characteristics and their hierarchical structure can help project managers during the execution of Agile projects at the enterprise level, more effectively, increasing their success and productivity.
Originality/value
The study addresses the gap in the literature by interpretative relationships between the identified enabling factors. The model validation is carried out by a panel of nine experts from several information technology organizations deploying Agile software development at the enterprise level. This unique method broadens the knowledge base in Agile software development at scale and provides project managers and practitioners with a practical foundation.
Details
Keywords
Jayakrishna Kandasamy, Fazleena Badurdeen and Tharanga Rajapakshe
Gongtao Zhang and M.N. Ravishankar
Digital technologies create myriad innovation opportunities and have inspired the establishment of many new start-ups in recent years. Despite the growing knowledge on digital…
Abstract
Purpose
Digital technologies create myriad innovation opportunities and have inspired the establishment of many new start-ups in recent years. Despite the growing knowledge on digital entrepreneurship, few studies explore how start-ups exploit these opportunities to achieve entrepreneurial success. The purpose of this paper is to explore start-ups’ capabilities for successful delivery of digital artefacts in a cloud computing infrastructure.
Design/methodology/approach
Empirical data were collected during a qualitative case study of an established start-up in the Chinese market by interviewing 41 interviewees. Informed by the notion of dynamic capabilities and using the Gioia methodology, the case firm's life cycle was analysed in detail.
Findings
The study identifies start-ups’ ordinary and dynamic capabilities for successful development and delivery of digital services. The findings provide insights into a portfolio of start-ups’ capabilities, namely adaptation, networking, reengineering and refinement.
Originality/value
The study suggests that start-ups’ capabilities and underlying entrepreneurial actions determine the degree to which adoption of digital technologies create and transfer value to customers. The study offers specific insights into how start-ups successfully develop and deliver digital artefacts in a cloud infrastructure based on entrepreneurs' prior expertise, vision and accumulated experience.
Details
Keywords
Vaishali Rajput, Preeti Mulay and Chandrashekhar Madhavrao Mahajan
Nature’s evolution has shaped intelligent behaviors in creatures like insects and birds, inspiring the field of Swarm Intelligence. Researchers have developed bio-inspired…
Abstract
Purpose
Nature’s evolution has shaped intelligent behaviors in creatures like insects and birds, inspiring the field of Swarm Intelligence. Researchers have developed bio-inspired algorithms to address complex optimization problems efficiently. These algorithms strike a balance between computational efficiency and solution optimality, attracting significant attention across domains.
Design/methodology/approach
Bio-inspired optimization techniques for feature engineering and its applications are systematically reviewed with chief objective of assessing statistical influence and significance of “Bio-inspired optimization”-based computational models by referring to vast research literature published between year 2015 and 2022.
Findings
The Scopus and Web of Science databases were explored for review with focus on parameters such as country-wise publications, keyword occurrences and citations per year. Springer and IEEE emerge as the most creative publishers, with indicative prominent and superior journals, namely, PLoS ONE, Neural Computing and Applications, Lecture Notes in Computer Science and IEEE Transactions. The “National Natural Science Foundation” of China and the “Ministry of Electronics and Information Technology” of India lead in funding projects in this area. China, India and Germany stand out as leaders in publications related to bio-inspired algorithms for feature engineering research.
Originality/value
The review findings integrate various bio-inspired algorithm selection techniques over a diverse spectrum of optimization techniques. Anti colony optimization contributes to decentralized and cooperative search strategies, bee colony optimization (BCO) improves collaborative decision-making, particle swarm optimization leads to exploration-exploitation balance and bio-inspired algorithms offer a range of nature-inspired heuristics.
Details
Keywords
Eylem Thron, Shamal Faily, Huseyin Dogan and Martin Freer
Railways are a well-known example of complex critical infrastructure, incorporating socio-technical systems with humans such as drivers, signallers, maintainers and passengers at…
Abstract
Purpose
Railways are a well-known example of complex critical infrastructure, incorporating socio-technical systems with humans such as drivers, signallers, maintainers and passengers at the core. The technological evolution including interconnectedness and new ways of interaction lead to new security and safety risks that can be realised, both in terms of human error, and malicious and non-malicious behaviour. This study aims to identify the human factors (HF) and cyber-security risks relating to the role of signallers on the railways and explores strategies for the improvement of “Digital Resilience” – for the concept of a resilient railway.
Design/methodology/approach
Overall, 26 interviews were conducted with 21 participants from industry and academia.
Findings
The results showed that due to increased automation, both cyber-related threats and human error can impact signallers’ day-to-day operations – directly or indirectly (e.g. workload and safety-critical communications) – which could disrupt the railway services and potentially lead to safety-related catastrophic consequences. This study identifies cyber-related problems, including external threats; engineers not considering the human element in designs when specifying security controls; lack of security awareness among the rail industry; training gaps; organisational issues; and many unknown “unknowns”.
Originality/value
The authors discuss socio-technical principles through a hexagonal socio-technical framework and training needs analysis to mitigate against cyber-security issues and identify the predictive training needs of the signallers. This is supported by a systematic approach which considers both, safety and security factors, rather than waiting to learn from a cyber-attack retrospectively.
Details
Keywords
Bahman Arasteh and Ali Ghaffari
Reducing the number of generated mutants by clustering redundant mutants, reducing the execution time by decreasing the number of generated mutants and reducing the cost of…
Abstract
Purpose
Reducing the number of generated mutants by clustering redundant mutants, reducing the execution time by decreasing the number of generated mutants and reducing the cost of mutation testing are the main goals of this study.
Design/methodology/approach
In this study, a method is suggested to identify and prone the redundant mutants. In the method, first, the program source code is analyzed by the developed parser to filter out the effectless instructions; then the remaining instructions are mutated by the standard mutation operators. The single-line mutants are partially executed by the developed instruction evaluator. Next, a clustering method is used to group the single-line mutants with the same results. There is only one complete run per cluster.
Findings
The results of experiments on the Java benchmarks indicate that the proposed method causes a 53.51 per cent reduction in the number of mutants and a 57.64 per cent time reduction compared to similar experiments in the MuJava and MuClipse tools.
Originality/value
Developing a classifier that takes the source code of the program and classifies the programs' instructions into effective and effectless classes using a dependency graph; filtering out the effectless instructions reduces the total number of mutants generated; Developing and implementing an instruction parser and instruction-level mutant generator for Java programs; the mutant generator takes instruction in the original program as a string and generates its single-line mutants based on the standard mutation operators in MuJava; Developing a stack-based evaluator that takes an instruction (original or mutant) and the test data and evaluates its result without executing the whole program.
Details
Keywords
Haider Jouma, Muhamad Mansor, Muhamad Safwan Abd Rahman, Yong Jia Ying and Hazlie Mokhlis
This study aims to investigate the daily performance of the proposed microgrid (MG) that comprises photovoltaic, wind turbines and is connected to the main grid. The load demand…
Abstract
Purpose
This study aims to investigate the daily performance of the proposed microgrid (MG) that comprises photovoltaic, wind turbines and is connected to the main grid. The load demand is a residential area that includes 20 houses.
Design/methodology/approach
The daily operational strategy of the proposed MG allows to vend and procure utterly between the main grid and MG. The smart metre of every consumer provides the supplier with the daily consumption pattern which is amended by demand side management (DSM). The daily operational cost (DOC) CO2 emission and other measures are utilized to evaluate the system performance. A grey wolf optimizer was employed to minimize DOC including the cost of procuring energy from the main grid, the emission cost and the revenue of sold energy to the main grid.
Findings
The obtained results of winter and summer days revealed that DSM significantly improved the system performance from the economic and environmental perspectives. With DSM, DOC on winter day was −26.93 ($/kWh) and on summer day, DOC was 10.59 ($/kWh). While without considering DSM, DOC on winter day was −25.42 ($/kWh) and on summer day DOC was 14.95 ($/kWh).
Originality/value
As opposed to previous research that predominantly addressed the long-term operation, the value of the proposed research is to investigate the short-term operation (24-hour) of MG that copes with vital contingencies associated with selling and procuring energy with the main grid considering the environmental cost. Outstandingly, the proposed research engaged the consumers by smart meters to apply demand-sideDSM, while the previous studies largely focused on supply side management.
Details
Keywords
Abdulmohsen S. Almohsen, Naif M. Alsanabani, Abdullah M. Alsugair and Khalid S. Al-Gahtani
The variance between the winning bid and the owner's estimated cost (OEC) is one of the construction management risks in the pre-tendering phase. The study aims to enhance the…
Abstract
Purpose
The variance between the winning bid and the owner's estimated cost (OEC) is one of the construction management risks in the pre-tendering phase. The study aims to enhance the quality of the owner's estimation for predicting precisely the contract cost at the pre-tendering phase and avoiding future issues that arise through the construction phase.
Design/methodology/approach
This paper integrated artificial neural networks (ANN), deep neural networks (DNN) and time series (TS) techniques to estimate the ratio of a low bid to the OEC (R) for different size contracts and three types of contracts (building, electric and mechanic) accurately based on 94 contracts from King Saud University. The ANN and DNN models were evaluated using mean absolute percentage error (MAPE), mean sum square error (MSSE) and root mean sums square error (RMSSE).
Findings
The main finding is that the ANN provides high accuracy with MAPE, MSSE and RMSSE a 2.94%, 0.0015 and 0.039, respectively. The DNN's precision was high, with an RMSSE of 0.15 on average.
Practical implications
The owner and consultant are expected to use the study's findings to create more accuracy of the owner's estimate and decrease the difference between the owner's estimate and the lowest submitted offer for better decision-making.
Originality/value
This study fills the knowledge gap by developing an ANN model to handle missing TS data and forecasting the difference between a low bid and an OEC at the pre-tendering phase.
Miguel Calvo and Marta Beltrán
This paper aims to propose a new method to derive custom dynamic cyber risk metrics based on the well-known Goal, Question, Metric (GQM) approach. A framework that complements it…
Abstract
Purpose
This paper aims to propose a new method to derive custom dynamic cyber risk metrics based on the well-known Goal, Question, Metric (GQM) approach. A framework that complements it and makes it much easier to use has been proposed too. Both, the method and the framework, have been validated within two challenging application domains: continuous risk assessment within a smart farm and risk-based adaptive security to reconfigure a Web application firewall.
Design/methodology/approach
The authors have identified a problem and provided motivation. They have developed their theory and engineered a new method and a framework to complement it. They have demonstrated the proposed method and framework work, validating them in two real use cases.
Findings
The GQM method, often applied within the software quality field, is a good basis for proposing a method to define new tailored cyber risk metrics that meet the requirements of current application domains. A comprehensive framework that formalises possible goals and questions translated to potential measurements can greatly facilitate the use of this method.
Originality/value
The proposed method enables the application of the GQM approach to cyber risk measurement. The proposed framework allows new cyber risk metrics to be inferred by choosing between suggested goals and questions and measuring the relevant elements of probability and impact. The authors’ approach demonstrates to be generic and flexible enough to allow very different organisations with heterogeneous requirements to derive tailored metrics useful for their particular risk management processes.
Details