Search results

1 – 10 of over 4000
Article
Publication date: 3 May 2018

Viktoriya Lantushenko, Amy F. Lipton and Todd Erkis

Knowledge of spreadsheet tools like Microsoft Excel is a valuable skill to have in today’s job market. The preliminary assessment of a group of business school students shows that…

Abstract

Purpose

Knowledge of spreadsheet tools like Microsoft Excel is a valuable skill to have in today’s job market. The preliminary assessment of a group of business school students shows that most of them struggle to perform simple tasks in a spreadsheet. The purpose of this paper is to propose using student tutors to teach these skills.

Design/methodology/approach

The authors identify students proficient in Excel as tutors and organize one-on-one peer tutoring lessons. The authors compare the Excel competency level of students prior to and after the tutoring sessions.

Findings

The results suggest that most students with minimal Excel skills significantly improve their competency level after tutoring.

Originality/value

The proposed hands-on approach appears to be effective in helping students acquire basic Excel capabilities.

Details

Managerial Finance, vol. 44 no. 7
Type: Research Article
ISSN: 0307-4358

Keywords

Article
Publication date: 1 August 2016

Claire Elizabeth Carlson, Paul A. Isihara, Roger Sandberg, David Boan, Kaile Phelps, Kyu Lim Lee, Danilo R. Diedrichs, Daniela Cuba, Johnny Edman, Melissa Gray, Roland Hesse, Robin Kong and Kei Takazawa

The need in disaster response to assess how reliably and equitably funding was accounted for and distributed is addressed by a standardized report and index applicable to any…

Abstract

Purpose

The need in disaster response to assess how reliably and equitably funding was accounted for and distributed is addressed by a standardized report and index applicable to any disaster type. The paper aims to discuss this issue.

Design/methodology/approach

Data from the Nepal earthquake (2015), Typhoon Haiyan (2013), the Haiti earthquake (2010), Sri Lankan flood (2011), and Hurricane Sandy (2012) illustrate uses of a public equitable allocation of resources log (PEARL). Drawing from activity-based costing and the Gini index, a PEARL spreadsheet computes absolute inequity sector by sector as well as a cumulative index. Response variations guide index value interpretation.

Findings

Index values indicates major inequity in Nepal hygiene kit distribution and Haiti earthquake (both PEARL indices near 0.5), moderate inequity for the Sri Lankan flood (index roughly 0.75) and equitable distributions for Typhoon Haiyan and Hurricane Sandy (both indices approximately 0.95). Indices are useful to approximate proportions of inequity in the total response and investigate allocation under uncertainty in sector need specification.

Originality/value

This original tool is implementable using a website containing a practice PEARL, completed examples and downloadable spreadsheet. Used across multiple sectors or for a single sector, PEARL may signal need for additional resources, correct inequitable distribution decisions, simplify administrative monitoring/assessment, and foster greater accounting transparency in summary reports. PEARL also assists historical analysis of all disaster types to determine completeness of public accounting records and equity in fund distribution.

Details

Journal of Humanitarian Logistics and Supply Chain Management, vol. 6 no. 2
Type: Research Article
ISSN: 2042-6747

Keywords

Article
Publication date: 4 October 2022

Paula de Santi Louzada, Tiago F.A.C. Sigahi, Gustavo Hermínio Salati Marcondes de Moraes, Izabela Simon Rampasso, Rosley Anholon, Jiju Antony and Elizabeth A. Cudney

This study aims to present an overview and analyze the Lean Six Sigma Black Belt (LSSBB) certifications offered by institutions operating in Brazil.

Abstract

Purpose

This study aims to present an overview and analyze the Lean Six Sigma Black Belt (LSSBB) certifications offered by institutions operating in Brazil.

Design/methodology/approach

This research analyzed LSSBB certification courses offered by 48 institutions in Brazil by comparing the syllabi of the classes to the reference model proposed by the American Society for Quality (ASQ) in the Six Sigma Black Belt Body of Knowledge. This study employed the content analysis technique and hierarchical cluster analysis to analyze the data.

Findings

The results revealed a lack of standardization in the content of Lean Six Sigma (LSS) training in Brazil. 100% of the LSSBB courses analyzed covered four of the 108 techniques recommended by the ASQ Body of Knowledge (i.e. data types, measurement scales, sampling, and data collection plans and methods). In contrast, more than 75% of the courses covered all techniques related to the macro areas of organization-wide planning and deployment, organizational process management and measures, measure, and improve. The major shortcoming of LSS training is related to the macro area Design for Six Sigma framework and methodologies. LSS training is offered in a highly concentrated area in Brazil, the wealthiest region, where universities play a crucial role in disseminating LSS.

Originality/value

The literature lacks studies that critically examine LSS certification courses. There is little research on LSS in Brazil and there are no studies on LSS training in this country.

Details

The TQM Journal, vol. 35 no. 7
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 6 June 2018

Roland Erwin Suri and Mohamed El-Saad

Changes in file format specifications challenge long-term preservation of digital documents. Digital archives thus often focus on specific file formats that are well suited for…

1833

Abstract

Purpose

Changes in file format specifications challenge long-term preservation of digital documents. Digital archives thus often focus on specific file formats that are well suited for long-term preservation, such as the PDF/A format. Since only few customers submit PDF/A files, digital archives may consider converting submitted files to the PDF/A format. The paper aims to discuss these issues.

Design/methodology/approach

The authors evaluated three software tools for batch conversion of common file formats to PDF/A-1b: LuraTech PDF Compressor, Adobe Acrobat XI Pro and 3-HeightsTM Document Converter by PDF Tools. The test set consisted of 80 files, with 10 files each of the eight file types JPEG, MS PowerPoint, PDF, PNG, MS Word, MS Excel, MSG and “web page.”

Findings

Batch processing was sometimes hindered by stops that required manual interference. Depending on the software tool, three to four of these stops occurred during batch processing of the 80 test files. Furthermore, the conversion tools sometimes failed to produce output files even for supported file formats: three (Adobe Pro) up to seven (LuraTech and 3-HeightsTM) PDF/A-1b files were not produced. Since Adobe Pro does not convert e-mails, a total of 213 PDF/A-1b files were produced. The faithfulness of each conversion was investigated by comparing the visual appearance of the input document with that of the produced PDF/A-1b document on a computer screen. Meticulous visual inspection revealed that the conversion to PDF/A-1b impaired the information content in 24 of the converted 213 files (11 percent). These reproducibility errors included loss of links, loss of other document content (unreadable characters, missing text, document part missing), updated fields (reflecting time and folder of conversion), vector graphics issues and spelling errors.

Originality/value

These results indicate that large-scale batch conversions of heterogeneous files to PDF/A-1b cause complex issues that need to be addressed for each individual file. Even with considerable efforts, some information loss seems unavoidable if large numbers of files from heterogeneous sources are migrated to the PDF/A-1b format.

Details

Library Hi Tech, vol. 39 no. 2
Type: Research Article
ISSN: 0737-8831

Keywords

Article
Publication date: 10 April 2009

Robert J. Balik

Currently many jobs for undergraduate finance majors require that the student demonstrate advanced Excel modeling skills. The purpose of this paper is to illustrate and explain…

2612

Abstract

Purpose

Currently many jobs for undergraduate finance majors require that the student demonstrate advanced Excel modeling skills. The purpose of this paper is to illustrate and explain the Excel Best Practices which should enhance their financial modeling efficiency.

Design/methodology/approach

The focus is a way to teach the Excel Best Practices when teaching financial modeling with Excel 2007. It uses a chronological modeling procedure that is consistent with current learning theory and the way students should use these Excel Best Practices. A capital budgeting replacement problem is used to illustrate many of the Excel Best Practices.

Findings

It was found that using a chronological modeling procedure is consistent with current learning theory.

Originality/value

Using the procedures mentioned in this paper should result in efficient financial modeling. Efficient models are created in less time, have fewer errors, if any, and are designed for ease of use.

Details

Managerial Finance, vol. 35 no. 5
Type: Research Article
ISSN: 0307-4358

Keywords

Book part
Publication date: 20 August 2018

Ronald Klimberg and Samuel Ratick

During the past several decades, the decision-making process and the decision-makers’ role in it have changed dramatically. Because of this, the use of analytical tools, such as…

Abstract

During the past several decades, the decision-making process and the decision-makers’ role in it have changed dramatically. Because of this, the use of analytical tools, such as Excel, have become an essential component of most organizations. The analytical tools in Excel can provide today’s decision-maker with a competitive advantage. We will illustrate several powerful Excel tools that facilitate the decision support process.

Details

Applications of Management Science
Type: Book
ISBN: 978-1-78756-651-4

Keywords

Article
Publication date: 16 May 2016

Mohammad A. Rob and Floyd J. Srubar

The purpose of this study is to demonstrate how existing volumes of big city crime data could be converted to significantly useful information by law enforcement agencies using…

Abstract

Purpose

The purpose of this study is to demonstrate how existing volumes of big city crime data could be converted to significantly useful information by law enforcement agencies using readily available data warehouse and OLAP technologies. During the post-9/11 era, criminal data collection by law enforcement agencies received significant attention across the world. Rapid advancement of technology helped collection and storage of these data in large volumes, but often do not get analyzed due to improper data format, lack of technological knowledge and time. Data warehousing (DW) and On-line Analytical Processing (OLAP) tools can be used to organize and present these data in a form strategically meaningful to the general public. In this study, the authors took a seven-month sample crime data from the City of Houston Police Department’s website, cleaned and organized them into a data warehouse with the hope of answering common questions related to crime statistics in a big city in the USA.

Design/methodology/approach

The raw data for the seven-month period was collected from the website in Microsoft Excel spreadsheet format for each month. The data were then cleaned, described, renamed, formatted and then imported into a compiled Access database along with the definition of Facts and Dimensions using a STAR Schema. Data were then transferred to the Microsoft SQL Server data warehouse. SQL Server Analysis Services and Visual Studio Business Intelligent Tool are used to create a Data Cube for OLAP analysis of the summarized data.

Findings

To prove the usefulness of the DW and OLAP cube, the authors have shown few sample queries displaying the number and the types of crimes as a function of time of the day, location, premises, etc. For example, the authors found that 98 crimes occurred on a major street in the city during the early working hours (7 am and 12 pm) when nobody virtually was at home, and among those crimes, roughly two-thirds of them are thefts. This summarized information is significantly useful to the general public and the law enforcement agencies.

Research limitations/implications

The authors’ research is limited to one city’s crime data, whose data set might be different from other cities. In addition to the volume of data and lack of descriptions, the major limitations encountered were the lack of major neighborhood names and their relation to streets. There are other government agencies that provide data to this effect, and a standard set of data would facilitate the process. The authors also looked at data for a nine-month period only. Analyzing data over many years will provide time-trend of crime statistics for a longer period of time.

Practical implications

Many federal, state and local law enforcement agencies are rapidly embracing technology to publish crime data through their websites. However, more attention will need to be paid to the quality and utility of this information to the general public. At the time, there exists no compiled source of crime data or its trend as a function of time, crime type, location and premises. There needs to be a coherent system that allows for an average citizen to obtain this information in a more consumable package. DW and OLAP tools can provide this information package.

Social implications

Having the crime data of a big city in a consumable form is immensely useful for all segments of the constituency that the government agencies serve and will become a service that these offices will be expected to deliver on demand. This information could also be useful in many instances for the decision makers, ranging from those seeking to start a business, to those seeking a place to live who may not necessarily know which neighborhoods or parts of the city are more prone to criminal activity than others.

Originality/value

While there have been few reports of possible use of DW and OALP technologies to study criminal data, the authors found that not many authors used actual crime data, the data sets and formats used in each case are different, results are not presented in most cases and the actual vendor technologies implemented can be different as well. In this paper, the authors present how DW and OLAP tools readily available in most enterprises can be used to analyze publicly available criminal datasets and convert them into meaningful information, which can be valuable not only to the law enforcement agencies but to the public at large.

Details

Transforming Government: People, Process and Policy, vol. 10 no. 2
Type: Research Article
ISSN: 1750-6166

Keywords

Book part
Publication date: 2 December 2021

Joseph G. Donelan and Yu Liu

This chapter advocates a teaching approach for the statement of cash flows (SCF) that includes introduction of the SCF early in the curriculum using the accounting equation format

Abstract

This chapter advocates a teaching approach for the statement of cash flows (SCF) that includes introduction of the SCF early in the curriculum using the accounting equation format, which helps students visualize the cash and accrual activities. We then adapt this accounting equation format to a worksheet model that can be used later in the curriculum with more complex data sets. This approach provides several advantages: (1) it maintains a consistent, accounting equation approach throughout; (2) it can be used for both the direct and the indirect report format; (3) when used with Excel, the format is easier to explain, easier to use, and less prone to mechanical error than the worksheet approaches used in most textbooks; and (4) it is used by many professional accountants.

Details

Advances in Accounting Education: Teaching and Curriculum Innovations
Type: Book
ISBN: 978-1-80071-702-2

Keywords

Article
Publication date: 20 June 2016

Linda L. Rath

The purpose of this paper is to determine whether TAMS Analyzer and Viewshare are viable free and open source software data sharing and creation tools for those with limited…

2243

Abstract

Purpose

The purpose of this paper is to determine whether TAMS Analyzer and Viewshare are viable free and open source software data sharing and creation tools for those with limited funding and technological skills.

Design/methodology/approach

The participant observer method was used to collect experiential evidence while applying the tools to a collection of text-, image-, and video-based digital cultural records.

Findings

TAMS Analyzer was found to be a low barrier to entry tool for those with coding and qualitative data analysis experience. Those with general experience will be able to create datasets with the support of manuals and tutorials, while those with limited experience may find it difficult to use. Viewshare was found to be a low barrier to entry tool for sharing data online, and accessible for all skill levels.

Research limitations/implications

TAMS Analyzer supports Mac and Linux platforms only, so a low-cost software recommendation was made for those in Windows environments.

Practical implications

Librarians can use these tools to address data access gaps while promoting library digital collections.

Social implications

With a greater understanding of data tools, librarians can be advisors, collaborators, agents for data culture, and relevant participants in digital humanities scholarship.

Originality/value

The research evaluates both the capabilities of the tools and the barriers to using or accessing them, which are often neglected. The paper addresses a need in the literature for greater scrutiny of tools that are a critical component of the data ecology, and will further assist librarians when connecting scholars to tools of inquiry in an environment with limited funding and technical support.

Details

Library Hi Tech, vol. 34 no. 2
Type: Research Article
ISSN: 0737-8831

Keywords

Open Access
Article
Publication date: 11 September 2017

Alexandros Nikas, Haris Doukas, Jenny Lieu, Rocío Alvarez Tinoco, Vasileios Charisopoulos and Wytze van der Gaast

The aim of this paper is to frame the stakeholder-driven system mapping approach in the context of climate change, building on stakeholder knowledge of system boundaries, key…

4624

Abstract

Purpose

The aim of this paper is to frame the stakeholder-driven system mapping approach in the context of climate change, building on stakeholder knowledge of system boundaries, key elements and interactions within a system and to introduce a decision support tool for managing and visualising this knowledge into insightful system maps with policy implications.

Design/methodology/approach

This methodological framework is based on the concepts of market maps. The process of eliciting and visualising expert knowledge is facilitated by means of a reference implementation in MATLAB, which allows for designing technological innovation systems models in either a structured or a visual format.

Findings

System mapping can contribute to evaluating systems for climate change by capturing knowledge of expert groups with regard to the dynamic interrelations between climate policy strategies and other system components, which may promote or hinder the desired transition to low carbon societies.

Research limitations/implications

This study explores how system mapping addresses gaps in analytical tools and complements the systems of innovation framework. Knowledge elicitation, however, must be facilitated and build upon a structured framework such as technological innovation systems.

Practical implications

This approach can provide policymakers with significant insight into the strengths and weaknesses of current policy frameworks based on tacit knowledge embedded in stakeholders.

Social implications

The developed methodological framework aims to include societal groups in the climate policy-making process by acknowledging stakeholders’ role in developing transition pathways. The system map codifies stakeholder input in a structured and transparent manner.

Originality/value

This is the first study that clearly defines the system mapping approach in the frame of climate policy and introduces the first dedicated software option for researchers and decision makers to use for implementing this methodology.

Details

Journal of Knowledge Management, vol. 21 no. 5
Type: Research Article
ISSN: 1367-3270

Keywords

1 – 10 of over 4000