Search results

1 – 6 of 6
Article
Publication date: 7 February 2018

Richard Allister Mills and Stefano De Paoli

The purpose of this paper is to further the debate on Knowledge Artefacts (KAs), by presenting the design of WikiRate, a Collective Awareness platform whose goal is to support a…

Abstract

Purpose

The purpose of this paper is to further the debate on Knowledge Artefacts (KAs), by presenting the design of WikiRate, a Collective Awareness platform whose goal is to support a wider public contributing to the generation of knowledge on environmental, social and governance (ESG) performance of companies.

Design/methodology/approach

The material presented in the paper comes from the first-hand experience of the authors as part of the WikiRate design team. This material is reflexively discussed using concepts from the field of science and technology studies.

Findings

Using the concept of the “funnel of interest”, the authors discuss how the design of a KA like WikiRate relies on the designers’ capacity to translate general statements into particular design solutions. The authors also show how this funnelling helps understanding the interplay between situativity and objectivity in a KA. The authors show how WikiRate is a peer-production platform based on situativity, which requires a robust level of objectivity for producing reliable knowledge about the ESG performance of companies.

Originality/value

This paper furthers the debate on KAs. It presents a relevant design example and offers in the discussion a set of design and community building recommendations to practitioners.

Details

Data Technologies and Applications, vol. 52 no. 1
Type: Research Article
ISSN: 2514-9288

Keywords

Article
Publication date: 10 October 2023

Stefano De Paoli and Jason Johnstone

This paper presents a qualitative study of penetration testing, the practice of attacking information systems to find security vulnerabilities and fixing them. The purpose of this…

Abstract

Purpose

This paper presents a qualitative study of penetration testing, the practice of attacking information systems to find security vulnerabilities and fixing them. The purpose of this paper is to understand whether and to what extent penetration testing can reveal various socio-organisational factors of information security in organisations. In doing so, the paper innovates theory by using Routine Activity Theory together with phenomenology of information systems concepts.

Design/methodology/approach

The articulation of Routine Activity Theory and phenomenology emerged inductively from the data analysis. The data consists of 24 qualitative interviews conducted with penetration testers, analysed with thematic analysis.

Findings

The starting assumption is that penetration testers are akin to offenders in a crime situation, dealing with targets and the absence of capable guardians. A key finding is that penetration testers described their targets as an installed base, highlighting how vulnerabilities, which make a target suitable, often emerge from properties of the existing built digital environments. This includes systems that are forgotten or lack ongoing maintenance. Moreover, penetration testers highlighted that although the testing is often predicated on planned methodologies, often they resort to serendipitous practices such as improvisation.

Originality/value

This paper contributes to theory, showing how Routine Activity Theory and phenomenological concepts can work together in the study of socio-organisational factors of information security. This contribution stems from considering that much research on information security focuses on the internal actions of organisations. The study of penetration testing as a proxy of real attacks allows novel insights into socio-organisational factors of information security in organisations.

Details

Information Technology & People, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0959-3845

Keywords

Book part
Publication date: 20 August 2020

Paolo Ruffino

This chapter explores what video games can teach us in light of the ongoing sixth mass extinction in the history of our planet, allegedly caused by global warming and the…

Abstract

This chapter explores what video games can teach us in light of the ongoing sixth mass extinction in the history of our planet, allegedly caused by global warming and the over-consumption of vital resources. Games made and played by nonhuman actors can shed light on the situatedness and partiality of our knowledge regarding the boundaries that separate and differentiate human and nonhuman, interactivity and passivity, entertainment and boredom, and life and death. Nonhuman games help us to articulate the space and time in-between these dualisms and have the potential to re-route gaming (and game studies) from false myths of agency, interactivity, and instrumentalism, and the masculinism inherent in these notions. Nonhuman games are companions for earthly survival, and as such they can be taken as useful references when considering a more ethical approach to the ecological crisis of the Anthropocene. The chapter investigates notions of posthumanism, interpassivity, and contemporary critiques of the early assumptions of game studies on the agency of human players. It looks at video games that play by themselves, idle and incremental games, and the emergence of nonplaying characters in ludic and open-world simulations. It explores forms of automatic play and the use of bots and Artificial Intelligence (AI) in online role-playing games, procedurally generated virtual environments, and games that far exceed the lifespan of their players.

Details

Death, Culture & Leisure: Playing Dead
Type: Book
ISBN: 978-1-83909-037-0

Keywords

Article
Publication date: 8 June 2015

David Martín-Moncunill, Miguel-Ángel Sicilia-Urban, Elena García-Barriocanal and Salvador Sánchez-Alonso

Large terminologies usually contain a mix of terms that are either generic or domain specific, which makes the use of the terminology itself a difficult task that may limit the…

Abstract

Purpose

Large terminologies usually contain a mix of terms that are either generic or domain specific, which makes the use of the terminology itself a difficult task that may limit the positive effects of these systems. The purpose of this paper is to systematically evaluate the degree of domain specificity of the AGROVOC controlled vocabulary terms as a representative of a large terminology in the agricultural domain and discuss the generic/specific boundaries across its hierarchy.

Design/methodology/approach

A user-oriented study with domain-experts in conjunction with quantitative and systematic analysis. First an in-depth analysis of AGROVOC was carried out to make a proper selection of terms for the experiment. Then domain-experts were asked to classify the terms according to their domain specificity. An evaluation was conducted to analyse the domain-experts’ results. Finally, the resulting data set was automatically compared with the terms in SUMO, an upper ontology and MILO, a mid-level ontology; to analyse the coincidences.

Findings

Results show the existence of a high number of generic terms. The motivation for several of the unclear cases is also depicted. The automatic evaluation showed that there is not a direct way to assess the specificity degree of a term by using SUMO and MILO ontologies, however, it provided additional validation of the results gathered from the domain-experts.

Research limitations/implications

The “domain-analysis” concept has long been discussed and it could be addressed from different perspectives. A resume of these perspectives and an explanation of the approach followed in this experiment is included in the background section.

Originality/value

The authors propose an approach to identify the domain specificity of terms in large domain-specific terminologies and a criterion to measure the overall domain specificity of a knowledge organisation system, based on domain-experts analysis. The authors also provide a first insight about using automated measures to determine the degree to which a given term can be considered domain specific. The resulting data set from the domain-experts’ evaluation can be reused as a gold standard for further research about these automatic measures.

Details

Online Information Review, vol. 39 no. 3
Type: Research Article
ISSN: 1468-4527

Keywords

Article
Publication date: 19 June 2019

Cheng Zhong and Alexandra Komrakova

This paper aims to demonstrate the capabilities of a diffuse interface free energy lattice Boltzmann method to perform direct numerical simulations of liquid–liquid dispersions in…

211

Abstract

Purpose

This paper aims to demonstrate the capabilities of a diffuse interface free energy lattice Boltzmann method to perform direct numerical simulations of liquid–liquid dispersions in a well-controlled turbulent environment. The goal of this research study is to develop numerical techniques that can visualize and quantify drop interaction with the turbulent vortices. The obtained information will be used for the development of sub-models of drop breakup for multi-scale simulations.

Design/methodology/approach

A pure binary liquid system is considered that is subject to fully developed statistically stationary turbulent flow field in a cubic fully periodic box with the edge size of 300 lattice units. Three turbulent flow fields with varying energy input are examined and their coherent structures are visualized using a normalized Q-criterion. The evolution of the liquid–liquid interface is tracked as a function of time. The detailed explanation of the numerical method is provided with a highlight on a choice of the numerical parameters.

Findings

Drop breakup mechanisms differ depending on energy input. Drops break due to interaction with the vortices. Quantification of turbulent structures shows that the size of vortices increases with the decrease of energy input. Drop interacts simultaneously with multiple vortices of the size comparable to or smaller than the drop size. Vortices of the size smaller than the drop size disturb drop interface and pinch off the satellites. Vortices of the size comparable to the drop size tend to elongate the drop and tear it apart producing daughter drops and satellites. Addition of the second phase enhances turbulent dissipation at the high wavenumbers. To obtain physically realistic two-phase energy spectra, the multiple-relaxation-time collision operator should be used.

Originality/value

Detailed information of drop breakup in the turbulent flow field is crucial for the development of drop breakup sub-models that are necessary for multi-scale numerical simulations. The improvement of numerical methods that can provide these data and produce reliable results is important. This work made one step towards a better understanding of how drops interact with the turbulent vortices.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 29 no. 7
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 13 July 2022

Virendra Kumar Verma, Sachin S. Kamble, L. Ganapathy and Pradeep Kumar Tarei

The purpose of this study is to identify, analyse and model the post-processing barriers of 3D-printed medical models (3DPMM) printed by fused deposition modelling to overcome…

Abstract

Purpose

The purpose of this study is to identify, analyse and model the post-processing barriers of 3D-printed medical models (3DPMM) printed by fused deposition modelling to overcome these barriers for improved operational efficiency in the Indian context.

Design/methodology/approach

The methodology used interpretive structural modelling (ISM), cross-impact matrix multiplication applied to classification (MICMAC) analysis and decision-making trial and evaluation laboratory (DEMATEL) to understand the hierarchical and contextual relations among the barriers of the post-processing.

Findings

A total of 11 post-processing barriers were identified in this study using ISM, literature review and experts’ input. The MICMAC analysis identified support material removal, surface finishing, cleaning, inspection and issues with quality consistency as significant driving barriers for post-processing. MICMAC also identified linkage barriers as well as dependent barriers. The ISM digraph model was developed using a final reachability matrix, which would help practitioners specifically tackle post-processing barriers. Further, the DEMATEL method allows practitioners to emphasize the causal effects of post-processing barriers and guides them in overcoming these barriers.

Research limitations/implications

There may have been a few post-processing barriers that were overlooked by the Indian experts, which might have been important for other country’s perspective.

Practical implications

The presented ISM model and DEMATEL provide directions for operation managers in planning operational strategies for overcoming post-processing issues in the medical 3D-printing industry. Also, managers may formulate operational strategies based on the driving and dependence power of post-processing barriers as well as the causal effects relationships of the barriers.

Originality/value

This study contributes to identifying, analyzing and modelling the post-processing barriers of 3DPMM through a combined ISM and DEMATEL methodology, which has not yet been reviewed. This study also contributes to decision makers developing suitable strategies to overcome the post-processing barriers for improved operational efficiency.

Details

Rapid Prototyping Journal, vol. 29 no. 1
Type: Research Article
ISSN: 1355-2546

Keywords

1 – 6 of 6