Abstract
Background: The South African banking sector is renowned as world-class, exhibiting ample capital, cutting-edge technology, robust infrastructure, and a rigorous regulatory and supervisory framework. Business intelligence systems are fundamental to providing the data-driven decision support needed to keep the banking industry competitive. The problem is that many business intelligence (BI) reports are not used optimally, or not used at all, because of usability problems.
Objectives: The objective is to report on an investigation into BI usability criteria and to propose evidence-based, critical usability criteria for BI reports in the banking industry.
Method: A mixed-methods research design, guided by a pragmatist philosophy was employed. Usability criteria for BI systems from literature formed the basis of a survey with employees at a South African Bank. The company issues log (CIL) on BI reports was analysed to investigate the mapping between the reported issues and usability requirements. Interviews were performed to validate the findings from the survey and CIL analysis.
Results: The triangulation of the findings from the survey, CIL data analysis and the interviews revealed discrepancies, which were used to refine the initial set of criteria towards proposing the critical usability criteria for the BI banking context.
Conclusion: Despite the discrepancy between the survey findings and the CIL analysis, the reports’ usability issues were confirmed by the interview findings.
Contribution: The scientific contribution is the evidence-based critical usability criteria for South African BI banking reports. On a practical level, the usability criteria can be applied to design and evaluate financial reports.
Keywords: business intelligence; BI; usability; usability criteria; financial reports.
Introduction
South African (SA) banks rank among the largest banking corporations on the African continent (Smith 2021). The SA banking sector is renowned as world-class, boasting ample capital, cutting-edge technology, robust infrastructure and a rigorous regulatory and supervisory framework (Matemilola, Bany-Ariffin & Muhtar 2015). The prominent banks in South Africa encompass the Amalgamated Banks of South Africa (ABSA), Standard Bank Group, Nedbank Group and FirstRand Bank (Ramavhona & Mokwena 2016). The provision of banking services in the SA region holds significant importance, catering to a diverse range of needs for businesses, consumers and investors (Smith 2021).
Informed decision-making within the banking sector necessitates business intelligence (BI) to facilitate the transformation of data into actionable insights, instrumental in making astute business choices (Hočevar & Jaklič 2010). Business intelligence assists in formulating a strategy harmonious with their organisational context, ensuring that implementation paves the way for sound decisions that enhance overall performance (Nithya & Kiruthika 2021). Business intelligence projects encompass the implementation of BI solutions, involving the design, development and deployment of business intelligence systems (BIS) tailored to an organisation’s needs.
A significant percentage of BI projects fail (Ain et al. 2019) and Olszak (2016) quantified that to be between 60% and 70%. Isik and Sidorova (2011) suggested that these failures occur when organisations decide to adopt BI without a clear understanding of the critical capabilities that define the success of such applications. The primary reason for the failure of BI projects is the lack of user adoption (Villamarín & Diaz Pinzon 2017). The adoption and usage of BI depend on a network of interconnected factors, with usability being pivotal and usability influences adoption by ensuring ease of use and learning (Thowfeek & Salam 2014). Furthermore, usability plays a crucial role in enabling users to access information and generate outputs (Macías and Borges 2024).
The study is presented in four parts. The first part covers the literature review of BI and usability, which is followed by the research design and the presentation of the findings. The study concludes by discussing the findings. The study is guided by the main research question, namely: ‘What are the critical usability criteria that should be used to evaluate BI banking reports?’ In response, it was decomposed into four sub-research questions and mapped to the respective research action as depicted in Table 1.
TABLE 1: Research sub-questions mapped to research actions. |
Literature review
The literature review aims to offer conceptual clarification of the key concepts and theoretical frameworks underpinning this study. The next section will discuss BI systems and that will be followed by a section on the usability of BI systems.
Business intelligence systems
Munoz (2017) proposed two perspectives on BI: the first perspective is viewing BI as a broad perspective that encompasses all data-gathering initiatives of an enterprise. The second perspective entails only the information technology angle relating to software services. Many authors use the terms BI and BIS interchangeably, so it is not always possible to disambiguate their perspectives. In this study, the term BI refers to the broad perspective where the goal is improving the timeliness and quality of business-related information. The term BIS will be used when referring only to the information technology perspective. Within the realm of BI, there exist several distinct processes, concepts and components that collectively enhance an organisation’s decision-making capabilities. To elucidate their interrelations, consider Figure 1, which shows sub-themes within processes, concepts and components, explained as follows:
- Processes are a crucial component of BI; there are processes for storing data in the data warehouse (DW), loading data, be it real-time or historical, by filtering out irrelevant information and depicting analysis by maintaining metadata (Wixom & Watson 2010).
- There are four key concepts of BI, namely, data collection (which involves extraction of data and consolidating the data into the DW), analysis (which involves accessing and analysing data), visualisation (which involves the creation of reports and dashboards) and decision-making (Xlogiatech 2023).
- The components of BI are essential for the virtual facilitation of decision-making (Bharadiya 2023). Key components of BI include DW; data sources; data mining; extract, transform and load (ETL); and online analytical processing (OLAP) (Wixom & Watson 2010).
|
FIGURE 1: Business intelligence processes, concepts and components based on Tikait (2023). |
|
A user can access the data from the DW in different ways, for example, through OLAP, reports, dashboards, data mining and catalogues. Business intelligence reports play a pivotal role in organisations (Mansell & Ruhode 2019; Tavera Romero et al. 2021). Organisations employ reports to discern performance trends and financial anomalies (Wise 2012). To maintain competitiveness, interactive reports with drill-through capabilities and dashboard visualisations are essential (Wise 2012). Business intelligence dashboards, a subset of BI tools, offer a mechanism for business owners and executives to explore data through visual interfaces (Quamar et al. 2020). These dashboards condense performance metrics into easily digestible visualisations (Hansoti 2010). For instance, utilising OLAP, data are presented in data cubes, fostering quick analysis by transforming raw data into a comprehensible format (Jinpon, Jaroensutasinee & Jaroensutasinee 2011). Figure 2 shows the BI front-end access point, which is used by the users to access data, in different forms. This view includes the reporting, dashboard and OLAP components. The next section introduces the usability aspect of BI systems.
|
FIGURE 2: Front-end business intelligence access point based on Wise (2012). |
|
Usability of business intelligence systems
Usability means ensuring that interactive products are easy to learn, effective to use and enjoyable from the user’s perspective (Preece, Rogers & Sharp 2002). According to the ISO 9241-11 (2018) definition, usability is the extent to which specified users can use a product to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use (ISO 9241-11 2018; Tullis & Albert 2008).
Towards identifying the relevant usability criteria, this study adopted a narrative (traditional) literature review (NLR) and a systematic literature review (SLR). An NLR is a qualitative method applying traditional reviews of the existing and past academic articles towards understanding concept relationships (Yang & Tate 2012). An SLR is a systematic examination of the scholarly literature about the research topic that critically analyses, evaluates and synthesises research findings, theories and practices by scholars and researchers related to an area of focus (Efron & Ravid 2018). While the purpose of the narrative literature review is to provide an assessment of the state of knowledge in a problem domain and identify weaknesses and needs for further research, the purpose of the SLR is to address a highly specific research question for which evidence from the literature is sought (Boell & Cecez-Kechmanovic 2015). This difference in the purpose methodology, and especially the outcome, motivates the use of the two types of literature reviews.
The NLR was conducted to assess the state of knowledge in the domain of BI usability. An additional contribution was the 35 usability criteria identified from the literature of which 12 were selected based on their relevance to the study’s goals. The SLR was conducted in 2019 to identify the usability criteria identified for BI. The following selection criteria were used: articles published from 2010 to 2019 and selected from the Association for Computing Machinery (ACM), Institute of Electrical and Electronics Engineers (IEEE) and Scopus database engines. In a bid to maintain coherence and consistency across the selection process, the initial database query used the search string: ‘usability criteria’ AND ‘business intelligence’. Figure 3 depicts the SLR selection process, including the steps undertaken to refine and select pertinent sources. This search yielded a total of five sources from IEEE, two from ACM and nine from Scopus. This small number demonstrates the lack of research into the usability of BI systems. The combination of these sources resulted in a pool of 16 research articles. Subsequent analysis led to the exclusion of one source because the format was not consistent with that of a research article, three were duplicates and three articles were removed from consideration as they did not have the keywords ‘usability criteria’ in either their title, keywords or full text. The remaining nine articles were retained for full analysis towards identifying BI usability criteria.
|
FIGURE 3: Systematic literature review selection process. |
|
The SLR yielded a list of 20 usability criteria from which six criteria were selected based on their relevance to this study. The NLR usability criteria and SLR usability criteria were merged to create a list of 14 literature-based usability criteria (LBUC) (see Table 2) in response to SRQ1.
TABLE 2: Literature-based usability criteria. |
The LBUC is the response to SRQ1. The LBUC is refined through the findings of the survey, document analysis and interviews to identify the set of critical usability criteria, which is the main scientific contribution of this study. The next section presents the research design for this study.
Research methods and design
Application context and ethical clearance
The research was conducted at one of the big four banks in South Africa. This bank was selected because one of the authors was employed at the bank. During the ethical clearance with the bank, the researcher agreed that there should not be any mention or any point of data that can in any way be used to reveal the bank’s name. This is because of certain legal factors and the potential loss of its operating license, among others. The participants were bank employees who have experience in BI banking reports, either creating reports or using the reports. Ethical clearance was obtained from the College Research and Ethics Committee (CREC) at the University of South Africa as well as the South African bank at which this study was conducted. Approval was granted in November 2020, number 2020/CSET/SOC/031, and required written consent by all participants prior to participating in the study. Participants’ responses captured were identified by a number, analysed and stored securely according to the applications’ undertaking, which aligns with the World Health Organization’s guidelines.
Method
This study employed mixed methods research (MMR) to evaluate the usability of BI banking reports. The MMR was adopted based on Cameron’s (2010) and Creswell and Plano Clark’s (2011) five Ps framework (Paradigms, Pragmatism, Praxis, Proficiency and Publishing), which include philosophical considerations and approaches, methodological choices and processes, competencies, practicalities and political considerations (Cameron 2010). An overview of the five Ps is provided next.
Paradigms
A pragmatic approach was adopted for this study, to guide the research subject, the activities and the nature of the research outputs as advocated by Pickard (2013).
Pragmatism
Pragmatism concerns action, change and the interplay between knowledge and action (Goldkuhl 2012). It is a method for attaining clarity of ideas within a normative conception of logic, following the norms for continuing, self-correcting enquiry directed towards truth (Baran & Jones 2016; Goldkuhl 2012). A major underpinning of pragmatism as philosophy is that knowledge and reality are based on beliefs and habits that are socially constructed (Kaushik & Walsh 2019; Shannon-Baker 2015). This study gathers insights from the participants creating or using the BI reports for their operational and strategic work. As a result, the philosophical assumption underlying this study was that of pragmatism.
Praxis
According to Creswell (2010), praxis refers to the adoption and use of mixed methods. Mixed methods involve combining or integrating qualitative and quantitative research and data in a research study (Creswell 2014). Nastasi, Hitchcock and Brown (2010:318) suggested integrating MMR designs and research design typologies, thereby identifying themes that reflect an integrated perspective about ‘precursors and basic design criteria’: types of methods or data mixed, the timing of mixing, breadth of mixing, rationale for mixing and researcher orientation. Multiple mixed methods designs exist; in this study, quantitative research was conducted first, the results were analysed and then qualitative research was conducted to help explain the quantitative results in more detail. This type of design has been described as explanatory sequential mixed methods (Creswell & Plano Clark 2018; Hirose & Creswell 2023).
Data collection and capturing
The research method included a survey, secondary data (extracted from the CIL) and interviews as discussed in the following sub-sections. The sample population for surveys and interviews was obtained from the business unit’s central email address database, which consists of all the employees who work at the selected business unit of the South African bank.
Survey
This survey questionnaire (see Appendix 1) consisted of two sections: the first section comprises the participant’s demographic information and the second section contains questions based on the LBUC (see Table 2) where participants rated section B questions using a five-point Likert scale, ranging from ‘strongly disagree’ to ‘strongly agree’ (Brooke 2013). Participants were contacted via a central email (which contained 274 email addresses of employees) with the survey questionnaire link and 98 responded. The survey data were gathered using Google Forms and MS Forms.
Company issues log data extraction
The banking institution has a team that develops and maintains the BI banking reports to ensure that the reports are updated and resolve issues as they arise. This team uses a portal, referred to as the company issues log (CIL) portal where stakeholders log requests relating to BI reports, and reports portal or database issues experienced by employees while doing their work at the bank. The CIL data extraction for this study captured issues relating to the BI banking reports. The CIL data were retrieved from June 2020 to March 2021 and the data were then formatted using an MS Word document to prepare for analysis. Thematic analysis (TA) was used to code the participants’ feedback as advocated by Braun and Clarke (2006).
Interviews
Prior to conducting the interviews, the researcher randomly selected interviewees using the central email address while also ensuring that interviewees were not known personally. Once a random list was made, the researcher emailed the participant consent form and the information sheet. Interviews were held to obtain the participants’ feedback about the BI banking reports usability issues. The responses were recorded with transcription (offered by MS Teams), and the researcher also transcribed the recording by re-listening to the recording and formatting the transcription using MS Word. The interview (see Appendix 2 for the interview questionnaire) had three sections. The first section captured the demographic information and a binary question regarding the participants’ satisfaction with the BI banking reports. The second section contains the semi-structured interview questions regarding the BI banking issues identified from the CIL data extraction findings, while the third section of the interview used a Likert rating scale of 1 to 5 to check if the participants were satisfied with the BI banking reports. Because of the coronavirus disease 2019 (COVID-19) regulations at the time, the interviews were conducted via Microsoft Teams. The interviewees were selected randomly using the central email address while ensuring that those interviewees were not known personally.
Data analysis – Statistical analysis of survey data
Descriptive statistical analysis and correlation analysis were used to analyse the data collected during the survey. The descriptive analysis for this study was reported in a table, which contains the number of participants, the minimum, maximum, mean and standard deviation (see Table 3). In this regard, graphs and tables were created using the Statistical Package of Social Science (SPSS 21) version 27. For this study, the correlation was aimed at quantifying the degree to which two constructs are related (see Table 4). A Pearson product-moment correlation was used to quantify the correlation between the constructs. A Pearson correlation measures a linear association between two normally distributed random variables (Schober, Boer & Schwarte 2018).
TABLE 3: Measures of variability and measure of central tendency. |
Data analysis – Thematic analysis of company issues log data extraction and interviews
Thematic analysis was used to analyse the data collected from CIL data extraction and interviews. Thematic analysis is a qualitative research method for describing data and involves interpretation in selecting codes and constructing themes (Braun & Clarke 2006, 2021; Kiger & Varpio 2020). Thematic analysis approaches typically acknowledge the potential for inductive (data-driven) and deductive (theory-driven) orientations to coding, capturing semantic (explicit or overt) and latent (implicit, underlying, not necessarily unconscious) meanings, processes of coding, theme development and the potential for some flexibility around the theory that frames the research (Braun & Clarke 2021; Kiger & Varpio 2020). The six phases of analysis when using TA, as suggested by Braun and Clarke (2006), are now delineated:
- Phase 1: Familiarising researcher with the data – in this phase, the researchers needed to familiarise themselves with the data.
- Phase 2: Generating initial codes – this phase involved generating initial codes based on the CIL data. This phase involved reviewing the data from the documents, one line item at a time and creating initial codes using specific words or text strings from the documents. The researcher used an inductive coding approach to create the codes.
- Phase 3: Searching for themes – this phase involved searching for themes and engaging with the initial codes (from the second phase) by collating all related, relevant data extracts as advocated by Van Biljon and Mwapwele (2023). The researcher used a deductive coding approach to create the themes.
- Phase 4: Reviewing themes – this phase focuses on refining the code groups, with each being assessed for internal homogeneity and external heterogeneity as advocated by Braun and Clarke (2006). In this phase, the researcher reviewed the allocated codes as well as the code groups. The researcher also revisited the codes allocated to each code group and ensured that the relevant codes were allocated to the correct code groups.
- Phase 5: Defining and naming themes – this phase involved defining and naming themes obtained during phase four as advocated by Braun and Clarke (2006) and Scharp and Sanders (2019). In this phase, the thinking behind the code groups was reconsidered and based on Phase 1, around the data capturing and familiarisation.
- Phase 6: Producing the report – this phase involved producing the research report – a complete data-based story to allow the reader to understand the merit and validity of the analysis.
Ethical considerations
The research was conducted at one of the big four banks in South Africa. This bank was selected because one of the authors was employed at the bank. During the ethical clearance with the bank, the researcher agreed that there should not be any mention or any point of data that can in any way be used to reveal the bank’s name. This is because of certain legal factors and the potential loss of its operating license, among others. The participants were bank employees who have experience in BI banking reports, either creating reports or using the reports. Ethical clearance was obtained from the College Research and Ethics Committee (CREC) at the University of South Africa as well as the South African bank at which this study was conducted. Approval was granted in November 2020, reference number 2020/CSET/SOC/031, and required written consent by all participants prior to participating in the study. Participants’ responses captured were identified by a number, analysed and stored securely according to the applications’ undertaking which aligns with the World Health Organization’s guidelines.
Results and findings
The findings, from the survey, CIL data extraction and interviews will be presented in this section.
Survey
Demographics
A total of 274 questionnaires were distributed, of which 98 were returned. Those included 58 male participants (59.2%) and 40 female participants (40.8%) of the sample. Figure 4 depicts that most of the participants (63.3%) fell into the age group of 25–34 years, followed by the 35–44 years age group (28.6%); participants aged 45 and above years accounted for 5.1%, and those aged between 18 and 24 years accounted for 2%, while 1% of the population preferred not to disclose their age.
Figure 5 depicts the sample distribution by BI user experience. Considering the participants’ exposure to BI, the results revealed that most of the participants (45.92%) have been a BI user for 49 or more months, 21.43% for 25–48 months, 17.35% for 13–24 months, 10.20% for 0–3 months and 5.10% for 4–12 months.
|
FIGURE 5: Sample distribution by business intelligence user experience. |
|
Figure 6 illustrates the participants’ BI experience areas, most participants (38.776%) have experience with both the reports and dashboards of BI areas, while 10.204% have experience with data mining, reports and dashboards; 9.184% of participants have experience with dashboards; 8.163% have experience with OLAP cubes, data mining, catalogues and reports; 7.143% of participants have experience only with reports; and 5.102% of participants have experience with OLAP cubes, reports and dashboards. There are three combinations for 4.082% in which the participants have experience in: (1) data mining, reports, dashboards and catalogues; (2) OLAP cubes, data mining and reports; and (3) dashboards and reports, dashboards and catalogues. Both data mining and reports, as well as OLAP cubes, catalogues, reports and dashboards, each have a percentage of 2.041% for participants with experience in these areas, respectively. Lastly, there are five combinations for 1.020% in which the participants have experience in: (1) data mining; (2) data mining and dashboards; (3) data mining, catalogues and reports; (4) data mining, OLAP cubes and reports; and (5) OLAP cubes, reports, dashboards and data mining. Most of the participants (79.6%) have used the BI system, while 13.3% are unsure, and 7.1% have not used the BI system. These demographics demonstrate that the participants had adequate exposure and experience of the BI banking system.
|
FIGURE 6: Business intelligence areas represented. |
|
Measures of variability and central tendencies
The mean and standard deviations from the learnability, robustness, design, flexibility and usability goals are depicted in Table 3. A Likert scale frequency was used to describe the different variables, where 1 represents strongly disagree, 2 for disagree, 3 for neutral, 4 for agree and 5 for strongly agree.
From Table 3, the following can be observed:
- The mean for learnability is 3.8878, which means most of the sampled population had no problem with the learnability. Given their relatively high experience levels, it is possible that their prior knowledge of banking reports helped them to use the report.
- The mean for robustness is 3.6122, which means most of the sampled population had no problems recovering the work from an unexpected situation (e.g. a power cut). Furthermore, it is easy to take corrective actions such as undo once the error has been recognised and the report site provides feedback to indicate continued progress.
- The mean for design is 3.9252. Design is a complex concept; in this context, it refers to the reports’ ability to provide informative responses to the user. That is influenced by the structure, the relevance of the information and the functionalities provided.
- Most of the sampled population (3.6939 on average) agree that the user can customise the report according to their priorities.
- Most of the sampled population (3.9184 on average) agree that the report provides the user with the information they require to achieve their goals and that the report aids the user in completing their tasks on time.
Notably, all the mean values become 4 when rounded to the nearest integer. Therefore, the findings indicate no significant dissatisfaction with the usability of the BI banking reports as measured by these constructs.
Correlations
According to the correlations depicted in Table 4, all the constructs’ correlations have a p-value of < 0.05, which indicates that the correlation between the constructs is significant (Field 2013). The findings thus confirm that all the variables are correlated and that the changes in one construct affect the other. This suggests that all the constructs listed in Table 4 contribute to the usability of the BI banking reports.
Company issues log data extraction
The purpose of the CIL data extraction was to identify the usability issues of BI banking reports (in response to sub-research question 2). Thematic analysis was used, again following the 6-step approach as advocated by Braun and Clark (2016).
Phase 1: Familiarising with your data
In this phase, the researcher also created a project in ATLAS.ti and named the project CIL project. The CIL project stored all the project files.
Phase 2: Generating initial codes
There were 162 initial codes that have been created for this study in ATLAS.ti.
Phase 3: Searching for themes
The researcher utilised the request types that the user can select from to log a request. The different request types, their descriptions and their frequency are summarised as follows:
- Small project (15): A small project refers to a project size category that is allocated when a project is logged; before a project is allocated a size category, there is a scoping that is required to be done and a presentation from the client explaining their requirement at a high level.
- Investigation (51): The investigation refers to the situation when the data reflected on the reports may not be as expected by the client. This can be because of data integrity issues, data issues, etc., examples of an error when the client expects the total number of sales to have increased by 3%; however, the report shows a spike of 20% increase.
- Change request (69): Change request refers to when the client wants to change a report already developed and published. The client may want to add a business rule, remove a filter, change the report layout, etc.
- Data extract (50): The data extract theme refers to the case when the client requests a once-off data in the form of an MS Excel file. In most cases, this is via Excel. The clients provide the business rules and the fields they want to see. This type of request is usually a once-off and is often called ad hoc.
- Incident (13): An incident is usually logged when an error occurs while the user wants to view a report. Incident refers to an unplanned event that interrupts the users, for example, when there is a network outage and the user cannot access the reports.
- Recurring ad hoc (3): The recurring ad hoc theme refers to the situation when a request of the same nature or issue keeps on being logged by the client.
- Large project (5): A large project is a project size category that is allocated when a project is logged. Before a project is allocated size, there is a scoping that is required to be done and a presentation from the client explaining their requirement at a high level.
- Maintenance (4): The maintenance theme refers to a maintenance issue, for example, server-related and database-related issues. Maintenance refers to ensuring that the applications, such as MS PowerBI, Tableau and MyBI, being used are in good working order. Maintenance needs to be done so that the business as usual does not get impacted, that is, a business can continue to use the reports without getting any errors.
- Scoping (8): Scoping is done for all new projects that need to be classified according to size category by the committee. A client logs this type of request and then gets allocated to a business analyst who will document this scoping. The level of detail and the client’s requirements will help the committee size the project – small, medium, large and extra-large.
The codes captured were allocated to the relevant code groups using ATLAS.ti to be able to create network diagrams. For example, for the code group named ‘Changed request’, there is a corresponding network named ‘Changed request’.
Phase 4: Reviewing themes
The researcher revisited the codes allocated to each code group and ensured that the relevant codes were allocated to the correct code groups. From the 162 codes that were initially created in Phase 2, 75 codes were retained. The codes not directly related to BI banking reports and BI banking report issues were removed.
Phase 5: Defining and naming themes
The generation of the initial codes and code groups was revisited and reconsidered but the researcher did not change the code groups. Figure 7 depicts the network diagram in ATLAS.ti, showing the nine CIL data themes created in Phase 3.
|
FIGURE 7: Network diagram showing the themes for company issues log data extraction. |
|
Phase 6: Producing the report
In this phase, the researcher identified the BI banking issues for each of the themes created. Table 5 contains the themes created and their respective BI banking report issues codes.
TABLE 5: The themes and respective business intelligence banking report issues codes. |
The BI banking report issues codes can be described as follows:
- Missing data: When data are missing, the results in the reports not being populated for certain days and the user cannot perform their tasks which then gets delayed until the data are updated on the report.
- System issues caused the reports not to be up to date with the data or the user cannot pull or download the report.
- Enhancements of reports, where some reports needed to be enhanced to include further requirements such as additional filters, columns, business rules or data source change, the other reason for an enhancement was that there was a report that the client felt was not satisfied with it and required to be rebuilt for the team to be able to use it. Furthermore, they wanted the reports to align with the similar key performance indicators (KPIs) they are reporting. This ensures that the reporting numbers are the same across the reports to avoid confusion or misalignment.
- Report investigation was because of the client wanting to ensure that the rules applied for the reports are up to date and, if not, to have the reports updated with the latest business rules, to report on the same numbers for the KPIs.
- Slow speed, where some reports were reported to be slow in loading or responding.
- Access to reports, which resulted from several reports not being accessible on the report portal and the requestor wanted to understand the cause of the issue.
- Data quality, where some reports required investigation between the front-end and back-end to identify the difference because the volumes on the reports reporting the same KPIs were not the same. These differences caused inconsistencies and some of the values returned were null.
From the BI banking issues code descriptions or details, the researcher identified 10 BI banking report issues. The relevance of the usability criteria (LBUC) in the BI environment was based on their relevance to the BI banking report issues identified from CIL.
To answer RSQ 2: ‘What are the usability issues of BI reports in the banking industry?’, the BI banking report issues were mapped to the LBUC and presented as the BI banking reports usability criteria (BIBRUC). From the 14 LBUC in Table 2, 10 usability criteria were confirmed relevant, and 4 usability criteria that could not be mapped to any BI issue were omitted.
Interviews
The interview was conducted after the analysis of the survey data and the CIL data extraction analysis. The survey concluded that there are no BI banking reports usability issues, while the CIL data extraction concluded with 10 BI banking reports usability issues. The BI banking report issues mapping to usability constructs are presented in Table 6. The interview data were used to verify the relevance of the criteria towards presenting a manageable, verified set of critical usability criteria for BI systems.
TABLE 6: Business intelligence banking report issues and associated usability criteria (BIBRUC). |
Descriptive analysis
Because of COVID-19 regulations, the interviews were conducted via the online discussion forum MS Teams, which was acceptable to both the bank and the University of South Africa (UNISA). The responses were recorded, and the recordings were transcribed (a feature of MS Teams). A total number of 31 interviews were conducted. There were no respondents between the age of 18 and 24 years and only 3.2% of respondents were above the age of 45 years; 48.4% of the participants were between the age of 25 and 34 years and the same percentage were between the age of 35 and 44 years. This means that 96.8% of the respondents were between the age of 25 and 44 years. Regarding gender, 16(52%) of the participants were female, while 15 (48%) of the participants were male. Figure 8 illustrates the sample population of participants according to their experience in the different BI areas. Most of the participants (70.97%) have been using the system for more than 4 years, 6.45% for 24–48 months, 12.90% for 12–24 months, 3.23% for 3–13 months and 6.45% for less than 3 months. The results confirm that all the areas are covered by the participants’ experience, and hence, they represent a valid sample.
|
FIGURE 8: Experience in business intelligence area. |
|
Thematic interview analysis results
The analysis of the thematic interview results resulted in 35 codes relating to BI banking report issues. These BI banking report issues were mapped to the BIBRUC and BIBRUCUP. From the analysis, 11 CBIBRUCUP were identified. The data were analysed using TA and the same six phases proposed by Braun and Clark (2006). After familiarising herself with the data, the primary researchers identified 50 initial codes using ATLAS.ti. The 10 BI banking report issues identified from the CIL data extraction were used to inform the search for themes (code groups) and codes. Table 6 shows the different BI banking report issues. Similarly, the captured codes were allocated to the relevant code groups to create network diagrams. The names for both the network diagrams and code groups are like the names allocated to the document names. This was done to ensure the consistency of the network diagrams. For example, for the code group named ‘The report takes time to load’, there is a respective network diagram theme (Figure 9).
|
FIGURE 9: Network diagram showing the themes for company issues log data. |
|
Phase 4: Reviewing themes
The researcher revisited the codes allocated to each code group and ensured that the relevant codes were allocated to the correct code groups. From the 50 codes that were initially created in Phase 2, 35 codes were retained.
Phase 5: Defining and naming themes
In this phase, the thinking behind the code groups was reconsidered based on Phase 1, around the data capturing and familiarisation. The researcher did not change the code groups, and they remained as per Phase 4. Figure 9 depicts the network diagram in ATLAS.ti.
Phase 6: Producing the report
In this phase, the researcher identified the BI banking issues for each of the themes created and mapped the codes to their respective BIBRUC and BIBRUCUP.
Satisfaction – Yes or no: The participants were asked if they were satisfied with the BI banking report and requested to provide a reason to substantiate the answer. While 77% of the participants answered with a ‘Yes’ and 13% of the participants answered with a ‘No’, 10% of the participants said that they were unsure as to whether they were satisfied.
Satisfaction – Scale of 1 to 5: The participants were asked at the end of the interview to provide their overall rating as to whether they were satisfied with the BI banking reports. The participants were allowed to only choose from a scale of 1 to 5, 1 being strongly dissatisfied, while 5 being strongly satisfied. For those that responded, 32% of the participants rated their satisfaction on a scale of 4, while 19% of the participants rated their satisfaction on a scale of 3; 19% of the participants provided a rating of a scale of 3.5, while 19% of the participants provided a rating of a scale of 5; 3% of the participants provided a rating of a scale of 4.2, while 19% of the participants provided a rating of a scale of 4.5.
Triangulation
Figure 10 depicts the results of the findings from all the research actions conducted in this study. The results have been triangulated with the intent to reveal the convergent evidence on the salience of the findings.
To answer RSQ4: ‘What are the critical usability criteria that should be used to evaluate the BI banking reports from the user’s perspective?’ Table 7 contains the mapping including the following sets of usability criteria: the LBUC (literature-based), BIBRUC and BIBRUCUP (survey-based) and the resulting CBIBRUCUP. There were 14 LBUCs that were deemed relevant for this study; out of the 14, 12 of them were deemed critical for the BI banking reports according to our findings. The 12 critical usability criteria are depicted in Table 7.
TABLE 7: Critical business intelligence banking reports usability criteria from the user’s perspective based on triangulation of all the previous business intelligence usability criteria in this study. |
Notably, mapping the BI issues to usability criteria was based on the literature definitions of those criteria but it also involved some subjective decisions. While effectiveness and efficiency influenced most of those issues, the purpose was to see which of the other criteria are relevant. Therefore, the CBIBRUCUP is suggested as a starting point for future research rather than a final and complete set of criteria.
Discussion
The findings from the CIL data extraction (response to SRQ2) and the survey (response to SRQ3) were contradictory, and therefore, interviews were conducted to investigate if there are BI banking reports usability issues from a user’s perspective (response to SRQ4), and if so, to identify the critical usability criteria, that should be used to evaluate the BI banking reports. The triangulation of the findings confirmed the need to incorporate the usability criteria when developing BI banking reports. The usability criteria identified in this study confirm most of the LBUC and expand the list of BIBRUC by adding the criteria of familiarity, multi-threading, recoverability and visibility.
This supports the findings from Mansell and Ruhode (2019), namely, that effort perception (which is related to ease of use and usability) is a dominant inhibitor of business intelligent use by managers in SA municipalities. Beelders and Kotzé (2020) investigated the causes of the low utilisation of BI and consequently proposed that usability should be included as a phase in the design and development lifecycle of BI tools. Eriksson and Ferwerda (2021) confirmed that user experience (a construct encompassing usability) is critical to BIS and identified performance and education as criteria influencing usability. In our study, performance is represented by the criteria of effectiveness and efficiency, so our findings confirm their criteria and add value by doing so on a more detailed level. Hamid et al. (2022) conducted a study to determine the impact of usability issues and experience of mobile banking users. Their results showed that the trustfulness, memorability, and efficiency of the mobile banking apps need to be improved. Macías and Borges (2024) proposed a BI approach aimed at helping key stakeholders in decision-making and obtaining new knowledge through indicators, measures, KPIs, trends and forecasts based on historical data related to assessments. These papers confirm the relevance of BI usability in general and this study in particular as none of these were performed in the context of banking reports. Despite the focus on BI banking employees, the criteria are based on general usability principles and could be applied to improve the practice of BI report design and evaluation for all stakeholders of the bank.
Conclusion
In this study, we proposed and evaluated the usability criteria for BI banking reports. Empirical data collection was done through a survey, CIL data extraction and interviews. The findings indicated that there are BI banking reports usability issues; therefore, there is a need for usability to be considered when developing BI banking reports. The usability criteria that have been validated by data triangulation of the three datasets are considered the critical usability criteria for BI banking reports and present the scientific contribution to the extant knowledge base. These criteria can also be formulated as guidelines and those can be applied to evaluate BI banking reports in practice. The discrepancy between users’ perceived ease of use and their performance in using a system is a well-known human–computer interaction phenomenon, but investigating the discrepancy between the CIL data extraction and the survey data at another bank might reveal more specific insights. Future studies are needed to focus on an entire bank instead of one business unit, and it would be useful to repeat the study at two or more different banks.
Acknowledgements
This article is partially based on the author, P.A.L.’s Master’s dissertation entitled ‘Usability guidelines for business intelligence banking reports: a mixed methods study in the South African banking industry’ toward the degree of Master of Science in Computing in the Department of Information Systems, University of South Africa, South Africa, with supervisors Prof. J.v.B and Dr. R.v.d.M., received February 2024. It is available here, https://uir.unisa.ac.za/bitstream/handle/10500/31635/dissertation_lebotsa_pa.pdf?sequence=1.
This article is based on the research supported by the South African Research Chairs Initiative of the Department of Science and Technology and the National Research Foundation of South Africa (Grant No. 98564). The authors declare that they have no financial or personal relationships that may have inappropriately influenced them in writing this article. The views expressed in this article are those of the authors and not an official position of the institution or funder.
Competing interests
The authors declare that they have no financial or personal relationships that may have inappropriately influenced them in writing this article.
Authors’ contributions
P.A.L. did the research as her MSc study. P.A.L. crafted the draft of the manuscript under the supervision of the two supervisors. J.v.B. conceptualised the the study and drafted it. R.v.d.M gave input throughout.
Funding information
This study is based on the research supported by the South African Research Chairs Initiative of the Department of Science and Technology and National Research Foundation of South Africa (Grant No. 98564).
Data availability
The data that support the findings of this study are available from the corresponding author, J.v.B., on reasonable request.
Disclaimer
The views and opinions expressed in this article are those of the authors and are the product of professional research. It does not necessarily reflect the official policy or position of any affiliated institution, funder, agency or that of the publisher. The authors are responsible for this study’s results, findings and content.
References
Ain, N.U., Vaia, G., DeLone, W.H. & Waheed, M., 2019, ‘Two decades of research on business intelligence system adoption, utilization and success – A systematic literature review’, Decision Support Systems 125, 113113. https://doi.org/10.1016/J.DSS.2019.113113
Baran, M.L. & Jones, J.E., 2016, Mixed methods research for improved scientific study, p. 72, Information Science USA, Hershey, Pennsylvania.
Bharadiya, J.P., 2023, ‘Leveraging machine learning for enhanced business intelligence’, International Journal of Computer Science and Technology (IJCST) 7(1), 1–19.
Beelders, T.R. & Kotzé, J.E., 2020, ‘Augmenting the business intelligence lifecycle model with usability: Using eye tracking to discover the why of usability problems’, Electronic Journal of Information Systems Evaluation 23(1), 96–111. https://doi.org/10.34190/ejise.20.23.1.007
Boell, S.K. & Cecez-Kecmanovic, D., 2015, ‘On being “systematic” in literature reviews in IS’, Journal of Information Technology 30(2), 161–173. https://doi.org/10.1057/jit.2014.26
Braun, V. & Clarke, V., 2006, ‘Using thematic analysis in psychology’, Qualitative Research in Psychology 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
Braun, V. & Clarke, V., 2016, ‘(Mis)conceptualising themes, thematic analysis, and other problems with Fugard and Potts’ (2015) sample-size tool for thematic analysis’, International Journal of Social Research Methodology 19(6), 739–743. https://doi.org/10.1080/13645579.2016.1195588
Braun, V. & Clarke, V., 2021, ‘One size fits all? What counts as quality practice in (reflexive) thematic analysis?’, Qualitative Research in Psychology 18(3), 328–352. https://doi.org/10.1080/14780887.2020.1769238
Brooke, J., 2013, ‘SUS: A retrospective’, Journal of Usability Studies 8(2), 29–40.
Brosens, J., Kruger, R.M. & Smuts, H., 2018, ‘Guidelines for designing e-statements for e-banking’, in Proceedings of the Second African conference for human-computer interaction: Thriving Communities, AfriCHI ’18, Windhoek, ACM, New York, NY, December 03–07, pp. 1–6.
Cameron, R., 2010, ‘Mixed methods research: The five Ps framework’, The Electronic Journal of Business Research Methods 9, 96–108, viewed 07 February 2023, from https://academic-publishing.org/index.php/ejbrm/article/view/1272.
Creswell, J.W., 2010, Mapping the developing landscape of mixed methods research, pp. 45–68, Sage, Los Angeles, CA.
Creswell, J.W., 2014, Research design qualitative, quantitative, and mixed methods approaches, 4th edn., Sage, Thousand Oaks, CA.
Creswell, J.W. & Plano Clark, V.L., 2011, Designing and conducting mixed methods research, Sage, Los Angeles.
Efron, S.E. & Ravid, R., 2018, Writing the literature review: A practical guide, Guilford Publications, New York, NY.
Eriksson, M. & Ferwerda, B., 2021, ‘Towards a user experience framework for business intelligence’, Journal of Computer Information Systems 61(5), 428–437. https://doi.org/10.1080/08874417.2019.1693936
Field, A., 2013, Discovering statistics using IBM SPSS statistics, 4th edn., SAGE Publications, Inc., Los Angeles.
Goldkuhl, G., 2012, ‘Pragmatism vs interpretivism in qualitative information systems research’, European Journal of Information Systems 21(2), 135–146. https://doi.org/10.1057/ejis.2011.54
Hamid, K., Iqbal, M.W., Muhammad, H.A.B., Fuzail, Z., Ghafoor, Z.T. & Ahmad, S., 2022, ‘Usability evaluation of mobile banking applications in digital business as emerging economy’, International Journal of Computer Science and Network Security 22(1), 250–260.
Hansoti, B., 2010, ‘Business intelligence dashboard in decision making’, in Paper 5. College of Technology Directed Projects [Preprint], viewed 27 January 2024, from https://docs.lib.purdue.edu/techdirproj/15/.
Hirose, M. & Creswell, J.W., 2023, ‘Applying core quality criteria of mixed methods research to an empirical study’, Journal of Mixed Methods Research 17(1), 12–28. https://doi.org/10.1177/15586898221086346
Hočevar, B. & Jaklič, J., 2010, ‘Assessing benefits of business intelligence systems: A case study’, Management: Journal of Contemporary Management Issues 15(1), 87–119, viewed 11 November 2023, from https://hrcak.srce.hr/53609.
Isik, O. & Sidorova, A., 2011, ‘Business intelligence (BI) success and the role of BI capabilities’. Intelligent Systems in Accounting, Finance and Management 18(4), 161–176. https://doi.org/10.1002/isaf.329
ISO 9241-11, 2018, Ergonomics of human-system interaction: Part 11 – Usability: Definitions and concepts, International Organization for Standardization, Geneva.
Jinpon, P., Jaroensutasinee, M. & Jaroensutasinee, K., 2011, ‘Business Intelligence and its Applications in the Public Healthcare System’, Walailak Journal of Science and Technology 8(2), 97–110. https://doi.org/10.2004/wjst.v8i2.16
Jooste, C., Van Biljon, J. & Mentz, J., 2014, ‘Usability evaluation for Business Intelligence applications: A user support perspective’, South African Computer Journal 53, 145–153. https://doi.org/10.18489/sacj.v53i0.198
Kaushik, V. & Walsh, C.A., 2019, ‘Pragmatism as a research paradigm and its implications for social work research’, Social Sciences 8(9), 255. https://doi.org/10.3390/SOCSCI8090255
Kiger, M.E. & Varpio, L., 2020, ‘Thematic analysis of qualitative data: AMEE Guide No. 131’, Medical Teacher 42(8), 846–854. https://doi.org/10.1080/0142159X.2020.1755030
Macías, J.A. & Borges, C.R., 2024, ‘Monitoring and forecasting usability indicators: A business intelligence approach for leveraging user-centered evaluation data’, Science of Computer Programming 234, 103077. https://doi.org/10.1016/j.scico.2023.103077
Mansell, I.J. & Ruhode, E., 2019, ‘Inhibitors of business intelligence use by managers in public institutions in a developing country: The case of a South African municipality’, SA Journal of Information Management 21(1), 1004. https://doi.org/10.4102/sajim.v21i1.1004
Matemilola, B.T., Bany-Ariffin, A.N. & Muhtar, F.E., 2015, ‘The impact of monetary policy on bank lending rate in South Africa’, Borsa Istanbul Review 15(1), 53–59. https://doi.org/10.1016/j.bir.2014.09.003
Munoz, J.M., 2017, Global business intelligence, Routledge, New York, NY.
Nastasi, B.K., Hitchcock, J.H. & Brown, L.M., 2010, ‘An inclusive framework for conceptualizing mixed methods design typologies: Moving toward fully integrated synergistic research models’, in A. Tashakkori & C. Teddlie (eds.), Handbook of Mixed Methods in Social & Behavioral Research, pp. 305–338, SAGE Publications, Inc., Thousand Oaks, CA.
Nielsen, J., 1993, Usability engineering, Academic Press, Boston, MA.
Nithya, N. & Kiruthika, R., 2021, ‘Impact of Business Intelligence Adoption on performance of banks: A conceptual framework’, Journal of Ambient Intelligence and Humanized Computing 12(2), 3139–3150. https://doi.org/10.1007/s12652-020-02473-2
Norman, D., 1990, The design of everyday things, Doubleday, New York, NY.
Olszak, C.M., 2016, ‘Toward better understanding and use of business intelligence in organizations’, Information Systems Management 33(2), 105–123. https://doi.org/10.1080/10580530.2016.1155946
Pickard, A.J., 2013, Research methods in information, Facet Publishing, London.
Preece, J., Rogers, Y. & Sharp, H., 2002, Interaction design – Beyond human-computer interaction, John Wiley & Sons, New York, NY.
Quamar, A., Özcan, F., Miller, D., Moore, R. J., Niehus, R. & Kreulen, J., 2020, ‘Conversational BI: An ontology-driven conversation system for business intelligence applications’, in Proceedings of the VLDB endowment, vol. 13(12), Elsevier, Tokyo, 31 August–04 September 2020, pp. 3369–3381.
Ramavhona, T.C. & Mokwena, S., 2016, ‘Factors influencing Internet banking adoption in South African rural areas’, South African Journal of Information Management 18(2), 1–8. https://doi.org/10.4102/sajim.v18i2.642
Rogers, Y., Sharp, H. & Preece, J., 2012, Interaction design: Beyond human-computer interaction, John Wiley & Sons, Indianapolis, New Jersey.
Scharp, K.M. & Sanders, M.L., 2019, ‘What is a theme? Teaching thematic analysis in qualitative communication research methods’, Communication Teacher 33(2), 117–121. https://doi.org/10.1080/17404622.2018.1536794
Schober, P., Boer, C. & Schwarte, L.A., 2018, ‘Correlation coefficients: Appropriate use and interpretation’, Anesthesia and Analgesia 126(5), 1763–1768. https://doi.org/10.1213/ANE.0000000000002864
Shannon-Baker, P., 2015, ‘Making paradigms meaningful in mixed methods research’, Journal of Mixed Methods Research 10(4), 1–16. https://doi.org/10.1177/1558689815575861
Smith, A., 2021, Top 5 biggest banks in South Africa, BuzzSouthAfrica, viewed 26 March 2023, from https://buzzsouthafrica.com/top-5-biggest-banks-in-south-africa/#:~:text=Top%205%20Biggest%20Banks%20in%20South%20Africa%201,Nedbank%20Group%20…%205%205.%20Investec%20Bank%20.
Smuts, M., Scholtz, B. & Calitz, A., 2015, ‘Design guidelines for business intelligence tools for novice users’, in Proceedings of the 2015 annual research conference on South African Institute of Computer Scientists and Information Technologists, Stellenbosch, ACM, New York, NY, September 28–30, pp. 1–15.
Tavera Romero, C.A., Ortiz, J.H., Khalaf, O.I. & Ríos Prado, A., 2021, ‘Business intelligence: Business evolution after industry 4.0’, Sustainability 13(18), 10026. https://doi.org/10.3390/su131810026
Thowfeek, M.H. & Salam, M.N.A., 2014, ‘Students’ assessment on the usability of E-leaming websites’, Procedia – Social and Behavioral Sciences 141, 916–922. https://doi.org/10.1016/j.sbspro.2014.05.160
Tikait, P., 2023, ‘Business Intelligence Concepts, Components & Applications’, Business Intelligence, viewed 07 October 2024, from https://www.selecthub.com/business-intelligence/business-intelligence-concepts/
Tullis, T. & Albert, B., 2008, Measuring the user experience – Collecting, analyzing, and presenting usability metrics, Elsevier Science, Boston, MA.
Van Biljon, J. & Mwapwele, S., 2023, ‘Research collaboration in asymmetric power relations: A study of postgraduate students’ views’, The Journal for Transdisciplinary Research in Southern Africa 19(1), 1288. https://doi.org/10.4102/td.v19i1.1288
Villamarín, J.M. & Diaz Pinzon, B., 2017, ‘Key success factors to business intelligence solution implementation’, Journal of Intelligence Studies in Business 7(1), 48–69. https://doi.org/10.37380/jisib.v7i1.215
Wise, L., 2012, Using open source platforms for business intelligence: Avoid pitfalls and maximise ROI, Newnes, New York.
Wixom, B. & Watson, H., 2010, ‘The BI-based organization’, International Journal of Business Intelligence Research 1(1), 13–28. https://doi.org/10.4018/jbir.2010071702
Xlogiatech, 2023, 4 concepts of business intelligence, viewed 28 October 2023, from https://medium.com/@xlogiatech1/4-concepts-of-business-intelligence-xlogia-36f6238dd4a5.
Yang, H. & Tate, M., 2012, ‘A descriptive literature review and classification of cloud computing research’, Communications of the Association for Information Systems 31(2), 35–60. https://doi.org/10.17705/1CAIS.03102
Appendix 1: Survey questions
1. Background questionnaire
2. Survey questions
3. Submit
Appendix 2: Interview
Interviews
Background questionnaire
1. Are you satisfied with the Bl reports (Yes/No)? Why?
2. Interview questions
3. Are there any other BI report issues that you had an encounter with while using the BI reports?
4. On a scale of 1 to 5 are you satisfied with the BI reports
|