Data analysis

Aim

To provide the basics of qualitative data analysis and provide relevant literature for more information on qualitative data analysis.

Definition

Qualitative research produces large amounts of textual data in the forms of transcripts and observational field notes. Qualitative data analysis is the process of systematically searching and arranging the transcripts, field notes, and other materials that you accumulate to increase your own understanding of them and to enable you to present what you have discovered to others. Analysis involves working with data, organizing them, breaking them into manageable units, synthesizing them, searching of patterns, discovering what is important and what is to be learned, and deciding what you will tell others.

Basic principles of qualitative data analysis

Qualitative data analysis is characterized by its analytical openness, as researchers can develop categories, themes, concepts or formulate a theory. In general, qualitative data analysis does not seek to quantify data. The results are described in descriptive, and possibly in a visual manner, supported with quotes or images from the data. Qualitative data analyses is also characterized by its cyclical and iterative process. Data analysis often takes place alongside data collection to allow questions to be refined and new topics to be further explored.

Different types of data analysis can be applied in qualitative research. A basic distinction is made between data analysis as inductive – i.e. the analysis starts from the collected data, which successively could lead to the discovery of themes or concepts –  or as deductive – i.e. the analysis starts from a framework, for instance predetermined themes or categories based on a theory or the literature, or the analysis is a mix of an inductive or deductive approach. This depends on the perspective or aims of the research.

Approaches

There are many different approaches to qualitative data analysis, like grounded theory, content analysis or thematic analysis. The choice is related to the aim of the study. Wertz et al. (2011) offer insightful examples of how different analytic lenses lead to different processes of analysis and specific outcomes.

Grounded theory is the process of reducing data into categories and the categories are then developed and integrated into a theory. Grounded theory is an inductive process to develop theoretical descriptions of social phenomena that emerge from the data. The theory is “grounded” in the data.

Content analysis is an approach in which existing theory/earlier findings are used as a framework for analysis. This approach can validate or extent a theory. Content analysis is both an inductive and deductive process. The codes can be based on an existing theory/results (i.e. deductive part), codes can be refined, but parts of the data that do not fit in the pre-determined codes can be coded with new codes (i.e. inductive part). The aim is to look for parallels, refine or reject (parts of) the theory for your own research/population.

Thematic analysis is a method for identifying, analyzing and reporting patterns (themes) within the data. It organizes and describes your data set in detail, and interprets various aspect of the research topic. It can be used for an indicative – i.e. themes identified are strongly linked to the data themselves -, or deductive process – i.e. themes driven by researchers’ theoretical or analytical interest in the area.

In general, researchers use the following three steps to analyze qualitative data

  1. Explorative analysis using open coding: analytical techniques to search and find what kind of information is in a specific data segment. This step is used to reorganize the data by forming codes in an inductive or deductive manner. Interpretation takes place on the level of a text segment.
  2. Comparative analysis using axial coding: analytical tactics to compare the codes in order to form categories, see patterns or differences in the data. The interpretation takes place on the level of a group of text segments.
  3. Integrative analysis using selective coding: analytical strategies to interpret the data on a higher level. The interpretation takes place on the level of the whole dataset.

Within these three phases different analytical techniques, tactics or strategies can be used. Evers (2015) offers a useful overview of these methods.

Computer assisted qualitative data analysis software

Software programmes can be of great help in dealing with the administrative work related to the cyclical process of reading, comparing and reflecting on text segments. There are various software programmes that support the analysis of qualitative data, such as Atlas.ti, MaxQDA and NVivo. These can be a useful tool in ordering the data efficiently, although you will need to order the data yourself. These programmes should not be viewed as shortcuts to rigorous and systematic analysis, as no programme is capable to interpret the data. Therefore, the researcher still needs analytical skills to take the analysis forward.

Quality procedures

There are several general principles in qualitative data analysis that lead to ‘good practices’, like the importance of transparency, validity, reliability, comparison and reflexivity (see Green & Thorogood, 2010, chapter 8). For example, it is important to note down decisions and steps in a logbook, to use dual coding procedures in which two researchers code the data independently to prevent bias and use a member check to ensure participants support the summary of the data.

Frambach et al (2013) offer a useful overview of different quality criteria:

Criteria Strategies
Credibility / Internal validity
  • Use multiple data-analysis methods (methodological triangulation)
  • Ask feedback from participants on the data or interpretation of the data (member checking)
Transferability / External validity
  • Make the findings meaningful to others by describing them and their context in detail (thick description)
  • Discuss the findings’ resonance with existing literature from different settings
Dependability / Reliability
  • Continuously analyze the data to inform further data collection (iterative data collection)
  • Continuously re-examine the data using insights that emerge during analysis (iterative data analysis)
Confirmability / Objectivity
  • Search the data and/or literature for evidence that disconfirms the findings
  • Discuss the findings with peers/experts (peer debriefing)
  • Keep a diary to reflect on the data analysis (reflexivity)
  • Document the steps and decisions taken in the data-analysis (audit trail)

Recommendations to increase the quality of qualitative data-analysis:

  • Start analysis after first data collection (iterative)
  • Thick analysis; use different analytical lenses to analyze the data
  • Search for patterns, mechanisms, reasons
  • Do analysis together with peers
  • Go back and forward in the data
  • Work in a disciplined way and note down all decisions and steps in a logbook

Potential pitfalls in qualitative data-analysis are:

  • Too much interpretation
  • Tunnel vision or work on your own
  • Drown in the amount of data
  • A lack of time
  • No analytical work done to identify patterns across the entire data set
  • Jump to conclusions to easily
  • Use % and numbers in reporting
  • Working inconsistent and not transparent