Research Writing

Theoretical framework analysis Usefulness Characteristic Approach

Theoretical framework analysis

The Theoretical Framework Analysis method for qualitative data management and analysis has been used since the 1980s. The method originated in large-scale social policy research, but is becoming an increasingly popular approach in the investigation. However, there is some confusion about its possible application and its limitations. While the leadership of an experienced qualitative methodologist is undoubtedly required, non-specialists from the larger team can and should be involved in the analysis process.

Background of Theoretical framework analysis

The Theoretical Framework Analysis method is part of a broad family of analysis methods that are often called thematic analysis or qualitative content analysis. These approaches identify commonalities and differences in qualitative data, before focusing on relationships between different parts of the data, thus attempting to draw descriptive and/or explanatory conclusions clustered around themes.

The Theoretical Framework Analysis method was developed by researchers Jane Ritchie and Liz Spencer of the Qualitative Research Unit of the National Center for Social Research in the United Kingdom in the late 1980s for use in political research. big scale. It is now widely used in other fields, including health research. Its defining characteristic is the output matrix: rows (cases), columns (codes), and “cells” of summary data. This provides a structure in which the researcher can systematically reduce the data for case and code analysis.

What is a Case?

A case is an individual interviewee, but can be adapted to other units of analysis, such as predefined groups or organizations. Although in-depth analyzes of key themes can take place across the entire dataset, the views of each research participant remain connected to other aspects of their story within the matrix. In this way the context of individual opinions is not lost. Comparing and contrasting data is vital to qualitative analysis. The ability to easily compare data between cases, as well as within individual cases, is built into the structure and process of the Framework Analysis Method.

Usefulness of the Theoretical Framework Analysis Method

The Theoretical Framework Analysis Method provides clear steps to follow and produces highly structured results from summary data. Therefore, it is useful when multiple researchers are working on a project. Especially in multidisciplinary research teams where not all members have experience in qualitative data analysis. It is also useful for managing large data sets where you want to get an overview and descriptive view of the entire data set.

However, caution is advised before selecting the method, as it is not a suitable tool for analyzing all types of qualitative data or for answering all qualitative research questions, nor is it an easy version of qualitative research for researchers. quantitative researchers. It is important to note that the Theoretical Framework Analysis method cannot accommodate very heterogeneous data. That is, the data must cover similar key themes or issues in order to be categorized.

Of course, individual interviewees may have very different views or experiences on each topic, which can then be compared and contrasted. The Theoretical Framework Analysis method is typically used for thematic analysis of semi-structured interview transcripts, although in principle it could be adapted to other types of textual data. For example, documents such as meeting minutes or journals or observation field notes.

Characteristic

For quantitative researchers working with colleagues experienced in qualitative analysis or exploring qualitative research for the first time, the nature of the Framework Analysis Method is seductive. Its methodical processes and spreadsheet approach seem to be more in line with the quantitative paradigm. Although the Framework Analysis method is a very systematic method for categorizing and organizing what may seem like unwieldy qualitative data, it is not a panacea for the problematic issues often associated with qualitative data analysis. For example, how to make analytical decisions and make interpretive strategies visible and auditable.

Qualitative research skills are needed to properly interpret the matrix and facilitate the generation of descriptions, categories, explanations, and typologies. In addition, reflexivity, rigor and quality are issues that are required in the Theoretical Framework Analysis Method, as in other qualitative methods. Therefore, it is essential that studies using the Framework Analysis Method for analysis be supervised by an experienced qualitative researcher. Although this does not prevent those new to qualitative research from contributing to the analysis as part of a larger research team.

Theoretical Framework Analysis Approach

There are a number of approaches to qualitative data analysis. These include:

Those that pay close attention to language and how it is used in social interaction, such as discourse analysis and ethnomethodology.

Those that deal with experience, meaning, and language, such as phenomenology and narrative methods.

Those who try to develop a theory derived from the data through a set of interconnected procedures and stages, such as grounded theory.

Many of these approaches are associated with specific disciplines and are supported by philosophical ideas that shape the process of analysis. However, the Theoretical Framework Analysis Method is not aligned with a specific epistemological, philosophical or theoretical approach. Rather, it is a flexible tool that can be adapted for use with many qualitative approaches that aim to generate topics.

Theme development is a common feature of qualitative data analysis. It involves the systematic search for patterns to generate complete descriptions capable of shedding light on the phenomenon under investigation. In particular, many qualitative approaches use the ‘constant comparative method’, developed as part of Grounded Theory. It involves making systematic comparisons between cases to refine each topic. Unlike the Grounded Theory, the Theoretical Framework Analysis Method is not necessarily concerned with generating a social theory. But you can greatly facilitate constant comparative techniques by reviewing the data through the matrix.

Differences with the Deductive Approach of Qualitative Analysis

Perhaps because the Framework Analysis Method is so obviously systematic, it has often, as other commentators have pointed out, been confused with a deductive approach to qualitative analysis. However, the tool itself is not linked to inductive or deductive thematic analysis; the place that research occupies on this inductive-deductive continuum depends on the research question.

A question such as “Can patients give an accurate biomedical account of the onset of their cardiovascular disease?” it is essentially a yes/no question. Although it could be colored by the scope of his story or by the appropriate use of community was welcomed with affection by its members. To distinguish himself from the others, the neophyte wore white clothing. In Christian religious terminology. Therefore, it requires a deductive approach to both data collection and analysis. For example, structured or semi-structured interviews and targeted qualitative content analysis).

Similarly, a deductive approach can be adopted if the analysis is based on pre-existing theory, such as behavior change theories, for example in the case of a research question such as “How does the Theory of Planned Behavior help to explain the GP’s prescription?

However, a research question such as “How do people construct stories about the onset of their cardiovascular disease?” it would require a more inductive approach that allowed for the unexpected. This would allow more socially located answers from the interviewees. These could include issues of cultural beliefs, food preparation habits, concepts of “destiny,” or links to other important events in their lives. For example the duel, which the investigator cannot predict in advance.

How can I establish when the Theoretical Framework Analysis may be appropriate?

In all these cases, it may be appropriate to use the Framework Analysis method to manage the data. The difference would become apparent in the way topics are selected: in the deductive approach, topics and codes are preselected on the basis of previous literature, previous theories, or specifics of the research question. Whereas in the inductive approach, themes are generated from the data using open (unrestricted) coding, followed by theme refinement. In many cases, a combined approach is appropriate when the project has some specific questions to explore. But it also intends to leave room to discover other unexpected aspects of the participants‘ experience or the way they assign meaning to phenomena.

In summary, the Theoretical Framework Analysis Method can be adapted for use with deductive, inductive, or combined types of qualitative analysis. However, there are some research questions in which the analysis of the data by cases and themes is not appropriate, so the Theoretical Framework Analysis method should be avoided. For example, depending on the research question, life history data might be better analyzed using narrative analysis; recorded consultations between patients and their healthcare professionals through conversation analysis; and documentary data, such as resources for pregnant women, through discourse analysis.

Requirements of the Theoretical Framework Analysis

Since any form of qualitative or quantitative analysis is not a purely technical process. It is influenced by the characteristics of the researchers and their disciplinary paradigms, critical reflection. Throughout the research process, it is essential, including in the design of the study, the construction or collection of data and analysis. All team members should keep a research journal, recording reflective notes, impressions of the data, and thoughts on the analysis throughout the process.

Experienced qualitative researchers are more adept at sifting through data and analyzing it rigorously and thoughtfully. They cannot become too attached to certainty, but must remain flexible and adaptable throughout the investigation. Thus, rich and nuanced findings can be generated that encompass and explain the complexity of real social life and can be applied to complex social issues.

It is important to remember when using the Framework Analysis method that, unlike quantitative research, in which data collection and analysis are strictly sequential and mutually exclusive stages of the research process. In qualitative analysis there is, to a greater or lesser extent depending on the project, a continuous interaction between data collection, analysis and theory development. For example, new ideas or insights from participants may suggest potentially fruitful lines of inquiry, or close analysis may reveal subtle inconsistencies in a story that require further exploration.

Analysis procedure

Phase 1: Transcription

You will need a good quality audio recording and, ideally, a verbatim (word for word) transcript of the interview. For analysis with the Theoretical Framework Analysis Method, it is not necessarily important to include the conventions of dialogue transcripts, which can be difficult to read. For example, pauses or two people talking simultaneously, because the content is what is primarily of interest. Transcripts should have wide margins and adequate line spacing for subsequent coding and note taking. The transcription process is a good opportunity to dive into the data and is highly recommended for new researchers. However, on some projects you may decide that it is better to use the resources to outsource this task to a professional transcriptionist.

Stage 2: Familiarization with the interview

Becoming familiar with the entire interview using the audio recording and/or transcript and any background notes or reflections that have been recorded by the interviewer is a vital stage in interpretation. It may also be helpful to listen to all or part of the audio recording again. In large or multidisciplinary research projects, the people involved in data analysis may be different from those who conducted or transcribed the interviews. This makes this stage especially important. A margin can be used to record any analytical notes, thoughts or impressions.

Phase 3: Coding

After familiarization, the researcher carefully reads the transcript line by line, applying a paraphrase or label. That is, a “code” that describes what you have interpreted in the passage as important. In the most inductive studies, “open coding” is performed at this stage. In other words, everything that may be relevant from the greatest possible number of different perspectives is codified.

Codes may refer to:

substantial things. For example, particular behaviors, incidents, or structures.

Values. For example, those that inform or support certain claims, such as belief in evidence-based medicine or patient choice.

Emotions. For example, grief, frustration, love.

More impressionistic/methodological elements. For example, the interviewee found something difficult to explain, the interviewee became emotional, the interviewer felt uncomfortable.

Deductive Studies

In purely deductive studies, the codes may be predefined (for example, by existing theory or by specific areas of interest to the project). This stage may not be strictly necessary and can be passed directly to indexing, although it is generally useful. Even if a largely deductive approach is taken, perform open coding on at least some of the transcripts to ensure that important aspects of the data are not lost.

The goal of coding is to classify all data so that it can be systematically compared to other parts of the data set. At least two researchers (or at least one from each discipline or specialty in a multidisciplinary research team) should independently code the first few transcripts. This ensures that no one perspective dominates.

Inductive Studies

In inductive coding it is essential to look for the unexpected and not just code literally and descriptively. In this regard the participation of people with different perspectives can be of great help. In addition to getting an overall impression of what has been said, line-by-line coding can alert the researcher to take into account what normally remains invisible because it is not clearly expressed or does not “fit” with the rest of the story. In this way, challenging the ongoing analysis and reconciling and explaining anomalies in the data can strengthen the analysis. Coding can also be done digitally using CAQDAS, which is a useful way to automatically keep track of new codes. Nevertheless,

Stage 4: Developing an analytical framework

After coding the first transcripts, all the researchers involved must meet to compare the labels they have applied. They must then agree on a set of codes that will apply to all subsequent transcriptions. Codes can be grouped into categories (using a tree diagram if helpful), which are then clearly defined. This constitutes an analytical framework of work. Several iterations of the analytic framework are likely to be required before no further code emerges. It’s always worth having an ‘other’ code under each category to avoid ignoring data that doesn’t fit; the analytic framework is never ‘final’ until the last transcript has been encoded.

Stage 5: Application of the analytical framework

The analytical framework is then applied by indexing the subsequent transcripts using the existing categories and codes. Each code is usually assigned a number or abbreviation for easy identification and is written directly on the transcripts. Computer Aided Qualitative Data Analysis (CAQDAS) software is especially useful at this stage because it can speed up the process. It ensures that, at later stages, the data is easily recoverable. It should be noted that, unlike statistical analysis software, entering the data into qualitative analysis software does not analyze the data. It is simply an efficient way of storing and organizing data so that it is accessible for the analysis process.

Stage 6: Collecting data in the framework matrix

Qualitative data is voluminous (an hour of interview can generate between 15 and 30 pages of text). Being able to manage and summarize (reduce) data is a vital aspect of the analysis process. A spreadsheet is used to generate a matrix and the data is plotted on it. The elaboration of graphs consists of summarizing the data by categories of each transcript. Good graphic representation requires the ability to strike a balance between data reduction, on the one hand, and preservation of the original meanings and “feel” of the interviewees’ words, on the other. The graphic should include references to interesting or illustrative quotes.

These can be automatically labeled if CAQDAS is used to manage the data. In multidisciplinary teams it is useful to compare and contrast summary styles early in the analysis process to ensure consistency within the team. The abbreviations used must be agreed by the team. Once team members have become familiar with the analytic framework and have practice coding and plotting, it will take an average of half a day for each one-hour transcript to get to this stage. In the early stages, much more time is needed.

Stage 7: Interpretation of the data

Throughout the investigation, it is helpful to have a notebook or computer file to record impressions, ideas, and first interpretations of the data. It may be worth pausing at any stage to explore an interesting idea, concept or potential topic by writing an analytical note and then discussing it with other members of the research team. Little by little, the characteristics and differences between the data are identified. In this sense, typologies are generated, questioning theoretical concepts (whether they are previous concepts or those emerging from the data) or drawing connections between categories to explore relationships and/or causality.

If the data is rich enough, the conclusions generated through this process can go beyond the description of particular cases. They can reach the explanation of, for example, the reasons for the appearance of a phenomenon, the prediction of how an organization or other social actor can instigate or respond to a situation or the identification of areas that are not working well within an organization. or system. It should be noted that this phase usually lasts longer than expected. Any project plan should ensure that sufficient time is allocated for meetings and individual researchers to interpret and write conclusions.

Quality Assessment in Qualitative Research

The Theoretical Framework Analysis method has been successfully developed and used in research for more than 25 years. The question of how to assess quality in qualitative research has been much debated. But ensuring rigor and transparency in the analysis is a vital component. There are of course many ways to do it, but in the Framework Analysis method the following are useful:

Summarize data during graphing

In addition to being a practical way to reduce data. It means that all members of a multidisciplinary team can participate in the data. In the same way, they can offer their perspectives during the analysis process. This without necessarily having to read all the transcripts or engage in the more technical parts of the analysis.

Graphing also ensures that researchers pay close attention to describing the data. For this, the subjective frameworks and expressions of each participant are used first, before moving on to the interpretation.

The summary data is kept within the broader context of each case. In this way, a thick description that pays attention to the complex layers of meaning and understanding is encouraged.

The matrix structure is visually simple

In the same way, it can facilitate the recognition of patterns in the data by any member of the research team. Even drawing attention to contradictory data, deviated cases or empty cells.

The systematic procedure makes it easy to follow, even for multidisciplinary teams and/or with large data sets.

It is flexible enough that non-interview data can be included in the matrix. For example, as field notes taken during the same or reflective considerations.

It is not aligned with an epistemological point of view or a concrete theoretical approach

Therefore, it can be adapted for use in inductive or deductive analysis or a combination of both. For example, using pre-existing theoretical constructs deductively and then revising the theory with inductive aspects. Or also using an inductive approach to identify themes in the data. All of this should be done before turning to the literature and using theories deductively to help better explain certain topics.

It is easy to identify relevant data extracts to illustrate themes and check whether there is sufficient evidence for a proposed theme.

Finally, there is a clear audit trail from the original raw data to the final topics, including illustrative citations.

Difficulties in the Application of the Theoretical Framework Analysis

This approach also presents a number of potential pitfalls:

The systematic approach and matrix format, as we have noted in the background, are intuitively appealing to those with a quantitative background. However, the spreadsheet aspect may further increase the temptation for those without a deep understanding of qualitative research to attempt to quantify qualitative data. This type of statement is clearly meaningless because sampling in qualitative research is not designed to be representative of a larger population. Rather it is intended to capture the diversity around a phenomenon.

Like all qualitative analysis methods, the Framework Analysis method requires a lot of time and resources. When multiple stakeholders and disciplines are involved in data analysis and interpretation, the time needed is extended. This time must be included in the project proposal at the pre-financing stage.

There is a high training component to successfully use the method in a new multidisciplinary team. Depending on their role in the analysis, members of the investigation team may need to learn how to:

Codify, index and graph the data.

Think reflectively about how their identities and experience affect the analysis process.

Learn about methods of generalization. That is, analytical generalization and transferability, rather than statistical generalization. In this way they can help to legitimately interpret the meaning and importance of the data.

Participant Responsibilities

While the Framework Analysis method lends itself to the involvement of non-experts in data analysis, it is critical to the success of the method that an experienced qualitative researcher leads the project. Even if the CEO of a large mixed methods study is someone else. Ideally, the qualitative director should be accompanied by other researchers with at least some training or prior experience in qualitative analysis.

The responsibilities of the lead qualitative researcher are:

Contribute to study design, project timelines, and resource planning; guide novice qualitative researchers.

Train academics, lay people and others (non-qualitative) to contribute, as appropriate, to the analysis process.

Facilitate discussion meetings in a way that fosters critical and thoughtful engagement with the data and with other team members.

Lead the writing of the study.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

CAPTCHA


Back to top button