Abstract

This paper draws on the experiences of two researchers and discusses how they conducted a secondary data analysis using classic grounded theory. The aim of the primary study was to explore first-time parents’ postnatal educational needs. A subset of the data from the primary study (eight transcripts from interviews with fathers) was used for the secondary data analysis. The objectives of the secondary data analysis were to identify the challenges of using classic grounded theory with secondary data and to explore whether the re-analysis of primary data using a different methodology would yield a different outcome. Through the process of re-analysis a tentative theory emerged on ‘developing competency as a father’. Challenges encountered during this re-analysis included the small dataset, the pre-framed data, and limited ability for theoretical sampling. This re-analysis proved to be a very useful learning tool for author 1(LA), who was a novice with classic grounded theory.

Introduction

The concept of secondary data analysis appears to have first entered the literature nearly 50 years ago, when Glaser discussed the potential of re-analysing data ‘which were originally collected for other purposes’ (1963, p. 11). Despite the 50-year gap, there still remains a paucity of literature which specifically addresses the processes and challenges of applying secondary data analysis to primary qualitative data and exploring the implications and outcomes of using a different methodology. This paper draws on the experiences of two people who attempted to use a classic grounded theory approach to analyse previously collected primary qualitative data.
Prior to discussing the approach to secondary data analysis used for this study, the differences between primary data, secondary data and primary and secondary data analysis and metasynthesis are briefly outlined. Primary data originates from a study in which a researcher collects information him/herself to answer a particular research question. Secondary data, on the other hand, is data that already exists (Glaser, 1963). Consequently, the secondary data analyst is not involved in the recruitment of participants or in the collection of the data. Heaton (2004) defines secondary data analysis as ‘a research strategy which makes use of pre-existing quantitative data or pre-existing qualitative data for the purposes of investigating new questions or verifying previous studies’ (p. 16). In other words, secondary data analysis is the use of previously collected data, for some other purpose. It is not a method of data analysis, therefore methods such as grounded theory or statistical analysis, for example, can be applied to the process of secondary data analysis. Metasynthesis, on the other hand, differs from secondary data analysis in that it analyses qualitative findings from a group of studies, and does not re-use the primary data set, e.g. interviews, diaries, photographs, stories and field notes. Rather, it is ‘the aggregating of a group of studies for the purpose of discovering the essential elements and translating the results into the end product that transforms the original results into a new conceptualisation’ (Schreiber, Crooks & Stern, 1997, p. 314).
A review of the literature highlights a number of reasons for conducting a secondary data analysis including: applying a new research question (Heaton, 2004); using old data to generate new ideas (Fielding, 2004); ‘verification, refutation and refinement of existing research’ (Heaton, 2004, p. 9), and exploring data from a different perspective (Hinds, Vogel & Clarke-Steffen, 1997). Despite the fact that secondary data analysis has been in use as a research tool for quite some time it has, in the main, been applied to primary quantitative data (Brewer, 2006), and its use with qualitative data is relatively new (Heaton, 1998). Qualitative secondary data analysis has its supporters and its sceptics, and one reason why so few researchers use this approach is because they feel there may be something ethically, practically or epistemologically problematic about re-using qualitative data (Mason, 2007). The most common reason why researchers conduct a secondary data analysis, according to Fielding (2004), is in order to re-analyse the data from a new perspective with a view to gaining new insights. Most instances of qualitative secondary data analysis tend to be those where the primary researcher re-analyses his/her original work (Parry & Mauthner, 2005; Gladstone, Volpe & Boydell, 2007).

Secondary data analysis: benefits

The last number of years has witnessed an increase in the number of databases where original qualitative data can be deposited and accessed for secondary analysis. Examples include the Irish Qualitative Data Archive (IQDA) which was established in 2011, (http://www.iqda.ie/content/welcome-iqda), and in the UK, the Qualitative Data Archival Resource Centre (ESDS Qualidata), (http://www.esds.ac.uk/qualidata/about/introduction.asp). It is also becoming increasingly common for funders to request researchers, as a condition of funding, to deposit their data in a relevant database (Bishop, 2007). The development of these databases will no doubt lead to an increase in the number of qualitative secondary data analysis studies in the future.
A review of the literature suggests that there are a number of advantages to secondary data analysis. Heaton (2004) points out that secondary data analysis is an effective means of analysing data when there is difficulty accessing a hard-to-reach sample, and when dealing with particularly sensitive issues, small populations and rare phenomena. Another benefit includes enhancing quality control by verifying original research, thus adding to the transparency, trustworthiness and credibility of the original findings. Others take a more pragmatic view and consider the re-use of existing data an efficient way of conducting research as it eliminates the need to spend time recruiting and gaining access to participants (Corti, 2008; Trochim, 2006); it is also considered in order to minimise the time and financial expense associated with data collection (Corti, 2008), e.g. recording device, transport and transcription costs. A final and important benefit of secondary data analysis is that it is recognised as a valuable teaching and learning tool for novice researchers (Glaser, 1963). Re-analysing existing data enables students to engage in experiential learning about a substantive issue and/or a particular methodology and, in so doing, protects potential research participants while students are learning how to carry out research in a safe way (Brewer, 2006). Despite all the positives, secondary data analysis has its critics. A number of writers highlight the drawbacks of re-analysing interview data including a loss of control over data collection (Brewer, 2006, Szabo & Strang 1997), lack of knowledge and information around the interview experience, and the inability to raise questions and probe about emerging themes in subsequent interviews (Bishop, 2007; Szabo & Strang 1997).

Objectives of secondary data analysis in this study

The objectives of the secondary data analysis in this study were threefold. Firstly, to identify the challenges of using classic grounded theory with secondary data, as not all primary data may be amenable to secondary data analysis (Heaton, 1998); secondly, to explore the potential of secondary data analysis as a teaching and learning tool for the principles and procedures of classic grounded theory; and thirdly to explore whether the re-analysis of primary data using a different methodology would yield a different result.

Methodology for this study

The methodology that informed this secondary data analysis study drew on Glaser’s writing in the area of classic grounded theory (Glaser, 1978, 1998, 2001, 2003, 2005). The grounded theory method offers a rigorous, orderly guide for theory development. Although structured and systematic, it is designed to allow the researcher to be free of the structure of more forced methodologies. Its real strength lies in its open-ended approach to discovery. The four techniques that lie at the heart of the classic grounded theory method are: coding (open and theoretical), constant comparative analysis, theoretical sampling and theoretical saturation. These techniques are used to guide the analytical process towards the development and refinement of a theory that is grounded in data.
However, unlike qualitative research which focuses on producing ‘thick descriptions’ of data, the grounded theorist focuses on organising ideas that emerge from data, and conceptually transcends the data and develops ideas on a level of generality higher in conceptual abstraction than the material being analysed (Glaser, 2001). Classic grounded theory was chosen to conduct this secondary data analysis in order to facilitate the first author’s (LA) need to learn the principles and procedures of classic grounded theory while actually conducting the secondary data analysis, as she was about to commence a larger, classic grounded theory study.

Brief description of primary dataset

The aim of the primary study was to explore first-time parents’ perceptions of their educational needs in the postnatal period (Andrews, 2000). Ten women and eight men were recruited during the women’s postnatal stay in hospital. All participants were interviewed separately three weeks after the birth of their baby. Data was collected using a semi-structured interview schedule based on a review of the literature. Interviews were audio-recorded and transcribed for analysis. The study was informed by the writings of Strauss and Corbin (1998) and their approach to grounded theory. Data was analysed using the constant comparative method, where eight categories were developed: four for the mothers’ data and four for the fathers’ data. For the secondary data analysis which is the focus of this paper, a subset of the data from the primary study, which included eight detailed interviews with fathers, was analysed.

Status of the authors in relation to the primary dataset

The first author (LA) collected and analysed the original data as part requirement for an academic award. The second author (AH) is an experienced researcher who has used and taught classic grounded theory methods.

Ethical issues

Similar to all research studies, secondary data analysis requires attention to ethical concerns. Writers in the area of secondary data analysis highlight issues such as copyright, informed consent, confidentiality and ownership of data (Parry & Mauthner, 2005; Heaton, 2004; Cobban, Edgington, & Pimlott, 2008). Parry and Mauthner (2005) view qualitative data as a joint venture between participants and researcher and, as a consequence, both parties should retain ownership rights over the data. In the context of this study, it was not possible to return to the participants of the primary study for further consent as the data had been collected 10 years earlier, and in keeping with the Data Protection Act (1988; Data Protection (Amendment) Act 2003) of that time and the original informed consent, the participants’ contact details and tape recordings had been destroyed. In keeping with the conditions of the Data Protection Act all identifiable material was destroyed 5 years after the study commenced. Ethical approval for the secondary data analysis was received from the University Faculty of Health Sciences’ ethics committee and it was given on the basis that the original transcripts were anonymised and there was no possibility of tracing the participants.
In order to ensure confidentiality, LA who completed the original study revisited each transcript to check that they were all anonymous. In addition, a new pseudonym was allocated to each participant before the other researcher was given access.
Giving permission to other researchers to view one’s own data can be a daunting and challenging experience, as it has the potential to expose the original researcher to criticism or academic inquiry. As part of the ethical process, the second researcher (AH) agreed to work in a respectful and supportive manner with the primary data collector and to use the opportunity as a learning process for both.

Benefits of having original researcher on secondary data analysis team

It is widely acknowledged that the re-use of qualitative data is maximised when extensive context is provided about the primary study (Berg, 2006; Fielding, 2004; Heaton, 2004; Van den Berg, 2005). Fielding (2004) notes that context and its relationship to the data is a practical rather than an epistemological or a theoretical issue. Therefore, secondary data analysts need to be given as much information as possible about the primary study so that they are familiar with the research and social context of the original study (Fielding, 2004; Heaton, 2004). Silva (2007) also emphasises the importance of knowing the context of the fieldwork practices. Without this knowledge, there is the potential to de-contextualise the data (Moore, 2007; Van den Berg, 2005).
One of the advantages of having the primary researcher involved in the secondary data analysis was that, within this study, she was in a position to provide information on research context including: the aim of the primary study, the methodology used, how and where participants were recruited, data collection methods and how these were recorded, why certain decisions were made, why certain questioning pathways were followed or not followed, as the case may be, what method of data analysis was used, and problems encountered. In addition, information on the social context of the study was provided, for example, where the study took place, when and where data was collected and the researcher’s and the participants’ backgrounds (Van den Berg, 2005). While this information was interesting, is was not essential in the context of a secondary analysis using classical grounded theory.

Objective 1: Identifying the challenges of using Classic Grounded Theory with secondary data

Using classic grounded theory on secondary data raised a number of issues for both researchers in relation to grounded theory, including issues around coding for the main concern, theoretical sampling, theoretical saturation and theoretical coding.

Coding for the main concern

The focus of classic grounded theory is on identifying the participants’ main concern and how they resolve that concern. In this way, the research problem emerges from the participants, as opposed to it being predefined by the researcher (Glaser, 1992). In order to identify the participants’ main concern and the process by which they resolve their concern, the researchers independently used the constant comparative method to code and analyse the transcripts and were guided by the following questions: What is this a study of? What categories does this incident indicate? What property of what category does this incident indicate? (Glaser, 1998, p. 123). This model of asking questions, comparing incident with incident, code with code and later category with category, resulted in the emergence of a main concern and the development of preliminary concepts and categories.
In contrast with the classic grounded theory approach to interviewing, which is characterised by ‘instilling a spill’ (Glaser, 1998, p. 111), the original primary data was collected using a semi-structured interview schedule. This posed a challenge in the re-analysis, as the participants’ responses were pre-framed within the original research question which was: What are first-time parents’ perceptions of their postnatal educational needs? In addition, the range and depth of participants’ responses was also limited by the use of an interview schedule and the researchers did not have access to the original field notes and memos. Consequently, it took a lot of reading, coding and recoding before the participants’ main concern became apparent. Indeed, the authors would strongly agree that, in the context of secondary data analysis and grounded theory methodology, ‘a large collection of recorded and transcribed in-depth interviews with detailed field notes may [have] offer[ed] greater potential for re-analysis than a more focused self limited set of semi-structured interviews’ (Corti, 2008, para. 3).

Theoretical sampling

Theoretical sampling is a form of non-probability sampling and is considered to be a defining property of grounded theory. Glaser (1998, p.157) suggested that theoretical sampling is both directed by the emerging theory and further directs its emergence, and ‘is the conscious, grounded deductive aspect of the inductive coding, collecting and analysing’. The basic question in theoretical sampling is where to go next in data collection in order to develop the theory. Glaser (1998) believed that participants, events, sites or other sources of data (for example, documentation) are selected on the basis of theoretical purpose and relevance as opposed to structural circumstances. Within the secondary data analysis experience, although it was possible to move back and forth between the transcripts and to theoretically sample for emerging ideas and concepts, it was not possible to sample new participants, events or other sources of data to inform the emerging categories and their properties. Therefore, in secondary data analysis, theory development is limited to the data at hand, as concepts and questions that arise cannot be pursued in subsequent interviews (Bishop, 2007). However, researchers do have the option to saturate their theory by collecting new primary data, if they so wish.

Theoretical saturation

Within classic grounded theory there is no set sample size, nor are limits set on the number of participants or data sources, just sampling for saturation and completeness, which results in an ideational sample as opposed to a representative sample (Glaser, 1998). The criterion used, therefore, to guide the researcher on when to stop sampling is theoretical saturation. In the context of this secondary data analysis study, the limitations around theoretical sampling also restricted the researchers’ ability to achieve theoretical saturation. Although the main concern was conceptualised and some emerging categories and properties were identified, it was not possible to arrive at the stage where one could be confident that no additional data could be found to develop properties of a category (Glaser & Strauss, 1967). There is no doubt, however, that had the dataset been larger or had the researchers had the opportunity to return to the field, theoretical saturation would have been reached.

Theoretical coding

Theoretical codes are abstract models for the synthesis and integration of emerging categories (Glaser, 2005). Like everything else in grounded theory, a theoretical code must emerge from the data as opposed to being forced onto the data. Although some theoretical codes were beginning to emerge as possibilities for integrating the theory, theoretical codes which would create links between all the categories were not identified, due to the limitations of the size of the dataset and the inability to return to the field.

Objective 2: To explore the potential of secondary data analysis as an effective teaching and learning tool for classic grounded theory

As highlighted earlier, as far back as 1963 Glaser recognised secondary data analysis as a valuable teaching and learning tool (Glaser, 1963). Although it omits some important steps in the research process such as negotiating access, sampling and data collection (Szabo & Strang, 1997), the valuable aspect of secondary data analysis as an experiential learning exercise held true within this project. The application of a different methodology and the process of secondary data analysis also created a greater understanding of the differences and similarities between the Strauss and Corbin (1990, 1998) and the classic grounded theory (Glaser, 1978, 1992, 1998) approaches to grounded theory. This was important, as LA was about to embark on a study using classic grounded theory for the first time and wanted to avoid the potential pitfall of ‘blurring the methods’ (Cutcliffe & McKenna, 1999).
While conducting the secondary data analysis, LA learned a great deal about the procedures and principles of classic grounded theory and how this approach differed from the Strauss and Corbin approach to grounded theory which was used in the primary study. It is not the purpose of this paper to expand on the debate regarding the differences between the classic grounded theory and the Strauss and Corbin approach, what Glaser (1998, p. 35) calls ‘rhetorical wrestle’, as these have been well documented elsewhere (Cooney, 2010; Kelle, 2007; Walker & Myrick, 2006; McCallin, 2003; Annells, 1997a; Annells, 1997b; Glaser, 1992) but rather to discuss what has been learned from the experience of applying a different methodology to a primary dataset. Heath and Cowley (2004) state that ‘it is methodological rather than ontological and epistemological aspects that have been cited as the main source of divergence’ (p. 142).
As Walker and Myrick (2006) note, the crux of the differences lies in the ‘interventions and activities in which the researcher engages with the data’ (p. 549). It was interventions and activities such as the semi-structured nature of data collection, coding in a conditional matrix, and forcing versus emergence of theory which were the main differences found between the two approaches during this secondary data analysis. The primary study applied Strauss and Corbin (1998) approach and it was found to be a suitable method for the novice researcher at that time, as it provided structure. However, in contrast with this, the classic grounded theory method is less structured and requires more patience (Walker & Myrick, 2006), and this held true when coding for the main concern and theoretically sampling for concepts in the secondary data analysis.
Although the secondary data analysis did yield a tentative, albeit unsaturated theory, most of the problems arose when the classic grounded theory approach was applied to a subset of the primary dataset, as it was evident that the procedures of data collection and analysis differed greatly from the Strauss and Corbin (1990, 1998) approach that had been applied initially. One reason for the difficulty in searching for a new perspective was that the primary research began with a specific question, namely, ‘What are first-time parents’ postnatal educational needs?’ In contrast, classic grounded theory does not begin with a hypothesis or a preconceived theoretical framework, it begins with an area of interest and data collection proceeds from this (Glaser, 1998).

In the secondary data analysis the general area of interest was: What is the main concern of men when they become a father and how do they resolve that concern. Glaser (1992) states that the logic of grounded theory is to ask two questions when examining the data, and this was adhered to throughout the secondary data analysis. The questions are: 1) ‘What is the chief concern or problem of the people in the substantive area, and what accounts for most of the variation in processing the problem?’ 2) ‘What category or what property of what category does this incident indicate?’ (p. 4).

This pattern of questioning is not used in the Strauss and Corbin (1990, 1998) approach, and as the primary data analysis used a preconceived theoretical framework to guide data collection and data analysis, it was incongruent with the classic grounded theory approach to grounded theory. Glaser (1992, p. 4) remarks that, in grounded theory, true emergence is interrupted by the asking of several pre-conceived questions, which takes the analyst somewhere different from what might be really going on, and in doing so, leads to the outcome being a preconceived conceptual description. The primary study although valuable in itself, did result in a conceptual description of mothers’ and fathers’ postnatal educational needs. The secondary data analysis led, to a small extent, to the discovery of an underdeveloped theory but as Glaser (1992) points out, the use of a preconceived set of questions was not flexible enough to facilitate true emergence, and although ‘this can be significant in its own right, it is not emergent grounded theory’ (Glaser, 1992, p. 4).

The application of a more open perspective using the classic grounded theory approach was restricted by the semi-structured interviewing technique used for initial data collection, which focused on a pre-framed set of questions based on a review of the literature. Problems arose during the secondary data analysis when certain concepts were being theoretically sampled for to fill in the emerging theory.
Although rich descriptions were evident in the data, and the questions asked were answered by the participants, there were some theoretical concepts emerging which were unrelated to the questioning framework and these were left relatively unexplored. That is to say that there were some areas which could have been explored in greater depth, for example, men’s experiences of becoming a father. One explanation for this is that during the primary data analysis LA was not focused on this particular theme at the time and simply wanted answers to the questions about fathers’ postnatal educational needs, which was the preconceived question from the outset. This may have restricted the flexibility and creativity which Glaser (1998) talks about, and inhibited true emergence of theory. What has now been realised through conducting this secondary data analysis is that a set of pre-framed questions is very restrictive and does force the outcome, as opposed to allowing the data to speak for itself which could have resulted in true emergence and, possibly, a different outcome.
In addition, by engaging in the process of secondary data analysis, LA was enabled to improve on interview technique, and to identify strategies for engaging in more open-style interviews. She also learned more about strategies to be employed when conducting classic grounded theory interviews. One of these was the tactic of starting the interview with a very open question, for example, ‘Tell me about your experiences of becoming a father’. Another was the ‘instilling a spill’ technique (Glaser, 1998, p. 111) which is useful if interviews become stagnant or wander off the beaten track, for example, ‘It’s not easy caring for a new baby’. Executing grounded theory is undoubtedly a skill that needs to be learned, and although certain elements of this were acquired during the primary study they required further development, in particular, the practice of remaining open and moving from the concrete to the abstract to allow for creativity (Glaser, 1998).
Conceptualisation of the data through coding is the foundation of grounded theory. Open coding was not problematic, as open coding had been applied in the primary study, however, this secondary data analysis led to a deeper understanding of the differences between using a predefined theoretical code and allowing the theoretical code to emerge. The Strauss and Corbin (1990, 1998) approach facilitated data analysis by fitting the emergent codes neatly into a coding matrix or paradigm and this facilitated a more structured approach to the primary data analysis. ‘Axial coding is a set of procedures where data are put back together in new ways after open coding, which includes a coding paradigm that involves conditions, action/interactional strategies and consequences’ (Strauss & Corbin, 1990, p. 96). In contrast the classic grounded theory method allowed true emergence of the theory and the theoretical code. To achieve this LA was required to resist imposing order on the data and instead look for patterns of behaviour in the data and wait for the theoretical code to emerge.
Glaser (1992, p. 22) argues that Strauss’s approach facilitates ‘forcing data’, and this held true in the primary study where data was neatly compartmentalized into categories which emerged from a preconceived framework. Glaser (1992) states that ‘once this form of forced coding starts, the grounded theory is usually lost, because the analyst is led far away from relevance’ (p. 47). Although the classic grounded theory approach is less structured, it is a more flexible and far less prescriptive approach and is very useful when there is little known on an area, and where the goal is to discover the theory implicit in the data.
Another learning outcome was the difference between theoretical conceptualisation and conceptual description. As Glaser (1998) points out, abstract conceptualisations are tied to the substantive area of enquiry and not to people or time, whereas the Strauss and Corbin (1990, 1998) approach focuses on context, causal conditions, action/interactional strategies and consequences. Conducting a secondary data analysis also highlighted the value of memoing. Memos were employed in the primary data analysis; however, they were not availed of during the secondary data analysis, leading to some omissions as to the train of thought, and why some avenues were left relatively unexplored. This reinforced the importance of memoing when conducting a grounded theory study. However, new memos were written during the process of secondary data analysis and these proved essential in the development and write up of the tentative theory outlined below.
A great deal of knowledge has been gleaned from this experiential learning exercise, as LA was in the privileged position of being able to learn the principles and procedures of classic grounded theory while having access to advice and support from experienced grounded theorists. Conducting a secondary data analysis has been a very useful exercise in learning the method to take forward into a new, classic grounded theory study so that it is clear from the outset how this method should proceed without any confusion regarding the procedures and principles involved.

Objective 3: To explore whether the re-analysis of primary data using a different approach would yield a different result

The personal experience of revisiting a primary dataset that had been gathered years earlier, when LA was a complete research novice, was challenging on several fronts. Firstly, the idea of examining one’s primary data with an open perspective to see if new ideas would emerge was exciting, however, when one went about scrutinising this data it soon became evident that the dataset had certain limitations. Challenges were encountered in several areas when the classic grounded theory method was applied, for example, coding for a main concern, theoretical sampling, theoretical coding and theoretical saturation, which have been explained previously. Despite these challenges and the limitations imposed by the primary dataset, this secondary data analysis went some way towards developing a tentative preliminary theory. This is in line with Heaton’s (2004) comment that not all data are amenable to secondary data analysis.
Findings from primary data analysis
In order to facilitate a comparison between the primary and secondary data analysis outcomes, a brief overview of the primary study is provided. The aim of the primary study was to explore the postnatal educational needs of first-time parents. The primary study involved analysis of semi-structured interviews which were conducted with mothers (n=10) and fathers (n=8), three weeks after the birth of their baby. One overarching core category was generated which was conceptualised as ‘learning to be a parent – it’s not until it happens’. Data from mothers and fathers were analysed separately and eight sub-categories emerged. The four categories that emerged from the father’s dataset include: it’s a complete change (transition to fatherhood), orientated towards the mother (antenatal education classes), the system isn’t there to be involved (lack of involvement in postnatal care) and just to be there (taking time off after the birth of their baby). The four categories that emerged from the mother’s dataset were: it’s a shock (transition to motherhood), I couldn’t visualise that at all (learning about postnatal issues during pregnancy), you have to experience it for yourself (postnatal educational needs) and you need support (the early postnatal period) (Figure 1).

Figure 1: Primary data analysis: Tentative theory and categories developed on ‘Learning to be a parent’.

Although becoming a father was a ‘complete change’, men described it as a smooth and gradual transition. However, men found that the content and focus of antenatal education classes was predominately ‘orientated towards the mother’. During their partner’s postnatal hospital, they were of the view that midwives did not involve them in the sharing of knowledge and skills in preparation for life with a new baby. Thus they considered that ‘the system isn’t there to be involved’. ‘Just to be there’ refers to the time that men took off work after their partner and baby came home from hospital to support their partner and to get to know their baby. Although it was not the focus at the time during primary data analysis, there were concepts emerging on men’s experiences of becoming a father, however, due to the pre-framed interview schedule and academic timeframe constraints at that time, data saturation was not achieved in this category.

Findings from secondary data analysis

In contrast to Strauss and Corbin (1990,1998), when using classic grounded theory one does not start with a preconceived question or agenda, rather one has a substantive area of interest or a hunch in mind. The substantive area of interest used to re-analyse this data was men’s experiences of becoming a father.
Using Glaser’s principles, the fathers’ main concern was conceptualised as ‘developing competency as a father’ (see figure 2). The processes that men engaged in to develop competence or resolve their concern were coded as selecting information for action, sourcing information to fill in the gaps, experiencing hands-on care, balancing competing demands and working it out by doing. These processes resulted in an outcome of gradual adjustment to fatherhood and developing competency. This group of fathers displayed readiness in becoming a father in that they were ready emotionally, economically, socially and pragmatically for their new role as a father. However, they felt they lack the necessary knowledge and skills to care for a new baby. When it came to men’s involvement in maternity care, this group of men felt they were on the periphery as their postnatal educational needs were not met by maternity care staff at that time.

Figure 2: Tentative theory from secondary data analysis: Developing competency as a father

Comparison of primary and secondary data analysis outcomes

When the primary and secondary data analysis findings are compared, there are some similarities and also some notable differences. The similarities include the fathers’ sense of not being involved by midwives, their lack of access to knowledge and skills and their adaptation to fatherhood although a change, it was a gradual one. Some of the notable differences in the classical grounded theory approach include: the move away from mere description of the data, the clear identification of a main concern and the conceptualisation of five processes used by fathers to resolve their concern. One explanation for the differences in the findings is the two different ways in which this data was examined. In the primary study, a specific pre-framed research question was applied whereas, in the secondary data analysis, a more open analytical approach was used allowing ideas to emerge from the data. In addition, the focus of the primary study was on postnatal educational needs, whereas the secondary data analysis had no preconceived framework. In the classical approach there was also a greater emphasis, during data analysis, on transcending and conceptualising as opposed to describing. There are two reasons for these differences, firstly, the application of the classic grounded theory approach which utilises a more open perspective and secondly, the passage of time facilitated a more objective approach to analysing the data.
Strauss (1987) recommends the use of integrative diagrams, as a way of integrating threads of the emergent theory and as a means of explaining ideas to others. However Glaser (1998) is of the view that diagrams oversimplify the theory, and may result in people not reading the intricacies of the theory developed. As a diagram had proved, in the first set of analysis, to be a useful tool in helping to visualise relationships between categories (see figure 1), it was decided to produce a diagram for the secondary analysis (figure 2). What is clear from both diagrams is that neither is sufficient to explain the outcome; however, interestingly the diagram produced from the secondary data analysis does give a greater feel for a core concern and how the various categories identified connected with that core concern.

Conclusion

Secondary data analysis is a research approach used to examine previously collected data. Several challenges were encountered when the classic grounded theory method was used for this secondary data analysis. One drawback to coding for the main concern, theoretical saturation and theoretical coding was the small number of datasets available for this re-analysis. During the secondary data analysis only the fathers data was re-analysed from the primary study.
In hindsight, using Glaser’s (2001, p. 145) idea that ‘all is data’; it may have been valuable to have re-analysed the primary data from mothers also, as this data may have added to and completed the emerging theory. The use of a pre-framed interview schedule which was used from the outset to guide data collection in the primary study also limited the secondary analysis. One principle of classic grounded theory is theoretical sampling for ideas and concepts, and one of the major drawbacks of secondary data analysis is that one cannot go back to the participants and probe for further responses to assist with filling in gaps in the emerging theory. However, researchers can recruit, if they wish, more participants and theoretical sample emerging concepts so that theoretical saturation could be achieved.
The second objective for conducting the secondary data analysis was for LA to learn more about the classic grounded theory method and to find out how it differed from the Strauss and Corbin (1990, 1998) approach, so that there would not be any blurring of methods when embarking on a new classic grounded theory study. This aim was achieved by working closely with AH and by practicing how to think and code conceptually, how to focus on the latent behaviours of the participants, and learning how to theoretically sample for ideas within transcripts. In addition, she learned how to improve on interview technique and identified strategies that may be applied to a more open style of interviewing.
The third objective of this study was to see whether the application of a different methodology would yield a different result. The secondary data analysis did result in a slightly different outcome. The two reasons for this are firstly, that classic grounded theory facilitated a broader, more open perspective to be applied to this data and this facilitated true emergence of a tentative theory. Secondly, there was greater emphasis on identifying the participants’ main concern and conceptualising the data as opposed to describing. The greatest benefit of this exercise was to learn by doing. As Glaser outlined almost 50 years ago, secondary data analysis is an effective teaching tool to learn the method and this was achieved by conducting this secondary data analysis.

Call for papers

The Grounded Theory Review is an international peer reviewed journal which accepts high quality research articles using a classic grounded theory approach. The journal publishes new grounded theories as well as methodological papers.

Since grounded theories can be scaled up or down pending disposable space, we want to encourage GT researchers to explore the short paper format of the journal. For the coming issues, we particularly welcome papers that provide short version grounded theories, discuss aspects of a grounded theory, or contribute with methodological reflections within a 3,000 word frame. We also welcome full grounded theories and methodological papers up to 8,000 words.