Search results
Results from the WOW.Com Content Network
Coding reliability [4] [2] approaches have the longest history and are often little different from qualitative content analysis. As the name suggests they prioritise the measurement of coding reliability through the use of structured and fixed code books, the use of multiple coders who work independently to apply the code book to the data, the measurement of inter-rater reliability or inter ...
Grounded theory can be described as a research approach for the collection and analysis of qualitative data for the purpose of generating explanatory theory, in order to understand various social and psychological phenomena. Its focus is to develop a theory from continuous comparative analysis of data collected by theoretical sampling. [4]
Content analysis is an important building block in the conceptual analysis of qualitative data. It is frequently used in sociology. For example, content analysis has been applied to research on such diverse aspects of human life as changes in perceptions of race over time, [ 35 ] the lifestyles of contractors, [ 36 ] and even reviews of ...
Imagine conducting in-depth interviews with cancer survivors, qualitative researchers may use data saturation to determine the appropriate sample size. If, over a number of interviews, no fresh themes or insights show up, saturation has been reached and more interviews might not add much to our knowledge of the survivor's experience.
Grounded theory combines traditions in positivist philosophy, general sociology, and, particularly, the symbolic interactionist branch of sociology.According to Ralph, Birks and Chapman, [9] grounded theory is "methodologically dynamic" [7] in the sense that, rather than being a complete methodology, grounded theory provides a means of constructing methods to better understand situations ...
Concerns include increasingly deterministic and rigid processes, privileging of coding, and retrieval methods; reification of data, increased pressure on researchers to focus on volume and breadth rather than on depth and meaning, time and energy spent learning to use computer packages, increased commercialism, and distraction from the real ...
Tukey defined data analysis in 1961 as: "Procedures for analyzing data, techniques for interpreting the results of such procedures, ways of planning the gathering of data to make its analysis easier, more precise or more accurate, and all the machinery and results of (mathematical) statistics which apply to analyzing data." [3]
Data mining is a particular data analysis technique that focuses on statistical modeling and knowledge discovery for predictive rather than purely descriptive purposes, while business intelligence covers data analysis that relies heavily on aggregation, focusing mainly on business information. [4]