Monday, April 27, 2009

Northstar Scam, Dryer Vent



  • Taylor and Bogdan, op cit. Cap. 6.
  • Castro, R. "In search of meaning, assumptions, scope and limitations of qualitative analysis."
a) The analysis process
data analysis is a process in qualitative research . Collection and data analysis go hand in hand. During the participant observation, interviews, etc., Qualitative researchers follow the tracks emerging themes, read his field notes and develop concepts and proposals to begin to make sense of their data. As your study progresses, begin to focus their research interests, ask questions, directives, check the stories of informants and follow their intuition. Towards the end of the investigation, the investigator focuses on the analysis and interpretation of data.

Some researchers prefer to distance research before starting an intensive analysis. However, it is advisable to begin the intensive analysis as soon as possible after completion of field work and collected. The more you wait difficult it is to reconnect with the informants for clarification.


All researchers develop their own ways of analyzing qualitative data. The basic approach we use to make sense of the descriptive data collected through qualitative research methods is directed towards a deeper understanding of the scenarios or people being studied. This approach has many parallels with the method of grounded theory. The insights are based on data and develop from them. But we are more interested in understanding the scenarios or people in their own terms that the development of concepts and theories. We accomplish this by describing and understanding the theory. Sociological concepts are used to illuminate features of the scenarios or people studied and to facilitate understanding. Further, our approach emphasizes the analysis of "negative cases" and the context in which data were collected.

In qualitative research, researchers analyzed and coded their own data. Unlike quantitative research , there is no division of labor between data collectors and coders. The data analysis is a dynamic and creative. Throughout the analysis, it comes to getting a better understanding deeper than it has been studied, and continue to refine the interpretations. The researchers also rely on direct experience don the scenarios, informants and documents, to arrive at the meaning of phenomena from the data.

b)
Editing and coding data analysis involves certain steps:
1) discovery phase progress, identify issues and develop concepts and proposals. The researchers are gradually giving way to those who study combining insight and intuition and familiarity with the data. One must learn to search for topics by examining the data in every way possible.

Tips: Reading data repeatedly, keeping track of issues, insights, interpretations and ideas (x ex. "observer comments"); seek the emerging issues, prepare tentative lists of topics (topics of conversation, vocabulary, feelings, recurring activities, etc. .) build typologies (useful in identifying issues and developing concepts and theories); develop concepts and theoretical propositions (move from description to interpretation and theory), sensitizing concepts (suggest directions for observation), concrete concepts (informants), list the issues, read the bibliography (committing to the theory and a priori assumptions), to develop a guide to history (to guide the analysis, integrates leading data).

2) Data coding and refinement of understanding : Occurs when the data has been collected. Coding
: systematically to develop and refine interpretations of data. Includes the collection and analysis of all data relating to issues, ideas, concepts, interpretations and proposals. During this stage of analysis, which were initially vague ideas and insights will refine, expand, or develop completely discarded.

coding mode of qualitative data:
a) Develop coding: Write a list of all topics, concepts, interpretations, types and propositions identified or produced during the initial analysis. Be as specific as possible when writing ideas. It must have some perspective on the type of data that fit each category. Having identified the main categories of coding again review the list and delete those that overlap. The number of categories that are taken depend on the amount of data collected and the complexity of our analytical framework. Assign a number or letter code to each category, so you can identify logical relationships.

b) Encrypt all data: coding the field notes, transcripts, documents and other materials, writing in the margin the number or letter assigned to each category. You must code both negative and positive events related to the category concerned. As data is encoded, you have to refine the coding scheme, add, delete, expand and refine the categories. The cardinal rule is to make the codes fit the data and not the reverse.

c) Separate data belonging to different categories of coding: this is a mechanical operation, not interpretative. The researcher gathers data pertaining to each category. Manually, cut the field notes, transcripts and other materials and put the data in each category in file folders or envelopes. It should include enough of the context as to be entirely understandable fragment. Must be kept intact a copy of all materials in their respective sets. There are programs for managing the automatic stage of qualitative data analysis.

d) See what data has more than enough: after all data coded and separated, review the remainder of the data that have not entered into the analysis. Some may fit existing categories. Also can raise new categories that relate to the previously developed or under the guidance of the underlying story. But no study uses all the collected data. If you do not fit, do not force the entry of all data in its analytical framework.

e) Refine your analysis: coding and data separation can compare different pieces related to each topic, concept, proposition and, consequently, refine and adjust ideas. You will find that some items seemed vague and obscure that are clearly illuminated. It is also likely that some concepts do not fit the data and some propositions lose validity. We must discard them and develop otros mejor adecuados. En el conjunto de los datos aparecen casi siempre contradicciones y casos negativos. Se deben analizar los casos negativos para profundizar la comprensión de las personas que se están estudiando.
En la investigación cualitativa no hay líneas guías que determinan la cantidad de datos necesarios para refrendar una conclusión o interpretación. Esto siempre queda sujeto a juicio.

3) Relativización de los descubrimientos : se comprenden los datos en el contexto, en el modo que fueron recogidos, es la fase final. Todos los datos son potencialmente valiosos si sabemos evaluar su credibilidad. No se descarta nada, solo varía la interpretación.

a) Datos solicitados and unsolicited, but researchers are trying to allow people to talk about the issues they have in mind are not totally passive. Made certain types of questions and pursue certain subjects. Request data that might not have emerged spontaneously. It should be noted if people, when responding to our questions, saying things than when he speaks on his own initiative.

b) The influence of the observer on the scene, observers participants try to minimize its effects on people who are studying, until they have achieved a basic understanding of the stage. But observers influence on stage almost always studying. During the early days informants may be cautious in what they say and do.

c) Who was there: the observer can influence what a reporter says or does, the same goes for many others.

d) direct and indirect data: when we analyze our data, we coded statements both direct and indirect information concerning the subject, interpretation or proposition.
But the more can be inferred from the data can be less sure about the validity of the interpretations and conclusions.

e) Sources: Danger to generalize about a group of people based on what a few or one has said and done. Do not be "absorbed" by the key informant. Attention should be paid to the sources of the data on which to base interpretations.

f) our own assumptions, the researcher begins the study with a minimum of assumptions. Our own pledges and budgets are unavoidable. Data is never self-explanatory. All researchers tap into their own theoretical assumptions and cultural knowledge to make sense of their data. The best control of the biases of the researcher is critical self-reflection.

c) Different processing capabilities according to design
There are different ways of processing data in an investigation according to the methodology adopted, ie quantitative or qualitative.
If our research we used quantitative methodologies, best suited for data processing would be the implementation of the operationalization techniques (see Unit 7) and statistics.

If instead we used a qualitative methodology , processing techniques most suitable for this type of design would be the interpretation and encoding used in transcribing interviews.

Coding (categorization) : acción mediante la cual es posible clasificar el material. La asignación de códigos constituye una identificación preliminar de los hallazgos. Un código constituye un intento del investigador por clasificar una palabra, una frase, un fragmento de texto en categorías específicas significativas que tengan un sentido dentro del marco teórico que están siendo utilizados.

Interpretación : el investigador convierte interpretativamente esos códigos en “significados”, es decir, explicaciones teóricamente consistentes de lo dicho.

e) Estrategias de interpretación de la especificidad del material comunicacional
Quantitative : Content Analysis. It is a methodology of social science and bibliometrics that focuses on the study of the contents of the communication. The content analysis of the principle that examining texts is possible to know not only their meaning, but information regarding their mode of production. That is, the text is not only endowed with a meaning signs known by the issuer, but as signs that say about the same issuer, or generalizing, hints on how to produce a text.

Content analysis is not a theory, only a set of techniques, so it is essential that the technique use a particular theory of meaning to the mode of analysis and results.

As an evolution of the content analysis emerged Discourse Analysis. It tends to view the content analysis using quantitative techniques and qualitative discourse analysis techniques

Qualitative : discourse analysis. Is a trans-disciplinary human science and social studies consistently spoken and written discourse as a form of language use, such as communication event and interaction in their contexts cognitive, social, political, historical and cultural. Analysis of

Address (AD) emerged in 1960 and 1970 in various disciplines and in several countries simultaneously: anthropology, linguistics, philosophy, poetry, sociology, cognitive and social psychology, history and science communication. The development of AD and related paralleled the emergence of other transdisciplinary, as semiotics or semiotics, pragmatics, sociolinguistics, psycholinguistics, socioepistemology and ethnography of communication. In recent years, AD has become very important as a qualitative approach in the humanities and social sciences.

methods of AD are qualitative: detailed description of the structures and strategies of written or spoken discourse at various levels: sound and visual and media structures, syntax (formal structures of sentences), semantics (the structures of meaning and reference ), pragmatics (speech acts, politeness, etc..), the interaction and conversation, processes and mental representations of the production and comprehension of speech, and the relationships of all these structures with the social, political, historical and cultural.
AD In that sense differs from content analysis that this method is quantitative rather social sciences applied to large amounts of text, for example with a coding observable properties of the texts.

f) Discourse analysis as interpretive activity
The full speech is recorded, then complete and literally transcribed, analyzed and interpreted by the research team. You can analyze the speech is given in two levels, one empirical the group he represents and other theoretical discourse which speaks of the first level, which allows interpret it or analyze it.

g) Methodological Triangulation
The triangulation is a strategy through which combines the application quantitative and qualitative methodologies and accounts for the possibility of coexistence of paradigms in the practice of sociological research.

is defined as the combination of methodologies for studying the same phenomenon. It is an action plan that allows sociology overcome the biases inherent in a particular methodology, the process of multiple triangulation occurs when researchers combined into a single investigation varied observations, theoretical perspectives, data sources and methodologies. However, this strategy does not guarantee overcoming multiple problems, because not enough to use several approaches parallel but what it is to achieve integration.

Triangulation is also called "methodological convergence" and underlies the assumption that quantitative and qualitative methods should not be considered as rivals but complementary fields. Its effectiveness is based on the premise that the weaknesses of each individual method will be offset by each other's strengths.

The fundamental strategy of multi-method approach is to tackle the research problem with an arsenal of methods that do not overlap their weaknesses and to add their own complementary advantages. Each method provides information that is different from that provided by the other and also is essential to interpret the other. Quantitative methods account for the regularities in social action and provide information distribution. Qualitative research shed light on concrete social processes through which individuals create rules that govern social action. There is no fundamental opposition, and every form of data is used both for verification and for the creation of theory.

There are four basic types of triangulation:
• Data triangulation: time, space and people. • Triangulation of researchers

• Theoretical Triangulation: Involves the use of multiple perspectives theoretical relationship with the same or the same set of objects.
• Methodological triangulation: can be: a) Intrametodológica: when the same method or different strategies belonging to it are used at different times; b) Intermetodológica: when different methods in a mutual relationship are explicitly applied to the same objects, phenomena or situations .

0 comments:

Post a Comment