MSalt: For past PARTICIPANTS
description
The MSALT study was originally designed to research the impact of transitions on adolescent development in the areas of academic and role related behaviors, choices, and outcomes. However, as time has passed we have had to broaden the scope of the study to incorporate transitions into junior high, high school, and either college or the work place.
This study has been ongoing for 15 years. We have collected information from various sources including the surveys given to you, our participants, your parents and your teachers. We have also collected record data (data taken directly from school records such as grades, attendance rate, and courses taken), and conducted phone interviews, and intensive face to face interviews.
This study has been ongoing for 15 years. We have collected information from various sources including the surveys given to you, our participants, your parents and your teachers. We have also collected record data (data taken directly from school records such as grades, attendance rate, and courses taken), and conducted phone interviews, and intensive face to face interviews.
surveys
Most of our information has been collected through written surveys. Surveys given to the participants have asked about achievement, mental health, activities, values, perceptions of environments, and attachments. Parents also received surveys that were designed to parallel the participants' surveys. We also asked teachers in the 6th and 7th grades to rate the participants ability in academic motivation, social competence, and any physical or emotional impairments.
There are several different aspects to survey research. First of all, there's survey development. This is something handled mostly by the upper-level graduate students and faculty members. It consists of collecting questions to ask on the survey, arranging them into a coherent format, testing and re-testing the format (usually with willing staff members' and/or their children who are around the age of those in our subject pool) to make sure it is clear and meets our time specifications.
Once we have a suitable survey ready, we print it and get ready to administer it. There are two different ways to do this: proctored administration in the schools, or self-administration of a mailed survey. In the past we have most often administered surveys in the schools for many reasons. The first of these is that it promotes a higher response rate, since the students are already in school and often willing to take some time off class. Being in the school also gives us the opportunity to answer any questions the students might have about the survey itself or the study in general, and to see the atmosphere they go to school in. When we are administering surveys during the summer, to schools out of our area, or to participants who are out of school, we mass mail surveys for self-administration at home. We have also used this method of administering surveys for the parent surveys we've done. Though it requires less time to do a mass mailing, we tried to get into the schools as much as we could to get more surveys and avoid more misunderstandings on the survey.
Once the surveys have been returned to our office, we check them into our database and, as in recent years, issue a check for the participant. While the surveys are coming into the office, we begin coding development. Coding is the process of turning the respondents' answers into numbers so that they may be used in statistics. For the most part at this point we have codes developed for all the questions on the survey, although a few have been added, updated, or re-worked, which requires a new coding scheme to be developed. This often consists of several people sitting down with some of the new surveys, collecting the answers, separating them into like types, and assigning them numbers to be coded later.
When the majority of the surveys are in, we can begin actually coding them. This consists of individual people sitting down with stacks of surveys and making decisions on what number to assign what answer, using the previously developed coding scheme. Coding can be both tedious and interesting, since participants are unique and often creative. Usually, each person who is involved in coding is assigned a certain set of like questions, and becomes one of our "experts" in coding that type of question. This helps our coding reliability and helps insure that the same question with the same answer will be coded the same way every time.
After the coding of a section is finished, we check-code it. This consists of one of our experts in the section sitting down with another expert's books and, essentially, re-coding them. There are several reasons to do this - primarily to make sure that nothing was missed the first time around, but also to assure that the answers were coded correctly. Often, in the process of coding, we come across a new answer coming up enough to deserve its own code, or some sort of error in the coding system. Instead of having each person go back over their own drawer numerous times to check for each and every mistake, we wait until everyone is finished and pay special attention to those specific errors.
Once the surveys have been coded and check-coded, they are sent to an outside keypunching company to create the data file from the numbers we've written in the margins. To do this, the company requires us to tell them how to format the data file, and then they get the completed formatted file back to us within approximately one month. When the data file gets back, we "clean" it, meaning that we check for errors made in coding or keypunching. Once the data file is clean, graduate students and faculty use it to research specific questions and write papers for conferences and publications from it.
There are several different aspects to survey research. First of all, there's survey development. This is something handled mostly by the upper-level graduate students and faculty members. It consists of collecting questions to ask on the survey, arranging them into a coherent format, testing and re-testing the format (usually with willing staff members' and/or their children who are around the age of those in our subject pool) to make sure it is clear and meets our time specifications.
Once we have a suitable survey ready, we print it and get ready to administer it. There are two different ways to do this: proctored administration in the schools, or self-administration of a mailed survey. In the past we have most often administered surveys in the schools for many reasons. The first of these is that it promotes a higher response rate, since the students are already in school and often willing to take some time off class. Being in the school also gives us the opportunity to answer any questions the students might have about the survey itself or the study in general, and to see the atmosphere they go to school in. When we are administering surveys during the summer, to schools out of our area, or to participants who are out of school, we mass mail surveys for self-administration at home. We have also used this method of administering surveys for the parent surveys we've done. Though it requires less time to do a mass mailing, we tried to get into the schools as much as we could to get more surveys and avoid more misunderstandings on the survey.
Once the surveys have been returned to our office, we check them into our database and, as in recent years, issue a check for the participant. While the surveys are coming into the office, we begin coding development. Coding is the process of turning the respondents' answers into numbers so that they may be used in statistics. For the most part at this point we have codes developed for all the questions on the survey, although a few have been added, updated, or re-worked, which requires a new coding scheme to be developed. This often consists of several people sitting down with some of the new surveys, collecting the answers, separating them into like types, and assigning them numbers to be coded later.
When the majority of the surveys are in, we can begin actually coding them. This consists of individual people sitting down with stacks of surveys and making decisions on what number to assign what answer, using the previously developed coding scheme. Coding can be both tedious and interesting, since participants are unique and often creative. Usually, each person who is involved in coding is assigned a certain set of like questions, and becomes one of our "experts" in coding that type of question. This helps our coding reliability and helps insure that the same question with the same answer will be coded the same way every time.
After the coding of a section is finished, we check-code it. This consists of one of our experts in the section sitting down with another expert's books and, essentially, re-coding them. There are several reasons to do this - primarily to make sure that nothing was missed the first time around, but also to assure that the answers were coded correctly. Often, in the process of coding, we come across a new answer coming up enough to deserve its own code, or some sort of error in the coding system. Instead of having each person go back over their own drawer numerous times to check for each and every mistake, we wait until everyone is finished and pay special attention to those specific errors.
Once the surveys have been coded and check-coded, they are sent to an outside keypunching company to create the data file from the numbers we've written in the margins. To do this, the company requires us to tell them how to format the data file, and then they get the completed formatted file back to us within approximately one month. When the data file gets back, we "clean" it, meaning that we check for errors made in coding or keypunching. Once the data file is clean, graduate students and faculty use it to research specific questions and write papers for conferences and publications from it.
record data
Record data consists of semester grades, standardized test scores, and special education or gifted classifications as recorded by the school. We collect this data by going into the schools and copying by hand the data onto separate sheets.
interviews
We have used two forms of interviewing to get a clearer picture of our participants. Phone interviews have been conducted to explore in greater depth some of the answers on the questionnaire. The face to face interviews were conducted to explore how participants managed the life transition from high school into the workplace or college.
For the most part, interviews are handled by graduate students and staff. First, they select a pool to interview. Next, they train interviewers in a specific protocol for the interview, and contact the participants to set up an interview time at a location of their choice. Interviews are taped and transcribed, and each interviewer fills out an "Interviewer Observation Sheet" to get an idea of the atmosphere the interview was conducted in (for the most part, the family home).
Occasionally the interviews are transcribed by temporary workers in the office, but more recently we have been sending tapes out to a transcribing agency to free our temps' time for other tasks. After the transcripts come back, they go through a coding process vaguely similar to that of surveys so that the data from them can be used in statistics as well.
For the most part, interviews are handled by graduate students and staff. First, they select a pool to interview. Next, they train interviewers in a specific protocol for the interview, and contact the participants to set up an interview time at a location of their choice. Interviews are taped and transcribed, and each interviewer fills out an "Interviewer Observation Sheet" to get an idea of the atmosphere the interview was conducted in (for the most part, the family home).
Occasionally the interviews are transcribed by temporary workers in the office, but more recently we have been sending tapes out to a transcribing agency to free our temps' time for other tasks. After the transcripts come back, they go through a coding process vaguely similar to that of surveys so that the data from them can be used in statistics as well.