SCCS+2.0

Notes from 10/3/08 meeting (Sally, Larry and Steve)
A guiding principle is achieving consistency between CDL and AIR SCCS survey and reports. Current Sketchy criteria: Additions to Sketchy criteria: Anchorage School District Data Cleaning Protocols (Link to downloadable Word file): [|2008_Anchorage_School_District_SCCS_Data_Cleaning_July_17.doc] This file outlines the parameters AIR uses to clean Anchorage survey data and may help to further define Lazy response Sketchy criteria. = =
 * Sketchy Results Perameters**
 * Respondent fails to answer more than a third of the questions.
 * Repondent takes survey after school hours (5:00 pm cutoff time).
 * "Lazy" responses. Respondent selects the same answer for a number of questions. Feasability of adding the following criteria to trigger survey being flagged as Sketchy: 10 or more same answers in a row in a given section, excluding the Substance Abuse and Delinquent Bahavior sections
 * Faking good or bad...should/could this parameter be added?
 * Survey Options Page Mockup**
 * TITLE: Should this naming convention follow the URL naming convention? After selecting from the district and school menus, respondents currently see that info on the survey they take.
 * URL:
 * DOMAIN: [|http://survey.aasb.org] (What wizardry needs to happen with the aasb.org domain to accomplish this?)
 * SUBDOMAIN: possible naming conventions:
 * [] (cdl school)
 * [] (non-cdl school)
 * [] (non-cdl school)
 * Do subdomains interact with drop down district and school menus? (darkened active menus and grayed out non-active menus).
 * How would districts/schools in other states outside of Alaska be added? Would they appear in the district/school drop down menus?
 * INSTRUCTIONS (Sally needs to clarify with AIR) : Possible change to "Your answers are confidential. No one at school will know." Ensure there that any changes do not conflict with No Parental Consent structure of student survey.
 * DEMOGRAPHICS: Question #0 (When did you receive your laptop?) will only appear on the CDL version of the survey. How will not including question #0 on non-CDL surveys change questions and reports?
 * QUESTION ADDITION: On the question "Which group describes you best?" add "Multi-Racial" after "Other."
 * GENERAL CONCERNS:
 * How flexible is the framework for generating additional surveys. Will it allow the addition of totally different surveys using the framework of this database? As anexample, Survey Monkey allows the generation of a totally new survey by typing innew questions.How would this affect reporting function?
 * Will the database framework be flexible enough to change questions over time?
 * Technically feasible versus Practically Feasable. We want to have the correct expectations about the abilities of the survey. Is what you are building any different from our original discussions?
 * Distribution Page Mockup**
 * NEW BUTTON: Might be helpful to have "Select All" and "Deselect All" button(s) so all districts and schools listed could be selected or deselected. Is this feasable/necessary to add?(Devin will add)
 * CODE: There is currently a 6 digit access code. The Distribution screen mockup shows an 8 digit code. Is the reason better security?(Devin will implement Alpha Numeric versus current Numeric-only)
 * ADDING NEW SCHOOLS:
 * Devin and/or apprentice would be authorized to add new schools to database, not AASB staff. Once a school is added it is never deleted. Any survey results gathered from the school would remain in the searchable database, but if a school is unchecked in the Access Control Screen it appears "grayed out" (not selectable) in the School drop down menu.
 * How would combining districts or moving a school from one district to another be handled?
 * An existing district splits into two new districts?
 * Schools from one district moved to another?
 * Could a new district be added to the database? What would the affect be on the reporting feature?
 * CONCERNS:
 * Interaction between grayed out text (inactive) or darkened text (active) district/school drop down menus and Administration screencheck boxes? When a district or school is checked, it should appear in the district or school menus as being active (darkened text)
 * How does the Access Control page interact with the Report function? Would de-selecting districts or schools that took a given survey negatively impact the report generation function later?
 * Report Page Mockup**
 * ONSCREEN REPORT COMPARISON: This would be a very useful feature, eliminating the need to print out and compare multiple reports. We need more info about this feature:
 * How would it look? Are you envisioning two surveys sided by side?
 * How it would work over multiple years?
 * How/where are multiple reports generated for comparison?
 * Would comparison screens be printable?
 * Filtering Screen Description (no mockup)**
 * We don't quite understand this one. More info needed (mockup would help) describing about how this feature would work.

**Notes from 9/4/08 meeting (Sally, Larry and Steve)**
Please review and make any necessary revisions. This outline can serve as the foundation for our Monday 9/8/08 teleconference at 10:00 am AST with Devin. It would be helpful for Monday's discussion if Devin could provide a cost breakdown for the various levels of hosting features. - Steve Hi again, Elizabeth,
 * Survey Features Overview**
 * SCCS online survey/databases for both CDL and AASB/ICE will constructed to exist as separate, nearly identical instruments (Exception: "When did you receive your laptop?" CDL question, etc.). This approach provides maximum flexibility for independent deployment and administration of both surveys, while preserving the integrity of each survey's data set.
 * The survey framework should be developed so that it can be easily duplicated and used (with minor modifications) as a foundation for future surveys that may be developed (Primary school SCCS, national SCCS, etc.). The framework should include online students and staff surveys, reponse database collection and storage, simple report feature and expanded administration functions (see admin function specs below).
 * The simple report feature will remain mostly as is. It is popular with districts and provides a quick, easy to understand snapshot of survey results. For deeper comparative analysis, raw data from both surveys can be output as .csv files, imported into a spreadsheet application (Excel, Filemaker, etc), combined, graphed, etc., either in-house at AASB or by an AIR or contract analyst.
 * Add this feature to top of report: range of dates respondents took the survey, depending on parameters set prior to report generation (district, school, etc)
 * Districts will not be allowed direct online access to reports of survey data. Reports can only be generated by and obtained through AASB.
 * Login Security Features**
 * Retain the current six digit numeric code that is randomly generated each time the survey is closed/reopened. This method is less secure than a per-school password system, but takes much less time to administer.
 * Three login steps: enter code, choose district from dropdown menu, choose school from dropdown menu.
 * Retool District/School dropdown menus to be interactive with the Administration Screen "active/inactive" Districts/School check boxes (see Administration Screen specs below). Schools or districts not scheduled to take the survey can be checked "inactive." If a survey participant from a school flagged as being "inactive" is able to reach the school dropdown menu during login, they should experience three things:
 * The name of the "inactive" school will appear "grayed out" in the school dropdown menu, while "active" schools appear in darker lettering.
 * Selecting the name of the "grayed out" inactive school will not allow the participant to advance to the survey.
 * Text should be posted just above the school dropdown menu saying something like "Schools not scheduled to participate in SCCS at this time will appear in gray on the dropdown menu below. If your school appears in gray you will be unable to advance to the survey. If you have questions please contact AASB." (button to generate new email message, AASB phone number)
 * Administration Screen**
 * Develop expanded funtionality for the Administration Screen that includes the following features:
 * Retain global survey open/closed button that generates random six digit access codes.
 * Add District/School "spreadsheet" with checkboxes to make schools and/or districts active/inactive. When a district box is checked, all schools within that district should automatically become checked, but could independently be unchecked.
 * Active/inactive District/School check boxes should be interactive with the District/School dropdown menus on the login screen. This feature assists with billing administration and works in tandem with the numeric login code to provide an additional layer of access control.
 * Include the ability to add a new school to any district. Once a school is added it will never be deleted, but could be made "inactive" using the check box. Once added to the database, a school should not be deleted so that the survey data is preserved from that school when it was captured during the given year it was in operation (Example: logging camp schools that come and go)
 * It would be optimal if it were possible to make a site inactive "on the fly" while the survey is in progress, without having any negative impact on the survey functions or performance.
 * Changes to Question: "Which Group Describes You Best?**
 * Add "Multi-Racial" choice?
 * What impact, if any, would this have on past survey results?
 * Would results appear in report as "No Data" like they do now for questions that did not exist during 2006 survey?
 * Database Response Cleansing**
 * Need ability to clean out responses from the database by hand. This would be done by Devin and/or his apprentice, not AASB.
 * Reasons: Every year, AIR has had to clean out a few responses by hand - things like early AASB tests of the system and a few erroneous entries from districts that we don't want to show up in the final data set.
 * "Sketchy Results" Report Parameters**
 * Coordinate "sketchy results" parameters between current survey report and AIR. Clarify how incomplete responses are handled for entire survey and for individual sections.
 * Currently the CDL survey is set up so that is a respondent fails to answer more than 1/3 of the questions, the entire survey is given "sketchy" status. A survey is also flagged as being sketchy if it is completed after school hours. (can't remember exact time this parameter is set at. Devin?)
 * Further discussion on "sketchy" topic between Larry West and Elizabeth Spier of AIR:

This time another question having to do with the "multi" category. We've been discussing how to handle the merger of the CDL and non-CDL approaches to racial identification, and what term to use.

The CDL approach was to only allow participants to mark a single answer, with no multiracial option (just "other"). The non-CDL approach has been to allow respondents to pick more than one answer (at least on the print surveys), but at the reporting level anyone who did so has been listed as "multiracial." Now we're leaning towards merging the two approaches by requiring a single answer, but adding "multiracial" as an option they can choose.

One of the things we've wrestled with is whether some multiracial people who identify strongly with one or both primary races might not identify their race at all if forced to make a single choice and uncomfortable with the generic "multiracial." Can you think of any other problems in taking this approach? Are there other options that we should consider? We want to take the most "racially sensitive" route that doesn't cause redundancy problems at the reporting level.

It occurs to me (though we haven't discussed this) that perhaps the only way to ensure that virtuall y everyone gets to pick their preference (without causing a redundancy problem) would be to offer all of the variouspermutations as options: Caucasian/Native, Caucasian/African American, African American/Native, Asian-Pacific Island/African American, etc. This of course would result in such a large number of choices as to likely be a problem in and of itself - and we might still want to offer multiracial as an option!

As an aside (a very minor thing, as it only occurs once in the item reports), we just noticed that you used the terms "multi" and "multiracial" in the district reports and "mixed" in the item reports. No need to change anything for this year, but we should pick one term (preferably multiracial) for all sections in the future.

We'll look forward to your thoughts on this - thanks!

Larry ___________________________ Hi, Larry,

I do think the "pick all that apply" with the options that we have now works best. You'll notice on the item-by-item reports this year that we had to use the ultimate single designation for everyone (thus the addition of the multi/mixed category), since the CDL did not allow them to pick more than one (in previous years with non-CDL data, we reported straight frequencies, without percents because some people did check more than one). In my experience, if people are sensitive about it, they just skip the question (funny thing is the staff make more jokes or hostile remarks in the "other" category than the students do!). I wouldn't try to list all of the combinations because you just can't cover them all (my friend's kid is 1/4 each of German, Irish, Puerto Rican, and Cambodian!).

So if the CDL folks can fix it to how the other SCCS has worked (check all that apply, with an "other" category where they can fill in the blank), that would be great. Ah, and I would change the Latino category to read Latino/Hispanic or something like that - right now it starts off with "Mexican and we end up with a lot of Puerto Rican students filling that in for "other" (and then we have to re-categorize them in data cleaning), but they would probably correctly check "Latino/Hispanic" if it was more obvious.

Elizabeth Spier Senior Research Analyst American Institutes for Research 1070 Arastradero Road, Suite 200 Palo Alto, CA 94304 Telephone 650-843-8226 Fax 650-858-0958
 * Annual updates to survey questions/section**
 * Have an interface for this, but it's not very good.
 * **LW:** am I correct in thinking that it's more likely that questions (and maybe even sections) would be added or deleted than that existing questions would be altered? **SR:** I don't think so. We are trying to keep consistent over time and resist casual changes, but at some point we might have a compelling reason to modify it in some way (add or delete questions, or modify questions).
 * Things To Resolve**
 * in raw data csv file exports, there are both blank cells AND some cells with zeros in them denoting a blank cell. When the csv data is then imported into a spreadsheet, this discrepancy results in a six digit (0-5) response numbering system instead of the five digit system (1-5) that it is supposed to be. Requires a bunch of manual cleanup by AIR that costs them time and AASB money. Zeros need to be deleted so all "blank" cells are really blank.
 * In Larry's July 17 notes there is reference to an issue with reverse worded questions. Possible coding correction?
 * Names of domain/subdomains?
 * Cost breakdown for levels of hosting features?
 * Questions**


 * When the survey is edited, how to handle presentation of earlier results? (Currently: removed questions are not shown on the report, and reports for old years show "no data" for new questions) **LW:** current method seems adequate to me. **SN:** Me too.
 * **SR:** is it now possible to go back and ask for a 2006 report for a district and get that year's results that reflect the survey as it was AT THAT TIME? **SN:** Yes, sort by year and generate a new report.
 * I'm sure there are more questions