SCCS+3.0

=**SCCS 3.0** Feature Development Overview=


 * DATA FLOW CHARTS**
 * Attachment**: Draft Data Flow Charts:
 * Notes:** 6/11/09 second draft of Mark Jablonowski's Data Flow charts outlining the Registration screens and EED data fields.

__**DASHBOARD FLOW CHART ** __
 * Attachment**: Draft Dashboard Flow Chart:
 * Notes:** This draft dashboard flow chart shows relationships of some of the various survey/database elements currently envisioned. Feel free to revise as needed.

Notes:** After signing an MOA with AASB, the district-designated survey coordinator will fill out an online survey activation form.
 * __ONLINE SURVEY ACTIVATION FORM __
 * The ability to manually change any information entered on the form by the district is required.
 * All schools and grade levels that are taking the survey within a district need to be selectable (check boxes?)
 * Form could have a selectable "Paper" option to denote which respondents at any school or grade level are taking the paper survey rather than the online version. This is subject to discussion prior to implementation.

A status bar graphic across the top of the Activation Form will display how many steps there are to complete the form and where the district survey coordinator is in the process (similar to an Amazon.com transaction). Draft text/categories for each of the six survey activation form screens are below. An asterisk * denotes mandatory fields for each of the screen descriptions.
 * Survey Progress Status Bar**
 * Screen #1***
 * Extended information for contact person(s)
 * Name, Title, Email, Phone
 * Screen #2***
 * District
 * School name(s) and City locations(s)
 * Screen #3***
 * EED enrollment numbers appear with grade levels. (district-reported EED school enrollment numbers) A prompt will also appear giving the person filling out the form the option to update the enrollment number in case it has changed since the school reported to EED. If the number is not updated, the EED enrollment number will be used for comparison. If the enrollment number is updated, that number will be compared to the number of participants completing the survey to calculate the survey participation percentage at each school.
 * Staff numbers
 * Selection of open/close dates designating the two week period the survey will be open for district staff an students to take it.
 * Screen #4 (partially mandatory)**
 * *Designate additional contacts for receiving information about the survey (List of schools entered on Screen #2).
 * *For each school, enter principal’s name, email and phone number. Please add any additional contact persons at the district or at schools who should receive survey information and survey updates and reminders.
 * Screen #5***
 * Summary page that allows input and return to previous page to make corrections without losing previously entered data.
 * Screen #6**
 * Thank you with reminder of the survey window dates
 * Short explanation of the six emails the district/school survey coordinators will be receiving before and during the survey open period.

__**DASHBOARD **__
 * Attachment**: Tracking spreadsheet
 * Notes:** This spreadsheet was used to track district/school participation for both paper and online versions of the survey during the open period that recently ended. The column heading information can serve as a foundation for the information that should appear in the SCCS 3.0 dashboard view. Other information not on the spreadsheet will also need to be included in the dashboard, including:
 * Default summary view of all districts
 * View of each district and all schools within that district, with the ability to zoom into grade level detail of each school
 * Displays information collected from the online survey activation form and other sources
 * Some information could be color coded similar to spreadsheet
 * Dashboard view sortable by:
 * Selectable time parameters
 * All open/closed surveys
 * Percentage participation
 * Numeric and percentage comparison of each district and school's SCCS survey results with state EED data. Color coding for numbers and percentages: 0-84%=Red, 85-100%=Green
 * Ability to manually produce and output the summary tables that are embedded in automatic emails
 * Question:** Is it possible to output a spreadsheet similar to the Tracking spreadsheet attachment from the SCCS 3.0 database?

__**ADMINISTRATION SCREEN **__
 * Standardized Naming:** Districts, schools and city names in SCCS 3.0 must be consistent with the state EED list. This list is updated annually and assigns a unique ID number to each school which should also be included in the SCCS database.


 * User Access Levels:** Assign password protected access levels to each individual user accessing the Administration, Report or Dashboard features.


 * New Survey Generation**: When a new survey is generated or copied, all features and relationships between the online activation form data, state EED data, past survey data, etc. need to automatically follow, including:
 * Questions automatically follow and question text is editable
 * Auto email functions automatically follow and email text is editable
 * Auto email district/school contacts can be manually added/deleted. Auto email system
 * Auto email scheduling can be manually overridden and changed
 * Survey open/close dates can be manually overridden and changed. **Question:** Does changing the survey close date reset the auto email schedule?
 * Survey Status summary tables (the ones embedded in auto emails) can be manually generated and output
 * Parameters set when completing the online survey activation form denoting which schools/grade levels are taking the survey, need to be manually overridden. Schools and grade levels need to be able to be added/deleted manually. The updated auto email schedule must reflect the changes to these parameters.


 * Discussion Point 1:** Pros and Cons of __all__ online surveys including Question 0: When did you receive your laptop? The alternative currently being used is to direct all CDL participants to a separate survey. Inevitably, some respondents end up taking the wrong survey, which skews results. If question 0 were on all surveys, some respondents would answer it incorrectly. Which is the greater risk: respondents taking the wrong survey or answering question 0 incorrectly?


 * Discussion Point 2:** The ability is needed to designate CDL and QS2 participants taking the survey for later analysis and reporting. What is the best approach to doing that? Adding a question, a checkbox, or?

__**AUTOMATED EMAIL SCHEDULE** __
 * Attachment**s: Sample text for six automated emails. We've put the most work in on the Activation email text, so start there. The other five still need work.
 * Notes:** Auto email settings should include the following features with manual override capability:
 * Editable text
 * Recipient list
 * Start/end times/dates
 * Storage of all sent emails for reference, etc. all
 * Email Schedule:** The sample Survey Activation letter contains an embedded Survey Status table that draws from state data, survey results and information supplied by districts in the online form. The information in this table should be standardized (**Example:** see AutomEmail1 Activation file posted above) The survey open period for each district will be 2 weeks (10 school days, Monday-Friday, weekends not included). Following is the schedule for automated emails and DRAFT text for each of the six emails (text subject to revision):
 * Email #1. Survey Activation (sent 3 weeks before survey opens)
 * Email #2. Survey Instructions (sent 2 weeks before survey opens)
 * Email #3. Survey Opening (sent 3 days before survey opens)
 * Email #4. 1st Update (sent day 5 of survey)
 * Email #5. 2nd Update (sent day 8 of survey)
 * Email #6. Survey Closed and Final Update (sent day 10 of survey)

__**AUTOMATED REPORT**__
 * Standardized Report Size:** When output, reports should be formatted to fit on a standard 8 1/2" X 11" sheet of paper or PDF file.

__**JUNE 1 TELECONFERENCE**__ Participants: Sally Rue, Steve Nelson, Mark Jablonowski Mark's discussion list:
 * Hash based survey entry
 * Instead of having one code for each survey and the students select their district/school, have a generated code for each possible combination. The code for each survey is displayed on the district's admin page (as well as the overall admin page) and will greatly reduce the possibility of someone taking the survey from the wrong school. The code will be unique and there will be little chance of hitting a survey if you enter a random code.
 * Survey closure based on EED data
 * Automatic closure of a survey when the appropriate number of people have taken the survey from that school. (Decided NOT to do this. Survey will automatically close on end date set by district, but can be manually overridden by administrator.)
 * Single dashboard experience
 * Consolidate all dashboard experiences into one place. Features will change based on the role of the user signed in. You will not have to view different pages for reporting and survey control options.
 * Increased reliance on district dashboard
 * All district level survey data and options will be available here.
 * survey progress status bar will be located here
 * When districts sign the MOA, they get a username and password for their dashboard where they enter all follow up information.
 * Would we ever want a school level dashboard?

__**PROJECT SCOPE**__
 * Attachment:** 090427ProjectScope.doc
 * Notes:** The original Project Scope and Timeline document.