DRAFT 200-45 Grading Rubric

DRAFT

Rubric Outline for SSWG 200-45 Reviews

(These are also areas where SSWG should provide standards or best practices--or link to existing resources.)

For SSWG reviews of 200-45 submissions, the review workgroups should assess the submissions on each of the following categories. As additional standards and best practices are recommended by SSWG, this rubric should be fleshed out into a checklist that can be made available to 200-45 submitters in advance of submissions. Note that the order of items is NOT an indication of relative importance.

  1. Platform: the stack of hardware, middleware, and software required to provide the application
    1. see preliminary recommendations
    2. Is the platform suitable to the scope of the application (e.g., campus-wide, inter-college, intra-college, interdepartmental, intradepartmental)?
    3. Does the platform account for future extension of the application (if the application is a likely candidate for wider adoption, will the platform scale)?
    4. Does the platform leverage campus middleware services vs developing internal functionality ("recreating the wheel")?
  2. Development Methodology
    1. Does the methodology meet the needs of the application's target audience?
    2. Does the application roadmap allow for small, iterative releases (vs occasional massive changes)?
    3. Are end-users at all levels integrated into the development process (vs just sponsors or managers who don't actually use the application)?
      1. Is there a clear mechanism for application users to provide feedback AND track status of feedback (bug tracking or feature request system)?
  3. Coding Practices
    1. in-code documentation (commenting)
    2. appropriate design patterns
      1. MVC or other separation of concerns?
    3. code review process in place?
    4. vulnerability scanning (automated code review)
    5. version control for source code?
      1. source code shared appropriately?
    6. automated build process?
    7. unit testing?
    8. using common or standard frameworks where appropriate?
    9. modularity of design so that individual elements can be changed out easily (e.g., database transportability)?
  4. Data Management and Access
    1. Is the data model sufficiently separated from the application to easily enable direct data sharing (is there a database or other data store that can be made available to others on campus directly or via API vs only through the application's interface)?
    2. Has the data model been rigorously evaluated and broadly vetted to determine the value of the data to other campus units and to the campus as a whole?
    3. Are the data model and data store sufficiently granular to protect only security-sensitive data while easily enabling access to the rest of the data (vs denying or tightly restricting access wholesale because some fraction of the data has privacy and security implications)?
    4. Has the data been evaluated to determine if there would be value to extracting it into the campus data warehouse (and integrated/correlated with other warehouse data)?
    5. Have the sponsors established a clear policy and procedure for obtaining access to data?
      1. Is the policy/process integrated with the campus data management policy?
  5. Application Security
    1. Does the application comply with the CyberSafety standards
      1. How was this assessed (internal checks vs external audits)?
    2. Does the application deal with data protected by various legal frameworks:
      1. PII? 
      2. HIPAA?
      3. FERPA?
      4. Others?
    3. Has security been evaluated end-to-end (both the server-side and the client-side)?
    4. Refer to 200-45 Security Checklist
  6. Usability
    1. Does the application leverage UI of existing applications to reduce training requirements? 
    2. Does the application use middleware services or frameworks to generate UI (e.g., KNS)?
    3. Has the application been formally evaluated for usability using industry standard techniques (card sort, paper mockups, eye tracking, user testing)?
    4. Does the project plan include a usability improvement process?
  7. Accessibility
    1. 508 compliant per campus web standards policy
      1. How was compliance assessed?
        1. Automated scanning tools?
        2. Common checklists?
        3. Testing by users who require accessibility accommodations?
    2. Do project funding sources or other drivers require any additional accessibility standards compliance (e.g., federal grants)?
  8. Campus Core Middleware Services Integration
    1. Does the application leverage available middleware services for functionality in the following areas:
      1. Authentication (CAS, Shib, Kerb, DistAuth, KIM, etc.)?
      2. Authorization (LDAP, IAM, KIM, etc.)?
      3. Enterprise Data Resources (DaFIS DW, PPS DW, Campus DW, etc.)?
      4. Workflow (KEW)?
      5. Service Bus (KSB)?
      6. UI Generation (KNS)?
      7. Enterprise Notification (KEN)?
  9. Documentation and Training for Users
    1. Is there end-user documentation for the application?
      1. Online? Context-sensitive? 
      2. Printed book or electronic?
      3. Screencasts or other tutorials?
      4. Is there a process and sufficient resources to update the documentation?
    2. Training required for end-users? 
      1. How will training be made available?
  10. Project Management and Personnel
    1. Who is sponsoring the project? 
    2. How was the development team selected?
    3. Who are the end-users? 
    4. Are the interests of the end-users directly represented by the sponsors?
    5. Have sufficient resources been allocated for various phases of the application lifecycle:
      1. Development?
      2. Initial Data Loading (including departmental data loading)?
      3. Production?
      4. Maintenance and Revision?
      5. Support?
      6. Technical Support ("The application won't load in my browser.")?
      7. Subject Matter Expert Support ("How does business process X tie into process Y?")
    6. Will developers be recruited (external) for this application?
      1. Are appropriate resources, such as campus technology experts, included in the hiring process?
    7. Is training required for developers?
    8. Is training required for systems administrators?
  11. Business Process Analysis
    1. Has the workload impact of the application been evaluated from end-to-end (i.e., not just the workload impact to the sponsoring department, but also to everyone else who uses the application)? Some areas to consider include workload for:
      1. application systems administrators
      2. managers/approvers or other groups involved in workflow
      3. end-users
      4. departmental technical staff providing end-user support (e.g., how complex is the client stack to maintain?)
    2. Is the application creating electronic versions of existing business processes, or have the business processes been evaluated and appropriately reengineered?
  12. Sustainability
    1. Does this application meet UC and UC Davis sustainability guidelines in the following areas:
      1. reduction of printing in favor of electronic records?
      2. application platform hosted in a virtualized environment?
  13. Risk Management
    1. Has the criticality of the application to University operations been assessed?
    2. If the application holds critical data, does the project plan include the following:
      1. backup plan?
      2. disaster recovery plan?
        Unknown macro: {gliffy}