MIV 200-45 Scoring

DRAFT

Rubric Outline for SSWG 200-45 Reviews

(These are also areas where SSWG should provide standards or best practices--or link to existing resources.)

For SSWG reviews of 200-45 submissions, the review workgroups should assess the submissions on each of the following categories. As additional standards and best practices are recommended by SSWG, this rubric should be fleshed out into a checklist that can be made available to 200-45 submitters in advance of submissions. Note that the order of items is NOT an indication of relative importance.

  • Platform: the stack of hardware, middleware, and software required to provide the application
    • see preliminary recommendations
    • Is the platform suitable to the scope of the application (e.g., campus-wide, inter-college, intra-college, interdepartmental, intradepartmental)?
  • campus-wide, yes.
    • Does the platform account for future extension of the application (if the application is a likely candidate for wider adoption, will the platform scale)?
  • yes [Section 8.0 Scalability]
    • Does the platform leverage campus middleware services vs developing internal functionality ("recreating the wheel")?
  • middleware [Section 4.0 Infrastructure Integration]
  • Development Methodology
    • What is the methodology?
  •  not defined in the 200-45, not agile
    • Does the methodology meet the needs of the application's target audience?
  • metrics aren't defined in the 200-45 for assessing user satisfaction and responding other than the "user group." user group doesn't specify whether end users are included or only middle managers. [Section 2.0 Stakeholder Value, see Chart]
    • Does the application roadmap allow for small, iterative releases (vs occasional massive changes)
  • Not well defined, but doesn't appear to be the case [9.0 Timeline / Budget]
    • Are end-users at all levels integrated into the development process (vs just sponsors or managers who don't actually use the application)?
  • Only through the users' group. [Section 2.0 Stakeholder Value, see Chart]
     
    • Is there a clear mechanism for application users to provide feedback AND track status of feedback (bug tracking/feature request system)?
  • Not in 200-45, but we believe they use Jira. Not public, but could be made public and shared with users.
  • Coding Practices
  • no info
     
    • in-code documentation (commenting)
    • appropriate design patterns
      • MVC or other separation of concerns?
    • code reviews
    • vulnerability scanning (automated code review)
    • version control
      • source code shared appropriately?
    • using common/standard frameworks where appropriate?
  • Using Kuali and Spring.
  • Data Management and Access
    • Is the data model sufficiently separated from the application to easily enable direct data sharing (is there a database or other data store that can be made available to others on campus directly or via API vs only through the application's interface)?
  • The model is separate, but access is not currently available.
    • Has the data model been rigorously evaluated and broadly vetted to determine the value of the data to other campus units and to the campus as a whole?
  • Unspecified, but unlikely
    • Are the data model and data store sufficiently granular to protect only security-sensitive data while easily enabling access to the rest of the data (vs denying or tightly restricting access wholesale because some fraction of the data has privacy and security implications)?
  • Yes.[5.0 Cyber Safety and Sensitive Data: "The new architecture also allows easier maintenance and implements more granular tools with respect to controlling who has access to sensitive data. Throughout the project, the project sponsors have worked with the Academic Senate to define data that is categorized as confidential, sensitive, or private."]
    • Has the data been evaluated to determine if there would be value to extracting it into the campus data warehouse (and integrated/correlated with other warehouse data)?
  • Unspecified
    • Have the sponsors established a clear policy and procedure for obtaining access to data?
      • Is the policy/process integrated with the campus data management policy (cite)?
  • No and no.
    •  
    • Analytics tools (decision support)
  • It's in the roadmap [3.0 Business Impact, 7.0 Administrative Integration: "Goals of these integrations would be to standardize the stored data, reduce data entry, and provide
    decision support capability to departments, deans and central campus administrators.", 9.0 Timeline/Budget]
    •  
  • Application Security
    • Does the application comply with the CyberSafety standards? How was this assessed?
  • Yes, and unspecified [5.0 Cyber Safety and Sensitive Data]
    • Does the application deal with:
      • PII? - yes
      • HIPAA? - no
      • FERPA? - unsure--do student evaluations in their MIV form fall under this category?
    • Has security been evaluated end-to-end (both the server-side and the client-side)?
  • Unknown, but probably yes [Section 5.0 Cyber Safety and Sensitive Data: "The new architecture, using Java and Spring Frameworks, offers easier traceability and maintenance of functions that might otherwise present security risks. During the development cycle, the project team has scanned the application multiple times to identify and fix security risks."]
  • Usability
    • Does the application leverage UI of existing applications to reduce training requirements? 
  • No
    • Has the application been formally evaluated for usability using industry standard techniques (card sort, paper mockups, eye tracking, user testing)?
  • Unspecified, should follow up
    • Does the project plan include a usability improvement process?
  • Unspecified
  • Accessibility
    • 508 compliant per campus policy (cite both)? How was this assessed?
      • Automated scanning tools?
      • Common checklists?
      • Testing by users who require accessibility accommodations?
  • Yes. Q/A process, not specified in 200-45
    •  
    • Do project funding sources or other drivers require any additional accessibility standards compliance?
  • Unknown
  • Campus Middleware Services Integration
    • Does the application leverage available middleware services for functionality?
      • Authentication (CAS, Shib, Kerb, DistAuth, etc.)?
        • Yes CAS [Section 4.0 Infrastructure Integration]
      • Authorization (LDAP, IAM, etc.)?
        • Yes, Kuali RICE [Section 4.0 Infrastructure Integration]
      • Enterprise Data Resources (DaFIS DW, PPS DW, Campus DW, etc.)?
        • Yes? Online directory [Section 4.0 Infrastructure Integration]
  • Documentation and Training
    • Is there end-user documentation for the application?
    • Unspecified
       
      • Online? Context-sensitive? 
      • Printed book or electronic?
      • Unspecified
    • Sysadmin/manager docs?
    • Unspecified
    • Training required for end-users? For Sysadmins/managers?
      • How will training be made available?
    •  Unspecified
  • version control
  • database transportability (per chip) "Modularity"
    • possible [8.0 Scalability: "The application was developed in Java but has been converted to use the Java Spring framework which is platform independent. This technology makes the application easier to maintain and expand, and if MySQL were to have limitations (not anticipated) the application could be ported to Oracle."]
  • hiring practices for programmers "personnel"
    • how was the team chosen?
    • who is managing?
    • adequate resources for dev, maint, and user support (both tech support and task-specific knowledge)?
  • Business Process and Workload Analysis
    • Initial Data Loading
    • Ongoing data upkeep (CRUD)
    • No provided workload analysis for end users and departments
    • Was changing the business process part of developing the application (vs creating an electronic version of a paper process)
    • [Section 1.0 Business Need: "It is designed to reduce redundant data entry, allow remote review of packets and reuse data for several purposes (for example producing multiple versions of
      CV’s and NIH Biosketch forms). Through workflow, MIV offers sequential access from the faculty, to the department reviewers, the Dean’s office, the central academic review committees, such as CAP, and to the Vice Provost and Chancellor."]
  • Sustainability
    • Yes, working to remove paper from the process [Section 1.0 Business Need: "Until the introduction of MIV to a pilot group, the vast majority of academic merit and promotion actions were paper based or used Word attachments which were ultimately produced in a hard copy paper file for final review and archive.]
    • Running on central campus hardware for efficiency