Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 28 Next »

This is the page and sub-pages where we track the UCD 2-5-x release

DB Conversion Scripts : SAK-1871

Gradebook : SAK-1832

  • Need to evaluate whether to port/refactor the following changes
    • SAK-1746 Adhoc groups interfere with grading
    • SAK-764Gradebook : GradeDownloadConfig.properties points to ips instead of classes
    • SAK-1062 Error generated in Gradebook when submitting a points grade to a gb item which will not be included in the course grade.
    • SAK-1348 Hide Begin Final Grading button from TA's
    • SAK-935 Refactor gradebook modifications for 2.4
    • SAK-936 add site memberships (back) into default (all sections) views of roster and course grades
  • One of the first steps to accomplishing this overall objective is to inject our CourseGradesToFinalGradingToolConverter into the Gradebook in place of the sample one that injects out of the box. This is accomplished by modifying spring-facades.xml under the gradebook project. SAK-1870 is an asset for that end.
  • Unfortunately, because the CourseGradesToFinalGradingToolConverter does not execute correctly unless the patch is applied to EnrollmentTableBean and this patch does not apply cleanly, it's necessary to get into the guts of that bean and mess around.
SAK-1746 "Adhoc groups interfere with grading"

r4417 /gradebook/branches/post_2-4-0/app/ui/src/java/org/sakaiproject/tool/gradebook/ui/EnrollmentTableBean.java

svn diff -r4416:4417

Index: app/ui/src/java/org/sakaiproject/tool/gradebook/ui/EnrollmentTableBean.java
===================================================================
--- app/ui/src/java/org/sakaiproject/tool/gradebook/ui/EnrollmentTableBean.java	(revision 4416)
+++ app/ui/src/java/org/sakaiproject/tool/gradebook/ui/EnrollmentTableBean.java	(revision 4417)
@@ -45,6 +45,7 @@
 import org.apache.commons.logging.LogFactory;
 import org.sakaiproject.section.api.coursemanagement.CourseSection;
 import org.sakaiproject.section.api.coursemanagement.EnrollmentRecord;
+import org.sakaiproject.section.api.coursemanagement.LearningContext;
 import org.sakaiproject.service.gradebook.shared.UnknownUserException;
 import org.sakaiproject.tool.gradebook.Category;
 import org.sakaiproject.tool.gradebook.GradingEvent;
@@ -237,7 +238,19 @@
 			i = officialEnrollments.iterator();
 			while(i.hasNext()){
 				EnrollmentRecord r = (EnrollmentRecord) i.next();
-				enrMap.put(r.getUser().getUserUid(), r);
+				// since ad hoc groups will end up in here make sure provided enrollments
+				// are not overridden by them
+				LearningContext lc = r.getLearningContext();
+				boolean add = false;
+				if(lc instanceof CourseSection) { 
+					//just to make sure, shouldn't add anything that is not of this ilk anyways
+					CourseSection s = (CourseSection)lc;
+					if(null!=s.getEid() || 
+							!enrMap.containsKey(r.getUser().getUserUid())) {
+						add=true;
+					}				
+				}
+				if(add) enrMap.put(r.getUser().getUserUid(), r);
 			}
 			
 			enrollments = new ArrayList<Object>(enrMap.size());

Looks like the code has changed extensively here. I can't even find the method that we patched, as it's been broken up into about four other methods and each one delegates the info gathering to AuthzSectionsImpl. Interesting.

Melete : SAK-1833

Content Hosting, Resource Conversion : SAK-????

  • From the 2-5-x installation guide:
    Moving from 2.4 to 2.5 requires adding several columns to the database tables used by the Content Hosting Service
    (the service used by the Resources tool, Dropbox, filepicker and webDav). The conversion scripts contain DDL
    statements to accomplish those changes. The new columns enable a switch from XML serialization to "binary-entity" 
    serialization, which is faster and requires less memory. They also enable improved performance of quota calculations. 
    You can get the benefits of binary-entity serialization without running the conversion utility, but you will never get 
    the quota-calculation improvements without doing the full conversion.
    
    There are two start up modes:
    
    1. Dual mode - if the code detects the binary entity upon start up, it will run in dual mode. This means it will be 
    capable of reading xml but will only writing to binary. This would give you the benefits of binary-entity serialization 
    without running the conversion utility, but you will never get the quota-calculation improvements without doing the full conversion.
    
    2. Binary only - The system will only reads and write binary and occurs only after a full database migration. Only 
    once this is done, will a system be able to fully realize all the performance enhancements that binary entity serialization 
    offers.  This occurs upon start up when it's detected that all of the xml fields are null.
    
    There are two methods to get a fully database conversion.
    
    1. Run the conversion utility which can be run on a live system.  Systems running oracle with large amounts should not attempt this right now.
    2. Run the conversion upon start up. This is only recommended for implementations with smaller datasets. 
    
  • 2008-04-07 : Sent mail to Jim Eng requesting information/status about method #1, which seems not quite ready for large oracle installations
    • Jim's Reply
      Here's what I recommend for oracle systems right now (until the oracle queries get updated in trunk and 2.5.x):
      
      I'd recommend you do a separate build to generate the conversion utility, rather than building it as part 
      of building sakai for your production servers.  The best way to do that for oracle at this point is to base 
      it on 2.4.x rather than 2.5.x.  The conversion utility will be essentially the same.  Here's how to do that:
      
      Check out Sakai 2.4.x.  You could start with the cafe checkout of 2.4.x, I think.  That should give everything 
      needed.  Replace the content, db and entity with the SAK-12239 branches of those projects.
      
             https://source.sakaiproject.org/svn/content/branches/SAK-12239
             https://source.sakaiproject.org/svn/db/branches/SAK-12239
             https://source.sakaiproject.org/svn/entity/branches/SAK-12239
      
      You need to select a project.xml file for the content/content-conversion project.  I do that by going to 
      dev/sakai/content/content-conversion and making a copy of project.xml.oracle named project.xml in that folder.  
      Then I go back to dev/sakai/ and run a maven-1 build of that entire sakai.
      
      You will need the oracle jdbc driver to build this project.  We can't provide a mechanism to include that 
      automatically because you need to agree to the copyright protection. So you need to get the latest oracle 
      driver and put it in your local maven-1 repository. Here's the maven dependency:
      
                     <dependency>
                             <groupId>oracle</groupId>
                             <artifactId>ojdbc14</artifactId>
                             <jar>ojdbc14.jar</jar>
                             <type>jar</type>
                             <url>http://download.oracle.com/otn/utilities_drivers/jdbc/10203/ojdbc14.jar</url>
                             <properties>
                                     <war.bundle>true</war.bundle>
                             </properties>
                     </dependency>
      
      That indicates that the jar can be downloaded from 
      http://download.oracle.com/otn/utilities_drivers/jdbc/10203/ojdbc14.jar, which was the version we 
      used.  If you have an earlier version of the driver, it may not work.  If there's a newer version, 
      I'd use that instead.  The default location for the maven-1 repo on my laptop is ~/.maven/repository, 
      so I put a copy of that file at this location:
      
             /.maven/repository/oracle/jars/ojdbc14.jar
      
      If you don't do that, your maven-1 build will fail and display the URL from which you can get the oracle 
      jdbc driver.
      
      Once you've built the war file, you can find it in dev/sakai/content/content-conversion/target.  That war 
      file contains everything needed to run the conversion utility.  You will need to do a little more configuration 
      before running the conversion utility.
      
      You could start up the conversion utility without any other preparation, but I recommend you first run the 
      2.4.x to 2.5.x conversion script that is in reference and then you can start up the 2.5.x code.  If you 
      haven't run the conversion script, you can find a script named "init1.sql" that contains the ddl statements 
      to add the required new columns and indexes.
      
      Here are the steps for running the conversion (using a war file built from the latest code from 
      subversion, r45197 or later):
      
      1) Expand the war file, which creates a folder named "sakai-content-conversion"
      2) Navigate down to the sakai-content-conversion/WEB-INF/classes directory
      3) Edit the file  sakai-content-conversion/WEB-INF/classes/runconversion.sh  and provide the absolute 
         path for the java runtime
      4) Edit the file named sakai-content-conversion/WEB-INF/classes/upgradeschema-step1.config and provide 
         the connection info (i.e. supply values for dbURL, dbUser and dbPass)
      5) Start the conversion utility for part 1 of the conversion by running the shell script named 
         runconversion.sh in no-hangup mode with the upgradeschema-step1.config file as a parameter and 
         piping output to log files.  For example, after changing directories to sakai-content-conversion/WEB-INF/classes/,
         from the shell prompt, issue a command with this form (where $LOGS is the url for a directory where you 
         have permission to write log files):
      
      # nohup ./runconversion.sh ./upgradeschema-step1.config 1>> $LOGS/conversion1.log 2>> $LOGS/conversion1.errors &
      
      6) Allow part 1 of the conversion to be completed without intervention, if possible.  It is possible to
         stop the conversion by creating a file named "quit.txt" in the sakai-content-conversion/WEB-INF/classes 
         directory.  If this part of the conversion is stopped, it can be restarted by removing the quit.txt 
         file from the sakai-content-conversion/WEB-INF/classes directory and then repeating step 5.  The 
         conversion will pick up where it left off.
      7) Edit the file named sakai-content-conversion/WEB-INF/classes/upgradeschema-step2.config and provide 
         the connection info (i.e. supply values for dbURL, dbUser and dbPass)
      8) Start the conversion utility for part 2 of the conversion by running the shell script named 
         runconversion.sh in no-hangup mode with the upgradeschema-step2.config file as a parameter and piping 
         output to log files.  For example, after changing directories to sakai-content-conversion/WEB-INF/classes/, 
         from the shell prompt, issue a command with this form:
      
      # nohup ./runconversion.sh ./upgradeschema-step2.config 1>> $LOGS/conversion2.log 2>> $LOGS/conversion2.errors &
      
      9) Allow part 2 of the conversion to be completed without intervention, if possible.  It is possible to stop 
         the conversion by creating a file named "quit.txt" in the sakai-content-conversion/WEB-INF/classes directory. 
         If this part of the conversion is stopped, it can be restarted by removing the quit.txt file from the 
         sakai-content-conversion/WEB-INF/classes directory and then repeating step 8.  The conversion will pick 
         up where it left off.
      
      It is not necessary to convert the CONTENT_RESOURCE_DELETE table.
      
      The current code in this branch is very similar to what we used at Michigan to do the conversion last month. 
      The primary difference is that we separated out the ddl statements a little more and didn't allow the 
      conversion utility to do any ddl.  Letting the conversion utility handle some of the ddl simplifies the 
      process but uses the same sql statements.
      
      I'm available to answer questions or provide advice.
      
      • Jim's Reply to the Reply
        I'm not sure whether you're referring to the conversion scripts or the conversion utility when you say 
        "scripts", so I will try to answer about both.
        
        When I referred to "scripts" I meant the sql conversion scripts (from /reference/).  The part of those
         scripts related to Content Hosting can be run while 2.4.x is still running.  They add columns and indexes
         that will be ignored by the 2.4.x code.  Then you can shut down 2.4.x and bring up 2.5.0.  The 2.5.0 code 
        will convert a few of the existing content collection records during startup (the top-level folders).  It 
        will also  convert a limited number of other records as people create or revise resources and collections.
        
        Then there's the piece Ian wrote that you start up using the runconversion.sh shell script.  I refer to that
         as the "conversion utility" in an ineffective attempt to distinguish it from the sql conversion scripts.  The
         conversion utility can be run while the new version of sakai (2.5.0 or later) is running.  Once the conversion 
        utility finishes migrating the data, the running instances of sakai will automatically start using the new 
        columns that were added.
        

– Owen's Reply

We discovered last week that runconversion.sh upgradeschema-oracle.config
doesn't work for us.

Jim then very helpfully explained that he too found himself needing to
modify both the config and the script extensively. I think he is in the
midst of preparing a better version to share.

Side note: the switch to 2.5 also uncovered our local need to upgrade the
Oracle driver we were using. This needs to happen just to get things
working at all -- even prior to any conversion. The symptoms there were
problems uploading files (e.g., exceptions and seeing lots of SAX parsing
errors in catalina.out). See these for background:

http://bugs.sakaiproject.org/jira/browse/SAK-11960
http://article.gmane.org/gmane.comp.cms.sakai.devel/17574

JSR-170 : SAK-???

  • Sakai FND Confluence page: http://confluence.sakaiproject.org/confluence/x/kno
  • 2008-04-07 : Sent mail to sakai-dev and production mailing list asking about the JSR-170 Content Hosting status. Who planning on running it, who is running it?
    • Omer@rice.edu replied indicating that Rice is switching to JSR-170 this summer
  • No labels