For the last couple of days I have been working on a new plugin called “MIS Enrolment” which allows you to connect to records on an external database and then enrol users onto a Moodle course based upon their participation within that programme. The enrolment method works by allowing teachers to create an abstract “link” between their course on Moodle and their programmes held by MIS.
This creates a many to many relationship between Moodle courses and MIS programmes and will allow teachers to easily enrol both staff and students onto the correct courses based upon the records held by an organisation’s MIS. More information to come on this over the next few weeks!
For now I am revisiting an old problem which I originally dealt with in my first month of developing for Moodle 2. In order to get this enrolment method working effectively we have to create and maintain an internal copy of the data from MIS. Normally you could just truncate (wipe clean) your records and insert them again from source, but in this case I need to compare and check:
- What records already exist? Do they need updating?
- What records are new? They will need inserting!
- What records are missing from MIS? They will need archiving!
And when you are looping through 10,000 records or more this can all result in a great many individual inserts, selects and deletions. So how can you maximise the efficiency of all this? This time round I am making use of delegated transactions in Moodle to handle the bulk of database inserts.
The mechanism effectively defers all the transactions with a database to one large transaction at the end of a process. As a result you can save lots of memory and significantly reduce execution time because your script isn’t constantly waiting on the database to return some data.
For comparison I tried inserting my test participation data of 9483 new records. Using the standard DB functions the script took 47 seconds to execute, working at a steady 202 transactions per second. By comparison the delegated transaction took just 10 seconds to handle exactly the same data inserts, an effective rate of 948 inserts per second.
If you are working with large datasets on Moodle I would definitely suggest checking this functionality out as part of your project!