Past Teaching Performance


Over the years my own teaching performance has improved as it has become more theoretically informed, and I rather wish I had a better exposure to theory before I started. I certainly have been helped by a gregarious and extroverted nature; often after a day's teaching I am mentally awake, even if physically exhausted. After several years of occasionally teaching informally, firstly at the Parliament of Victoria and secondly at Ministry of Foreign Affairs in Timor-Leste, I took up High Performance computing education at the Victorian Partnership for Advanced Computing (VPAC) and at the University of Melbourne. It was only at VPAC that I undertook formal training in education, initially as Certificate IV ("CertIV")in Training in Assessment (2008-2009), then a Graduate Certificate ("GradCert") in Tertiary and Adult Education (2013).

Q. What do you do well?

When I previously delivered teaching in a more informal and ad-hoc basis there were some concepts which I grasped almost intuitively. It was clear to me, for example, that a primary or secondary school approach would not work in an adult environment, where equality, guidance, and shared-exploration was more appropriate. Later I would discover (firstly in Timor-Leste then with a high proportion of international students) that there were some political-cultural differences in this regard, especially when teaching those who have come from societies where a degree of subservience to authority is normal and often necessary. In such circumstances the aforementioned principles still apply but they have to be introduced to what is a new teaching style.

Another aspect that worked early on actually derives from the "hacker" tradition; "Always yield to the Hands-On Imperative!" (Levy, 1984) I noticed that many IT educators simply taught people what they should do rather than giving the learner sufficiently opportunity to do it themselves (including making mistakes). This did not seem to be a sensible approach to me and my teaching material includes, a great deal of direct "finger-to-keyboard" immersive activity. Later I would discover that this matched the idea that there are different disciplinary learning style (as opposed to individual learning styles) (Rogowsky, et al 2015). The "hands-on" approached combined with collaborative learning ("paired programming" in the IT world), seems to be the most effective combination.

Some particular learning and content areas that have seen improvement through the GradCert include concepts of structured learning and scaffolding. Some of my earlier material did not include this, some concepts were being introduced prior to the prerequisites being taught, and they weren't always integrated with existing material or the context in a systemic way. (although I have always emphasised the learning of low-level IT skills and theory in preference to icon-driven secondary notation where one more learns an interface than the system). This has certainly changed over the years and I am very conscious of ensuring that new material is not introduced until all prerequisites are.

Q. What could be improved?

Perhaps the strongest source of both areas for improvement and as affirmation of teaching content has been anonymous feedback. Whilst the University of Melbourne insists on using what I consider to be a seriously flawed method (Lafayette, 2018), feedback from VPAC courses were based around a qualitative scale differentiated on delivery, content, and environment. With such feedback specific changes could be made as needed with greater precision (e.g., HPC job submission examples with particular software applications). Learners themselves, as the final "consumers" of an educational service have the highest priority in teaching performance. Whilst they may not see everything that goes on behind the scenes and indeed may misidentify problems, they are the people for whom course content is designed and delivered.

In terms of improvement, there are some very obvious and institutional matters that I know have to be dealt with in the coming year. Firstly, there is a waiting list of over 700 registrations for my courses, and I deliver approximately 60 registrations per month. The rate of new registrations is greater than my rate of delivering courses. Secondly, I need to introduce a new course on regular expressions, as this has been requested by life-sciences researchers who have taken up a much larger proportion of the research population on our HPC system. Finally, I am very aware of the need to update my own knowledge on more recent versions of multithreaded and message passing programming and debugging tools; much of my own content is several years old now and, whilst not wrong, is falling behind recent developments.

Q. What will you do to improve?

Clarity in what areas need to be improved also points towards the means to implement the improvements.

* Feedback system. I have raised with my managers the problems with the Net Promoter Score as a metric, however it is university-wide policy and will not be changing. However opportunity does exist to elaborate on the NPS in a manner which at least partially deals with the worst aspects. This includes seeking qualitative responses to the rating, along with additional measures for delivery, content, and facilities. In other words, to use the NPS as on overall evaluation and satisfy university policy and to derive the actually useful measures out of additional feedback metrics.

* To deal with the excess of user-interest, I have sought a new teaching environment as the room that I currently conduct lessons only holds 20, and my manager is also seeking to incorporate video-recording and editing of these lessons. A particular challenge is that small class-size allows for a significant amount of learner-input to tailor material for their own use-cases. Effort has been made to encourage an assistant in this process, but there are few who have the right skill-set in teaching and HPC knowledge.

* Whilst course revisions do occur at the end of each delivery (if only to add in minor points, correct spelling mistakes etc), the revision of multithreading and messaging passing requires a more thorough review. Fortunately I have received assistance from a number of international experts on these topics, including two members of the respective governing standards boards for these API and libraries.

* Review learner's engagement. Whilst learners are autonomous and are not required to attempt the examples provided in the workshop (this is not assessed) the opportunity exists to review their engagement with the course on the basis of command history in their user history accounts and by the outputs of job submission. Whilst this can be time consuming, it does provide the opportunity to identify areas where the course requires additional emphasis if learners are making mistakes and running into problems without asking.


Lafayette, L., (2018). Net Promoter Score: The Most Useless Metric of All

Levy, S., (1984). Hackers: Heroes of the Computer Revolution. Anchor Press/Doubleday.

Rogowsky, B., Calhoun, B., Tallal, P. (2015). Matching Learning Style to Instructional Method:
Effects on Comprehension. Journal of Educational Psychology. Vol. 107, No. 1, 64-78