Meet Jason Yang, Antares QBot Developer

December 20th, 2019

Meet Jason Yang. He is the brainy developer behind the now-famous Qbot. Qbot is a world-first, AI-infused, Microsoft Teams chatbot that drives user engagement. It uses machine learning to build a body of knowledge that students can tap for 24/7 support. We sat down to interview him about the challenges he faced, some exciting new features he is working on, and what the future of Qbot will be now that it is being open-sourced.

What is your role here at Antares Solutions?

I’m a technical consultant. My passion is to work with our customers to solve real world problem by leveraging technology. As a solutions consultant at Antares, I have the privilege to work with many inspiring customers like Dr. David Kellermann to make a difference in everyone’s lives, including mine.

Where did the Qbot project originate?

It all stemmed from Dr. David Kellermann. He wanted to modernise the teaching environment at universities. He wanted to have one channel students could go to for everything in the course, (Microsoft Teams) so students could talk, collaborate, have access to lectures, notes, assignments, exams etc. The problem was that the way Teams works, Teams is linear. So, a problem came, questions asked by students were lost due to this. So we thought, lets have a bot so that every time a student asks a question, it notifies the tutor, then every question will be linked to someone’s activity feed, and the tutor is responsible for answering it.

What was the most challenging part of the project?

We started working on the project around late 2017. As for technical challenges, Teams was very new, about one or two years old at the time and Microsoft was still in the process of releasing new features into Teams. The bot really needed the ability to look at answers that students wrote but there was no way to do that. Many things weren’t possible at the time, and we needed to capture user replies, but we couldn’t, so we had to ask Microsoft to build out this feature as an API.

What we were doing had never been done before. David had lots of brilliant and innovative ideas on the user experience, but because we were on a new platform, and using a new technology (the bot framework) and a whole new way of engaging with the user, we had to figure out how to make his vision come to life. It was hard to figure out all on my own as there were no existing guidelines on how to do this. It was fun, but quite challenging.

As for non-technical challenges, the deadline was quite hard. There were also budget limitations, and the bot had to be ready for the next semester of classes, so we only had eight weeks of budget and timeline. I did spend weekends working on it.

What do you think is the most exciting feature of Qbot?

Once the bot went live students would take photos of questions on paper and ask the bot “how do I do this”. So this feature was trying to solve this problem we came across. For the bot, it records the question as “how do I do this” with no other context. We were thinking, ‘how do we provide this context to the bot’? The solution we came up with was putting QR codes inside all the question, exam and assignment material. That way when a student takes a photo of the content, we can scan the photos for the QR code. Once we get the code, we then do a lookup in the database and the bot can figure out what question the student was referring to in their question. We found this happened quite often because we ran the pilot of the bot in an engineering course, which contains a lot of graphs, diagrams and equations.

The cool thing that came out of that is that now we have a way of addressing what context the bot was used for, so now we can do additional things. For example, if the bot detects the user is trying to get help on a specific question from the weekly homework booklet then we can now have the bot provide a hint instead of the answer. “I see you’re working on question 13a, would you like a hint?” Or “would you want to watch a video of the tutor explaining the subject step by step?”

It was a lot of work, but the trade-off was providing a really incredible learning experience for the student.

How did it feel to have your solution showcased on a global stage at Microsoft Inspire, and more recently Microsoft Innovate?

Weird, I guess. On a personal note, I remember when I first started in this field, one of my role models was my cousin who is also a software developer. I used to always ask him ‘what have you built that’s gone into the public? What have you built that’s helping people’? Once this became public, I realised that it is something I built. I felt accomplished.

A lot of things we build are very internal; they help companies do very specific processes. Suddenly, we now have this thing that is going to be open sourced, and it will help improve the university life of thousands of students. I thought, why couldn’t Dr Kellermann be my lecturer at university?

You’ve been nicknamed “The Master of the Bots”, where did that come from?

I don’t know who coined it. It might have been Dr Kellermann who came up with it out of nowhere!

What will it mean for the future of Qbot now that it is being open-sourced?

That’s a hard question. This is almost like a trial version for all the universities to test and have a go with. Then [Antares] will build our own e-Learning solution with QBot as the base, plus everything else. Our Qbot will have all the complex features, whereas this version has the basics – what we had at UNSW minus some other functionalities. In a way that users can set up and build their own features. Essentially it is a Qbot template. It is the starting point where I started with Qbot, being made free to universities across the world in which they can build on from there.

Do you have any plans for future features of Qbot?

We are working on a few things. In engineering a lot of answers are very much right or wrong, but one thing that is difficult to mark is when students are asked to draw diagrams, such as to sketch out the trajectory of some entity, and the student has to draw a graph. We are using machine learning to identify components of the graph, then we train the model on what features would be accurate. The model will then look at other images to see if those features appear. If this ends up working it will drastically reduce the time tutors spend on marking. It will allow them to spend more time on validating that the auto-marker we have provided them is correct.

Other than that: custom study packs. Throughout a semester the way David runs his course is every week there’s a quiz and every three weeks there is a mini test that covers the content in the past three weeks. By the end of the semester we have all this data for every student about how well they did on each topic. Now the idea behind custom study packs is to identify the topics students struggle with and provide them study material that might bring them back to the basics. Like, “it looks like you’re struggling on question 3, so we’re going to give you some core concepts”. For students who did well, they will be given more advanced material to let them get more practice. We then deliver the study packs during the mid-semester break, right before final exams, so students will be given a set of exam study material tailored based on how well they did during the semester. This is how pass marks were brought up from 65% to 85%.

Some fun facts about Jason

  • He went to UNSW Sydney for his studies
  • He went from student, to alumni, to staff, and used his original student number as his login all throughout working on the project
  • His go to sustenance while working on the project was coffee

It was great to see the developer’s perspective on Qbot’s development. It is clear that the project was quite extensive and required a lot of critical thinking to overcome the many challenges faced. Antares was very lucky to have Jason work so hard on the project, and to provide a very high-end solution for UNSW and Dr. David Kellermann. We look forward to seeing Qbot develop over time as it is open sourced.

Written by Isabella Mitchell.