Voice technology is becoming the newest medium in which students can interact with their education. With Amazon Alexa, students worldwide are gaining more from their educational experience by simplifying ways they acquire information and stay organized. Institutions can personalize campus life by building private skills, providing daily flash briefings, and quickly retrieving campus information. Join Chris Dixon, Head of IT and Innovation at Lancaster University, to learn more about how the L.U. Alexa skill is adding another dimension to the student university experience, engaging students in a variety of topics from study skills and welfare problems to social life and academic progression.
Greetings. It’s a great pleasure to be here and to share with you our story. It’s a story of how we grew to recognise the amazing potential that voice has to transform the delivery of services to our community, and to improve the lives of our students. It’s a journey that introduced us to the power of cloud technology to rapidly address our requirements, and it’s a journey that has been fun, fast and fruitful.
Today I want to show you why at Lancaster we believe that voice technology is a vital way that we need to communicate with our communities. I want to share with you how we built our voice skill, with a small team in just 120 days, and I want to share where we think the journey may take us next.
But first, let me tell you a bit about Lancaster University. If we were to leave Brussels and fly across the English Channel, we would soon find ourselves flying past Manchester, and arriving just South of the beautiful Lake District into our 200-hectare parkland campus.
We have great transport links by air from Manchester Airport and a 6-lane strip of tarmac motorway just outside that’s called the M6.
Lancaster is one of the smallest urban areas in the UK to be categorised as a city with a population of around 50,000 people and it has both a historic castle and a cathedral.
We have around 13,000 students on-site from over 140 countries and 2500 staff
and we also have international campuses in Ghana, China, Malaysia and Germany.
Given those numbers, we are a relatively small UK university, but many have said that we are a university that punches well above our weight.
We have consistently ranked in the top 10 of UK league tables, and we come in 130th out of over 1000 worldwide universities.
And we’re not just good at research, but our teaching is also rated highly, achieving a Gold standard in the Teaching Excellence Framework.
And in 2018 we were chosen to be awarded the UK University of the Year award.
But what about our digital and IT Services?
We are a university that values our student experience, and we have strong ambitions to be more than just average in the delivery of our IT Services.
Our IT Services department has a history of innovation. For example, we created one of the UKs first superfast networks which connected schools and public buildings, and now as a separate company provides rural areas with gigabit broadband.
And we produced a sector leading mobile phone app, called iLancaster which is in use by over 98% of our students, as well as local residents and university visitors.
So, it’s our strong culture of innovation that means we are always looking for new opportunities on the horizon. And that’s why we started to look very seriously at the opportunity of using voice…
And this, for me, is the most compelling statistic. It’s just plain easier and quicker to use our voice. You’re in this place today because you don’t want to just read the words on a page, you want to listen to me (well, you sort of have to now!). And if we can get voice technology to work then it offers some really powerful ways to deliver our services.
So as part of our innovation process we started to look around for other examples. And here are a couple of my favourite ones.
Dominos Pizza had a simple but powerful idea. The device knew where you were, and they already knew your credit card information and your regular pizza. So they simply created a skill where you could ask “Alexa, as Dominos to order my regular pizza” and then.. “knock ,knock” there it is.
What’s even more impressive is that Dominos took over an additional 4% of the worldwide market share for pizza in 2017 when they released the skill.
Or how about Campbells Soup, who had another great idea. In this skill you can give a list of ingredients that you have in your store cupboard and it will come up with a recipe that uses them. Of course, it includes a tin of Campbells soup! The skill also lets you follow along with the recipe hands free, going back and forward as you need. It’s suggested hundreds of thousands of recipes so far, and it’s added to Campbells bottom line.
And this picture brought it home to me. This is my 4 year old niece. Who is so comfortable with using Alexa that when her mum and dad logged into their app, they found some unusual items on their shopping list.
So we had a number of compelling reasons why voice should be something we really look seriously into.
We had a number of compelling examples that we really loved, we knew that students (like my niece) were comfortable with using voice technology and we rather suspected that we could use the cloud to build a foundation for other services
So we were off <click> and we started by writing some prototypes. We started with a web based chatbot that used frequently asked questions from our website to give out information. We then produced a bot using an Alexa device that could carry out simple question and answer responses. We demonstrated the bot at a university conference in the UK and met some representatives from AWS who encouraged us to go further. We followed the Amazon model of starting with the Press release, and took a proposal to the university that we would build, in 120 days, a “Digital Friend” for students that would cover most questions that students would want to ask a chatbot.
They said yes. Gave us the money for our developers, and told us to get on with it
<click>
We started by inviting students who used our mobile phone application, iLancaster, to answer a survey with one simple question. “What would you like to ask a chatbot”. We got over 3000 replies. So we categorised them into areas such as Academic, Campus Life and Student Societies. We then asked our project team to prioritise which we should deliver. They also asked us to include disabled students, a group which was not represented top in the survey but which may find voice technology especially useful.
<click>
In the first 20 days we built our infrastructure in the AWS cloud. We learned how to authenticate the skill and how to connect the AWS cloud to our local APIs.
<click>
Then we connected up our first services. Friendly greetings to make the skill feel like a friend and links to timetabled events, and deadlines.
<click>
We realised that being able to easily add in new Question and answer responses would be really important to build out fast, so we added those, and the skill can now answer more than 200 different questions. We linked in more university wide events, and started to build in answers specifically for disabled people.
<click>
We added in exam data, staff data so students could get information about their tutor or look up contact information. We linked to a large external source of knowledge so that students could research almost any information about their course.
<click>
Now we were really in the zone.
We wanted to add in opening times of shops, bars and other campus facilities. But it turned out, after a lot of looking, that there was no system to provide an accurate and up to date set of information. No wonder students wanted it!
So, we wrote a whole admin system for our estates colleagues to keep opening times up to date and to deliver us the APIs we needed. We linked in live computer lab and laundry availability, as well as integrating the skill into our existing mobile phone application.
<click>
And just before we ran into the deadline of our self-imposed launch date, we realised we had better add “cards” for Alexa devices with screens.
Sprinkle in a little magic sauce, and a bit of testing. Release to the Alexa skills store and
<click>
You’re done.
We had created “L.U”
<click>
And this is the team that built it.
We’re not just helping students with the app itself, but we’re helping and empowering students to actually build the app.
We used a total of 3 full time developers, a part time conversational specialist, to help us code for voice, which is something we had not worked on before, and around 20 part time students who were involved in all aspects of development, design and testing.
Students are a really, really great resource for us. We have a large potential workforce to draw upon with a wide range of skills across the university.
They bring a really great perspective, as well as their user accounts for testing!
It’s like having our own focus group, on tap, all the time.
If the students think you’re building something rubbish that is not going to work, then they will tell you or refuse to build it.
Our students get great outcomes, in fact all of our current students have achieved a first class degree... no pressure on the rest!
So, let’s meet L.U
Main Intro Video
Open L.U
What do I have next?
What week is it?
What lectures have I got on Friday?
That’s everything thanks. Bye.
So, that’s a glimpse of L.U which we launched in early March
L.U works in our iLancaster mobile phone application, and that was really important to us as we already had over 98% of our students regularly engaging with the app.
On the app we also have richer information presented back including web links, and we have easy to use quick selection for the most common phrases.
The app is also a place where students can set preferences, for example setting a PIN number to avoid sensitive information being read out without consent.
And it works with all the Alexa smart speaker range and gives extra information in the form of cards where the device has a screen.
So, let’s show you a little more of what L.U can do, and why it’s making a real difference for our students.
New students don’t know a lot about the campus, so they really need access to basic information, at a time when they might be stressed or not know who to ask for help.
They might not be familiar with the college system. They may not know where they are going or be familiar with their timetable.
And if they don’t know who to ask or have not yet made friends then L.U can be a great thing for them to talk to.
When they first launch the skill it emails them our terms and conditions and more information about what they can ask., It also reads them out some suggestions for what they can ask, and if they are opening the skill on their voice device then these suggestions are randomised to encourage them to discover more.
Ask L.U what college I'm in
Ask L.U to tell me my university email address
Ask L.U Where’s my next class?
Ask L.U How many credits is Emotional Intelligence worth ?
Ask L.U what hours I’ve got next week?
So that’s a little of what L.U can do for new students, who are getting used to the campus, but there’s more to L.U and it’s really useful for current students as well.
Exiting students tend to know about simpler things, so having more complex
and services built in is really useful. Exam deadlines, how many teaching hours they have, who their tutors are and even the ability to book rooms and access services are key for existing students.
We also had to build in some safeguards to ensure that they system complied with our policies. For example, the students need to set a PIN number in our mobile phone application for the voice device to read out sensitive information such as their grades.
So, L.U knows about computers, washing machines and dryers, it can research almost any subject and it can book rooms and check availability.
But we also wanted to make it useful for people with a disability, perhaps people who found using a screen particularly difficult. So, as well as being able to use the features you’ve just seen hands free, we also built in specific questions that disabled students frequently ask. And we’ve had some great feedback.
Ask L.U What's my average grade?
Ask L.U How many teaching hours do I have next week?
<PERHAPS: Who is my academic advisor?>
Ask L.U How busy are the computers in the library?
Ask L.U to Research the distance between the earth and the moon?
Ask L.U to book a group pod for me on Friday?
3?
2?
I think this is a really powerful message.
When we set out to build this skill we didn’t perhaps think about the real power that students might find in having information read out.
We sometimes think that students are happy to talk to a person and find it easy to ask questions, but our feedback shows that sometimes they are embarrassed, or have social skill issues that mean they find it easier to talk to a voice device.
This is especially true when talking about their disability and support.
So that’s why we build specific questions around disability, and worked with students and support staff to refine these and make them really useful.
DEMO – Benefits for disabled students
How can I get support for my disability?
How do I arrange mitigating circumstances?
What circumstances qualify for support?
How to see a college wellbeing officer?
How do I see a counsellor?
So you've seen how it works from a users point of view.
What does the underlying technology look like.
How do we leverage AWS technologies to make this work
Now I’m not programmer, but I hope I can explain how this is all woven together.
<click> Here is one of our endpoints, the iLancaster mobile phone app where users ask questions. They can do this by picking standard responses, or by voice or text search.
<click> Amazon LEX uses speech recognition and understanding of language to work out the “intent” of what users are asking, and any data that goes with it.
<click> And here is an Alexa enabled device, like an echo dot.
<click> and here, the Alexa Skills kit does the same job, working out the intent and any data.
So, for example, when I ask “When is the coffee shop open on Friday” both Amazon LEX and Alexa Skills kit can <click> realise that my intent is to ask about opening times, and the data, which they both call slots are “coffee shop” and “Friday”
From either side we feed these into serverless code using <click> AWS Lambda which allows us to run code without needing those pesky servers.
Then if needed we use <click> Amazon API gateway which acts as the front door to our
<click> local APIs, which power the examples you have seen.
And where things get a bit more complex and we need to search or determine how to best ask things of our APIs then we use <click> Amazon Elasticsearch.
The data flows back up the stack, and the result it returned back to the user.
So, let’s walk through an example of a student asking L.U via their echo dot to book a group work room in the library. So they say “Alexa, ask L.U to book me a group work room on Friday”
The Alexa skills kit converts the voice into an intent, which in this case is the group work room intent.
The Alexa skills kit also gathers all the data and put them into the required slots. In this case we have “Friday” but when we talk to the Lambda function we also need the time and the number of people
So the device prompts for those and fills the required slots.
Then the Lambda function talks to the Amazon API gateway which works out the required local API, which in this case is our library booking API, and ensures all the required security is met and the request is formatted correctly.
The local API returns the data, via the gateway which ensures it’s returned correctly to the Lambda function and the result it passed all the way up to be read out to the user.
And to support all of this we
<click> use tools like AWS X-Ray to debug code across the entire stack, Amazon S3 storage to keep all our code safe, and Amazon Cloud watch to tune the whole thing and keep it working effectively.
Finally, a key part having personalised information is authentication.
This is how we use AWS to make authentication work.
The user might install the skill from the Alexa Skill Store or an Alexa enabled device, so as before the device uses the <click> Alexa Skills kit. We use <click> Amazon Cognito as our identity source, and so we look to see if a user identity exists.
If is does not then <click> Amazon Cognito talks to our SAML provide to get user information, and a SAML token is returned with the details we need.
If however the user comes via the iLancaster mobile app and the AWS Lambda function and there is no identity information then we provide Amazon Cognito with user information from our local user database.
This means that that Amazon Cognito will get the maintain the user information needed regardless of if the user installed the skill in the skill store, on the device or used the mobile app.
So, I hope I’ve been able to inspire you with the journey we’ve been on so far.
But we have much more we would like to do.
We would love to use machine learning to predict what a student might do, and to suggest interventions to them that might give them a better outcome.
We think that we could go much further with the concept of a Digital Friend, using Socratic questioning to help students move past barriers and engage with their wellbeing.
We want to use recorded audio of lecturers to give great revision help and support.
And we want to make sure that L.U can answer anything that the student or staff members might as of it.
We’ve recently added Facebook as a new endpoint, which might appeal to the staff! And we’re also planning on placing L.U into our student web portal to increase it’s reach further.
<DEPENDING ON TIME>
We focused our first phase of development on improving the lives of our students, but we also (almost accidently) added benefits for staff. So a member of lecturing staff can ask about their students, look up phone numbers, ask about how many teaching hours *they* have.
But we can go much further with this. We can start to use voice to address staff digital frustrations. They could book rooms, order catering, place orders, check expenses and a whole bunch of other useful things.
</DEPENDING ON TIME>
<DEPENDING ON TIME>
We also want to investigate putting university owned devices into rooms. For example, an alexa in a lecture theatre to turn on the projector and display a presentation. Or an alexa common areas, to answer questions and give directions.
We may look at deploying Alexa for Business to meet some of these needs.
</DEPENDING ON TIME>
Feedback and usage so far show that engaging via Voice is a really popular way of consuming services, and it’s growing.
We have instrumentation within the skill to allow us to see which parts are getting the most use, and we can start to see what questions people are asking the app and not getting an answer, and then target future development in those areas.
So Perhaps I’ll leave the final word to L.U
“Ask L.U what does your future hold”
Thank you…