Category: Events

JISC DigiFest Hackathon 2020

It’s been a year to the day since I published my last blog post reflecting on the DigiFest Hackathon of 2019. It’s been a busy year, but the opportunity to reunite the team for another Hackathon was one we couldn’t turn down!

As usual, the theme of this Hackathon is EdTech, organised by JISC. JISC are a UK non-profit, aimed at providing advice and resources in the tech sector, to schools and universities. Our team consisted of Nathaniel Read, Alex Lovett and Dan Tregoiing. We’ve completed a number of Hackathons together, and we’re all members (or alumni in my case) of the University of Hull, and Hull Computer Science Society (HullCSS). This event is for students and recent graduates, and JISC were happy to invite us to compete again.

A photo of the four hackathon team members in matching hoodies, on a stage
Team HullCSS (from left to right): Alex Lovett, Harry Gwinnell, Dan Tregoiing & Nathaniel Read

Nathaniel wrote a fantastic piece about the event as a whole, and I’d recommend you read it. This post will focus more on the technology behind what we built, and how it works.

Solution

oneGrade is our automatic marking solution, aimed at Computer Science teaching. Marking code is inherently complex problem given how many possible solutions exist. As such, we designed a solution to automate as much of the quantitative aspect of the marking process as possible.

The concept is simple; students submit their work (git or svn link) via a web interface or VLE integration, code is checked out and built into a container using Azure Kubernetes Service, and a suite of automated Unit, Interface, Integration etc testing can be performed. Azure Web Apps and Azure Container Service power this workflow, and results are delivered via the oneGrade web interface.

To setup an assignment, a lecturer must provide a Dockerfile that defines their assignment (e.g. an .NET Core image) and any number of testing scripts for their unit/integration/other testing framework that we support. This gives us the power of flexibility, allowing the platform to support any mix of languages and unlimited testing potential. This is all stored in the database against the assignment ID, and pulled out for use at container build time.

The Student interface is also simple, providing only a list of assignments. Assignments have either a submission box for a source URI, or a link to the report for their marks.

A flowchart of the solution architecture

The solution consists of two components, a React frontend providing our Instructor and Student UI, with an ASP .NET Core backend providing our Container Management services and testing environment. This hackathon was my first encounter with React, though I’m happy to say it was an experience I rather enjoyed. React proved a flexible and straightforward framework, and I was able to get going quickly. Most of my web experience is through the ASP.NET MVC Framework, with traditional JQuery.

The backend is responsible for taking a submission, building it into a container, and running the suite of defined tests. It does this using Azure Kubernetes Service (AKS). oneGrade will accept submissions, generate a worker process, and dispatch a request to AKS. Once AKS reports that the image is built and the container is running, the worker will run the defined testing, and store the results in the database. This provides a detached testing method, allowing for highly parallel operation.

Conclusion

Overall, the solution was fairly robust and solved the challenge we set for ourselves. There were a number of features we’d hoped to implement but simply ran out of time for, including a wider array of testing support, and using Azure Blockchain Service to store submissions for plagiarism detection. I’m pleased with how far we took the project during the Hackathon, and we may see some future iteration in our free time.

Our team was awarded the “Most Market Ready Solution” prize by JISC, for which we are immensely grateful. Competition was very strong with many incredible solutions being proposed, and both myself and the team would like to thank JISC for the opportunity to attend!

JISC DigiFest Hackathon 2019

The wonderful people over at JISC invited us along to their two day Hackathon, running alongside their DigiFest event. We pulled together a team of four, Nathaniel, Dan, Alex and myself and began coming up with ideas. They identified a number of key ideas they wanted us to build solutions for, including:

  • Student Wellbeing
  • Shaping the Curriculum
  • AI or Intelligent Agents to support learning
  • Intelligent Campus

Our idea was fairly straightforward. Our University (The University of Hull) has a whole host of services regularly accessed by students, often requiring student cards and logins to a variety of fragmented systems. Could we bundle all of these into a single source (A mobile app) and replace the 16,000+ physical plastic student cards with a Digital alternative. We put this under a combination of Intelligent Campus and Student Wellbeing, aiming to provide a platform that provides NFC powered mobile student cards, and a range of student services including help, security, timetables, travel etc.

We soon heard back that we’d been accepted, and a couple of months later we headed over to Birmingham for the event. JISC provides travel, hotel, food and everything we needed for the event, meeting on the first evening for dinner in a local restaurant. Much of the first night was spent throwing ideas around about how this might form, the kinds of services we wanted to offer while minimizing the amount of used space. The University currently uses an app called iHull, but this suffers from several issues around usability, and a lot of rarely used tiles on it’s home screen. We were looking for a fresh approach, something that aligned with the University’s brand identity more strongly.

As we made our way into the first official day of the event, we divided our four man team into tasks. Nathaniel worked on the UI Design (and some of the Xcode!), Alex and Dan worked on Backend services, and I was tasked with building the app in Xcode. Soon after, designs started flowing in from Nathaniel, and we were quickly constructing UI and linking together pages and Navigation Menus. Day 1 went by fairly successfully with a huge focus on getting the app looking presentable. We knew there wasn’t much time on Day 2, so we needed it to look good. We ended the day with a pretty satisfying looking app, and much of our backend services for login, weather and calendar information working, but not linked to our app.

At this point JISC invited us to their drinks reception, and we spent a couple of hours chatting and drinking with JISC employees and University delegates alike. Many interesting conversations were had, from AI Rabbit Robots to JISC Apprenticeships and more. Dinner followed in the most delicious pizza place, with our delegation filling almost the entire venue!

We finished off day 1 with a late night programming session, tying up our messy UI, fixing scaling bugs and deploying to TestFlight. We managed to tidy up most of the UI, leaving the evening with a reasonably function app.

Day 2 was split into two parts. Before lunch, and after. We spent much of the morning working on pulling the backend into the app we’d made, linking our login pages, UI, calendars etc to use the real data generated in the web platform. We then spent the afternoon building and tweaking our presentation. The presentation comprised mostly of screenshots of the app, using the strong brand identity present to produce and overall very professional looking presentation.

It was then time to present what we had made to a team of JISC Judges, the other teams, and a bunch of Delegates from the conference that were invited to attend. JISC’s own team of interns went first, followed by us, and the other teams. You can watch the presentation we gave here.

As the presentations wrapped up, the winners were announced, and we were delighted to be named the winners of this years event. All of the teams produced fantastic solutions, from interactive chatbots for campus services to a live French lesson using speech recognition. We’re incredibly proud to have been named the winner.

We’d like to thank JISC for inviting us to the event, and to everyone who was involved for making it a great competition.

We’re continuing to build out the app as a fun side project, and you can find more info on it here or contact me if you want to know more.

Microsoft Future Decoded 2018

The Event

Last week featured the 2018 installment of Microsoft’s annual Future Decoded event, hosted at the Excel in London (Whether they did this for the Office pun is as yet unconfirmed). To quote Microsoft themselves :-

Microsoft Future Decoded provided two days of top-level keynotes, breakout sessions, networking and an action-packed expo, giving the information and practical advice needed to help grow your business in the changing world of Digital Transformation and AI.

That means both technical and business people head down to catch up on the latest from Microsoft and their partners. We all speculate on what the future might hold while showing off what we’ve got to offer today.

For those who don’t know me so well, I work for a Microsoft Mixed Reality Partner called VISR doing all kinds of technical wizardry. As you can expect, we showed off lots of shiny HoloLens tech we’ve been working on.

A picture of VISR's stand for Future Decoded

VISR’s Stand the day before the show!

FUSE

One of the demos we were running provides a guided maintainence platform in MR, giving frontline workers access to deeper information without the need for their hands. You can check out a video on YouTube or find more info on the product at VISR’s website but essentially it adds information about the device being repaired such as status of internal components, and twins it with info on the last repair, notes left by previous maintainers, etc. It also provides step by step instructions to ensure you complete the repair safely if you need them, overlaying holographic animations and instructions, and verifying each step has been completed successfully. If you ever put the device into an unsafe state, warnings are shown to guide you back.

My role in this application in particular is in the IoT tech inside the applicance itself. In this demo, we’re using a UPS Mains panel, leftover from our datacentre. The maintainence guides you through replacing a fuse in the applicance itself, and each of the switches, the fuse itself, the door and the lock are all rigged with sensors. An embedded device inside the applicance keeps a track of these sensors at all times, reporting status back to our PaaS solution, for others to keep an eye on. In our demo, an app inside Microsoft Teams keeps an eye on the Applicance, and if things go wrong, it prompts the users to create a maintainence job for the device. An engineer can then head onsite with a HoloLens and perform the maintainence, and the Appliance will mark itself as repaired, and notes saved for next time.

FUSE Tech

The tech inside for this example all runs in .NET Core running on an embedded Linux device. It’s surprisingly performant, and we’ve had no issues with it so far. It gets a little shaky with the WiFi connection inside a conference venue (especially with it’s antenna locked inside a steel box!) but most people made it through the demo and could appreciate the value of the solution.

It aims to cover the “Head up, hands free” approach that Microsoft is targeting in this space, replacing the need for manuals or tablets for repair guides, and freeing up the hands of our engineers and front line workers. This app is packaged ready to go, with just a couple of config changes to meet client needs, so we’re pretty happy with how this rolled out.

An image of the FUSE UI

A slightly older build of FUSE

Talks

As always there was a wealth of technical talks and knowledge available, and I managed to miss all of them!

Instead, I was lucky enough to catch a keynote given by Dr Maggie Aderin-Pocock MBE and Sir Michael Caine CBE.

Dr Aderin-Pocock’s talk was an inspiring tale of a journey through life, working against her Dyslexia and a raft of naysayers to become a renowned scientist. She has worked on the James Webb Space Telescope, appeared on TV as an Astonomy expert, and given a number of talks and presentations, among many other achievements. Hearing her speak was a truly inspiring experience, and her tale is one I won’t forget in a hurry.

Sir Michael Caine also offered an insighful journey through his career, highlighting decisions and events that have impacted his life. He has many interesting stories to tell, and a host of life advice that even the more experienced of us would be mindful to listen to.

Finally, the second day hosted a talk by Satya Nadella himself, covering the future of technology both from Microsoft and the industry alike. He highlighted the role of Cloud in the future of technology, and the role AI is playing in day to day life as more companies begin to integrate it into their businesses. It was an interesting view on the technical landscape, and we’ll see how his predictions play out.

Networking

I love attending these events, as it gives me a chance to meet some incredibly interesting people. Last year I managed to have a long discussion with the guy who created Paint 3D on the Windows team, before moving on to an interesting chap who put Cortana inside a robot dog!

This year was spent chatting to some awesome people, including several architects from BJSS, the ever amazing guys at Transparity, and more Microsoft employees than I know what to do with.

Huge thanks go to the team that spend months preparing for this event on the Microsoft side, and to VISR for bringing me along for the third consecutive year. I can only hope the trend continues!

Humber Care Tech Challenge

On the 6th and 7th of September, fellow HullCSS member Dan and I travelled to Bridlington for the first Humber Care Tech Challenge. The event was hosted and sponsored by East Riding of Yorkshire Council, the University of Hull, Amazon, C4DI and the One Point, presenting us the challenge to come up with a technology to create solutions that help the elderly, infirm and others in care with their daily lives.

As we entered as a team of two, the organisers contacted us several days prior to pair us with another group of two, from the City Health Care Partnership. We’d never met before, but we welcomed these two guys to work with us on the solution. As it turned out, the guys weren’t Developers, so we used them as a great source of inspiration and ideas, due to their experience in the healthcare industry.

The idea we pitched in the first day was essentially Amazon Alexa for Care Homes, providing residents with information about their meals, activities, visitors, etc, and access to their nurses. It twinned this with a range of Smart Home and Smart Health tech, to allow staff and relatives to monitor the patients in terms of weight, blood pressure, etc, and to check they’ve closed doors, windows etc. This also allows our bed bound patients more control over their environment, allowing them to open curtains, turn off lights and more. The app also ran Intent Analysis across all requests, to pickup key words which might flag up signs of depression, dementia, etc, which can be used to flag a patient for a review with the Mental Health teams.

The judges were impressed with the idea, but we didn’t win the first day’s prizes, instead picking up the Peoples Choice Award for our concept. We’d developed a strong presentation showing the idea, and an Alexa Flash Briefing with some of the information that would be available.

The second day allowed us to commence with actual development, with us choosing to build an ASP MVC Application, on .NET Core, running on AWS. This provided a web interface for Care Home staff and Relatives to monitor the patients, and update information about activities and meals. This app also featured an API for use by Alexa, to retrieve information, pass data back etc.

The finished product came out looking really smooth and polished, despite us only taking the one day to work on it (we have the commit graph to prove it!). We managed to win both People’s Choice Award and the Best Solution prizes, taking home the first ever Care Tech Trophy, and an Echo Dot each to boot!

We’re not certain what the future holds for the solution we developed, but Dan and I continue to work on it. Who knows what the future will hold!

Overall the challenge was great fun, and I’d recommend you check out the next one in 2019! I went in with pretty much no knowledge of the inner workings of the healthcare sector, and all 13 teams produced a fantastic solution, several of which are progressing to full roll out!

Cyber Security Lecture – HullCSS 2018

This week, I presented a lecture on Cyber Security for HullCSS. It covered Personal security, securing Applications you build, and organisation security. This was part of the HullCSS lecture series.

 

I covered three main topics, Personal Security, Developing Secure Solutions, and Organisational Security. The overall presentation is highly top level and basic. It covers common vulnerabilities and issues, but doesn’t go into too much depth as to how to protect and mitigate.

Personal Security focuses on the topic of using Password Managers, Two Factor authentication and other practises to develop good habits, and keep your online activity secure.

Secure Solutions covers the top vulnerabilities in applications, and how to mitigate them. It also covers how to spot vulnerabilities and encourage developers to build good habits.

Organisation security covers possible flaws where staff may expose information. It suggests some ideas to reduce the chance of this happening.

 

If you want to check out the slides from the event, you can find them below. If you’d like a full copy of the presentation (Available as a Keynote or PowerPoint) including notes, let me know. If you’d like to use this in your own presentation, please ask first.

Presentation Link

Google HashCode @ Hull 2018

Yesterday (1st March 2018), a number of HullCSS students got together to compete in Google’s HashCode competition. Everyone managed to submit a solution and gain points, and great fun was had by all.

The Event

HullCSS (The Hull Computer Science Society) ran a HashCode Hub in the Fenner SuperLab, where teams from all over the University came together to compete. It was the first time HashCode was run at the University of Hull, and all students were welcome.

All teams managed to submit a working solution, with the lowest point total being 10 Million, and the highest being 45 Million. All were impressive solutions, with each taking a different approach to solving the problem. Everyone relished the challenge, and it was interesting to see all skill levels working together.

All teams were eagerly watching the scoreboard throughout the event. Everyone wanted to make it above the top 1000 teams (top 25%).

My team consisted of Josh Taylor, Alexander Rossa, and myself, and was named Gravity Gun. We kicked off at 5.45pm with the release of the problem (which you can find here), and got to work planning out our algorithm.

The Solution

We decided to take a Supervisor/Worker approach, with our main program acting as a coordinator of rides, and the cars doing much of the route finding/calculation themselves.

A basic set of properties were given to the car, to allow it to know about itself, such as its position, the rides it has completed, and the current global step it is on. This allows the supervisor to ask the car how long it will take to complete a given ride. The car is capable of calculating at what points it will pick up the customer, and arrive at the destination.

The supervisor knows more generalised information about the problem, such as the size of the grid, the list of cars and rides to complete, and the number of steps in which the rides have to be completed. It prioritised the list of rides, then asks each car how long it will take to complete the ride. In this version the prioritisation is very basic, and is only ordered by the end step in ascending order. It then gathers the results, and immediately removes any result that will complete after the maximum finish step of the ride. If the ride won’t complete the ride in time, we don’t even want to attempt it, as we could be using this time to get more points.

Next, it will check to see if any of the cars can achieve the bonus points. If they can, it will pick one of these cars, otherwise it will pick any car. It will then pick the car that reported the least number of steps to complete the job. This provides a fairly optimal path for completing rides (though clearly not the best!).

Conclusions

This model is fairly good, achieving 45 million points, though not without its faults. The top team managed to achieve over 49 million points. Given more time (there was a fair amount of time finding and fixing silly logic errors), we’d look to improve the prioritisation algorithm, looking at balancing the choice of going on a long journey, vs doing multiple short ones.

The competition was great fun, and I’ll certainly look to do it again next year. I’d encourage any student who wants experience with a real world, complex programming problem to compete! It offers something drastically different fromĀ  a University module, and it can be really rewarding to build a working solution. As a student, the challenge was really engaging, but not too difficult as to be unsolvable.

I’ll definitely be encouraging HullCSS to run the event again next year. If you’re in Hull and want to get involved, let me know. If there is enough interest, it may be that the Hub is opened up to the whole city!

Links

Want to know more about HashCode or get involved? You can find some info here:

https://hashcode.withgoogle.com/

 

Want to see my solution? Check it out on GitHub! (We were really tired, and were rushing when we wrote this, apologies in advance!)

https://github.com/HarryGwinnell/HashCode2018

 

Want to have a go yourself? Find the PDF problem here! And the inputs are here.

 

Why not check out my other projects?