Redesigning A Singapore University Website — A UX Case Study
A revision of SUSS’ Information Architecture and Navigation
This article is the second entry in a series of retrospectives written to aid my growth as an aspiring UX designer enrolled in General Assembly’s User Experience Design Immersive. I look forward to hearing your thoughts about any college’s digital presence or advice on UX Design in general, so please leave a comment in the section below and we’ll connect!
Industry Type: Tertiary Education
Team Member(s): Jen and I
Project Duration: 2 weeks
Skills Exercised: Heuristic Evaluation, Content Audit, Competitive Analysis, Contextual Inquiry, User Interviews, Information Architecture, User Flow Mapping, Wireframes, Prototyping, Usability Testing
What this article covers:
- Discovery + Target User Persona
- Content Research and Evaluation + User Research
- Existing User Flow Maps
- Problem & Solution Statements
- Redesigning the IA
- New User Flow Maps
- Wireframing & Prototyping
- Usability Testing & Iterations
Discovery
The brief for our second project in the General Assembly UXDI course was to improve the Information Architecture (IA) and navigation of a Singapore University website, so as to make its content more comprehensible and easier to find.
Jen and I were paired up for this project, and we were assigned the Singapore University of Social Sciences (SUSS).
Previously known as the Singapore Institute of Management (UniSIM), the school re-branded itself on 17 March 2017 and became the SUSS that people know today. While it has been restructured to become Singapore’s 6th autonomous university, SUSS’s commitment to providing lifelong education and equipping its students to better serve society remains.
We did further research on the school, and it became evident through its various schemes and alumni subsidies that SUSS’ outreach to adult learners is also just as strong as ever.
Target User Persona
It became clear to us that SUSS’ target audience is working adults who are balancing family, career, and continual education to advance their professional development. This informed our decision of picking John (out of the given 3) as our primary User Persona, with Mark and Jessica becoming the secondary ones:
Content Research and Evaluation
Alongside having our target User Persona in mind, we knew the size of the research challenge ahead of us, so we wasted no time in running a cognitive walkthrough, heuristic evaluation, and content inventory of the existing website, together with a competitive analysis of 2 other Singapore Universities.
We teamed up temporarily with the other team (shoutout to Aaron and Shiyang!) working on the SUSS website to speed up the process, and these are our findings:
Cognitive Walkthrough
1. The existing website has a very functional design.
2. For a relatively new, rebranded university, SUSS’ website looks dated and uninspiring for prospective students.
3. There was a common issue of presenting overwhelming information to users across many pages.
4. The website used several uncommon acronyms and terms which made the process of obtaining information confusing.
Heuristic Evaluation
With reference to Jakob Nielsen’s 10 Usability Heuristics, we evaluated the functionality, design, and navigation of the existing website based on the following metrics:
- Consistency and Standards
Users are directed to many different microsites with different navigation bar designs when trying to obtain information. Users become disorientated as this inconsistency makes the process confusing.
The website makes use of inconsistent and unclear button UI, causing users to be unsure of the interactions they are meant to have with the system.
- User Control and Freedom
In an attempt to use the search bar feature, unnecessary advertisements are automatically pushed to users, hampering the process of looking for relevant information.
- Recognition rather than Recall
The minimal similarity between the different sign-up forms and log-in pages in the website causes users to spend the additional effort in recalling and verifying the different steps taken while obtaining information.
Content Inventory
We then performed a content inventory exercise on the existing website in order to generate a sitemap:
At first glance, one will notice how extensive the IA of the existing website is. Through the exercise, we learned that users have to go through unnecessarily deep levels of navigation to obtain the information they need. The information is also presented in an overwhelming manner on several occasions, while clickable links that are meant to guide users on to their next steps do not work. Overall, the experience is a disorientating and unmemorable one.
Competitive Analysis
Jen and I compared the process of learning more about a programme syllabus on the SUSS website against that of 2 other local universities (Singapore Institute of Technology and Singapore Management University). Our findings further supported the argument that users have to go through unnecessary steps to obtain information on the SUSS website.
User Research
In order to validate our earlier assumptions about the existing website and assess its usability with actual test users, Jen and I also ran contextual inquiries and user interviews with people who represented John’s User Persona.
“I like the consistent use of brand colours, but I had difficulty finding the course information needed for (Master of Applied Research in Social Sciences) because the navigation bar had too many options and the website uses naming conventions I’m not familiar with. It became overwhelming and troublesome to get to where I need.”
In our user interviews, a common theme in the feedback we received was that the process of obtaining information was an unpleasant experience because the content was overwhelming and that made it difficult to get to the relevant webpages. This confirmed the need to redesign the IA and navigation of the existing SUSS website.
Existing User Flow Maps
Jen and I then mapped out the user flows for all 3 of our Personas.
We isolated John’s user flow map as he was the primary user we wanted to focus on. This helped us to visualise the pain points that John experiences while trying to accomplish his goals:
1. The need to download the information as separate PDFs for each module of the same course forces John to go through unnecessary steps just to find out the full details of a course. The website did not offer him a concise way to do so.
2. While attempting to apply for a course, John also encounters broken links in the admission process.
At the same time, we could see from Mark and Jessica’s combined user flow map that 3 out of 4 tasks lead users down the path of micro-sites, which makes the different processes tedious and arguably confusing.
Problem & Solution Statements
We consolidated all our research findings and insights before crafting the following problem statement:
Working adults who wish to continue their education by taking up a course in SUSS encounter unnecessary steps on the school’s website to find the information they need for the course. Even then, the details are often presented in an overwhelming manner. This makes for a disorientating and tedious process of reaching the relevant web pages.
We believe we can solve this problem by:
Redesigning the IA and navigation of the SUSS website in order to empower prospective students (from undergraduate to continuing education level) to feel confident in their decision-making process every step of the way. This involves the following 3 approaches:
- Congregate: to consolidate the relevant and important information in one place for easy access.
- Improve Usability: to streamline the content and allow for an easier navigation of the school website.
- Increase efficiency: to design a more efficient course admission process in order to account for the stakeholders’ interests.
Redesigning the IA
To redesign the website’s IA, we planned to use a combination of card sorting and tree testing methods with test users in order to uncover their mental models behind the organisation of content and also validate our design decisions made.
Content Audit
With clear design goals in mind, we proceeded with a combined content audit exercise to reduce over 276 unique cards (obtained from the previous content inventory) to 64.
Card Sorting Exercises
Jen and I then ran a 1st card sorting exercise internally to reduce that number further to 30, by associating similar terms together and removing the ones we felt could be found beyond the secondary navigation level.
With 30 unique cards and 5 categories, we ran a 2nd card sorting exercise (hybrid) with 23 test users.
The results obtained were not ideal as there were low agreement scores across several categories. Our follow-up questions with some of the test users revealed that there was ambiguity in the card names, leading to different results.
We clarified the terms used and expanded on the number of categories before running a 3rd card sorting exercise (closed) with 11 test users.
We received much better agreement scores this time, and we used the findings to refine the card names further. For example, “Academic Committees” was renamed to “Boards & Committee” as it was a better fit for the type of content shown on that webpage.
Revised Sitemap
With those findings, we drafted a new Sitemap and reviewed it against our 3 Personas’ previous User Flows.
Tree Testing Exercises
To validate our design decisions made, especially for John’s User Flows, we crafted 5 tasks which aligned with his goals and pain points, before running our 1st tree testing exercise.
We learned that people struggled with minor issues in our revised Sitemap, namely the location of content (i.e. Networking Opportunities) and definition of terms (i.e. Subsidies & Incentives) used. We made further clarifications and ran a 2nd tree testing exercise to validate our decisions to revise the cards’ location and naming.
We obtained more positive results this time round. For instance, most users now correctly found information regarding “Networking Opportunities” when we renamed it to “Networking Events” when we placed it under “News & Events”.
All the insights gathered from multiple rounds of card sorting and tree testing exercises finally enabled us to create the final Sitemap shown above.
New User Flow Maps
Jen and I then drew new User Flow Maps for all 3 Personas, with special attention given to John, our primary user.
We streamlined the process of obtaining course syllabus information for Mark and Jessica and removed the hassle of going through multiple micro-sites.
More importantly, where John previously had to go through 4 navigation levels only to download programme details separately for one single course, he could now get all the relevant details for that same course within a single page in just 3 levels.
We also addressed the problem of broken links which occurred during the previous admission process.
Wireframing & Prototyping
We translated all these design decisions into wireframes in order to quickly visualise the content hierarchy and flow between the different pages that John would encounter while fulfilling his user needs.
The wireframes became the foundation we built upon when we moved them to Axure to create our first clickable prototype. As it was the first time using this tool for prototyping, it took a while to learn Axure’s many different features, and we struggled initially with adding interactions to our prototype. However, through many rounds of testing and advice from others, we finally produced a version 0.1 which we felt confident enough to conduct Usability Testing with.
Usability Testing
We tested the usability of our prototype with 5 users who represented John’s Persona. After they were given a short contextual background and brief for each of the 4 tasks, our users took on the role of John and interacted with the prototype to locate the relevant information needed for John’s user flows.
After each test, we asked our users to provide feedback through both our follow-up questions and survey form. Our survey results showed an average success rate for the tasks of 80%, which we knew gave us a good benchmark to build upon.
With the feedback received from the 5 Usability Tests, we plotted them on a 2x2 matrix which evaluated impact against time-cost. As shown by those highlighted in red, we invested our time implementing only the changes which had a high impact to us achieving our design goals.
Iteration
With the class presentation quickly approaching, we had to make smart use of our time in incorporating the most important elements into version 0.2 of our prototype:
A. We gave John a secondary means of obtaining information regarding “Alumni Subsidies”. Upon clicking on this new link, John is given a synopsis of the type of subsidies that apply to alumni students like him. From there he will be directed to the specific subsidy that is relevant for the respective courses.
B. During our Usability Tests, we observed that many users stopped proceeding beyond the course listing page when finding the course syllabus information for [Graduate Certificate in Human Capital Management]. This is because they received no feedback from the website while trying to interact with the different elements, and this led them to believe they were on the wrong path.
We took that into consideration and added more interactivity to the course listing page for prototype version 0.2. This includes changes made to the filter bar feature and pre-selection feedback when hovering the cursor over each course.
Way Forward
The next step for Jen and I are to conduct a second round of Usability Testing using our updated prototype. Ideally, this should be done with 7 to 10 test users who again represented John’s Persona.
With the feedback we received, we should again plot them on a impact vs. time-cost 2x2-matrix to decide which changes will impact us most in our ability to solve our design problem. These changes will contribute to the next version of our prototype, and yet another round of Usability Testing awaits!
Lessons Learned
Test early and test often
Testing our prototypes early was the best way to validate our design decisions. The feedback we received from the Usability Tests also helped inform our next steps, especially the important changes to implement when time is also a constraint. With that in mind, the earlier and more often we do it, the better.
On hindsight, this principle should have also applied to our wireframing phase. Conducting Usability Testing using lo-fi wireframes almost ensures that we do not become too attached to our designs when we allow them to be critiqued upon by our test users. At the same time, it helps our test users focus more on the navigation flows and organisation of content, rather than any visuals (images and colours) which may become a distraction.
Conclusion
The 2 weeks spent improving the IA and navigation of the existing SUSS website was tough but enriching. Picking up on Axure prototyping skills aside, this experience enabled us to be more confident in approaching test users during the user research and usability testing phases. More importantly, we had the opportunity to practise making sense of user insights and thereby making informed design decisions after the results of each test (from card sorting to usability test) came in.
Thanks for reading this article! Feel free to comment below and we’ll connect!