top of page

Narwhal Nation Interface

unnamednn.png

AT A GLANCE

BUSINESS PROBLEM:

The New School required an assessment of its new event management system, Narwhal Nation, to determine if it was worth the investment and how it could be improved. Was it working the way it was intended? 

MY ROLE & METHODS USED: 

I volunteered to lead a multi-method analysis of the platform and focused on three areas of usability for three types of users: 

  • How useful is it in documenting events?

    • User type: RA staff members / internal users

    • Methods: Usability testing, focus groups, quantitative data analysis

  • How useful is it in publicizing events?

    • User type: Students / External customer users

    • Methods: Survey, quantitative data analysis

  • How useful is it in evaluating events? 

    • User type: Professional staff / Internal users​

    • Methods used: Interviews, quantitative data analysis

SELECTED OUTCOMES:

  • Several alterations were made to event submission form 

  • Training was revamped for internal staff based on findings

  • An investment was made in purchasing ID swipers for more accurate attendance logging

challenge

Narwhal Nation is the event management program the university uses to promote student engagement. It was new in the 2018-2019 school year and represented a significant investment and change in how we handle engagement. The university spends thousands on its programming and needs an interface that is effective and usable.  I was involved in the strategic planning assessment for the division and one aspect of the strategic plan required “assessing the efficacy of Narwhal Nation”. I took leadership in examining the positives and negatives and overall usability of  Narwhal Nation.

process

OUTCOME

The Housing Department represented the single biggest user of Narwhal Nation to document, promote, and evaluate its almost 2,000 annual events for its customers, the students living in residence. I decided to focus my research efforts on this department. I consulted with that department to learn more about what they needed to know, and then met with the Director of Student Involvement, who oversees the platform for the University, and the Senior Director for Assessment for their feedback. 

 

I decided to examine whether it was doing what we needed it do and its overall usability in three key areas, 1) Documenting events (usability for Resident Advisors, RAs inputting events), 2) Publicizing events (usability for residents, customers accessing it to see what is happening on campus), and 3) Evaluating events (usability for professional staff supervisors charged with providing reports about events happening).

 

I created a multi-pronged research plan, looking into the three above areas and involving three kinds of Narwhal Nation users: RAs who input events; our customers, the students who learn about events through NN; and the professional staff who supervise the RAs and have to do reporting on what events are happening across the halls. The goal was to improve usability for these groups based on the jobs they were looking to get done and to give us an overall sense of the positives and negatives of Narwhal Nation.

METHODS & RESULTS:

DOCUMENTING:

  • DATA ANALYSIS: I collected data from NN, and the previous event management platform as well as the concurrent monthly reporting that’s done to get a clearer picture of how many events had been conducted in the halls. 

    • RESULTS: Learned that different halls had different standards for what events to include in NN through overall NN data, and also noted that more programming was reported in the previous system and the concurrent monthly reports. 

      • This informed my later interviews with supervisors on what they required employees to enter into the database and prompted calls for standardization.

  • USABILITY TESTING: I sat with five RAs and asked them to talk me through entering a program into Narwhal Nation to get a sense of pain points and issues with that process. 
    • RESULTS: After analyzing our interviews, I was able to identify some issues with the submission form that were confusing including identification of type of program, there being confusing language around submission and not being sure they were actually submitting. A desire for personalization and linking to other things they have to log into for the job. 

  • FOCUS GROUPS: I trained some colleagues to help in conducting focus groups with RAs on their usability experiences with NN for documenting programming. 

    • RESULTS: Identified some things they found helpful (ie swiping for attendance purposes) and other things they found problematic like repetitive processes within the form, suggestions for better integration with things residents use more often, and issues with evaluation.

 

PUBLICIZING: 

  • SURVEY: I surveyed customers (students) from all halls on their experiences with NN with an eye toward how usable they found it and if they used it at all. This survey consisted of some quantitative questions as well as open ended questions regarding their usage of the platform. I used affinity mapping (one wall of several featured below) to organize the qualitative information gained from the survey and surface themes in the data. 

    • RESULTS: 1) Clearly more use in first year halls, 2) found they prefer to hear about events through emails and posters, 3) large number of students have not used NN 4) Qual analysis (affinity mapping) detected main issues with NN include finding it to be a hassle or redundant, 5) however learned there was a desire by good chunk to be in the loop, they hadn’t warmed to NN. 6) Desire to better integrate into my.newschool or other things they already use. 

 

 

 

 

 

 

 

 

 

 

EVALUATING:

  • INTERVIEWS: I Interviewed all pro staff members on how they evaluate programming done and what expectations they have for programming and how to better get data for reporting. 

    • RESULTS: Analyzing these interviews, I found various themes, including: a notion that the data were hard to collect, desire to clean up approval process, and difficulty in getting students to evaluate programming

 

 

 

 

 

 

 

 

 

I found many actionable insights and presented those to a number of stakeholders responsible for purchasing, implementing, and training on the Narwhal Nation platform. Some of these included:

  • Met with Director overseeing NN again as well as my direct director to discuss my presentation of results

  • Changes were made to submission forms reflecting usability issues noted by RA staff, specifically in confusing language of whether something was actually submitted or not, in confusing learning outcome submission, and in the tags that were shared (residence hall tags were added)

  • Feedback from focus groups and usability tests helped inform the training we did for new RA staff

  • Worked with team to show how some things were being done differently across halls and we standardized our reporting and what we required from staff. 

  • Invested in swipers for all RA staff as they had reported enjoying this functionality and the numbers were promising

  • Added tags for events that indicate which hall it’s at, useful for search - from residents saying they wanted to be in the loop

  • We moved toward having one page for HRE instead of building ones

bottom of page