Kestral Gaian is the Head of Digital at the London Interdisciplinary School. In this post she looks at the future of usability and HCI within Higher Education, and why it’s time to champion the voice of the user.
1992 was a hell of a year for technology.
Windows 3.1 was released and sold one million copies within its first month – setting new records for computer software. Linux saw its first distribution go public just six months after Linus Torvalds announced the new project on usenet. We got the first major worldwide computer virus, the Apple Newton debuted at the Consumer Electronics Show, and Wolfenstein 3D became the first major 3D video game. We also said goodbye to two computing legends, Grace Hopper and Isaac Asimov, who passed away in January and April respectively.
It was a year once described by Queen Elizabeth II as “annus horribilis” for a variety of reasons, least of all the headlines made when Princess Diana and Prince Charles decided to separate. It was the year in which the USSR split into smaller states, several natural disasters ravaged the planet, and (perhaps most tragic of all) Take That released their first ever album.
Strange, then, that a world in turmoil could still make such huge strides forward in technological innovation. The releases and forward motion in 1992 set the stage for decades to come, laying a strong foundation for the years of innovation that immediately followed.
Thirty years later, the world of technology is almost unrecognisable. The web changed the way we work and think, connectivity requires no tether, and computers have transformed from beige boxes to tiny devices we carry on our wrists, in our pockets, and in backpacks wherever we go.
In 1992, just 17% of UK households had access to a computer [1]. In 2021, the average UK household contained at least four computing devices, often more [2]. With this thirty-year shift from relative obscurity to relative ubiquity has come several huge challenges, but perhaps none has been more obvious than the battle for usability.

A design’s usability depends on how well its features accommodate users’ needs and contexts. People often confuse usability with user experience (UX) and ease of use, but usability is less about what buttons go where and more a measure of how well a specific user in a specific context can use a product to actually achieve something, and do it in a quick and easy way.
While ‘usability’ is certainly the lay term for this field, the term HCI (or Human Computer Interaction) has come to be the standard way to describe research into how people interact with technology – and the annual CHI HCI conference, celebrating its 40th anniversary this year, has been debating the major issues of the day since 1982.
“Usability is about human behaviour. It recognises that humans are lazy, get emotional, are not interested in putting a lot of effort into, say, getting a credit card and generally prefer things that are easy to do vs. those that are hard to do.”
On one hand we want our technology to be highly personalised, customised, and feel like it was made exactly how we as individuals want to interact with it. On the other there’s the need for a standard experience so that people can be trained how to use and operate things. On the third and fourth hands, which you’d only have if you were some kind of Vitruvian Man, there’s the ever-competing arguments of new functionality vs keeping software simple.
We’ve all used websites that felt like they were designed for rocket scientists, and we’ve all used apps that felt patronising to tap through. Whether an app, an operating system, a website, or something more bespoke, almost every human being can tell you about an experience they’ve had with technology that just plain sucked.

An application can look and feel like it hasn’t been updated since Hanson released Mmmbop, yet can still pull in millions of users and – perhaps more alarmingly – millions of dollars in revenue. The Yale School of Art’s website represents 166 students each paying $262,700 in tuition fees. That’s $33.1m USD of revenue every four years – but at the time of writing their website ranks amongst the most unusable in higher education. Norway’s most popular classifieds website, Arngren, looks like it was built in 1996. Closer to home, the Arts Council’s website (which is actually quite usable) is let down the second you click through to Grantium, their online bid-writing platforms But why do we tolerate these poor user experiences?
The problem, of course, is choice.
As Morpheus was keen to remind us in The Matrix, choice is often an illusion – and nowhere is that truer than in technology. We often have very little choice over the systems we encounter, both as users and people making software purchasing decisions. In the consumer world the illusion of choice is very real. You can choose between fifty different screen sizes, but ultimately the inner hardware and, crucially, the software of the devices remain woefully generic and have evolved little since 2007. In the business world, enterprise software has the opposite problem – through monopoly the choices are limited, and without competition the market has completely stagnated.
Everyone is looking for that ‘golden triangle’ of usability, cost, and ease of implementation. Add compliance, data security, and some fairly high profile cases of change going wrong, and it’s no wonder that organisations have evolved to favour backend box-ticking and budget crunch over creating truly breathtaking experiences for end users.
The older an organisation is, the larger and messier its digital legacy. Universities, once the forefront of new and emerging technologies, are often forced to standardise around backend systems so tired and outdated that they couldn’t support a modern UX if they tried. While student expectations of usability have skyrocketed, systems that handle basic functions like sharing feedback and grades, and allowing a student to change their name, gender identity, or pronouns, remain almost comically archaic and clunky.
We should be able to ‘have our cake and eat it too‘, but it feels like we’ve almost forgotten what cake is, what it’s for, and that it’s a thing that can be enjoyably consumed at all.

And so here we are in 2022, three decades since Yeltsin, Linux, and Diana & Charles. Few knew on January 1st that it would be such a transformative year for technology, but the huge upheaval that the world was seeing unknowingly set the stage for a boom in innovation that ultimately birthed the Internet age and changed the way we used technology forever.
As we enter into our third calendar year living with COVID-19, it’s hard not to draw comparisons. The British royal family is one again facing unpleasant headlines, we’re facing an ever-increasing climate emergency, and there is much socioeconomic turmoil across the globe.
We don’t yet know how transformative a year 2022 could be, but the opportunity before us seems as huge as it does scary. Recent world events have shaken the status-quo, seeing technology come under increasing pressure to adapt to our changing lives. The lessons we’ve learned through lockdowns and isolation have changed the nature of the conversations we’re having in higher education around pedagogy, remote teaching and learning, and what makes an experience online vs hybrid. To steal a phrase from Malcolm Gladwell, we are fast approaching a tipping point.
“It is not the strongest or the most intelligent who will survive but those who can best manage change.”
Now, it seems, is the ideal time for change.
I started my career in technology at Microsoft over two decades ago, and in that time I’ve been lucky enough to witness the inception of many incredible, potentially game-changing technologies. I watched product demonstrations in awe, wondering if this would be the thing that changed the world.
It wasn’t until I ventured beyond the comfort of my corporate desk into roles within charities, broadcasting, and education that I realised how naïve my thoughts around innovation had been. I’d been a self-confessed computer nerd since the age of eight, but for the first time I had to adjust my focus and see the real power behind the technology throne: the users.
A user can be anyone – technophile to technophobe, with an almost infinite combination of use cases, accessibility needs, and preconceptions. By shifting my focus from the innovative technology itself to the innovation that it could enable, I became acutely aware that technology wasn’t really the thing. To quote Lee Pace’s character from Halt and Catch Fire, technology is “the thing that gets us to the thing.”

That all sounds great, but how do we pioneer change in a changing world, full of tight budgets and changing demands and 25% of us potentially being in government-mandated isolation at any given time?
In higher education, we need to refocus and completely change the narrative before we can change the world. I believe that 2022 has the potential to surpass 1992 in terms of technological innovation, particularly in the education space, but this new frontier isn’t in CPU power or screen size or connectivity – it’s in the world of HCI and usability.
And it has to start with our users. Not just our ‘stakeholders’ like faculty, and estates, and the VC’s office, but our students. When I talk to current university students about the solutions they’re used to using – no matter how integrated through single sign-on nor how well branded they are – the feedback is universally poor.
In 2022 I believe we absolutely must set a new standard for working in partnership with our students and members of faculty to create experiences that feel personal, connected, and supportive. We need to build emotion and care into all we do.
This isn’t a new concept – and pioneers in meta-design and participation like Gerhard Fischer write some incredible papers on this more than a decade ago – it is only now that all of these far-reaching and disparate factors have lined up to put us at that magical and tangible tipping point.
We need to treat our digital assets as extensions of our people, not just as optional extras. In the spirit of the phrase “nothing about us without us”, I encourage you all to think about how you build student feedback into every layer of the digital process, from idea generation to provider selection to managing existing tools and services.
“The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire.”
It’s fitting that, as CHI HCI turns 40, it is more important now than ever before. We have an opportunity in 2022 not to build the fastest server or create the fanciest screen, but to refocus the entire conversation on usability, on co-operative design, and on building the future hand-in-hand with those who are living it every single day.
That’s my new year’s resolution – and I’d love to see everyone in technology make it theirs, too.
References:
[1] https://www.statista.com/statistics/289191/household-penetration-of-home-computers-in-the-uk/
[2] https://www.statista.com/statistics/365104/number-connected-devices-per-person-uk/
Image Credits
- Screenshot composites taken from versions of Windows in Virtual Machines provided by Kestral Gaian.
- Courtesy of Microsoft, 2010, from ‘Microsoft Office The Movie’
- Courtesy of Warner Bros, 1999
- Illustration by Kestral Gaian