The client is a large missions organization/religious nonprofit with a tour incorporated into their headquarters, the tour is being renovated after 20 years, and they’re moving a lot of the exhibits to large screens incorporating stories on video, animated infographics, and still photo exhibits.
We were given the task of creating a piece that incorporated text stories read by a narrator, which included small avatars depicting the people involved in the story, or an icon representing some integral part of the story. The 14 stories were divided into two categories – people who had been killed in the course of their work, and stories of apparent miracles – and were all different lengths. Some had specific dates tied to them, and they all had locations. It was necessary to be able to add new stories, remove selected ones, and activate or deactivate them as needed.
A consultancy was doing the broader tour layout and provided mockups of the space.
Plans included providing for ADA compliance to allow persons in wheelchairs to access the interface.
We were not provided with any information of how high off the ground the screen would be mounted, but were told that the images were to scale. We knew that the wall was 8′ tall, and the screen was a 75″ touch screen, which allowed us to scale out the rest of the display.
The entire project was happening during the height of the second COVID-19 wave in Central FL, so I was unable to interact with the 75″ screen in person, or to recruit people to do user/interface testing. I am the average height for an American male (5’9″), and my wife is average for an American female (5’4″), and we have children ranging from 9 years old to 10 months old. Additionally, we had a small number of friends that we had “containered” with during the worst of the pandemic. I laid out the proper placement of the screen on a wall.
I combined these data points with ADA compliant reach-ranges. To get this, laid out over the initial visual designs (created by a coworker):
I created mockups to assess how they would feel in a real world space.
Challenges to Prototyping & Refinement
The visual designer and I were very aware of our limitations due to not having access to anything approaching a 75″ 4K screen in portrait orientation. Without approximating the experience it would be far too easy to oversize images, or undersize text (as in the mockups above). I got ahold of a 4K monitor that could be turned, but I had to make sure I was the proper distance from the monitor to “test” the designs, and make sure the elements were readily discernable.
According to the Society of Motion Picture and Television Engineers, the optimal viewing angle in theaters is between 35° to 55° of the horizontal field of vision (FOV), and the middle 60° of the vertical FOV is considered the “central” portion. We decided to use 40° of the vertical FOV as our “sweet spot” to determine how far back users were likely to stand when not interacting directly with the screen. We also allowed for a range of up to 50° vertical FOV, and down to 30°. With our screen being approximately 65″ tall, I calculated the FOV based on distance using the equation:
FOV° = 57.2958 × ( ARCTAN ( OBJECT SIZE / VIEWER DISTANCE ) )
|Viewer Distance||Screen Size||Vertical FOV|
|55″ / 4’7″||65″||49.76°|
|77″ / 6’5″||65″||40.17°|
|113″ / 9’5″||65″||29.91°|
My monitor is 27.9″ at its widest dimension. I had to place it about 33″ away to get it to approximately 40° of my vertical FOV.
The primary element that needed to be dialed in properly is the font size for the body copy. Generally, minimum reading size is understood to be 13 arc minutes (13/60ths of a degree) of the FOV.
Assuming that Chrome’s standard font size of
16pt text is sufficiently readable at the 33 inches that I was from my screen. I calculated it’s size in arc minutes, using the same equation as above, but multiplying the results by 60 to get arc minutes instead of degrees, getting a result of 17.36 arc minutes, at that scale rounding up to 18 is negligible, and would only improve readability.
Units & Affordances
Both my visual designer and I are used to creating designs that will be implemented on a wide variety of screen ratios and sizes, and we were operating in that paradigm. It was only at this point that we realized that our screen size would always be the exact same size and ratio. This led us to decide to think of all elements’ sizes in terms of the
At a 4K portrait resolution, and at our assumed viewer distance 18 arc minutes translated to approximately
0.8vh (8/1000 of the screen height).
After some feedback, and to make the text of the stories crystal clear to viewers, We ultimately increased the body size to
1.155vh. This is approximately equivalent to 25.6 arc minutes, and
22pt text on my own screen.
Back End Structure
Making it Easy to Update
With the requirement that the experience be updatable, it was assumed that it should be as easy to update as possible. During the ramp-up to the project we were provided with a document with all of the stories (including some that would not be active at the time of launch), and the images of the people who had been killed.
I created a JSON files for each category with the following structure:
name: string slug: string (unique) active: boolean location: string date: string paragraphs: [ an array of strings ]
I then used Mustache.js to create templates for each story, and each category’s menu pages. This allows the entire experience to be a single page, removing loading time between button pushes, and allowing for smoother transitions. All of the stories are loaded in using AJAX at the launch of the page.
Adding new content is a 3-step process:
- Add the content to the appropriate JSON file.
- Add the representative avatar/icon to the associated image directory. It must be named the same as the
slugfield in the story’s JSON entry.
- Add the audio track to the audio directory. The file must be named
When the page is reloaded, it will load the new story.
Activating a story that is in the JSON file only requires setting
The storytelling experience is the core of what we were creating. The basic idea of what it would be is a way to take in the story both audibly and visually. As such it was necessary to coordinate the audio of the slug.mp3 with the scrolling of the story (if necessary). Additionally, a music bed was requested under the story audio tracks, but the team didn’t want to have to coordinate each story in a single track preferring that there would be a separate
I landed on using Howler.js, to coordinate the tracks, and the scroll. Primarily making use of the following functions:
.duration()to retrieve the length of the story mp3.
.seek()to make sure that the stories always started at the beginning when pages are reloaded.
- And the obvious
The JS file does the following when the each story is loaded (if the UI is not muted):
- Fade in the page, and then scroll it to the top (because a user could have been scrolled lower, the browser maintains that scroll positioning).
- Find the length of the story mp3. Using:
duration = eval(slug+"_audio.duration()")*1000
Note: If this product was going to be used online,
eval()would have to be avoided as it is patently insecure.
- Set the position of playback for the story mp3 to the its beginning.
- Start scrolling the story text with a duration equal to the duration of the story mp3.
- Start the playback of the story mp3.
- When the story text gets to the end it will stop, approximately at the same time as the mp3.
- 3 seconds later the audio bed fades out.
If the UI is muted, there is no auto scroll, and will load in a
paused state. The user can hit
play, which will start the auto-scroll, but not the audio.
When the user navigates to a “next” or “previous” story, the audio and scroll will pause, fade out, and then proceed through the list above.
The pause button pauses the audio as well, and restores from the current playback position, unless it’s within the first 3 seconds.
The UI also includes a “restart” button.
Every time a menu page is loaded, it fades in as it scrolls to the top in an attempt to indicate its ability to scroll down. If a user scrolls down on the touch screen, the links to all stories can accommodate ADA reach range guidelines.
Throughout the experience, all control buttons are within the ADA recommended range, as well as the average reach-range for the average American adult.
The groups that regularly go through the tour experience range from a handful of middle-aged vacationers to hundreds of middle-school students over a 2 hour period. Inevitably, someone is going to tap some combination of buttons that will “break” the experience, requiring it to be reloaded.
We talked with the tour designers and managers and found out that they do not intend on having any other kind of interface (keyboard or mouse) connected to this display, and would not be able to easily interact with it other than through the provided touchscreen.
To that end, we added a reload “easter egg” so that 8 consecutive taps in an invisible
5vw box in the bottom-left of the screen will refresh the entire page and re-load all of the story and audio data.
I have uploaded the entire project repository on Github at:
Note, for the purpose of development, I recorded or synthesized the story mp3s, chose a stock video, and selected a stock music bed for the project. All of these were intended to serve as artifacts to work around and with, and will be updated & improved in the final implementation.
Viewing on Other Devices
If you would like to see and interact with the project, it is available at:
Nothing will look right unless you view it at the proper aspect ratio of a 4K screen in portrait view. You can achieve this in Google Chrome’s web tools panel buy creating a custom device view and setting the resolution to
When the page loads, the guidelines displaying the average and ADA reach parameters will show on the page, you can hide them by tapping the bottom-left corner of the interface.