iOS App for the Large Pelagics Research Center
How I designed a mobile tool for collecting and analyzing data about global fish populations in 25 hours using tangible prototypes and rapid feedback.
Overview
Ahead of their first fundraising efforts, the Large Pelagics Research Center contracted me to design an iOS app for tracking global fish populations.
In under three weeks, I delivered designs for a solution that met technical data collection requirements, responded to nuanced insights about the fisherpeople using the product, and balanced the constraints of the sole engineer developing the app.
In addition to delivering an end-to-end experience in Figma, I developed content design, a visual identity, a design system, and a style guide.
Background
Previously, the Large Pelagics Research Center’s data about the global movements of fish species relied heavily on the memories of recreational and commercial fisherpeople. It was a manual, asynchronous process in which information was entered into the system days or weeks after catch and release, so data was often incomplete and inaccurate.
The Large Pelagics Research Center, in partnership with the Olin College of Engineering Intelligent Vehicles Laboratory, has developed new tags that carry an RFID chip to automate the process of tag identification. A fisherperson can scan the tag at or near the moment of capture with a compatible RFID reader, such as a mobile phone, which will allow the fisherperson to immediately enter data about the fish in the HiTag app. When the app connects to cellular or wifi, the data is automatically uploaded to the Pacific Islands Fisheries Group database.
Minimizing the barrier between catching and reporting the fish improves catch and release fishing activities, while also enabling scientists and researchers access to richer, more accurate data over the lifetime of a fish.
New tags carry an RFID chip to automate the process of tag identification. Bluetooth can detect these tags.
Team:
3 researchers from the Large Pelagics Research Center
1 iOS engineer
1 product designer (me)
My role:
UX/UI designer
Interaction designer
Content designer
Visual designer
Prototyper
Brand designer
Initial timeline: 20 hours
Due to budget constraints, I was initially given 20 hours to deliver a net-new, end-to-end design solution on a lean, fully remote team.
Requirements
I partnered with the iOS engineer to outline the MVP product requirements. We confirmed this with our clients, the 3 researchers.
Two flows:
First catch flow (for fish that did not yet have tags)
Recapture flow (for fish that already had tags)
Data collection requirements:
RFID tag ID
Photo
Species
Size & weight
Date & time
Location
Nice to have:
Visual identity
Design system
Information architecture
Users
I worked closely with the 3 researchers, who were also my clients, to collect information about the three different user types, their needs, and their workflows. Due to time constraints, I was not able to conduct generative research with commercial or recreational fisherpeople. Instead, I met with the researchers from the Large Pelagics Research Center to gather nuanced insights.
3 core users:
Commercial fisherpeople: These individuals are monetarily incentivized to record data.
Recreational fisherpeople: These individuals are “citizen scientists” who are personally motivated to record data.
Marine researchers: These individuals are often aboard commercial and recreational expeditions to support data collection.
Insights:
The entire catch and release process must happen in under 45 seconds
There is often limited or no access to wifi or cellular data while using the app, so data must be recorded and uploaded asynchronously (this is important for engineering, but also might need to be communicated in the UI)
There is often high glare from the sun while on the water, so the UI must be usable in bright settings
Recreational fisherpeople can be hesitant to share their location, but location is a requirement, so the experience must respond to these conflicting needs
I translated these insights into a high-level catch-and-release workflow. The digital experience would need to be integrated with this workflow.
Catch and release workflow.
Iteration 1: High-level flows
With an understanding of the workflow and insights about the users, I proposed two high-level patterns for the digital experience.
Option 1: Step-by-step flow
Option 1 follows a pattern where every piece of information is collected on a separate screen during data entry. This supports more complex data inputs in the UI, but forces a specific order and can feel lengthy. I created a low fidelity prototype to quickly gather feedback from the team, which can be viewed here.
Option 2: One single form
Option 2 follows a pattern where all data inputs are handled in a single form. This raised issues for entering complicated data, such as species and location, but felt faster and flexible. Remember, the entire catch and release process must happen in 45 seconds, so I was designing for efficiency. Also, this option would be easier for the engineer to implement. You can view the prototype here.
Option 1: As a pattern, each piece of data is handled on an individual screen.
Option 2: As a pattern, all data entry happens in a form on a single screen.
Feedback
The team agreed to move forward with option 2. It would be easier to implement and maintain, and it was faster and more flexible than ordered screens.
Iteration 2: Interaction Patterns
It was time to refine interaction patterns and design details, specifically for the various data inputs.
Simple numerical entry
Simple data inputs, such as fork length, could be handled inline with fields that follow native iOS patterns.
Simple inputs, such as numerical fork length and date/time, can be supported inline.
Complex data entry: Species and location
Location and species had nuanced requirements based on differences between recreational and commercial fisherpeople.
In the initial exploration, species selection occurred in a dropdown menu with fuzzy search (the items in this menu would be pulled from a finite list).
Location was split into latitude and longitude numerical fields; this was informed by guidance from the 3 researchers regarding the language used during commercial and research expeditions.
Species and location needed to be usable by commercial and recreational fisherpeople.
Feedback
Species
In order to support efficiency, we decided on the following requirements:
Species should be optional. This is particularly useful for recreational fisherpeople, who might be uncertain.
The UI should also support “unknown" for users who do not know the species.
The UI should support “frequently used” or “recently used,” which responds to real-world scenarios in which fisherpeople repeatedly catch the the same species.
How might we design a UI that supports the complex requirements for species?
Location
I learned from the researchers that recreational fisherpeople are not always fluent in latitude and longitude coordinates.
Also, as mentioned earlier, recreational fisherpeople are hesitant to share their precise location.
The backend, however, requires precise lat/long coordinates.
How might we design a UI that allows users to enter a general region or location, and that can be translated on the backend into specific lat/long coordinates?
Iteration 3: Complex Interaction Patterns
I explored a more complex UI and interaction pattern that utilized modals to support the needs of both commercial fisherpeople and recreational fisherpeople.
Species
A modal improved efficiency by creating the opportunity for quick links, such as “frequently used” or “suggestions.” A modal also provided clear space for an “unknown” option.
Location
For location, the modal provided an opportunity for the user to share location in multiple ways. I explored the option to select a general region, as well as pinch-and-zoom interactions. We ultimately landed on a UI where users could type in exact lat/long coordinates, use their current location, or pinch-and-zoom to an area.
Handling location in a modal allowed for more flexible inputs, while also supporting exact lat/long coordinates.
Prototype
I created a prototype to reflect this new pattern, where simple inputs would be handled inline, but complex inputs would be handled in a modal. I brought this prototype back to the team for feedback. There was consensus that numerical inputs in fields combined with modals for complex inputs supported both efficiency and nuanced user needs. You can view the prototype here.
Hybrid approach, where simple inputs are handled inline, while complex inputs are handled in a modal.
Iteration 4: Visual Identity and Design System
At this point, the team was aligned about the digital workflow and interaction patterns. The engineer was unblocked to begin building, but I was nearing 20 hours of work. The 3 researchers, who were also the clients, were thrilled with the UX, so they wanted to extend the contract by 5 hours to refine the visual design. This app would be used for proof of concept in the field, but also for investor pitches, so the team was interested in polished visuals.
Maritime, utilitarian, and efficient
The team shared existing collateral and a high-level vision for a marine look and feel, which I translated into a handful of directions.
We wanted the app to feel utilitarian, but also modern and efficient. We ultimately chose a monospaced typeface for field inputs, and a sans serif typeface for all other content.
Designing for high-glare situations
The UI needed to be visible in high-glare situations. Knowing this, I proposed a handful of options for a dark theme, which is more legible in bright conditions than a light theme.
Design system and style guide
With the visual identity nearly ready, I began working on a design system in Figma to support future designers and the current engineer.
Results
The final deliverables included a design system in Figma, a lightweight brand identity and visual system, as well as an end-to-end user experience for an MVP iOS app that will be used for proof of concept and investor pitches. You can view the final prototype here.
The is currently in development. I am continuing to work closely with the engineer for testing and to solve additional problems as they arrive. The researchers and I are discussing plans to gather feedback once the app is in the hands of real users. If all goes well during fundraising, I am hoping to support the team with future design work.
While we do not yet have qualitative or quantitative feedback from users, I believe this process speaks to the value of design and rapid prototyping while developing a product from scratch. By repeatedly creating tangible prototypes for the team react to, I was able to confidently make design decisions under tight timelines.
This process also effectively involved the entire team, including non-technical folks, in the design process. It created transparency, trust, alignment, and team-wide investment in the success of the MVP designs.
Final UI.