4moms infant car seat

Project Type: Integrated product and app experience

Company: 4moms

The 4moms car seat is a robotic infant car seat that automatically installs itself within the vehicle, and continuously monitors itself through a suite of sensors and a dashboard on the companion app.

 4moms Infant Car Seat: System Overview

4moms Infant Car Seat: System Overview

  • BASE: The car seat base itself contains motors, sensors, and an integrated user interface with a screen, button, and audio speakers.
  • APP: The app connects to the base over Bluetooth, and sends commands as part of a choreographed installation experience. The app also displays a dashboard, to monitor the state of the car seat in real time.

This project was a large team effort that spanned many years, so below I've bucketed some examples of my work into four categories:  (1) Interaction Design - animated storyboards, (2) UX Design - wireframes and system diagrams, and (3) Design Research - usability testing, in-home user studies, and one-on-one interviews, (4) Collaboration with Embedded Software - designing a lightweight, text-drawing library from scratch.

Since the launch of the 4moms car seat, I have been working with Henry Thorne, technical co-founder of 4moms, on a spinoff car seat startup business called Safest Seats. We are currently in the exploratory research and prototyping phase, and as the only employee of the startup, I am responsible for all UX design and research.


    Example #1: Interaction Design - Animated Storyboard

    During the installation process, the robotic car seat must use its internal motors to automatically level itself relative to the vehicle chassis. In order to do this, the app instructs the user to place their phone on the floor of the vehicle, to take a reference angle measurement. 

    This was a task that users initially had a great deal of difficulty with, because it involved a series of compound instructions, involving proper placement and orientation of the phone on the vehicle floor during measurements. This was further complicated by the fact that users had to receive instructions via the app while also physically manipulating and rotating their phone.

    We approached this problem by breaking down the compound instructions into individual steps, and storyboarding possible app graphics and animated transitions. Finally, we conducted guerrilla-style usability testing with 13 people at a local park to ensure that the final instructions were intuitive and understandable. 

     Animation storyboards: Ground angle measurement instructions in app

    Animation storyboards: Ground angle measurement instructions in app

     Ground angle measurement: Potential error cases

    Ground angle measurement: Potential error cases


    Example #2: UX Design - Wireframes and System Diagrams

     System Overview: Installation states and error states

    System Overview: Installation states and error states

     Infant Car Seat Base: Embedded software wake-up states, button events, and Bluetooth events (via connected app)

    Infant Car Seat Base: Embedded software wake-up states, button events, and Bluetooth events (via connected app)

     User Experience Journey Map, Personas

    User Experience Journey Map, Personas

     App Dashboard States and Notifications

    App Dashboard States and Notifications

     Detail View: All possible app dashboard center dial states

    Detail View: All possible app dashboard center dial states

    Future Concept: Child Monitoring

    In a future iteration of the product, we considered adding sensors to monitor the child's harness and buckle, in addition to the installation state of the seat itself. This feature would allow the seat to detect the presence of the child within the seat, whether or not the harness is bucked, and whether or not the harness is sufficiently tightened.

     User Experience: Child monitoring events

    User Experience: Child monitoring events


    Example #3: Design Research

    Screen Shot 2017-10-16 at 12.40.51 AM.png

    Usability Testing: Large-Scale Field Test

    After the development of both the car seat and the app were complete, we conducted an extensive series of usability testing and field tests to ensure that the product would be usable once we released it into the wild. The scope of testing included both embedded and app software, as well as overall product unboxing and usability with real families and babies. I helped to organize and coordinate all final field testing, with over 80 different users, over the course of 4-5 months.

    Field test designs:

    • Users completed the installation process, under direct observation, in their own vehicles. Users were evaluated for accuracy, and were interviewed about their perception of the experience.
    • Users were sent home with complete boxed car seats, beta versions of the app, and were instructed to install the car seat in their own vehicles to the best of their ability. Users returned 2 days later, and certified technicians inspected the installations for accuracy. We also interviewed users about their perception of how accurately they performed.
    • Users took the car seat home and used it as their primary car seat for their baby for one month. Users provided periodic feedback through remote surveys and a final in-person interview.

     

    One-on-one User Interviews, In-Home User Studies

    As part of the exploratory research phase for the new car seat start-up business, I’ve conducted one-on-one interviews and in-home user studies with parents to better understand use case scenarios and unmet needs around car seats.

     Synthesis: One-on-one user research with mothers

    Synthesis: One-on-one user research with mothers

    IMG_5974.JPG
     User Decision Process: Information architecture on car seats

    User Decision Process: Information architecture on car seats

     In-house user research (car seat research, decision, and purchase experience): Journey map and card sorting

    In-house user research (car seat research, decision, and purchase experience): Journey map and card sorting

    JourneyMap2_Kate.JPG
     Facilitating team collaboration session on research synthesis

    Facilitating team collaboration session on research synthesis


    Example #4: Collaboration with Embedded Software

    Design a custom, lightweight, text-drawing library from scratch

    After the initial release of the car seat, we continued to make updates to the car seat firmware to add additional features and fix minor bugs.

    However, in order to actually change the content on products that had already been sold and released into the world, we had to figure out a way to generate new visual assets through very minimal code that could be uploaded from the app via a firmware update. Any new assets would have to match the visual design of the assets that had been pre-programmed onto the base during manufacturing.

    In order to accomplish this, I worked with one of our robotics / embedded software engineers to invent a modified custom text-drawing library, which we created together from scratch.

    This enabled us to create new screen content, and modify existing user flows for improved usability.

     

    1. Concept and background

    Our initial approach was to create tiny individual bitmap images for each alphabet character, each consisting of a matrix of 0 or 1 values. We could put this directly into the firmware code like ASCII art, and we could construct ransom-note-like sentences through code that would manually list the desired x, y screen coordinates of each letter. However, aside from being tedious, this text-drawing approach posed several challenges:

    • Font: Our ASCII art font was just a generic system font, not our 4moms brand font, so it did not look "designed."
    • Rasterization: There was no greyscale rasterization around the edges of the letters, which made them look coarse and jagged.
    • Letter spacing: Our simple algorithm used just a uniform letter spacing distance, whereas a properly designed font should have varied pixel spacing between characters to account for the "side-bearing" of each glyph.

    In order to solve for this, we developed our own miniature font library for our 4moms brand font.

     Screen on the 4moms infant car seat base

    Screen on the 4moms infant car seat base

     TOP: generic system font, no rasterization, uniform letter spacing  BOTTOM: 4moms brand font, rasterized, proper letter spacing

    TOP: generic system font, no rasterization, uniform letter spacing

    BOTTOM: 4moms brand font, rasterized, proper letter spacing

     Brainstorm session: Text rendering techniques

    Brainstorm session: Text rendering techniques

     
     Glyphs aligned to baseline in vector format (top), then exported to rasterized bitmap format (bottom)

    Glyphs aligned to baseline in vector format (top), then exported to rasterized bitmap format (bottom)

    2. Rasterized glyphs

    I started by creating a set of bitmap characters, in our branded font, with proper greyscale rasterization around the edges. We decided to limit ourselves to only upper case letters A-Z, numerals 0-9, and Spanish characters, in two fixed font sizes.

    I started by creating outlines of the glyphs in vector format, then exported to rasterized format. This ensured that all characters maintained consistent alignment relative to a common baseline on their pixel grid.

     
     Measured letter spacing between pairs of glyphs, using the flat edges of the capital letter "I" as a default reference.

    Measured letter spacing between pairs of glyphs, using the flat edges of the capital letter "I" as a default reference.

    3. Letter spacing

    I tried unsuccessfully to look up an existing letter spacing matrix for our 4moms brand font. This is typically something that the original font designer would have encoded into the font library, for every possible combination of glyphs.

    However, unable to access this, I instead decided to attempt to create my own by reverse engineering the algorithm that digital fonts use to calculate letter spacing distance based on the side-bearing of each glyph.

    Since the upper case letter "I" of our Century Gothic font has flat edges on both sides, I used this glyph as a measuring tool to find out how much spacing Illustrator automatically added or subtracted when positioned to the left or right of each other glyph in the character set.

    I eventually came up with a table of letter spacing distances for each pair of characters in our set. This measurement was in units of "ems," which enabled us to calculate a fixed pixel distance based on our screen resolution and font size.

     

     

     

    4. Screen layout and design specifications

    For each new screen asset we wanted to generate, we used an existing image asset (pre-programmed onto the base during manufacturing), as a background canvas. We then overlaid black rectangles (defined in code), to cover up any text or areas of the image that we wanted to hide. Finally, we defined the x, y, position coordinates of each new string of text we wanted to add, along with the string content. The algorithm was then smart enough to figure out the position of individual letters based on our letter spacing algorithm.

    We were able to define all of these specifications in a CSV table, which made it easy to import into the software.

    The final result enabled us to produce new screens, with content that was indistinguishable from the original Photoshop-designed content. 

     We used a CSV format to import the custom screen design specifications into the software.

    We used a CSV format to import the custom screen design specifications into the software.