design lead | UX design | virtual reality | learning apps
virtual reality STEM apps
for kids discovering science through VR
After six years as a hardware company developing an interactive desktop VR/AR display running on Windows, zSpace decided to get into the business of education technology software. The plan was to bundle the platform with apps, marketed to schools and districts for middle school science labs.
Product was tasked to produce an initial set of three applications to run on the platform: Newtonian physics, electromagnetism, and an open-ended gallery of artifacts for study and comparison. These would include enterprise components in common for use in classroom/lab settings
I joined the team in March, and product release was a hard deadline in September, just six months.
There was one other senior UX/UI designer on the team, with two product managers, five developers, one QA, and one developer acting as project manager.
The visual designers had previously developed an SDK that included a UI toolkit of menus, an inventory system, and basic stylus interactions. Given time constraints we were not able to redesign or improve on this UI for version 1.
Design and implement an all new application for teaching Newtonian physics, making use of and showcasing zSpace's advanced interactive 3D technology. The application would also include enterprise features for supporting lesson creation, distribution, and answers.
A few things had already been determined for this project.
Product Management provided a list of learning requirements based closely on national Common Core standards of what 8th grade students were expected to understand about Newtonian physics.
The virtual environment of the application should be reminiscent of a skate park or gym, with ramps and other items for balls of various types to roll and bounce on in simulated physics at a skate park scale.
The company had already developed an SDK with a UI Toolkit of menus, dashboards and other UI elements, lifted from conventional 2D interaction designs.
The short time frame and lack of a project manager did not allow for a defined design process. Flexibility was key, and the team took a “lean” approach. Essentially, my role was to come up with solutions that everyone could agree to, and to describe those as clearly as possible, staying one step ahead of the developers in time. I tried to address at the same time both the broad feature set as well as key 3D interaction details, top down and bottom up.
I would always start a new feature design by meeting with interested parties and asking a lot of questions, where we’d also bounce ideas around. I’d then go on my own and document what I learned, then begin describing solutions - narratives, outlines, illustrations, etc. I made frequent presentations, eliciting feedback and then revising and expandingv. I ultimately provided full specifications for engineering and for QA.
My first day at work, I held a meeting with the product managers and lead engineers to review the requirements spreadsheet in detail. I needed to hear from everyone their thoughts and assumptions, as well as limitations, constraints and priorities.
Together, we hashed everything out until I felt we had a consensus on as many items as possible. The requirements we started with were extensively revised and clarified, and we also found ways to group multiple requirements together into more clear and cohesive statements.
I then took what I captured in my notes and on the whiteboard and crafted a more complete requirements document to be signed off the by product managers.
A building tool
It seemed pretty obvious that this application should allow users to build their own configurations of ramps etc. for trying out their own ideas and with balls. My experiences at LEGO and on SimCity came in handy here.
I began an inventory of items to build with - ramps, platforms, targets, launchers, droppers, pushers, and deflectors. I also proposed a grid-constrained environment so that it would be easy to place these items together precisely.
A learning tool
While building in virtual 3D is a pretty neat use of zSpace, I wanted to make the space rich in data, to “make the seen the unseen.” I proposed a Matrix mode that the player could invoke, which would allow manipulation of time and the ability to extract data from the balls in the scene. At every moment, each ball would have its own free body diagram - and in motion this would make for an eye-catching “dance of arrows.”
Students would be able to query these vectors at any moment in time, taking measurement of force, velocity, and acceleration.
A lab tool
Configurations could be saved. Teachers could add a set of questions and then publish the activity to the class. Students could open the activity and answer the questions, submitting them back to the teacher. The UI for this was a simple extension of the existing UI Toolkit. For version 2, I designed an in-world lab workbook, interactive and feature-rich, and with pages that turn.
In addition to Newton’s Park, I was brought onto development of zSpace Studio - an application for studying and comparing 3D models. The beta version proved to be very difficult to use, a problem I reckoned had to do with the ambiguous scale and orientation of the space. I designed a set of controls for managing these issues. I also designed a dynamic ruler that would make it easy to take measurements..
Once version 1 was done and version 2 design was underway, I got more active in exploring and proposing new approaches and ideas for 3D interaction such as “desktop” environments, handling inventories, presenting menus - sometimes building mockups or prototypes in Unity. I also designed new STEM apps for chemistry and geometry - providing high-level designs and Unity mockups.