Scroll Top
19th Ave New York, NY 95822, USA

Sydney Water

Sydney Water is an Australian corporation that provides drinking water, wastewater, and some stormwater services to the Greater Metropolitan of Sydney in the Australian state of New South Wales. This region has approximately 5.5 million habitants. Sydney water has over 5000 staff and an area of operations covering 12,700 square kilometres.

My job as the Lead Product Designer involved the discover, define and design stages of Haz Point. I worked closely with Sydney Water Engineers, 3D Designers and the Hazard Management team to create and define the UX, prototypes and interface designs of the initial MVP project. 

Role: Lead Designer 

Project: Haz Point

Year: 2017 – 2018

The Problem

There was a clear problem statement that we needed to solve for the project.

“How do we make it easier and safer for new employees and contractors to learn about and avoid potential hazards before going to Sydney Water facilities”

There are many employees and contractors regularly visiting Sydney Water work sites. The company wanted a way to show them the potential dangers that could be awaiting them once they arrived at a water treatment facility, this was to try and prevent accidents and increase awareness.

Water treatment facilities can be complicated places with many different areas and machinery that can potentially hurt people, sometimes fatally. By showing people what to look out for when visiting these sites in detail, in advance, we are able to minimize some of the risks involved and even save lives in the process.

The advantage of doing a project in Virtual reality is that you can get a realistic feel for a space and the objects inside of it, in this case, the machinery and hazards. Compared to 360 photos or videos the depth and richness of a VR experience built with 3D models is much better. In VR you can move close to an object and pick it up to see the structure and materials, you can also to look at it from different angles and interact with it as if it were real. Here are some of the other positive aspects for doing projects in VR:

  • Ability to gamify the experience
  • Create a meaningful and memorable experience (Immersive 3D compared to flat 2D)
  • Word of mouth to other users because of the wow factor, enhancing the usage of the app
  • Deeper levels of understanding and learning due to in-depth focused experiences, no noise or distraction from the outside
  • Build it once and deploy to many devices and headsets

As you can see from the images these facilities can be really complicated and dangerous to work in.

For this project, I used the Double Diamond of design to map the pieces together. I find it an extremely useful process guide and map to show the stakeholders while you are on the design journey. It doesn’t matter what you are designing VR, AR, apps or a website, you will need a process to follow, it helps you and the team stay on track.

The Double Diamond of Design, a simplified version of my design process.

Designing in VR

In VR you need to consider a lot of details that are irrelevant when you are designing in 2D. Here are some of the main points to consider.

In VR people will tend to look around and not focus on where you want them to. This makes it extremely important to prompt the users to know they are looking in the right direction. Pointing out directions with subtle arrows or overlays is a good way to get them back to where you need them to be. In our case, we needed our users to look at many different objects and identify which ones were out of place, so looking all around the space was not a bad thing. Humans field of view is

In VR people can get motion sickness with all types of negative symptoms, that can be similar to seasickness. There are a few theories as to why VR motion sickness occurs, it could be from sensory conflict, eye movement, evolutionary, or postural instability theories.To prevent motion sickness it is important to make sure that your body and movements are as smooth as possible when in the experience. Also, if you have a feeling of flying this can cause sickness, it is much better to teleport to a new area in the scene by pointing a clicking. More on VR motion sickness here:

Heavy headsets with tight straps can get uncomfortable for the users. It is important to pick a comfortable headset to use when designing. We chose the Google Daydream mobile VR headset, this was a good option for comfort with its soft material shell, eye frame and head strap. In general, mobile VR headsets are lighter than room scale options. Besides the actual headset, the other comfort consideration is head and neck tilt. You shouldn’t make people have to move in big, awkward motions during the experience, it can cause discomfort and even neck injury.

In VR people can experience discomforts in some situation like heights, small spaces, edges of cliffs and buildings. You need to consider that the mind can play tricks on the person’s body and convince them they are in a real situation. This can make the person feel like they are literally about to jump off a cliff or fall down. It is a good idea to make the user feel safe at all times unless the designer purposely doesn’t want to do it.

When designing VR experiences you can use sound to your advantage. For example, it can be used to draw a persons attention to a certain location, turn their head or go and look for something. If done right, the audio can be extremely convincing and essential for a VR project. If done poorly it can detract from the experience. Some of the main sounds to be careful of are:

  • Left and right speaker sounds
  • The distance from where sounds are coming from, and, how they change as you get closer to them
  • Small details, like the sound of walking or flicking off a switch
  • Moving objects around and using a realistic sound, for example, putting down a metal object should make the right noise at the right time

Viewpoints and comfort zones to keep the user happy.

Choosing a Headset

With so many VR headsets available it is important to choose the right one for a project. There are two main types of headsets available. Room scale VR and Mobile VR. We needed a headset that was easy to set up, portable, and fairly cheap, so we chose the Google Daydream headset.

We looked at room scale VR options such as the HTC Vive and the Oculus Rift from Facebook. These were excellent options for in-depth immersive experiences, but, for a project like our one, it was a bit too expensive and complex to set up and maintain at the required locations.

Most room scale setups need an expensive high powered computer to run them and at least 2 x 1.5 meters of clear area to use it in. This puts constraints on where you can use it and makes it much harder to move around and set it up. On the other hand, room-scale VR is better quality and gives a richer experience.

The controls are more tactile and you get greater movement and haptic feedback from them. You also get a full range of body movements which doesn’t happen with mobile VR. The room scale movements come from sensors that you set up in the space as you can see by the image below. Mobile VR does not come with or work with those sensors as yet.

Mobile VR is a cheaper and easier way to go for some VR projects. If you don’t have access to a high-performance computer, a large space and don’t need amazing detail and interactions then this could be the way to go. The way mobile VR works is that you make an app package and then launch it in a mobile VR headset. It is as simple as opening an app on your phone and putting the headset on.

These days there are many phones capable of running VR, Daydream is compatible with at least 8 different phones. To select the mobile headset we tried out different options, we quite liked the Google Cardboard as a testing and prototyping tool, it really comes in handy at to lo-fi concept stage but doesn’t quite cut it for complicated projects where you need a controller.

Room scale VR headsets on the left and Mobile headsets on the right.

Design Workshops

To begin the design process, I facilitated design thinking workshops to start brainstorming and getting ideas flowing with the teams. So of the activities I used were from Google Design Sprints and Ideo’s Design Kit I find these resources have most user-friendly workshop plans. We looked at:

  • Experience Mapping – Write down who solves the user need the best “eg finding hazards” and explain why the experience is good an any limitations.
  • How Might We – Write down opportunities and pain points onto sticky notes, the present them to the group, we then affinity map them into patterns and dot vote.
  • Crazy 8’s – Draw out 8 ideas onto a piece of folded paper then present them, the group then dot votes on best ideas.
  • Solution Sketch – Draw out 8 ideas onto a piece of folded paper then present them, the group then dot votes on best ideas.

The project had a defined user base, this made parts the discovery and research phase clearer. The designs had to be accessible and usable for everyone, from a graduate engineer right through to a desalination plant worker in their 60’s or later.

I did design and discovery workshops with key stakeholders and created two main persona’s that had specific needs. We then referred back to and referenced them during the project, they also helped with recruiting people for testing, I created these personas in Sketch.

Some of the main persona points were:

Administrators

  • Ability to submit and upload hazard information and locations
  • A way to see consumer names and task analysis status
  • Ability to create login and passwords for specific users

Consumers

  • Able to use the app with low tech skills and knowledge
  • Language and audio needs to be clear and concise
  • In-app experience needs to be self-explanatory and intuitive

Our two main defined User Personas, tech-savy assesor and low-tech consumer .

Design Discovery

To get the initial ideas out I sketched with pen and paper. I still think you can’t beat this quick and easy way to express your ideas no matter what you are designing. As we progressed I showed my ideas to my team to see if what I was thinking was possible to code up and model in Unity.

To show the steps in using the product, I created flows for the two user groups. These were excellent to map the steps that are needed and discuss with the team. Below you can see how the consumer uses the app while the administrator helps out along the way. I sketched these out and finished them off in Whimsical.

To explore different options I took the user flows and sketches and changed them into simple wireframes in Sketch. This was to see higher fidelity concepts of how the app and VR experience could work. We looked at the idea of an app that you launch and use before the VR experience. The benefit of this was that the users can onboard and choose locations to learn more about before moving into the VR environment.

These wireframes were actually featured on the InVision blog see them here:
https://www.invisionapp.com/inside-design/wireframe-examples/

VR & App Flows

Below you can see the detailed app to VR flows and the final app UI designs.
These detailed flows are excellent for showing the developers and greater team what they need to follow.

App UI

The MVP app designs are shown below, this was the most simple way to get the project launched and a basis for more features to be added in at a later stage. The apps main purpose is to login and then launch the VR project with the help on an assitant.

Pre VR and in VR user with assistant flows.

App to VR journey wireframes.

App to VR journey wireflows.

VR Design Details

We looked at doing the project with 360 images instead of 3D modeled environments, this approach could have worked but we found it difficult to get good quality images. We also had issues building the interactive UI components over the images in Unity.

I learned that if you do 360 video or imagery it needs to be very high quality to be convincing and the UI on top cannot have as much interactivity for a particular object as you can with 3D models. For example, you can grab a handle and turn it or flick off a light switch in a convincing way with 3D models but not with static images.

There are a few tools out that are making the 360 image experiences easier to create, and for some projects such as real estate sales, it makes sense to do them with imagery and not 3D models.

We needed realistic models of the sites. To get these, we engaged a talented team of 3D modelers to come and create them. The used a process called Photogrammetry which involved going to a site and scanning it with lasers and then creating a model from it.

Once this was done we were able to drop the models into Unity and then add interactivity in it. This process worked for us, but, if you don’t have a big budget and want to do the models yourself you will need to learn how to use 3D programs such as Maya, Blender or Cinema 4d which all have a bit of a learning curve. If you don’t require exact custom models you can find plenty of free and paid models to use at these websites.

TurboSquid
https://www.turbosquid.com/

Sketchfab
https://sketchfab.com/features/download

Unity Asset Store
https://assetstore.unity.com

Prototyping, and testing is an extremely important step in VR design. Unlike app and web design, you can make someone very sick and nauseous in VR if you don’t get it right, so, we did it early and often.

For rough wireframe prototypes, we used Tilt brush and literally sketched out the scenes in VR, we then put people in it to see if they understand the environment what they were trying to do. When we needed to be more realistic we did white boxing which simpy is putting white shapes into an area that you can walk around in. We organised the research and testing insights in MIRO and on sticky notes for presentations.

Sketching 

For UI elements, I began with sketching them out on the iPad in Procreate. This allowed for quick iterations of ideas that I could talk to the team about before moving into the digital versions. Here are some early sketched concepts for the components library and hazard scenes. On the left, you can see a scene with many hazards, on the right a panel that has opened to tell the user they have spotted a hazard.

Modelling in Unity and testing with a HTC Vive.

Sketching ideas in ProCreate iPad App.

VR Interface Design

I mocked up final interface designs on top of water treatment site imagery using Figma. Inside of this program, I could demonstrate all the necessary UI components, menus and interactions, and even create the clickable prototypes. The only thing I was missing was the 3D movement. I hope prototyping software like InVision and Figma will soon allow for 360 images and 3D models to be used soon.

In the images below you see:

  • Detailed VR user flow map for the development and product team
  • Mock-ups of the main VR experience screens including;
  • A person with the Daydream controller in their hand and the pointer
  • The status menu on the bottom left side with time, hazards spotted and help used up
  • On the top left, you have the home and help menus in black, subtle and out of the way
  • On the bottom right you have the exit button
  • Hazard information modals with more info and done buttons
  • Selected hazards
  • Completed hazards

In this scenario, the user has selected a site and entered it. They have the headset on and the controller in their hand, the controller is seen as the white one on screen. The user then needs to point and click on hazards that they think are dangerous. The hazard will then select and change colour if it is correct and then give them information about it in a modal view.

A detailed VR flow map for the team to see how it links up.

VR Onboarding game main screens.

Main Takeaways

After months of concept testing and iterations, we engaged the developers and game designers and handed over the concepts, prototypes, and documentation. They finished the project using Unity with MonoCode and the MVP pilot was released internally for further tesing and improvements.

The project was a great learning experience, I see big potential for VR, AR and MR projects. They can really help humans to learn and perform tasks in a more focused way without all the distractions we have constantly surrounding us.

Thanks to new technology and software it will get easier to create working prototypes and detailed designs quickly in the next few years. Graphics are also getting much better and there will soon be a point where the line between what is real and what is a 3D model will start to blur. Mobile devices will also be all VR and AR ready and a lot of big players are saying it will be the next big thing.

And as Tim Cook said:

“In a few years, we’re not going to be able to imagine our lives without Mixed Reality. It’s that profound a platform,”

Next case study | Woolworths 🛒

✉️ Get in touch

© Julio Castellano 2020