How do you design a software application in 3 days at a conference? Earlier this year my colleague James Murtagh, a Marketing Manager, pitched the idea of ‘Red Gate Live Labs’. His inspiration was drawn from Lean Startup and the Nordstrum Innovation Lab iPad app case study.
“By the time the product is ready to be distributed widely, it will already have established customers.”
Eric Ries, The Lean Startup
In Lean Startup, Eric Ries advocates getting a product idea in front of potential customers as early as possible, to establish if there’s a market for it. Once you’ve validated that customers will pay for your product, you iteratively improve the concept and design based on their feedback. James also read ‘The Next Evolution of Marketing: Marketing with Meaning‘ which promotes engaging customers to create meaningful communications with them.
Source controlling application code is common practise in most development teams. However, source controlling a database can be time consuming and painful. Developers use alternatives, such as backups instead, but this workaround is less than ideal. Based on conversations with existing customers, we formed a hypothesis:
Oracle Developers and Database Administrators (DBAs) need a better way to get their database schemas into source control
Running a stand at a software conference (or tradeshow) typically involves giving demos, talking to customers and giving away promos. Kscope, the annual Oracle Development Tools User Group (ODTUG) conference took place in June this year. The conference provided an ideal opportunity to validate our hypothesis using the Live Lab concept. We convinced the company to let us try it out, by designing a Source Control for Oracle tool in 3 days and getting feedback on our designs. Here’s what we did in Texas…
The Live Labs Stand Design
Our internal marketing Agency came up with an awesome stand design!
- On the left was the user experience area: for lo-fidelity paper prototyping and gathering customer feedback. You will see later in this post how we used the whiteboard to the left and the rear wall of the stand for post-its from customer feedback sessions.
- On the right was the development area: for hi-fidelity HTML/CSS prototyping. The scrum board was integrated into the design of the stand to track progress with team tasks. This area was also used for product demos of our existing Oracle tools.
The Methodology and Process
We split the 3 days at the conference into 9 mini Agile sprints. The sprints coincided with break times when conference attendees were in the exhibition hall. In between breaks we aimed to review the feedback we had gathered and iterate the designs for the tool.
Ok so I admit we cheated a little bit. I prepared a basic paper prototype of our Windows application before we arrived in Texas, based on feedback from a marketing survey and a couple of telephone conversations with existing customers. This was partly because we had such limited time at Kscope, that we wanted to have something to show people and start collecting feedback on our designs as soon as possible.
The prototype was quick and easy change on the fly as each interface control was stuck on separately. For the stationary geeks among you, I used Sharpies (fine and super fine), a Letraset warm grey marker, a red Sign pen, scissors and Blue Tac. I also took print outs of two simple ‘frames’ I created in Balsamiq to emulate desktop application windows and a printed screenshot of a Windows desktop. As you can see, my handwriting is not very legible, but it was legible enough!
I created a rough flow diagram of the workflow to help participants understand the concept more easily.
Over 3 days I ran about 25 feedback sessions with developers, DBAs and team managers. I must stress that the sessions were not usability tests. They were somewhere between an interview, a participatory design session and a usability evaluation.
Before we went to Texas I searched online for write ups about doing usability testing at a tradeshow or conference. Surprisingly I found very few references in the literature. But one good reference I did find was a paper titled ‘Extremely Rapid Usability Testing’ by Mark Pawson and Saul Greenberg (2007). Pawson and Greenberg designed their method (ERUT) to:
- Assess the usefulness of the core functionality of a product, i.e., was the product’s unique selling proposition solving a problem that a majority of customers wanted solved?
- Find major usability problems in the core functionality.
My method was similar, but unlike Pawson and Greenberg I didn’t have a sectioned off area for usability testing and I wasn’t evaluating an existing software application. My method was closer to the Nordstrum case study mentioned above. We were very keen that our process was transparent and engaging. We wanted customers to feel involved in the design and development of the tool.
Before the conference, I didn’t know if we’d have 2 minutes or 20 minutes with participants. As it turned out, many participants stayed for 15-20 minutes, and some stayed for 30 minutes to 1 hour!
Because of the fluid nature of the traffic to the stand, we weren’t able to recruit participants in advance or for a specific schedule of sessions. Instead my colleagues (from Marketing and Project Management) and myself initiated conversations with anyone who visited the stand. We asked a few questions about the person’s role, development environment and processes, use of source control etc. If the person was interested in source controlling their databases, we would invite them to participate in a feedback session with me (providing I was available).
A feedback session with some of the speakers from Kscope: Cary Millsap, Dominic Delmolino and Ron Crisco.
Data Collection and Analysis
Unlike normal usability tests, I was unable to record the sessions. The environment was very noisy, so recording audio was tricky. And it felt awkward to ask consent to be recorded if a participant only had 5 minutes to spare. However, when a participant was relaxed and able to spend 20 minutes or so with us, we asked for consent and one of my colleagues used a Flip camera to get overhead shots. We had an action camera positioned above the coffee table, but we found the sound and image quality was fairly poor.
I used various methods for collecting feedback:
- During the sessions I wrote observations on sticky notes (as in the photo above) and stuck them on the rear wall of the stand. I used the KJ Technique (or affinity diagramming technique) to group similarly themed sticky notes together in between feedback sessions. I stuck orange dots on any feedback that I incorporated in design changes, so I could keep track.
- I used an empathy map (hat tip to Dave Gray) to collate snippets I picked up from conversations about what people feel, say and do, what’s important to them, their pain points and what they’ll gain from having the tool. I created some provisional personas before the conference, and have since used the empathy map to update them.
- It’s important to know what development environment our customers use (for Oracle database development) and what source control systems they use, as there are several. I wrote the names of systems on sticky notes and used sticky dots to track which systems participants said they used.
- Although I hadn’t planned for it, I found it extremely useful to sketch out someone’s process with them, let them draw a diagram of their environment or come up with UI design ideas. This was particularly useful to validate my understanding of a participant’s team, their development environment or process. With one participant, Wendy, we were able to identify her pain points together and then look at how our tool could solve her problems.
What was really great about my feedback ‘wall’ was that some people who participated in our feedback sessions returned on day 2 and 3 to see what feedback we had received and if we’d implemented feedback from their session. I had conversations at the coffee stand with previous participants who were delighted to know I’d updated the UI based on their feedback.
The empathy map became a talking point and developers liked to see what development environments people used. One guy said he used Notepad, and a couple of participants noticed that post-it and said “Oh that poor Notepad guy!”
All the time I was learning not only about how Oracle developers work, but also about their organisation and team culture. This was something I wouldn’t have got from running remote usability tests back at the office.
At the end of each session I asked participants to complete a small feedback form which helped to validate our hypothesis, and collect email addresses for our Beta list.
Along the way we made several tweaks to the paper prototype and I was able to test these out and get immediate feedback, for example changing details in a system tray notification on the desktop with a participant.
If you want to read more about what we found out while we were at Kscope, you can read our blog post which we updated at the end of each day.
When I had slack time, I would go through the feedback with David, the developer who was working on an HTML/CSS interactive prototype. I found it hard to provide regular feedback to David as often feedback sessions were back-to-back. We found people skipped conference talks and drifted into the exhibition hall throughout the day and I didn’t want to miss opportunities to talk to potential customers.
We had stand up meetings between sprints with the rest of the team and updated the Kanban board to show progress on tasks. Sometimes our stand-ups were watched by people who arrived at the stand at that point!
The whole team had to wear multiple hats. We all did product demos of our existing Oracle tools, spoke to customers, got feedback on the prototypes and did coffee runs.
The Dev Area
Whilst I was running feedback sessions, David was working away on creating and updating the HTML/CSS version of our Source Control for Oracle prototype. We wanted to show more complex interactions and the paper prototype could only go so far to showing how the tool would work. We used Twitter’s Bootstrap for the UI components. Bootstrap provides ready-made buttons, icons, tables and much more, meaning that we hardly had to write any custom CSS. We also used Knockout to make it interactive. Knockout’s simple framework made it quick and easy to add features such as the search-as-you-type filter.
David’s screen was displayed on the large monitor in the middle of the stand, so passing traffic could see him coding live. This certainly attracted some of the developers to our stand!
We didn’t get to do as much testing with the HTML version of the prototype as we’d hoped. The paper version was still much quicker to update. But by day 3 we were testing it with people and getting feedback.
The Outcome, and What’s Next
Over 92% of people we spoke to said they would purchase our tool, which was meant our hypothesis was well and truly validated! We had some amazing conversations with potential customers and gathered some great feedback on our designs. The project team is now working full speed on development. I’ve got some time to reflect on running a Live Lab and will be posting up some lessons learned here on the UX team blog in the near future.
I’d love to know what you think about our Live Labs case study! Please do get in touch if you have any experiences you can share from doing UX research, design or evaluation at a conference or tradeshow.