Remote User Testing: Airbnb vs Wimdu

In many of my other blog posts I have discussed user research through various methods. From surveys, to design feedback.

What I had not done, yet, was share my experience with performing remote user research. Technically I did, with my presentation ‘Prospective Optimization‘ at the Dutch Web Analytics Conference back in March, but not specifically on this website.

In this blog post I would like to share with you, a real life example of working with UserTesting.com. To be more precise, a comparison between Airbnb.com and Wimdu.com.

SKIP TO: User Testing Videos

Why UserTesting.com?

I like to write my articles based on real life experiences in the world of conversion rate optimization. Although there are many worthy alternatives out there for remote user testing, such as Loop11, TryMyUI and YouEye, my practical experience with UserTesting.com is the most extensive.

In future articles I will definitely take the time to focus on the alternatives. Not only for remote user testing, but for all aspects of user research.

Comparing Airbnb.com with Wimdu.com

Airbnb has certainly established itself as a serious player in the field of travel. By creating a network of hosts, who make their homes, or rooms, available for (short) rental to travellers, it undoubtedly tapped into a vast resource of revenue still available in the online travel agency (OTA) world.

Is Airbnb unique? Some beg to differ, because they were not the first. The same goes for Wimdu. Is Airbnb an improved idea inspired by the likes of Couchsurfing.org? Is Wimdu another spin-off, of a spin-off? My opinion is… who cares. It should be clear that all of these companies are doing well, not only financially, but also in filling a requirement in today’s travel market. So, who are we to criticize?

Well, enough politics for now…

Let the users speak!

This research was basically performed with two goals in mind.

  • First is to show you how I use UserTesting.com to gain insights from participants.
  • Second is to learn more about this niche in the travel industry. Pure curiosity.

This form of user research, recording participants’ screens and audio while they perform a set of tasks is great because it allows you to do several things.

  • Listen to the participants speak their mind. If you get good participants, they will clearly speak their minds. They will share the positive and negative thoughts they have while participating in your test. In some situations, you can even hear a participants’ anxiety, or pleasure in performing the set tasks or using certain functionalities.
  • You get to see how participants use your website (or whatever it is you are testing). How they move their mouse, what links they click, and how the website performs for them.

Basically, where analytics and feedback forms leave off, (remote) user testing picks up. This is true in more ways than one, because what (remote) user testing lacks is an easy way to quantify the findings.

User Research = Manual Labor

Nothing is more laborious in the online world than performing and analyzing user research. It is good for the hours if you work freelance, but notoriously time consuming nevertheless. We have gotten lazy with all the great analytic tools on the market these days, that we forget that real insights, takes real-time, real effort… a lot of both to be honest.

The same is the case with remote user research. In several cases I have spent up to 1 hour setting up a test, 5 hours analyzing the results (average video length 20 minutes, analyzing takes me 1 hour per video) and around 8 to 16 hours compiling and documenting the results. Time and effort ultimately depends on the test in question.

Still, it is quicker than alternative methods of performing user research like in a lab.

The Comparison Test: Airbnb vs Wimdu

So how did I go about setting up the test? In this specific test 5 participants were invited to perform a series of tasks defined around a single scenario. The tasks (some default ones from UserTesting.com) were meant to let the participants compare both websites and give their opinion on the experience.

FYI… The participants are pulled from a legion of testers who have signed up at UserTesting.com. In return for a small fee, they perform the test for you, which is included in the ‘per test fee’.

I have conducted of 100+ tests and I can happily report that the quality of participants is good. The only barriers now are geographical ones as the location of the participants is limited to USA, Canada and the UK. YouEye does support European testing at the moment, in case anyone was wondering.

I formulated the following scenario for the test:

Put yourself in the shoes of someone who wants to compare two sites before booking a Bed & Breakfast accommodation for your upcoming holiday to New York City. You will be visiting two sites to compare offerings.

Within the scenario, participants were asked to perform the following tasks:

  1. Go to www.airbnb.com. Look at the home page for five seconds. Then look away and answer this one question (without peeking!): What do you remember?
  2. You are looking for a place to stay in New York City. You and your partner will arrive on Friday, April 20th, and return home on Monday, April 23rd. Go ahead and look for a suitable place to stay.
  3. When you are happy with the place you would personally want to stay at, go through the motions of booking the accommodation. Make sure to stop just before ‘actually’ booking. Remember, please talk us through your journey. Tell us what you like and what you found distracting.
  4. Repeat steps 1,2, and 3 for www.wimdu.com
  5. Which site did you prefer? Why?
As you can see, the set of tasks basically repeats itself for both websites being tested.

Findings

When viewing the recorded test sessions, I take it upon myself to use a ‘Create Clip’ function whenever I find something that might be useful to discuss. In this case, I have done the same. Now, I did test 5 participants, but I will only show you the findings of 1 of them, just to prevent taking too much of your time and keep this article as enjoyable as possible.

Creating a clip helps you find a certain issue after reviewing.

Anyone interested in discussing the findings with me, you are welcome to do so by contacting me, or by sharing your thoughts in the comments below. I will consider making a compilation video of all the sessions and publishing them, but for now, I will limit the compilation video to just 1 participant, just to save some (personal) time.


DISCLAIMER: This list was created by myself and is in no way ‘complete’. Some shortcuts were taken for the purpose of publishing this blog post since I do all of this in my spare time, which, with 3 kids, is not much. I like to consider myself objective and skilled enough to detect possible issues on a website. I agree that because only 5 users were tested it is hard (near impossible) to claim significance, but I know that any finding could be a catalyst to improving the website. Since I work for neither Airbnb nor Wimdu, I cannot say what has already been tested and what has not. It could be that certain functionalities have been designed just as the user has experienced them (ie. Search Button on Homepages).


Airbnb User Research Findings

TIP: Click on the toggle below the video for a (short) summary of the findings in the Airbnb user research session.

Airbnb Findings Summary (short) – from video

  1. Star-icon by listing on Search Results Page: Unclear. In travel, this icon is too easily linked to rating/review score of an accommodation.
  2. Search flow: Location input field + search button, then followed below by Number of nights and persons. Wimdu’s example was appreciated a little more with search button under all input fields (see screenshot). This is a very subjective matter.
  3. Host video: Homepage displays video depicting a ‘host’. If you are a first time visitor and are new to the concept it might be a deterrent. Can I find a place to stay, or is this a site to list a place to stay…
  4. Filters: Extremely low contrast making the filters hard to find. Filters in bottom right of layout. When expanding filters, they become unusable (see screenshot).
  5. Extra costs: Cleaning costs of $50 steep for a $72/night apartment. Causes hesitation.
  6. Third Parties: When trying to book, signing in through Facebook took very long. This is a risky dependency at such a crucial stage in the process.
  7. Many Questions: Potential bookers must answer many questions before being able to book. Questions are supported by stating that ‘the host would like to know more’. Number of questions seem over the top and sometimes even irrelevant for purpose of travel. Hotel seems much more hassle-free at such a moment.
  8. Call-to-Action: Location of ‘continue’ call-to-action button is confusing since there is a require field just above it. Button seems to be linked to that input field, but is not causing an error message to pop up (see screenshot).

Wimdu User Research Findings

TIP: Click on the toggle below the video for a (short) summary of the findings in the Wimdu user research session.

Wimdu Findings Summary (short) – from video

  1. Calendar: Start of the week different for US/EU users, Sunday vs Monday. Small difference, but big impact .
  2. Listing of Locations: New York, is both a state and a city. When searching, prioritize on city level instead of state or region (for EU). If people search, they will most likely already have a sense of the city they want to go to.
  3. Sorting by Recommendations: Be careful when sorting by recommendations when the number of recommendations are limited (ie. < 5 or 10). In past experience, the order in which visitors will seek out their desired product is price > reviews (recommendations). Default to price, low-to-high.
  4. Photo of Hosts: Placing a photo of the host on a search results page can be confusing, especially for first time visitors. Who are these people, reviewers, other travellers? The fact that the host is portrayed on the results page becomes more clear on the detail page. Progressive disclosure should be used here to prevent visitors from getting confused.
  5. Phone Support: Phone number below Book Now button very tempting to call. This would cause a channel shift and in the end give you a higher CPA.
  6. Booking Process: Booking seemed easier than with Airbnb. No list of questions. Trustlogo’s seemed a bit small and few. Location of the Complete Booking button too far to the right… move closer to the left, align with input fields.

Conclusion

The preferences were evenly split. Wimdu did get some credits for being easier to book, but lacked in recognition. The latter is most likely (very non-scientific term) due to the fact that Wimdu is European based (Germany) and Airbnb US. Not an excuse, but surely worth considering. I think that both websites have enough work cut out for them in optimizing the user experience. My advice: Just keep testing.

Conducting user research this way is… well, call me a nerd, but it is fun. You are in charge of what you want to test, when, and to a large extent where. The insights gained through this research are powerful in a way that it is the users/visitors themselves that indicate what they like and don’t like. Gut feeling plays a small part, so the possible positive effect on conversion rate optimization based on user feedback is potentially huge.

No, there is no contact with the testers to ask additional questions or to steer a test while in progress. Some things you need to take for granted and take the feedback at such a value that it keeps your user experience and conversion rate optimization brain cells in motion.

When viewing the tests, I was glad that my trustlogo research importance was mentioned on several occasions, but I did miss a critical (or at least so for me) item. During my own pre-test, so to speak, I noticed that both websites don’t utilize some very key USP’s (unique selling points) at important stage in the booking process.

The one that caught my eye the most, and which was not visible while booking, was the payment process at Wimdu. So, in closing, please consider this…

At Airbnb, your are charged for you stay as soon as the host accepts your booking request.

Airbnb Payment Method

Airbnb's payment method differs from Wimdu. Critical, yes or no?

At Wimdu, you are not charged until 24 hours after arriving at your destination.

Wimdu Payment USP

An important Wimdu USP hidden from view.

  1. How does that make you feel?
  2. Would seeing this USP make you feel safe booking?
  3. How would you test to see who else felt the same?

Exactly, (remote) user testing! Please share your experiences with remote user testing!

Need help performing user research? Feel free to contact us.

Related posts:

  1. Remote research tools Remote online testing offers UX designers new ways to test a site without the need to drag users into the room. Some purists say remote testing loses the nuances of...
This entry was posted in User Experience, User Research. Bookmark the permalink.

Comments are closed.