Local Search in 2016: Wearables, Beacons, and Machine Learning

Right now, we are nearing a point whereby the convergence of several related technologies, combined with their improving accessibility (infrastructure and cost) means we are not far away from some big disruptions in local search. People will be expecting search results far more specific to their current context than ever before... and they'll be getting them. I've put together a simple and relatively typical story to illustrate some of the technologies (see section after the story).

A Search Story

Imagine someone who needs to pick up a gift for a friend of hers; she is wandering through London and searches for 'jewellery shop' via her phone as she walks. She gets a bunch of results for stores nearby, but isn't happy and so refines her original voice search by simply speaking 'show those with 4 stars or more', and gets a subset of her original results a moment later. She is still unsure, so jumps on a rental bike and heads towards Oxford Street. Her phone recognises she is now cycling and updates the results for a wider radius.

She checks her results on her watch at some traffic lights, and decides the top results look great so clicks that. A short while after starting peddling again she feels her watch vibrate on the left side, and turns left as reaches the next intersection. She follows the haptic feedback from her watch for a couple more turns before parking up her bike near Regent's Street when it indicates she has reached her destination.

Macys are already deploying iBeacons in their stores.

She walks into the store and knows she is looking for bracelets but isn't sure where they might be. She pulls up Indoor Streetview on her phone and gets an instant map of the store, and sees she needs to head upstairs and to the back of the store. As she goes up the elevator she sees an ad for the store's own app, so she grabs that over in their free in-store wifi and opens that up to see what offers there might be on.

As she heads out onto the floor she is now too deep into the building and has lost her GPS signal, but by now the store's app has opened up and uses beacon technology in the store to guide her to the bracelets with perfect accuracy. She browses a bit and really likes a couple of the bracelets she sees, but can't decide between them and decides to mull it over.

On the way to catch the train home, her phone buzzes to let her know the electronics store she is nearing has the watch she was looking to buy as a gift for her boyfriend. She'd searched for the watch several times over the last few days and so her phone setup a passive search.

Later on that evening, the store's app (knowing that she'd been in the store) throws up a voucher code for her to get a discount on their website. She decides to go ahead and take another look, so opens up the site and eventually makes up her mind and buys a bracelet using her voucher.

The Future is Now

All the technologies in this story already exist, and almost all are already available to customers (you'll need to wait until February to get your Apple Watch with haptic feedback) and nothing in this story should be particularly surprising. The most important aspect will come from two things:

  • All the technologies involved reaching widespread coverage.
  • Consumer's familiarity with these technologies and expectations from them.

Once the technologies are widespread and people have acclimatised, there is a lot of synergy between the various elements and I believe we'll see a sharp uptick in them dramatically affecting searcher behaviour (which will be cyclical in affecting how businesses deploy these technologies).

I've discussed previously how we've noticed a trend for people who search on a mobile phone to have an expectation that Google/Siri/whatever will use not only their explicit search phrase to give them relevant results, but will also supplement their query with implicit aspects based on their context (see this post for more discussion on that). A simple example to illustrate this is people searching for a phrase such as 'breakfast' as the sole phrase. Not that long ago such a search would've been crazy but now we know that Google will understand we are on a phone and what we want from our context (see this video for an extension of this).

There is no reason to believe this trend won't continue and people won't rely on further aspects of context and other technologies that will augment their searches. There is also no reason to suspect that the proliferation of the technologies in this story isn't going to continue in the same fashion it has been. With that in mind, let's take a look at what happened.

Breakdown of the Technologies

So what happened during this search, what technologies were involved? Let's break it down.

Streetview Indoors

  1. Search for 'jewellery shop' without any intent words ('buy', 'find', 'nearest'), or location ('cambridge'), or qualifiers ('luxury', 'cheap'). The searcher expected that both intent and location would be implicitly understood from her context.
  2. Search refinement, a second search ('show those with 4 stars or more') which based on the first, rather than being a completely new search. Google calls this 'conversational search', and with ongoing improvements in Machine Learning and in the data they have access to, it seems sure we're going to see it get bigger and bigger (especially as wearable devices take off and users acclimatise to the concept). Likewise the Machine Learning powered Google Hummingbird update is going to drive improvements to this further, and so we're also going to see this become a lot more powerful.
  3. Mode of transport as context. Android already has an activity recognition API which will recognise whether the user is on foot, in a car or on a bicycle. When we are doing a local search, it makes far more sense to consider 'where can the user get to in 5 minutes' rather than 'what is within 500m of the user', and this activity recognition API allows that to happen. Furthermore, as per above, users acclimatise to their devices implying their context, and come to rely on it in their searches.
  4. Non-visual feedback. In this example, I've highlighted the haptic feedback mechanism of the new Apple Watch, but the same result might have come about with audio direction guidance from someone wearing Google Glass, or who has their phone in their pocket with headphones in. This is important as these non-interruptive forms of navigation make them safer and easier to use, removing the friction from people doing so more heavily.
  5. Indoor Streetview. Google are already doing this (and see here for a list of over 10,000 buildings worldwide already mapped), and as coverage and awareness improves we'll see people begin to use it more and more. We will see people who are in a building do a Google search for another part of the building! This will work fantastically for big buildings like airports, museums, shopping centres etc., but the accuracy of GPS will making it difficult to be accurate for ultra-fine navigation, or what people are calling hyper-local. Not to worry though, as here comes...

    Some of the existing beacons in Europe via WikiBeacon.

  6. Beacons. Using Bluetooth Low Energy technology (which is a sibling to old school Bluetooth, and quite different) beacons have been around for a while but hadn't seen much interest until Apple announced their iBeacon technology mid-2013. There are millions and millions of devices already out there which can detect these beacons, and more and more companies and organisations are starting to adopt them. They allow for determining someone's position to within a metre or so, allowing hyper-local advertising and mapping.
  7. Passive Search. Technologies like Google Now are already doing this with varying degrees of success. Sometimes it mis-fires, but sometimes it is astonisingly insightful in what it presents. We should expect to see this get better and better, drawing from more sources and interpreting our needs more accurately and often.

    My vision when we started Google 15 years ago was that eventually you wouldn't have to have a search query at all – the information would just come to you as you needed it.

    Sergey Brin, Google
  8. Universal Analytics. Google's upgraded analytics platform introduces a bunch of new features, including Cross Device Tracking which allows you to track a user's interaction with your website across several sessiosn which may have occurred on different devices. This can be combined with another new feature, the Measurement Protocol, which allows you to record into Google Analytics non-web interactions your visitors may have had. In my story, the unique voucher the app sent to the phone allowed the store to tie together the web session where our hero finally bought a bracelet with her earlier visit into the store in person. This is potential step change in understanding user behaviour and improving attribution tracking.

Cross device tracking ilustration, via Craig Bradford

Wrap Up

We've seen the journey of a user buying a product via the internet, and nothing about the story was out of the ordinary or unexpected. However, if we looked close we can see she ended up purchasing a product via a website, but never performed a web based search. She used voice search built into her phone, various apps, and a visit to a physical store, conducting and refining various searches along the way.

Hoardes of SEO Consultants around the world will likely despair that all the links they've built up for their clients are no longer a focus in this model. CRO analysts will be miserable that the work they've done split testing and iterating on improvements to their clients' websites were relegated further back in the purchase funnel than they are used to. Obviously, this specific scenario won't often be the case, but variants of it will be more and more common.

Customer reviews have long been an essential element of local search, and  there is no reason to think that is going to change. On the contrary, as scenarios like that I've outlined above continue to become more and more common, the relative importance of reviews in the whole search ecosystem is going to continue to grow (further supported by hints from Google talking about a possible "Merchant Quality update").

General brand signals will also get a bigger slice of the pie - these will include links, as well as social signals, website activitity and on-page trust metrics (content quality and freshness), customer reviews and lots of user signals.

Going even further, we believe that Google perceive instances where users bounce back from a website to a search result as a negative signal to ranking that site in the future (the user didn't find what they were looking for, it seems); we could imagine a physical analogue of this scenario where Google sees users leaving a store following a search and going into a similar store a short moment later - might Google start directing people to that second store right away in future?

I don't know where all of this is going, but I do think that the changing dynamics in how people leverage local searches is going to have a big impact on how businesses cater to those users, and will also impact users non-local search behaviour. It is something that marketers and businesses should be keeping an eye on.

Side Story: Passive Searches

In our little adventure story, we saw a second search where a second purchase was made with no specific query, but was prompted by a passive context based search done without the searcher explicitly initiating it. It is easy to imagine that Google Now is not far away from this, and it is obviously something Google are interested in (also see quote above from Sergey Brin).

Now you can imagine there are some contextual cues where you don't even have type the first letter to fill out what most people in similar contexts do.

Amit Singhal, Head of Search - Google

Furthermore last year, Google bought Behavio, a startup working on using the various sensors in smartphones to analyse trends and patterns in your where you go, when you do so, and who with. This can all be used, along with other data on your phone (calendar, recent calls, recent searches) to predict what you'll be doing next.

What seems obvious is Google are getting way more data about user behaviours, there is ongoing improvement in the sensors that help understand a user's context, and combined with improving capability in machine learning. I don't think we've fully envisioned the sorts of behaviours this will lead to just yet.

Come chat with me!

If you're coming to our SearchLove London conference in October then I'll be hosting a topic table discussing these technologies and others that might disrupt Search over the next couple of years, and would love to connect directly with people to swap ideas and thoughts.

Get blog posts via email

About the author
Tom Anthony

Tom Anthony

With a background in freelance web development, a degree in Computer Science, a PhD in Artificial Intelligence (almost – he is still writing his thesis!) and having taught himself to program on a BBC Master compact at the age of 8, it could be easy...   read more