During this year, the thing that has stuck with me the most is the Apple iBeacons. Introduced in IOS 7 (2013), an iBeacon is a new type of location awareness. Using Bluetooth Low Energy (BLE), a device (this could be a phone) can be used to establish a region around an object. The technology was introduced to me through conferences, blog posts, and the possibilities of uses for this technology are endless.
Behind the scenes
iBeacons were introduced in Apple IOS7 as part of the Core Location API. It is possible for any device that has Bluetooth Low Energy to become an iBeacon in itself, but it is also possible to manufacture independent beacons relatively cheaply. The beacons can be powered by a small coin size battery and last for years.
As mentioned above, there are a massive range of uses for this technology, such as tour guides, shopping, and indoor location tracking. An interesting discovery is that iBeacons work alongside Passbook, which could be useful for checking in at a conference, for example.
The limitation of this is that you need to have an application downloaded and the Bluetooth switched on. This is a big problem, as many people (including myself) do not keep Bluetooth switched on because it drains the battery.
However, during the writing process of this blog post I attended the theweb.is conference in Cardiff, United Kingdom where Scott Jenson (@scottjenson) discussed iBeacons (and non-specific beacons) and introduced the concept of having an internet browser recognising a beacon and then going to a website instead of an app, which could solve a lot of accessibility problems. For more information the Github link is here- GitHhub Physical Web
My plan to experiment with beacon technology is to create an application that uses iBeacons for an event. The app will connect to several iBeacons and then deliver the corresponding content when the user is close to a particular beacon.
The outcome of this is to explore the possibilities of location awareness and how we can improve user experience by understanding a persons environment and then react to changes within that environment.
The first prototype was a setup between an iPhone 4s and an iPhone 5c. The 4s was the beacon, and using the app Locate, the 5c (the ‘receiver’) scanned for any beacons in the area and, upon finding them, connected to iPhone 4s beacon. The 5C ‘receiver’ could then recognise the distance between devices, power being emitted and the major and minor figures.
The second test was using the same two phones but this time experimented with proximity (I did order an independent beacon but unfortunately it was faulty). Proximity works on three levels; far, intermediate, and close. The second prototype experiment demonstrated that by using these proximities it is possible to deliver different content based on the distance.
Every week or so I will post updates on Twitter and Google+ but also an occasional blog post to show the progression. I hope that you will follow along on this project and if you have any ideas or have worked with this before, it would be great to hear from you. I’m really excited for this technology and its potential.