This Bluetooth 4.0 (BLE) bike lock removes the need for a physical key and enables greater convenience and safety for you and your bike. Once paired, the lock remembers your smartphone. When you approach it'll automatically unlock. No need to find your keys, just pull apart. The battery currently lasts for 3 weeks, on average. A notification is sent when the power is running low. There's also a backup key, just incase your phone runs out of juice. The lock also features from added security. If an attempt is made to break into the lock, an alarm will sound (I'm currently looking into a GSM for remote notification).
First prototype has been made and works. First prototype used an Arduino Mini Pro alongside the BLE Mini from Red Bear Lab (see images below). Second prototype is under development. It miniaturises the circuity for a more efficiency by simply using the BLE112 (Ti CC2450 SoC) chip for both micro-controller and bluetooth 4.0, along with a lower power motor.
Some additional exciting features are afoot for future revisions of the lock too.
Project developed for rehabstudio for Google+ and Converse
// Coded and made by myself | part-creative input
Arduino (microcontroller), pressure pads and wireless transmitters were inserted inside the soles to capture the way people move and translate those moves into a unique pattern on-screen.Put simply, users were able to ‘paint’ with their sneakers to create the ultimate customised Chuck Taylors. This Won second place at the Creative Social 'Hack-a-Chuck' competition at the Cannes Lions festival, 2013.
Lit is an interactive, multi-sensory exhibit that explores the generative relationship between motion, light and audio. Entering into the experience you are presented with an empty, dark space. As you start walking through the installation reacts to your movement in a fun and playful manner.
This project was inspired by exploring reactive architecture through creative technology, venturing away from the dystopian, bleak concrete walls that invade our urban spaces.
Movements are captured via IR sensors mounted along a wall. These pickup movements are sent to an Arduino which then controls various RGB LEDs, while communicating with a sound shield to play a particular sound depending on where you are.
Produced for Digital Shoreditch 2013 - Shoreditch Town Hall, London.
Keeping hydrated seems simple enough in practice. In reality, people don’t drink enough. HydroBolt is a smart, Internet of things water bottle that can analyse how much you drink and keep you hydrated.
The hardware is comprised of a water flow sensor, a BLE112 bluetooth SoC (originally developed on an Arduino Mini Pro) to compute and transfer the data and an RGB LED for a glanceable UI (Traffic light system, green = good, red = bad). HydroBolt also comes with an app to help people learn, understand their habits and explore more detail with the information gathered wirelessly from the water bottle.
Currently in working prototype stage (July 2013 - see below).
Gigalinc is an exhibition that allows participants to interactively explore large-scale panoramic images. High-definition gigapixel images (1000 times the information captured by a 1 megapixel digital camera) are digitally projected onto a large cinema screen that allows viewers to interactively navigate via kinect based hand-gestures, zooming in and out of areas of particular interest. Surround-sound audio accompaniments added to the multi-sensory experience.
Gigalinc explores the world of immersive photography and the possibilities it offers for changing the way we look at and use photographic images. It does this through digital technologies that allows the viewer to ‘step inside’ and move around large panoramic images, presented in astonishing levels of detail. As the perspective changes, the viewer feels as if he or she is actually ‘immersed’ in the scene. Participants also have the opportunity to print out their own image of what they found interesting/appealing and add to a pin board to create a collage of interest.
The technology used consisted of several features. The hand tracking utilises Xbox Kinect to input gesture based commands that runs on open source software. Arduino allows the participants to select their image just by tapping one touch-pad on the floor. Arduino was also responsible for allowing the participant to print a section of the image. This is achieved through a large red push button that when pressed, tells the computer to capture the image on screen and print it out instantaneously.
GigaLinc has been shown internationally, including at TEDGlobal 2013 (click for images).
eSleeper is a 21st century cat basket, an ideal resting location for any cat. When the feline decides its nap-time, eSleeper’s automated lighting control turns on a relaxing wave of colour inside an eMac, while greeting the cat with the iconic Macintosh start-up chime (keeping the legacy of the eMac). When the cat walks out, eSleeper turns off the lights and tweets to @eSleeper1, displaying various phrases along with how long the cat has occupied eSleeper.
My cat loves sleeping. His old basket suffered from overuse and had to be discarded. Instead of buying him a replacement, I decided to bring cat baskets into the digital age and build eSleeper. He now loves eSleeper (and is sleeping even better).
An Ethernet Arduino is used to control eSleeper. Data is sent to the Arduino from an infra-red beam attached to the inside of the eMac. When the beam breaks, the Arduino turns on the RGB LED, talks to a sound shield to play the Macintosh startup chime and records how long the cat has been inside. When the beam is broken again, the Arduino turns off the LED and tweets a random phrase linked to the time spent inside the eSleeper.
2012 was not only the year of the Olympics, but also the launch of the first ever ‘Digilympics’, a twitter-powered race for sporting success where you determine the outcome. Four Lego athletes move down a physical racetrack as fans Tweet their team to move them further towards the finish line.
The competition was open to anyone on the web, allowing participants to Tweet their team to success using four unique Twitter handles. Tweets in support of a particular account/country moved that country’s contestant physically forward along a running track.
A Processing sketch scans the Twitter account of each country for any ‘@’ replies. Any new replies are passed to an Arduino with the Adafruit motor shield attached, one motor for each team. The motors spin and advance each team forward in response to tweets received. An infra-red beam detects the country reaching the finishing line first and signals for the reset of a new race.
View the website here.
Project part of the Digital Sizzle 6 Hackathon – Group project – Idea, visual look and part coded by me.
Gaia is a mixed media installation that explores people’s growing interconnectedness in the urban environment. Using 3D data visualisation and soundscape, Gaia maps London’s feelings and movements (through social media data and TFL)to create an experience that looks at how the increase in the amount of information we provide about our day to day lives and emotions can create feedback loops which bring us closer together and help create better urban environments.
From a technical perspective, we took a wide range of data from TFL including bus stop and underground station locations and putting this in to a 3D mesh, interconnecting routes and journeys. On top of this, we connect real-time data every 60 seconds on the position of every bus and tube in London.
Exhibited in the Whitechaple London and Wired2012.
Visualised using Processing.
A few weeks before I participated in AngelHack 2012, a pub conversation arose regarding a theft of a friend’s bag and the growing problem that such crime presents in society. This lead me to think, 'How can I creatively use technology to prevent the casual theft of valued bags and luggage?' Enter SafeCase.
SafeCase constantly monitors movement via an accelerometer, conected at an ATMEGA324P (originally Arduino). Owners of the bag wear an RFID wrist band to identify themselves. If no identified user moves the bag then a alarm will trigger and LED's will flash, notifying that owner that their bag is in the wrong hands.
First prototype was made the AngelHack Hackathon and Accelerator and received a 'special mention'.
September's issue of PC World Magazine featured my home automation system. Published in print (745,000 circulation, mainly throughout USA) and gained over 15,000 views online.
Read the full article online @ PCWorld.com - Please note - that section has been re-branded as TechHive.
The photo booth was designed as an accessible way of capturing family and friends of the bride and groom in a fun, light-hearted and quirky manner. People could see their reflections displayed on an LCD TV inside the booth, while a large red button was the trigger to capture each image. Props were also available for the guests to add a sense of humour into each image.
350 photos were taken on the night ranging from the mundane to the extraordinarily bizarre.
Written in Processing and Arduino.
The Learning Department at the British Museum had a special exhibit for the Chinese New Year and wanted children to try to draw the characters from the Chinese zodiac in an unconventional way.
I collaborated with Creative Technologist Becky Stewart who helped develop the Kinect/Processing side of the exhibt. Children were able to select their Chinese zodiac symbol via RFID cards placed upon a hot spot. This would change the background image and allow the children to 'draw' using their hands. Xbox Kinect was the interface used.
My work is most often focused around designing experiences and inventing products that fill the digital/physical void. This usually sits in-between three corner stones; Creative Technology, Interaction Design and the technical ability to make and build it (electronics, hardware, 3D prototyping and coding).
All work is my own (including the coding, images, thoughts and design), unless otherwise stated. Currently based in London, UK.
For more information please visit my Linkedin.