New York-based visual and performance artist Lia Chavez approached us for a one-of-a-kind collaboration. Lia works with multimedia to explore the behaviour of light in a state of meditation, and instead of a still, calm and quiet place, she's discovered that deep meditation can in fact trigger visions of stroboscopic light and cataclysmic storm systems. She wanted to share this visceral experience with others, so she came to us to help create her next piece: an 8-hour live performance in Dundee, Scotland
I created custom code that reads Lia's brainwaves and, via bluetooth, transmits its signal to a strobe light. The signal's frequency and strength adapted, so that when Lia is in a deep state of meditation, the strobe flashes brightly and intensively, and less so when she is in intermediate states.
Lia's final performance turned out to be the perfect meeting of technology, science and art that inspired us all to collaborate in the first place
Project developed at rehabstudio. Code and hardware setup developed by myself
Visit the website here.
Flood Beacon is an internet-connected device that broadcasts real-time flood level data and alerts.
The risk of flooding is increasing. Flood damage losses are expected to increase four fold by 2050, costing $1 trillion globally. Unfortunately in dealing with this growing problem, those at risk have to rely on predicted flood information, rather than real-time information. Flood Beacon aims to address this failing by providing flood data and warnings in real-time.
The Food Beacon is engineered to broadcast live data such the current water level, GPS location and any sudden accelerations the beacon may feel. The data is sent to cloud for processing and distributed via an open API, managed by Xively. Alerts can then be sent to our smartphone application (push notification when someone is in immediate danger), monitoring stations or anything else people wish to develop.
The prototype hardware technology inside includes a GSM chip for communication, microprocessor, accelerometer, ultrasonic sensors, rechargeable LiPo battery and a 3D printed design.
Visit the website here.
BitTag is a physical price tag that creates a seamless integration between Bitcoin (or any another digital cryptocurrency) transactions and "bricks and mortar" retail stores.
Once set up, the BitTag hardware displays information such as the product name, price in USD (or other local currencies) and the real-time price in Bitcoin. If the Bitcoin value suddenly fluctuates, the price on BitTag will instantly reflect this. The customer, therefore, instantly knows how much the item is worth, whilst the retailer does not incur any financial loss if the value of the Bitcoin changes.
Each BitTag is managed with the assistance of an accompanying iPad app. The app can either be used to setup the Tag or process the digital Bitcoin transaction by simply placing the BitTag on the screen. Alternatively, the Bitcoin transaction can be activated by a simple 'shake' of the BitTag hardware, enabling a Bitcoin QR code to be displayed on the OLED display and scanned by the user's smartphone, using standard apps like BlockChain.
The prototype hardware technology inside includes a Bluetooth low energy (BLE 4.0) chip for communication to and from the app, Microprocessor (similar to Arduino), OLED display, accelerometer, rechargeable LiPo battery and a 3D printed design.
Reported in the PSFK Future of Retail report 2013 - read the article here
Shop windows don't have to be a passive experience. As part of PSFK's Future of Retail Report, me along with rehabstudio developed, designed a prototyped an interactive display that adapts to whoever stands in front of it.
The technology identifies shoppers using Bluetooth low energy (BLE) and instantly reacts to a set of personal data stored on their mobile device, such as shopping habits and preferences. Shoppers can swipe through personalised content, place items in a digital shopping cart, and purchase straight from the physical display.
This concept could be deployed outside the context of a store environment, be it street on physical billboards, bus stop signs or car showrooms.
Project developed at rehabstudio. Half of the creative input.
Project developed by me for rehabstudio for Google+ and Converse
// Coded and hardware designed and made by myself | part-creative input
Converse asked us to hack their sneakers, so me along with rehabstudio set out to create the ultimate personalised pair of Chuck Taylors.
We inserted an accelerometer X+Y axis, pressure sensors, Arduino microcontroller, and batteries inside each shoe, so that they would track the wearer's moves, and send the corresponding data wirelessly to a computer, which rendered colourful graphics projected onto a screen.
These graphics could then be printed onto a new pair of Chucks, for a truly personalised design that's entirely unique to how the wearer moves. Our hacked sneakers premiered at Cannes in 2013, where they picked up the silver prize in their category.
This Bluetooth low power 4.0 (BLE) bike lock removes the need for a physical key and enables greater convenience and safety for you and your bike. Once digitally paired, the lock remembers your smartphone. When you approach, it'll automatically unlock. No need to find your keys, just pull apart. Also, there's a physical backup key, just incase your phone runs out of juice. The lock also features from added security. If an attempt is made to break into the lock, an alarm will sound.
The first physical hardware prototype has been made and coded using an Arduino Mini Pro alongside the BLE Mini from Red Bear Lab (see images below). Second prototype is under development which miniaturises the circuity for a more efficiency by simply using the BLE112 (Ti CC2450 SoC) chip for both micro-controller and Bluetooth 4.0, along with a lower power motor and efficient code.
Developed before the likes of BitLock and Lock8 hit Kickstarter
Lit is a physical, interactive and multi-sensory exhibit that explores the generative relationship between motion, light and audio. Entering into the experience you are presented with an empty, dark space. As you start walking through the installation reacts to your movement in a creative, fun and playful manner.
This project was inspired by exploring reactive architecture through creative technology, venturing away from the dystopian, bleak concrete walls that invade our urban spaces.
On the hardware/technology side, movements are captured via IR sensor technology mounted along a wall. These pick-up movements and are sent to an Arduino which controls the RGB LEDs, while communicating with a sound shield to play a particular sound depending on where you are.
Produced for Digital Shoreditch 2013 - Shoreditch Town Hall, London.
Gigalinc is a creative exhibition that allows participants to physically explore large-scale panoramic images through new technologies. Gigapixel images (1000 times the information captured by a 1 megapixel digital camera) are digitally projected onto a large cinema screen that allows viewers to interactively navigate via physical Kinect based hand-gestures, zooming in and out of areas of particular interest. Surround-sound audio accompaniments added to the multi-sensory experience.
The technology used consisted of several hardware features. The hand tracking utilises Xbox Kinect to input gesture based commands that runs on open source software. Arduino allows the participants to select their image just by tapping one touch-pad on the floor. Arduino was also responsible for allowing the participant to print a section of the image. This is achieved through a large red push button that when pressed, tells the computer to capture the image on screen and print it out instantaneously.
GigaLinc has been shown internationally, including at TEDGlobal 2013 (click for images).
Keeping hydrated is crucial to our well-being. And whilst it seems simple enough in practice; in reality, people don't drink enough. HydroBolt is a creative, Internet-of-things water bottle that uses technology to analyse how much you drink, and remind you to keep hydrated.
The hardware includes a water flow sensor, a BLE112 Bluetooth SoC (originally developed on an Arduino Mini Pro) to compute and transfer data, and an RGB LED for a glanceable UI. If your Hydrobolt water bottle glows green, it means you are keeping sufficiently hydrated. If it glows red, it means it's time to have a sip.
HydroBolt also comes with an app, designed and coded to help people learn and understand their water-drinking habits, based on the information gathered wirelessly from their water bottle.
eSleeper is a 21st century technology-focused cat basket, designed to be an ideal resting location for any cat. When the feline decides its nap-time, eSleeper’s automated lighting control turns on creative wave of colour inside an eMac, while greeting the cat with the iconic Macintosh start-up chime (keeping the legacy of the eMac). When the cat walks out, eSleeper turns off the lights and tweets to @eSleeper1, displaying various phrases along with how long the cat has occupied eSleeper.
Technology and hardware wise, an Ethernet Arduino is used to control eSleeper. Data is sent to the Arduino from an infra-red beam attached to the inside of the eMac. When the beam breaks, the Arduino turns on the RGB LED, send a digital message to a sound shield to play the Macintosh startup chime and records how long the cat has been inside. When the beam is broken again, the Arduino code turns off the LED and Tweets a random phrase linked to the time spent inside the eSleeper.
2012 was not only the year of the Olympics, but also the launch of the first ever ‘Digilympics’, a Twitter-powered race for sporting success where you determine the outcome. Four Lego athletes move down a physical racetrack as fans Tweet their team to move them further towards the finish line.
The competition was open to anyone on the web, allowing participants to Tweet their team to success using four unique Twitter handles. Tweets in support of a particular account/country moved that country’s contestant physically forward along a running track.
Processing code scans the Twitter account of each country for any ‘@’ replies. Any new replies are digitally passed to an Arduino with the Adafruit motor shield hardware attached, one motor for each team. The motors spin and advance each team forward in response to tweets received. An infra-red beam detects the team reaching the finishing line first and signals for the reset of a new race.
View the website here.
A few weeks before I participated in AngelHack 2012, a pub conversation arose regarding a theft of a friend’s bag and the growing problem that such crime presents in society. This lead me to think, 'How can I creatively use technology to prevent the casual theft of valued bags and luggage?' Enter SafeCase.
SafeCase is a physical prototype that constantly monitors movement via a digital accelerometer, connected at an ATMEGA324P (originally Arduino). Owners of the bag wear an RFID wrist band to identify themselves. It's designed so that if no identified user physically moves the bag then an alarm will trigger and LED's will flash, notifying that owner that their bag is in the wrong hands.
First prototype was made the AngelHack Hackathon and Accelerator and received a 'special mention'.
Read the full article online @ PCWorld.com - Please note - PCWorld has been re-branded as TechHive.
September's issue of PC World Magazine featured my home automation system. Published in print (745,000 circulation, mainly throughout USA) and gained over 15,000 views online.
The photo booth was designed as an accessible way of capturing family and friends of the bride and groom in a fun, light-hearted and creative manner. The technology allowed people could see their reflections displayed on an LCD TV inside the physical booth, while a large red button was the digital trigger to capture each image. Props were also available for the guests to add a sense of humour into each image.
350 photos were taken on the night ranging from the mundane to the extraordinarily bizarre.
Written in Processing and made with Arduino.
Project part of the Digital Sizzle 6 Hackathon – Group project – Idea, visual look and part coded by me.
Gaia is a mixed creative technology installation that explores people's growing interconnectedness in the urban environment. Using 3D data visualisation and soundscape, Gaia maps London's feelings and movements (through social media data and TFL) to create an experience that looks at how the increase in the amount of information we provide about our day to day lives and emotions can create feedback loops which bring us closer together and help design better urban environments.
From a technical perspective, we took a wide range of data from TFL and put this in to a digital 3D mesh, interconnecting routes and journeys. On top of this, we connect real-time data every 60 seconds on the position of every bus and tube in London.
Exhibited in the Whitechaple London and Wired 2012.
Visualised using Processing.
My work is most often focused around designing experiences and inventing hardware prototypes/products that fill the digital physical void. This usually sits in-between three corner stones; Creative Technology, Interaction Design and the technical ability to make, code and build it (electronics, hardware, 3D design/print etc).
All work is my own (including the coding, images, ideas and design), unless otherwise stated. Currently based in London, UK.
For more information please visit my Linkedin.