Make things happen with a swipe of the hand or turn up the volume with an invisable dial. A 3D Gesture and Tracking Sheild for the Raspberry Pi can turn surfaces and wearable's into control panels.
The Seeed Studio's 3D Gesture & Tracking shield is a add on Hat for the Raspberry Pi that add's magical controls. With simple gestures and movements controls can be programmed to activate home automation, sensors, lights, camera's or control programs on the screen. The shield generates an electric field up to 10 cm above it's surface. When an object, such as your finger, passes through the field, it's location is detected. It also has zones on the surface to detect touch.
Seeed Studio's 3D Gesture and Tracking Hat can be used like at laptop track pad but as it works up to 10 cm above the device, height coordinates can also be used in your programs. The electric field it produces is sensitive enough to detect movement hidden under surfaces such as wood or cloth and is still able to detect tracking movement. Though at a slightly reduced range.
So what movements can be detected?
Non Contact Detection:
Moving your finger above the Hat will return X Y Z coordinates. When you swipe across the hat the direction of your movement is returned in a text format. Such as South North or West East.
If you move slowly in a circular motion it is detected as an Air Wheel returning decreasing or increasing numbers, depending on which way you move, from 1 to 10,000. This is useful for using as a volume control. Though I found a bit of practice is required to get the motion and speed right.
Along with the non contact sensing there are also five contact regions that are used for Touch, Tap and Double Tap detection. These return compass positions, North, South, East & West as well as Center.
The device has a sensitivity of up to 150 dpi, updating at 200 readings per second. The surface area is 6.5 cm x 5.5 cm to fit on a Raspberry Pi. In comparison my Laptops trackpad is about 10 cm wide, so controlling pointers may be be a bit lively for some applications. The advantage is this can sense in 3D space so you can create 3D zones for actions to be sensed in.
The Gesture shield connects to the Raspberry Pi with a 40 pin GPIO connector and communicates through the i2c pins and powered by 3,3v at pin position 17. The pins GPIO 4 and GPIO 17 are also used to access the firmware. Alternatively there is a Grove connector for use with Seeed's Raspberry Pi Grove Hat and the Open Source Grove modular connections system for sensors, devices and other electronics.
All gesture and tracking movements are detected internally by the software suite on board the MC3031 chip, with all actions accessible via an API.
Seeed supply drivers for C, there are also Python libraries available.
All movements are sent over the API as they are detected. So as you go for a double tap on the center you will get the X Y Z co ords as your finger moves through the electric field to the surface. Then there will be a touch result for center when your finger makes contact with the shield. As you make the first tap of your double tap it will send Center for Tap and finally Center for Double Tap.
You just need to decide which detection your program is going to concentrate on.
If you want to detect a Double Tap as well as a Tap, you will get results for both but you just test for Double Tap before Tap which will allow you to separate them.
I tested the functions of the Gesture and Tracking Shield using the Python Libraries. The Library is based on the PySupply's Flick library used by the Flick Hat but has been modified for use with Seeeds shield.
There is a flick-demo available to check out the movements straight away before trying to code something yourself.
This is a simple program to show how to detect a flick movement. Put the flicklib.py library file from the Git Hub download and the i2c.py file in your program folder. Then you can create a simple program like this. The Library is for use with Python 2.7. When used with Python3 the contact detection and flicks seem to be detected ok but the X Y Z tracking is intermittent.
#!/usr/bin/python #requires flicklib.py and i2c.py to be in the same location import flicklib from time import sleep #Constantly gets moves from Sheild through flicklib.py @flicklib.flick() def flick(start,finish): global flicktxt flicktxt = start + finish #do something with flicktext results def main(): print("Ready to Flick") try: global flicktxt #get global varible for use global flickcount flicktxt = '' #initalise varible to blank flickcount = 0 while True: #constant loop #flicktext not blank and flick not already been detected if len(flicktxt) > 0 and flickcount < 1: print(flicktxt) #print flick movement flickcount +=1 #+1 so same flick not detected twice else: #flick previously detected so clear varibles ready for next flick. flickcount = 0 flicktxt = '' sleep(0.05) #short delay before next detection except KeyboardInterrupt: print("\nAll Flicked Out") if __name__ == "__main__": main()
Now when you move you finger above the shield from south to north the output will show
Similar setups are used for Touch, Tap and Double Tap.
I have put together this python script to show all the touch and the flick gestures which is shown in the demo video. FlickView-python.zip
The Python program based on flick-image by ric96 https://github.com/ric96/flick-image
Other Examples from You Tube
The Seeed Studio 3D Gesture and Tracking Shield is a fun little device to use from controlling all sorts devices and programs, IoT devices and turning surfaces into control panels. It senses your actions well but can occasionally not detect a N S E W for double tap using Python and flick-demo, but this may be a timing issue. I also noticed in my python code the occasional double tap would be missed. X Y Z tracking, flick, touch and tap seemed fine.
The included Grove port for use with other devices and modular setups is useful if you use a Grove setup.
It's is simple to program in Python and presumably just as simple in c.
Note: This product has been supplied by Seeed Studio for me review and express my own opinions.