The Arduino x Kway Contest
As I've detailed in the Equipments video above, one of the tech pillars of this research has been the Nicla Sense ME board from, which I have received from Arduino after being accepted to the Arduino x Kway Contest.

The contest counted with participats from all around the world, with a deadline for the 14th of feb, 2023. And in this Lab Note I want to share what I have submitted in the Arduino Project Hub. What follows from now is a copy of the original submission:
---
Animal Sensing - Arduino x Kway contest
This project seeks to experiment with haptic sensory substitution ("a way to bypass one traditional sensory organ by using another"), as a mean to ugment human perception in relation to other animals and the environment.
Project description
This project is a sub-experiment of a larger endeavor that seeks to nurture new senses for humans via the bio-signals gathered from other living beings.
It is also an entry to the Arduino x K-way contest $ that provokes: "what new experiences can we create when integrating advanced AI and powerful sensors in one of the most iconic outdoors jackets?"
The First Approach
As a participant of the contest I received a kit with the Nicla board and a K-way jacket (despite a huge delay in shipment due to the carrier's disorganization), with the goal of developing a jacket that can sense the environment and other beings and deliver these signals to its wearer.
So I began my experimentation by trying to connect the Nicla board to the Adafruit DRV2605L Haptic Controller , which I have used in the past. My idea was to deliver changes in atmospheric conditions to the skin of anyone wearing the jacket. After some work exploring the board and its battery, I managed to connect the nicla board to the haptic drive via protoboard. As the code in this project page shows, whenever there as an increase in temperature, the haptic driver would deliver a buzz to the vibration coin motor.
However, when I tried to connect the nicla and the haptic drier via I2C - so as to have a more compact device - I came across a stumbling block. Even after hours spent trying to bridge the STEMMA QT and ESLOV connectors, I could not get the drive functioning this way. Now I see that this is probably because I didn't have pull up resistors for the driver in the adapter cable that I created. Unfortunately, all of the tests came at a cost: the DRV2605L Haptic Controller stopped functioning.

The Childish Approach
With this situation I was forced to change directions - with only two weeks until the contest deadline, and all at while I was parenting solo for the first time ever. With very limited bench time but lots of play time alongside a child, a new idea emerged: What if the Nicla board was not attached to the jacket, but to my dog friend? What if I was able to feel what he's doing/feeling even when I'm not seeing him?
This new approach opened up many new possibilities for the embedded ML capabilities of the Nicla board. It could be used to recognize my dog's movements - and its surrounding environmental characteristics - which could then trigger another micro-controller attached to the jacket (via BLE) to deliver these signal onto my skin. A first prototype in the larger horizon I'm looking at. So I began testing.
After a lot of trial and error I managed to connect the Nicla board to the Edge Impulse studio. I ran tests with acquired and existing gesture recognition datasets and realized that for what I'm attempting to do I need to collect the data motion myself. So I've attached the Nicla board to my dog and have been observing him for a while, paying attention on the movements of the board as he goes about his activities.


I have identified three movement categories to work with: resting, running, and wiggle-playing. And now the next step is to simulate these movements to the best of my capacity with the board connected to the studio, so as to create a good training dataset.
At this stage, this project is by no means ready for today's (Feb 14th) contest deadline. However, I'm going to continue developing this, with plans to integrate sensor fusion capabilities. So I think it's important to share my progress with the community. Feedback would be highly appreciated, especially from people at Edge Impulse, given that this is my first experience with ML ever.
And also because there's a lot of work ahead. After collecting the first data, I need to filter it, run it through a ML model, analyze and improve the results, deploy the model to the Nicla board, send the live predictions over BLE to a micro-controller such as the feather esp32, and then use that to trigger the vibratory stimuli over a board attached to the jacket. Definitely a lot of work. But I look forward to doing it.


0 comments