Visual Wake Words

This example application implements the Visual Wake Words CNN architecture. The VWW model is trained to classify images to two classes (person/not-person) and serves as a popular use-case for microcontrollers.

This example demonstrates how to receive input data using xscope.

Building the firmware

Run the following commands in the xcore_sdk root folder to build the firmware:

cmake -B build -DCMAKE_TOOLCHAIN_FILE=xmos_cmake_toolchain/xs3a.cmake
cd build
make example_bare_metal_vww
cmake -G "NMake Makefiles" -B build -DCMAKE_TOOLCHAIN_FILE=xmos_cmake_toolchain/xs3a.cmake
cd build
nmake example_bare_metal_vww

Running the firmware

Running with hardware

make run_example_bare_metal_vww
nmake run_example_bare_metal_vww

The firmware will now wait until a data is sent from a host application. Test images can be sent to the firmware using xscope. Most RGB images should work. The test_image.py script requires Python. Ensure you have installed Python 3 and the XCore SDK Python requirements.

Sending a test image to the xcore.ai Explorer board using xscope. The test_image.py script can be found in the application directory:

./test_image.py path/to/image

Optimizing the model

An unoptimized, quantized model is included with the example.

First, be sure you have installed the XMOS AI Toolchain extensions. If installed, you can optimize your model with the following command:

xcore-opt --xcore-thread-count 5 -o model/model_xcore.tflite model/model_quant.tflite

Converting flatbuffer to source file

The following unix command will generate a C source file that contains the TensorFlow Lite model as a char array.

python <path-to-sdk>/tools/tflite_micro/convert_tflite_to_c_source.py --input model/model_xcore.tflite --header src/vww_model_data.h --source src/vww_model_data.c --variable-name vww