nRF Machine Learning
The nRF Machine Learning application is an out of the box reference design of an embedded machine learning using Edge Impulse. The application gathers data from sensors, forwards data to the Edge Impulse platform, and runs the machine learning model. It also displays results of the machine learning model on LEDs. The Edge Impulse platform collects data from sensors, trains machine learning model, and deploys the model to your Nordic Semiconductor’s device. To learn more about Edge Impulse support in the nRF Connect SDK see Using Edge Impulse with the nRF Connect SDK.
Requirements
The application supports the following development kits:
Hardware platforms |
PCA |
Board name |
Build target |
---|---|---|---|
PCA20053 |
|
||
Thingy:52 |
PCA20020 |
|
|
PCA10095 |
|
||
PCA10056 |
|
The available configurations use only built-in sensors or the simulated sensor signal. You do not need to connect any additional components to the board.
When built for an _ns
build target, the sample is configured to compile and run as a non-secure application.
Therefore, it automatically includes Trusted Firmware-M that prepares the required peripherals and secure services to be available for the application.
Overview
To perform its tasks, the nRF Machine Learning application uses components available in Zephyr and the nRF Connect SDK, namely the Common Application Framework modules and Sensors for sampling sensors, and UART or Nordic UART Service (NUS) for forwarding data. It also uses the Edge Impulse’s data forwarder protocol.
Sampling sensors
The application handles the sensor sampling using the CAF: Sensor manager module. This module uses Zephyr’s Sensors to handle the sampling. This approach allows to use any sensor available in Zephyr.
By default, the following sensors are used by the application:
Thingy:52 - Built-in accelerometer (
LIS2DH
).Thingy:53 - Built-in accelerometer (
ADXL362
).nRF52840 Development Kit - Simulated sensor (Simulated sensor driver). The simulated sensor generates predefined waves as acceleration. This development kit does not have a built-in accelerometer.
nRF5340 Development Kit - Simulated sensor (Simulated sensor driver). The simulated sensor generates predefined waves as acceleration. This development kit does not have a built-in accelerometer.
Forwarding data
The application uses Edge Impulse’s data forwarder protocol to forward data to Edge Impulse studio. By default, the following transports are used:
Thingy:52 uses Nordic UART Service (NUS).
Thingy:53 uses Nordic UART Service (NUS).
nRF52840 Development Kit uses UART.
nRF5340 Development Kit uses UART.
Machine learning model
The application handles the machine learning model using the Edge Impulse wrapper library available in the nRF Connect SDK. The model performs the classification task by assigning a label to input data. The labels that are assigned by the machine learning model are specific to the given model.
By default, the application uses pretrained machine leaning models deployed in Edge Impulse studio:
Both Thingy:52 and Thingy:53 use the NCS hardware accelerometer machine learning model. The model uses the data from the built-in accelerometer to recognize the following gestures:
idle
- The device is placed on a flat surface.updown
- The device is moved in updown direction.rotate
- The device is rotated.tap
- The device is tapped while placed on a flat surface.
Unknown gestures, such as shaking the device, are recognized as anomaly.
Both the nRF52840 Development Kit and nRF5340 Development Kit use the NCS simulated sensor machine learning model. The model uses simulated sensor data to recognize the following simulated wave types:
sine
triangle
idle
The
square
wave signal can also be generated by the simulated sensor. This signal is unknown to the machine learning model and therefore it is marked as anomaly.
The application displays LED effects that correspond to the machine learning results. For more detailed information, see the User interface section.
Power management
Reducing power consumption is important for every battery-powered device.
In the nRF Machine Learning application, application modules are automatically suspended or turned off if the device is not in use for a predefined period.
The application uses CAF: Power manager module for this purpose.
This means that Zephyr power management is forced to the PM_STATE_ACTIVE
state when the device is in either the Power management active or the Power management suspended state, but the power off state is forced directly by CAF: Power manager module as Zephyr’s PM_STATE_SOFT_OFF
state.
In the
POWER_MANAGER_LEVEL_ALIVE
state, the device is in working condition, Bluetooth is advertising whenever required and all the connections are maintained.In the
POWER_MANAGER_LEVEL_SUSPENDED
state, the device maintains the active Bluetooth connection.In the
POWER_MANAGER_LEVEL_OFF
state, the CPU is switched to the off mode.
In the suspended and OFF states, most of the functionalities are disabled. For example, LEDs and sensors are turned off and Bluetooth advertising is stopped.
Any button press can wake up the device.
For the Thingy:53, the sensor supports a trigger that can be used for active power management.
As long as the device detects acceleration, the board is kept in the active state.
When the board is in the POWER_MANAGER_LEVEL_SUSPENDED
state, it can be woken up by acceleration threshold by moving the device.
You can define the time interval after which the peripherals are suspended or powered off in the CONFIG_CAF_POWER_MANAGER_TIMEOUT
option.
By default, this period is set to 120 seconds.
Firmware architecture
The nRF Machine Learning application has a modular structure, where each module has a defined scope of responsibility. The application uses the Application Event Manager to distribute events between modules in the system.
The following figure shows the application architecture. The figure visualizes relations between Application Event Manager, modules, drivers, and libraries.
Since the application architecture is uniform and the code is shared, the set of modules in use depends on configuration. In other words, not all of the modules need to be enabled for a given reference design. For example, the CAF: Bluetooth LE state module and CAF: Bluetooth LE advertising module modules are not enabled if the configuration does not use Bluetooth®.
See Application internal modules for detailed information about every module used by the nRF Machine Learning application.
Programming Thingy:52
The Thingy:52 does not have the J-Link debug IC and the application configuration does not use a bootloader. Use an external debugger to program the firmware. See Thingy:52 documentation for details.
Programming Thingy:53
If you build this application for Thingy:53, it enables additional features. See Thingy:53 application guide for details.
Programming nRF53 DK
If you build this application for the nRF53 DK, it enables additional features similar to the ones that are enabled for Thingy:53:
MCUboot bootloader with serial recovery and multi-image update.
Static configuration of Partition Manager.
DFU over-the-air using Simple Management Protocol over Bluetooth.
See Developing with Thingy:53 for detailed information about the mentioned features.
The nRF53 DK has a J-Link debug IC that can be used to program the firmware. Alternatively, firmware can be updated over MCUBoot serial recovery or DFU over-the-air using Simple Management Protocol over Bluetooth. Keep in mind that if you use bootloader to update firmware, the new firmware must be compatible with used bootloader and partition map.
The nRF53 Development Kit uses RTT as logger’s backend. The RTT logs can be easily accessed, because the Development Kit has a built-in SEGGER chip.
Custom model requirements
The default application configurations rely on pretrained machine learning models that can be automatically downloaded during the application build. If you want to train and deploy a custom machine learning model using Edge Impulse Studio, you need a user account for the Edge Impulse Studio web-based tool. The user account is not needed to perform predictions using the pretrained models.
Data forwarding requirements
To forward the collected data using Edge Impulse’s data forwarder, you must install the Edge Impulse CLI. See Edge Impulse CLI installation guide for instructions.
Nordic UART Service requirements
If you want to forward data over Nordic UART Service (NUS), you need an additional development kit that is able to run the Bluetooth: Central UART sample. Check the sample Requirements section for the list of supported development kits. The sample is used to receive data over NUS and forward it to the host computer over UART. See Testing with Thingy devices for how to test this solution.
nRF Machine Learning build types
The nRF Machine Learning application does not use a single prj.conf
file.
Configuration files are provided for different build types for each supported board.
Each board has its own prj.conf
file, which represents a debug
build type.
Other build types are covered by dedicated files with the build type added as a suffix to the prj
part, as per the following list.
For example, the release
build type file name is prj_release.conf
.
If a board has other configuration files, for example associated with partition layout or child image configuration, these follow the same pattern.
When the CONF_FILE
variable contains a single file and this file follows the naming pattern prj_<buildtype>.conf
, then the build type will be inferred to be <buildtype>.
The build type cannot be set explicitly.
The <buildtype> can be any string, but it is common to use release
and debug
.
For information about how to set variables, see Important Build System Variables in the Zephyr documentation.
The Partition Manager’s static configuration can also be made dependent on the build type.
When the build type has been inferred, the file pm_static_<buildtype>.yml
will have precedence over pm_static.yml
.
The child image Kconfig configuration can also be made dependent on the build type.
The child image Kconfig file is named <child_image>.conf
instead of prj.conf
, but otherwise follows the same pattern as the parent Kconfig.
Before you start testing the application, you can select one of the build types supported by nRF Machine Learning application, depending on your development kit and the building method. The application supports the following build types:
debug
– Debug version of the application - can be used to verify if the application works correctly.release
– Release version of the application - can be used to achieve better performance and reduce memory consumption.
Not every board supports both mentioned build types.
The given board can also support some additional configurations of the nRF Machine Learning application.
For example, the nRF52840 Development Kit supports nus
configuration that uses Nordic UART Service (NUS) instead of UART for data forwarding.
Note
Selecting a build type is optional.
The debug
build type is used by default if no build type is explicitly selected.
User interface
The application supports a simple user interface. You can control the application using predefined buttons, while LEDs are used to display information.
LEDs
The application uses one LED to display the application state. The LED displays either the state of data forwarding or the machine learning prediction results. You can configure the LED effect in the application configuration files.
If the application uses the simulated sensor signal, it uses another LED to display the effect that represents the signal generated by the simulated sensor. The application defines common LED effects for both the machine learning results and the simulated sensor signal.
By default, the application uses the following LED effects:
Thingy:52 and Thingy:53 display the application state in the RGB scale. Thingy:52 uses the Lightwell LEDs and Thingy:53 uses the LED1.
If the device is returning the machine learning prediction results, the LED uses following predefined colors:
rotate
- Redupdown
- Greentap
- BlueAnomaly - Purple
If the machine learning model is running, but it has not detected anything yet or the
idle
state is detected, the LED is blinking. After a successful detection, the LED is set to the predefined color. The LED effect is overridden on the next successful detection.If the device forwards data, the LED color turns red and uses the following blinking patterns:
LED blinks slowly if it is not connected.
LED blinks with an average frequency if it is connected, but is not actively forwarding data.
LED blinks rapidly if it is connected and is actively forwarding data.
Both nRF5340 Development Kit and nRF52840 Development Kit use monochromatic LEDs to display the application state. The LED1 displays the application state and the LED2 displays the signal generated by the simulated sensor.
If the device is returning the machine learning prediction results, the LED1 blinks for a predefined number of times and then turns off for a period of time. Then the sequence is repeated. The machine learning result is represented by the number of blinks:
sine
- 1 blinktriangle
- 2 blinkssquare
- 3 blinksidle
- 4 blinks
If the machine learning model is running, but it has not detected anything yet or it has detected an anomaly, the LED1 is breathing.
If the device forwards data, the LED1 uses the following blinking patterns:
LED blinks slowly if it is not connected.
LED blinks with an average frequency if it is connected, but is not actively forwarding data.
LED blinks rapidly if it is connected and is actively forwarding data.
Configuration
The nRF Machine Learning application is modular and event-driven. You can enable and configure the modules separately for selected board and build type. See the documentation page of selected module for information about functionalities provided by the module and its configuration. See Application internal modules for list of modules available in the application.
Configuration files
The nRF Machine Learning application uses the following files as configuration sources:
Devicetree Specification (DTS) files - These reflect the hardware configuration. See Devicetree for more information about the DTS data structure.
Kconfig files - These reflect the software configuration. See Kconfig - Tips and Best Practices for information about how to configure them.
_def
files - These contain configuration arrays for the application modules. The_def
files are used by the nRF Machine Learning application modules and Common Application Framework modules.
The application configuration files for a given board must be defined in a board-specific directory in the applications/machine_learning/configuration/
directory.
For example, the configuration files for the Thingy:52 are defined in the applications/machine_learning/configuration/thingy52_nrf52832
directory.
The following configuration files can be defined for any supported board:
prj_build_type.conf
- Kconfig configuration file for a build type. To support a given build type for the selected board, you must define the configuration file with a proper name. For example, theprj_release.conf
defines configuration forrelease
build type. Theprj.conf
without any suffix defines thedebug
build type.app.overlay
- DTS overlay file specific for the board. Defining the DTS overlay file for a given board is optional._def
files - These files are defined separately for modules used by the application. You must define a_def
file for every module that requires it and enable it in the configuration for the given board. The_def
files that are common for all the boards and build types are located in theapplications/machine_learning/configuration/common
directory.
Advertising configuration
If a given build type enables Bluetooth, the CAF: Bluetooth LE advertising module is used to control the Bluetooth advertising.
This CAF module relies on Bluetooth LE advertising providers to manage advertising data and scan response data.
The nRF Machine Learning application configures the data providers in src/util/Kconfig
.
By default, the application enables a set of data providers available in the nRF Connect SDK and adds a custom provider that appends UUID128 of Nordic UART Service (NUS) to the scan response data if the NUS is enabled in the configuration and the Bluetooth local identity in use has no bond.
Multi-image builds
The Thingy:53 and nRF53 Development Kit use multi-image build with the following child images:
MCUboot bootloader
Bluetooth HCI RPMsg
You can define the application-specific configuration for the mentioned child images in the board-specific directory in the applications/machine_learning/configuration/
directory.
The Kconfig configuration file should be located in subdirectory child_image/child_image_name
and its name should match the application Kconfig file name, that is contain the build type if necessary
For example, the applications/machine_learning/configuration/thingy53_nrf5340_cpuapp/child_image/hci_rpmsg/prj.conf
file defines configuration of Bluetooth HCI RPMsg for debug
build type on thingy53_nrf5340_cpuapp
board, while the applications/machine_learning/configuration/thingy53_nrf5340_cpuapp/child_image/hci_rpmsg/prj_release.conf
file defines configuration of Bluetooth HCI RPMsg for release
build type.
See Multi-image builds for detailed information about multi-image builds and child image configuration.
Building and running
The nRF machine learning application is built the same way to any other nRF Connect SDK application or sample. Building the default configurations requires an Internet connection, because the machine learning model source files are downloaded from web during the application build.
This sample can be found under applications/machine_learning
in the nRF Connect SDK folder structure.
When built as a non-secure firmware image for the _ns
build target, the sample automatically includes the Trusted Firmware-M (TF-M).
To build the sample with Visual Studio Code, follow the steps listed on the Building nRF Connect SDK application quick guide page in the nRF Connect for VS Code extension documentation. See Building and programming an application for other building and programming scenarios and Testing and debugging an application for general information about testing and debugging in the nRF Connect SDK.
Selecting a build type
Before you start testing the application, you can select one of the nRF Machine Learning build types, depending on your development kit and building method.
Selecting a build type in Visual Studio Code
To select the build type in the nRF Connect for VS Code extension:
When Building an application as described in the nRF Connect for VS Code extension documentation, follow the steps for setting up the build configuration.
In the Add Build Configuration screen, select the desired
.conf
file from the Configuration drop-down menu.Fill in other configuration options, if applicable, and click Build Configuration.
Selecting a build type from command line
To select the build type when building the application from command line, specify the build type by adding the following parameter to the west build
command:
-- -DCONF_FILE=prj_selected_build_type.conf
For example, you can replace the selected_build_type variable to build the release
firmware for nrf52840dk_nrf52840
by running the following command in the project directory:
west build -b nrf52840dk_nrf52840 -d build_nrf52840dk_nrf52840 -- -DCONF_FILE=prj_release.conf
The build_nrf52840dk_nrf52840
parameter specifies the output directory for the build files.
Note
If the selected board does not support the selected build type, the build is interrupted.
For example, if the nus
build type is not supported by the selected board, the following notification appears:
Configuration file for build type ``nus`` is missing.
Providing API key
If the URI of the Edge Impulse zip file requires providing an additional API key, you can provide it using the following CMake definition: EI_API_KEY_HEADER
.
This definition is set in a similar way as selected build type.
For more detailed information about building the machine learning model in the nRF Connect SDK, see Using Edge Impulse with the nRF Connect SDK.
Tip
The nRF Machine Learning application configurations available in the nRF Connect SDK do not require providing an API key to download the model. The model is downloaded from the web, but no authentication is required.
Testing
After programming the application to your development kit, you can test the nRF Machine Learning application. You can test running the machine learning model on an embedded device and forwarding data to Edge Impulse studio. The detailed test steps for the Development Kits, the Thingy:52, and the Thingy:53 are described in the following subsections.
Application logs
In most of the provided debug configurations, the application provides logs through the RTT. See Connecting using RTT for detailed instructions about accessing the logs.
Note
The Thingy:53 in the debug
configuration provides logs through the USB CDC ACM serial.
See Developing with Thingy:53 for detailed information about working with the Thingy:53.
You can also use rtt
configuration to have the Thingy:53 use RTT for logs.
Testing with Thingy devices
After programming the application, perform the following steps to test the nRF Machine Learning application on the Thingy:
Turn on the Thingy. The application starts in a mode that runs the machine learning model. The RGB LED is blinking, because no gesture has been recognized by the machine learning model yet.
Tap the device. The
tap
gesture is recognized by the machine learning model. The LED color changes to blue and the LED stays turned on.Move the device up and down. The
updown
gesture is recognized by the machine learning model. The LED color changes to green and the LED stays turned on.Rotate the device. The
rotate
gesture is recognized by the machine learning model. The LED color changes to red and the LED stays turned on.Shake the device. The machine learning model detects an anomaly. The LED color changes to purple and the LED stays turned on.
Press and hold the button for more than 5 seconds to switch to the data forwarding mode. After the mode is switched, the LED color changes to red and the LED starts blinking very slowly.
Program the Bluetooth: Central UART sample to a compatible development kit, for example the nRF52840 Development Kit.
Turn on the programmed device. After a brief delay the Bluetooth® connection between the sample and the Thingy is established. The Thingy forwards the sensor readouts over NUS. The LED on the Thingy starts to blink rapidly.
Connect to the Bluetooth® Central UART sample with a terminal emulator (for example, PuTTY). See How to connect with PuTTY for the required settings.
Observe the sensor readouts represented as comma-separated values. Every line represents a single sensor readout. The Thingy forwards sensor readouts over NUS to the Central UART sample. The sample forwards the data to the host over UART.
Turn off PuTTY to ensure that only one program has access to data on UART.
Optionally, you can also connect to the device using Edge Impulse’s data forwarder and forward data to Edge Impulse studio (after logging in). See Forwarding data to Edge Impulse studio for details.
Testing with the nRF52840 or nRF53 DK
After programming the application, perform the following steps to test the nRF Machine Learning application on the Development Kit:
Turn on the development kit. The application starts in a mode that runs the machine learning model. Initially, LED2 displays the LED effect representing
sine
wave (1 blink) and LED1 is breathing, because the signal was not yet recognized by the machine learning model. After a brief delay, the machine learning model recognizes the simulated signal. LED1 and LED2 display the same LED effect.Press Button 3 to change the generated acceleration signal. Right after the signal change, effects displayed by LEDs are different. After a brief delay, the machine learning model recognizes the
triangle
wave and the same effect (2 blinks) is displayed by both LEDs.Press Button 3 to again change generated acceleration signal. The
square
wave (3 blinks) is displayed only by the LED2. This signal is marked as anomaly by the machine learning model and LED1 starts breathing.Press and hold Button 1 for more than 5 seconds to switch to the data forwarding mode. After the mode is switched, LED1 starts to blink rapidly.
Connect to the development kit with a terminal emulator (for example, PuTTY). See How to connect with PuTTY for the required settings.
Observe the sensor readouts represented as comma-separated values. Every line represents a single sensor readout.
Turn off PuTTY to ensure that only one program will access data on UART.
Optionally, you can also connect to the device using Edge Impulse’s data forwarder and forward data to Edge Impulse studio (after logging in). See Forwarding data to Edge Impulse studio for details.
Forwarding data to Edge Impulse studio
To start forwarding data to Edge Impulse studio:
Make sure you meet the Data forwarding requirements before forwarding data to Edge Impulse studio.
Run the
edge-impulse-data-forwarder
Edge Impulse command line tool.Log in to Edge Impulse studio and perform the following steps:
Select the Data acquisition tab.
In the Record new data panel, set the desired values and click Start sampling.
Observe the received sample data on the raw data graph under the panel. The observed signal depends on the acceleration readouts.
Porting guide
You can port the nRF Machine Learning application to any board available in the nRF Connect SDK or Zephyr.
To do so, create the board-specific directory in applications/machine_learning/configuration/
and add the application configuration files there.
See the Configuration for detailed information about the nRF Machine Learning application configuration.
Application internal modules
The nRF Machine Learning application uses modules available in Common Application Framework (CAF), a set of generic modules based on Application Event Manager and available to all applications and a set of dedicated internal modules. See Firmware architecture for more information.
The nRF Machine Learning application uses the following modules available in CAF:
See the module pages for more information about the modules and their configuration.
The nRF Machine Learning application also uses the following dedicated application modules:
ei_data_forwarder_bt_nus
The module forwards the sensor readouts over NUS to the connected Bluetooth Central. The sensor data is forwarded only if the connection is secured and connection interval is within the limit defined by
CONFIG_BT_PERIPHERAL_PREF_MAX_INT
andCONFIG_BT_PERIPHERAL_PREF_MAX_INT
.ei_data_forwarder_uart
The module forwards the sensor readouts over UART.
led_state
The module displays the application state using LEDs. The LED effects used to display the state of data forwarding, the machine learning results, and the state of the simulated signal are defined in
led_state_def.h
file located in the application configuration directory. The common LED effects are used to represent the machine learning results and the simulated sensor signal.ml_runner
The module uses Edge Impulse wrapper API to control running the machine learning model. It provides the prediction results using
ml_result_event
. The module runs the machine learning model and provides results only if there is an active subsriber. An application module can inform that it is actively listening for results usingml_result_signin_event
.ml_app_mode
The module controls Application mode. It switches between running the machine learning model and forwarding the data. The change is triggered by a long press of the button defined in the module’s configuration.
sensor_sim_ctrl
The module controls parameters of the generated simulated sensor signal. It switches between predefined sets of parameters for the simulated signal. The parameters of the generated signals are defined by the
sensor_sim_ctrl_def.h
file located in the application configuration directory.usb_state
The module enables USB.
Note
The ei_data_forwarder_bt_nus
and ei_data_forwarder_uart
modules stop forwarding the sensor readouts if they receive a sensor_event
that cannot be forwarded and needs to be dropped.
This could happen, for example, if the selected sensor sampling frequency is too high for the used implementation of the Edge Impulse data forwarder.
Data forwarding is stopped to make sure that dropping samples is noticed by the user.
If you switch to running the machine learning model and then switch back to data forwarding, the data will be again forwarded to the host.
Dependencies
The application uses the following Zephyr drivers and libraries:
The application uses the following nRF Connect SDK libraries and drivers:
The sample also uses the following secure firmware component:
In addition, you can use the Bluetooth: Central UART sample together with the application. The sample is used to receive data over NUS and forward it to the host over UART.
API documentation
Following are the API elements used by the application.
Edge Impulse Data Forwarder Event
applications/machine_learning/src/events/ei_data_forwarder_event.h
applications/machine_learning/src/events/ei_data_forwarder_event.c
- group ei_data_forwarder_event
Edge Impulse Data Forwarder Event.
Enums
-
enum ei_data_forwarder_state
Edge Impulse data forwarder states.
Values:
-
enumerator EI_DATA_FORWARDER_STATE_DISABLED
-
enumerator EI_DATA_FORWARDER_STATE_DISCONNECTED
-
enumerator EI_DATA_FORWARDER_STATE_CONNECTED
-
enumerator EI_DATA_FORWARDER_STATE_TRANSMITTING
-
enumerator EI_DATA_FORWARDER_STATE_COUNT
-
enumerator EI_DATA_FORWARDER_STATE_DISABLED
-
struct ei_data_forwarder_event
- #include <ei_data_forwarder_event.h>
Edge Impulse data forwarder event.
Public Members
-
struct app_event_header header
Event header.
-
enum ei_data_forwarder_state state
Edge Impulse data forwarder state.
-
struct app_event_header header
-
enum ei_data_forwarder_state
Machine Learning Application Mode Event
applications/machine_learning/src/events/ml_app_mode_event.h
applications/machine_learning/src/events/ml_app_mode_event.c
- group ml_app_mode_event
Machine Learning Application Mode Event.
Enums
-
struct ml_app_mode_event
- #include <ml_app_mode_event.h>
Machine learning application mode event.
Public Members
-
struct app_event_header header
Event header.
-
enum ml_app_mode mode
Machine learning application mode.
-
struct app_event_header header
-
struct ml_app_mode_event
Machine Learning Result Event
applications/machine_learning/src/events/ml_result_event.h
applications/machine_learning/src/events/ml_result_event.c
- group ml_result_event
Machine Learning Result Event.
-
struct ml_result_event
- #include <ml_result_event.h>
Machine learning classification result event.
-
struct ml_result_signin_event
- #include <ml_result_event.h>
Sign in event.
The event that is called by modules to mark that the module actively listens for the result event.
-
struct ml_result_event
Sensor Simulator Event
applications/machine_learning/src/events/sensor_sim_event.h
applications/machine_learning/src/events/sensor_sim_event.c
- group sensor_sim_event
Simulated Sensor Event.
-
struct sensor_sim_event
- #include <sensor_sim_event.h>
Simulated sensor event.
-
struct sensor_sim_event