GeForce NOW’s RTX 3080 membership is the next generation of cloud gaming. This GFN Thursday looks at one of the tier’s major benefits: ultra-low-latency streaming from the cloud. This week also brings a new app update that lets members log in via Discord, a members-only World of Warships reward and eight titles joining the GeForce Read article >
Software: Neuton TinyML ; Hardware: Arduino Nicla Sense ME
Less than half a year ago, a new Arduino’s board, Nicla Sense ME (Motion & Environment), hit the market. And, needless to say, I couldn’t wait for a chance to try this brand new device and explore its capabilities.
Nicla Sense ME is currently the smallest board across the entire Arduino PRO family of boards for industrial applications. It comes with a range of Bosch Sensortec smart motion and environmental sensors, and I’m quite sure that this device will be widely applicable in industrial IoT, as it’s great for various research projects and commercial use.
I’m really excited about the pace of development of the TinyML field because such small devices coupled with tiny machine learning capabilities indeed demonstrate enormous potential for many industries and domains.
Now, let me share my findings after an experiment with Nicla Sense ME.
Procedure:
In my experiment, I decided to play with in-air handwriting using an accelerometer-based pen device for handwriting recognition applications in order to test the ability of Nicla Sense ME to identify and process data about handwritten numbers on the edge. Compared to handwriting based on touch devices, this task is much more complex and challenging.
The device on which the experiment was carried out consisted of a triaxial accelerometer and Nicla Sense ME. The accelerations generated by hand motions were processed on MCU for further inference using the Neuton TinyML model.
Step 1: Model Training
For model training, I created my own dataset by capturing data containing 200 samples for each handwritten digit (from 0 to 9). Each sample was recorded for 2 seconds with a frequency of 100 Hz. It’s easy to calculate how many times I had to move my hands in the air to collect this data, and it’s even easier to imagine what my family members were thinking about me 🙂
I merge captured samples to a single CSV file and add a target variable using Python script. The resultant dataset consisted of three features (accelerometer axes) and 400000 rows (200 samples by 200 rows for each digit).
I uploaded the dataset in a CSV format to the Neuton TinyML platform, selected the target variable (Digit), target metric (Accuracy). And then I enabled the TinyML option, selected 8-bit calculations without float support. For this case, I activated the Digital Signal Processing option for automatic data preprocessing and feature extraction. The model was trained automatically, and nothing needed to be compressed.
I chose Neuton for this experiment since the platform automatically builds models that are optimal in size and accuracy. I also tried to train the same model with TensorFlow. Сheck out the comparison of the resultant metrics:
Just have a look at the dramatic difference in the number of coefficients, Neuton’s model has 42 times fewer coefficients than a TensorFlow one!
Step 2: Embedding into a Microcontroller
After the training was completed, I downloaded the archive containing all the necessary files for the embedding to the microcontrollers firmware project.
As you can see, the sensor accuracy is really high so it managed to identify all the numbers with great probability. The model operates right on the device and weighs about 1 KB. That’s what I call real TinyML!
Conclusion:
I believe that the future belongs to such incredibly tiny machine learning solutions. The ability to sense and accurately process various types of data in real-time, at low energy consumption but with high computing power opens up new opportunities in many fields. The implementation of such solutions can, for instance, streamline robotic science, ease the detection of seismic activities and dangerous gases in the air, advance IoT devices that we use on a daily basis, and a lot more — the scenarios are infinite! Isn’t that inspiring?
Please write in the comments below what other experiments you want me to do with Nicla Sense ME!
Record quarterly revenue of $7.64 billion, up 53 percent from a year earlierRecord fiscal-year revenue of $26.91 billion, up 61 percentRecord quarterly and fiscal-year revenue for Gaming, Data …
The practice of identifying raw data (such as pictures, text files, videos, etc.) and adding relevant and informative labels providing context to the given data is known as data labeling. It is employed to train the machine learning model in many use cases. For example, labels can be used in computer vision to identify whether a photograph has a bird or an automobile, in speech recognition to determine which words were spoken in an audio recording,
Overall, labeled datasets help train machine learning models to recognize and understand recurrent patterns in the input data. After being trained on labeled data, the ML models are able to recognize the same patterns in new unstructured data and produce reliable results. Continue Reading
The types of edge computing and examples of use cases for each.
Many organizations have started their journey towards edge computing to take advantage of data produced at the edge. The definition of edge computing is quite broad. Simply stated, it is moving compute power physically closer to where data is generated, usually an edge device or IoT sensor.
This encompasses far edge scenarios like mobile devices and smart sensors, as well as more near edge use cases like micro-data centers and remote office computing. In fact, this definition is so broad that it is often talked about as anything outside of the cloud or main data center.
With such a wide variety of use cases, it is important to understand the different types of edge computing and how they are being used by organizations today.
Provider edge
The provider edge is a network of computing resources accessed by the Internet. It is mainly used for delivering services from telcos, service providers, media companies, or other content delivery network (CDN) operators. Examples of use cases include content delivery, online gaming, and AI as a service (AIaaS).
One key example of the provider edge that is expected to grow rapidly is augmented reality (AR) and virtual reality (VR). Service providers want to find ways to deliver these use cases, commonly known as eXtended Reality (XR), from the cloud to end user edge systems.
In late 2021, Google partnered with NVIDIA to deliver high-quality XR streaming from Google Cloud NVIDIA RTX powered servers, to lightweight mobile XR displays. By using NVIDIA CloudXR to stream from the provider edge, users can securely access data from the cloud at any time and easily share high-fidelity, full graphics immersive XR experiences with other teams or customers.
Enterprise edge
The enterprise edge is an extension of the enterprise data center, consisting of things like data centers at remote office sites, micro-data centers, or even racks of servers sitting in a compute closet on a factory floor. This environment is generally owned and operated by IT as they would a traditional centralized data center, though there may be space or power limitations at the enterprise edge that change the design of these environments.
Looking at examples of the enterprise edge, you can see workloads like intelligent warehouses and fulfillment centers. Improved efficiency and automation of these environments requires robust information, data, and operational technologies to enable AI solutions like real-time product recognition.
Kinetic Vision helps customers build AI for these enterprise edge environments using a digital twin, or photorealistic virtual version, of a fulfillment or distribution center to train and optimize a classification model that is then deployed in the real world. This powers faster, more agile product inspections, and order fulfillments.
Industrial edge
The industrial edge, sometimes called the far edge, generally has smaller compute instances that can be one or two small, ruggedized servers or even embedded systems deployed outside of any sort of data center environment.
Industrial edge use cases include robotics, autonomous checkout, smart city capabilities like traffic control, and intelligent devices. These use cases run entirely outside of the normal data center structure, which means there are a number of unique challenges for space, cooling, security, and management.
BMW is leading the way with industrial edge by adopting robotics to redefine their factory logistics. Using different robots for parts of the process, these robots take boxes of raw parts on the line and transport them to shelves to await production. They are then taken to manufacturing, and finally returned back to the supply area when empty.
Robotics use cases require compute power both in the autonomous machine itself, as well as compute systems that sit on the factory floor. To optimize the efficiency and accelerate deployment of these solutions, NVIDIA introduced the NVIDIA Isaac Autonomous Mobile Robot (AMR) platform.
Accelerating edge computing
Each of these edge computing scenarios has different requirements, benefits, and deployment challenges. To understand if your use case would benefit from edge computing, download the Considerations for Deploying AI at the Edge whitepaper.
Sign up for Edge AI News to stay up to date with the latest trends, customers use cases, and technical walkthroughs.
As part of Jaguar Land Rover’s Reimagine strategy, partnership will transform the modern luxury experience for customers starting in 2025Software experts from both companies will jointly …
Is this the best podcast ever recorded? Let’s just say you don’t need a GPU to know that’s a stretch. But it’s pretty great if you’re a fan of tall tales. And better still if you’re not a fan of stretching the truth at all. That’s because detecting hyperbole may one day get more manageable, Read article >
Jaguar Land Rover and NVIDIA are redefining modern luxury, infusing intelligence into the customer experience. As part of its Reimagine strategy, Jaguar Land Rover announced today that it will develop its upcoming vehicles on the full-stack NVIDIA DRIVE Hyperion 8 platform, with DRIVE Orin delivering a wide spectrum of active safety, automated driving and parking Read article >
However, from what I can observe so far, the tensorboard graph does not really indicate user-understandable node names and cell names which makes it so difficult for tracking down the connections within the graph.