IoTool Save Data and Blobs
Introduction
This document consolidates the workflows for efficient management and storage of sensor data, particularly focusing on Binary Large Objects (BLOBs) and data ingestion and processing. Each flow serves a unique purpose but shares a common foundation in handling HTTP POST requests, sensor data management, and PostgreSQL database interactions. The workflows included are:
-
/android/iotool_lite_v3_savedata
-
/android/iotool_v3_savedata
-
/android/iotool_v3_saveblob
-
/android/iotool_lite_v3_saveblob
Common Components and Processes
Handling HTTP POST Requests
Each flow initiates by receiving data through a POST request to a
specified endpoint. The structure of these requests generally includes
parameters like username, password, database information, sensor count,
and sensor data. An example JSON structure for these POST requests is
provided below. It’s important to note that the information sent in this
request, including user credentials (up
and un
), as well as the
host
, url
, and endpoint
, is random and should be modified to fit
actual use cases.
Example JSON file:
{
"info": {
"_postman_id": "56ff6dea-0cd1-40b3-80ef-8a7426935ede",
"name": "IoTool",
"schema": "https://schema.getpostman.com/json/collection/v2.1.0/collection.json",
"_exporter_id": "4611963"
},
"item": [
{
"name": "iotool_lite_v3_savedata",
"request": {
"method": "POST",
"header": [],
"body": {
"mode": "urlencoded",
"urlencoded": [
{
"key": "un",
"value": "username",
"type": "text"
},
{
"key": "up",
"value": "password",
"type": "text"
},
{
"key": "db",
"value": "localhost",
"type": "text"
},
{
"key": "sensorcount",
"value": "1",
"type": "text"
},
{
"key": "db0",
"value": "senzor1",
"type": "text"
},
{
"key": "data0",
"value": ",1#,2#,3",
"type": "text"
}
]
},
"url": {
"raw": "http://dev.sandbox.engineering:1880/android/iotool_lite_v3_savedata",
"protocol": "http",
"host": [
"dev.sandbox.engineering"
],
"port": "1880",
"path": [
"android",
"iotool_lite_v3_savedata"
]
}
},
"response": []
}
]
}
Sensor ID Management
The management of incoming sensor data is crucial in each flow, involving the generation of sensor IDs to link readings to specific sensors in the database and the handling of sensor data for SQL insertion. Special considerations are made for data type transformations and entry structuring to ensure database compatibility. This process is facilitated by specific subflows:
-
Try to Add Sensor IDs Subflow: Enhances the parent flow by focusing on error detection and data structuring. It acts as a precursor to the
addNewSensorsIDs
subflow, analyzing the output to handle any potential errors and formatting encountered errors into HTML for easy interpretation. -
Add New Sensor IDs Subflow: Integrates key database management and error logging principles. It begins by ensuring the necessary table exists before invoking the
getSID
subflow, which is essential for providing the required credentials to access a specific user within PostgreSQL. The main task is to search the database for the sensor ID from the incoming HTTP POST request and add it to the database if not found, with error handling mechanisms in place to manage failures.
PostgreSQL Database Interaction
All flows engage with a PostgreSQL database for data storage and retrieval, involving:
-
User login
-
Table creation and manipulation
-
Dynamic SQL query generation
-
Data insertion
Reusable Node-RED subflows are employed for various tasks, including retrieving sensor IDs and logging database operations, ensuring efficient and secure database interaction across all flows.
PostgreSQL Database Interaction
All flows interact with a PostgreSQL database for data storage and retrieval. This involves user login, table creation and manipulation, dynamic SQL query generation, and data insertion. Each flow employs reusable Node-RED subflows for tasks like retrieving sensor IDs and logging database operations.
Sensor Readings Management
This section of the flow highlights the key differences between the four example flows, focusing on the interpretation and organization of incoming sensor data and the creation of dynamic SQL queries for data insertion into a PostgreSQL table. Each flow has a unique approach to adapting sensor data for SQL insertion, primarily observed in the ``parse and insert data'' function node:
-
IoTool Lite v3 Save Data: This flow handles special numerical values like
NaN
,-Infinity
, andInfinity
by replacing them with their string representations. The data parsed from the incoming message is then used in dynamic SQL query generation, where tuples of (timestamp, reading, sensor ID) are generated for database insertion. -
IoTool v3 Save Data: Similar to the Lite version, this flow also manages special numerical values. Additionally, it includes functionality for handling zipped data; if the incoming message indicates that data is zipped, it attempts to unzip it using the
pako.inflate
method. -
IoTool Lite v3 Save Blob: This flow treats incoming data as UTF-8 encoded and converts the data into a hexadecimal string. It also structures the data for database insertion, including adding a prefix to the hexadecimal string.
-
IoTool v3 Save Blob: Similar to the Lite version, but it treats incoming data as base64 encoded. This flow also converts the data into a hexadecimal string and prepares it for insertion into the database.
Conclusion
This unified documentation provides a comprehensive overview of the IoTool flows, highlighting their shared foundations in HTTP POST request handling, sensor data management, and PostgreSQL database interactions. By understanding the commonalities and differences among these flows, users can better leverage the IoTool suite for efficient sensor data handling and storage.